Banner photo: Tumalo Falls in Oregon.

accessibility info on demos

This year for the first time, The accessibility chairs of UIST 2022 collected information about different types of sensory experiences for demos. We hope that this information is helpful to attendees to choose demos that meet their sensory preferences. Please note that the information provided here is based on what the authors of the demos have provided and the accessibility chairs have been unable to verify the accuracy of this information. If you find any inconsistencies in this information, or if you have any feedback on how we could improve this process and make demos more accessible for future years, please reach out to the accessibility chairs Amanda Swearngin, Venkatesh Potluri and Yi-Hao Peng .

Motion

In the following demos, attendees will look at horizontal or vertical scrolling , video shot by a hand-held camera, or video from a first person perspective. E.g. those found in a first-person shooter video game.

Demonstration of Geppetteau: Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface

The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick up and play around with our device seated in the table.

Interactive Zoetrope with a Strobing Flashlight

Users will hold a flashlight and shine it into the spinning display. the flashlight strobes at a rate such that the spinning gives the illusion of animation.

Music Scope Pad

Attendees can enjoy sounds and images in the VR space using iPads and headphones.

SpinOcchietto: A Wearable Skin-SIip Haptic Device for Rendering Width and Motion of Objects Gripped Between the Fingertips

Participants strap a tracker on their wrist, strap the prototype onto their thumb and index fingers, wear a VR head-mounted-display, and reach out their hand wearing the prototype to grab, hold, and release virtual objects they see in front of them. For attendees unwilling to wear the VR headset, we plan to display what's going on a monitor on the table.

Demonstrating a High-Precision Pen that Senses Translation and Rotation on Passive Surfaces

Attendees will be shown a pen device that controls interactive widgets on a screen. Attendees can also try the pen themself.

Point Cloud Capture and Editing for AR Environmental Design

Attendees will use an iPad to capture the surrounding environment in the form of point cloud and then edit it on the same iPad.

TouchVR: A Modality for Instant VR Experience

Attendees will interact with a smartphone, watching VR contents. Attendees also need to touch the back of a smartphone with their fingers.

Augmented Chironomia for Presenting Data to Remote Audiences

Participants will either watch a demo author give a presentation using visualization overlays composited over live webcam video, or they will be invited to present this content themselves to other demo attendees. In the latter case, participants will be expected to perform mid-air pointing and pinching gestures to manipulate composited visualization overlays.

Demonstrating p5.fab: Direct Control of Digital Fabrication Machines from a Creative Coding Environment

Attendees can write code on a laptop to programmatically control a 3D printer.

Demonstration of Geppetteau: Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface

The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table.

DATALEV: Acoustophoretic Data Physicalisation

Attendees will see the physical objects moving in mid-air and projected images in 3D space, at the same time, they are allowed to use gestures to control the objects' movement.

Knitted Force Sensors

They are stand alone demonstrators showing different knitted textile samples. The demos have a small screen that shows values, or is connected to a bigger screen where users can play a game. There is also a window blind that users can control through the textile interface.

Anywhere Hoop:Virtual Free Throw Training System

You can experience a simplified version of the experimental system used in this research (the system to practice basketball free throws through AR glasses).

Beeping noises or high frequency sounds

Participants will experience listening to beeps or high frequency noises in the following demos

Demonstration of Geppetteau: Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface

The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table

Fibercuit: Prototyping High-Resolution Flexible and Kirigami Circuits with a Fiber Laser Engraver

Description unavailable.

Demonstration of MetamorphX: An Ungrounded Moment Display that Changes its Physical Properties

By shaking the gyro-moment-based torque presentation device, participants can experience various haptic feedback via the grasped object.

Thermoformable Shell

Attendees watch demo of thermoforming TF-shell-embedded figure. They can also experience thermoforming with small figures.

Strobing or blinking lights

The following demos contain strobing or blinking lights.

Demonstration of Geppetteau: Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface

The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table.

Interactive Zoetrope with a Strobing Flashlight

Users will hold a flashlight and shine it into the spinning display. the flashlight strobes at a rate such that the spinning gives the illusion of animation

ForceSight: Non-Contact Force Sensing with Laser Speckle Imaging

Attendees will be able to play an action game (similar to flappy bird) with our non-contact force sensor as the input.

Touchibo: Multimodal Texture-Changing Robotic Platform for Shared Human Experiences

Attendees will explore two prototypes, using touch as the interaction. We designed two experiences, one a underwater one (where the robot will be underwater) and people will touch it and sense it and second one will be a desk staticionary robot. Those prototypes were designed to be explored by people with and without visual impairment

SenSequins: Smart Textile Using 3D Printed Conductive Sequins

Attendees can touch the garment made of Sensequins on the mannequin.

DATALEV: Acoustophoretic Data Physicalisation

Attendees will see the physical objects moving in mid-air and projected images in 3D space, at the same time, they are allowed to use gestures to control the objects' movement.

Fibercuit: Prototyping High-Resolution Flexible and Kirigami Circuits with a Fiber Laser Engraver

Description unavailable.

Accessibility Chairs

Questions? Contact: accessibility2022@uist.org

arrow-up icon