This year for the first time, The accessibility chairs of UIST 2022 collected information about different types of sensory experiences for demos. We hope that this information is helpful to attendees to choose demos that meet their sensory preferences. Please note that the information provided here is based on what the authors of the demos have provided and the accessibility chairs have been unable to verify the accuracy of this information. If you find any inconsistencies in this information, or if you have any feedback on how we could improve this process and make demos more accessible for future years, please reach out to the accessibility chairs Amanda Swearngin, Venkatesh Potluri and Yi-Hao Peng .
In the following demos, attendees will look at horizontal or vertical scrolling , video shot by a hand-held camera, or video from a first person perspective. E.g. those found in a first-person shooter video game.
The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick up and play around with our device seated in the table.
Users will hold a flashlight and shine it into the spinning display. the flashlight strobes at a rate such that the spinning gives the illusion of animation.
Attendees can enjoy sounds and images in the VR space using iPads and headphones.
Participants strap a tracker on their wrist, strap the prototype onto their thumb and index fingers, wear a VR head-mounted-display, and reach out their hand wearing the prototype to grab, hold, and release virtual objects they see in front of them. For attendees unwilling to wear the VR headset, we plan to display what's going on a monitor on the table.
Attendees will be shown a pen device that controls interactive widgets on a screen. Attendees can also try the pen themself.
Attendees will use an iPad to capture the surrounding environment in the form of point cloud and then edit it on the same iPad.
Attendees will interact with a smartphone, watching VR contents. Attendees also need to touch the back of a smartphone with their fingers.
Participants will either watch a demo author give a presentation using visualization overlays composited over live webcam video, or they will be invited to present this content themselves to other demo attendees. In the latter case, participants will be expected to perform mid-air pointing and pinching gestures to manipulate composited visualization overlays.
Attendees can write code on a laptop to programmatically control a 3D printer.
The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table.
Attendees will see the physical objects moving in mid-air and projected images in 3D space, at the same time, they are allowed to use gestures to control the objects' movement.
They are stand alone demonstrators showing different knitted textile samples. The demos have a small screen that shows values, or is connected to a bigger screen where users can play a game. There is also a window blind that users can control through the textile interface.
You can experience a simplified version of the experimental system used in this research (the system to practice basketball free throws through AR glasses).
Participants will experience listening to beeps or high frequency noises in the following demos
The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table
Description unavailable.
By shaking the gyro-moment-based torque presentation device, participants can experience various haptic feedback via the grasped object.
Attendees watch demo of thermoforming TF-shell-embedded figure. They can also experience thermoforming with small figures.
The following demos contain strobing or blinking lights.
The user will be seated at a table. They will be wearing an HTC vive. They will be able to pick and and play around with our device seated in the table.
Users will hold a flashlight and shine it into the spinning display. the flashlight strobes at a rate such that the spinning gives the illusion of animation
Attendees will be able to play an action game (similar to flappy bird) with our non-contact force sensor as the input.
Attendees will explore two prototypes, using touch as the interaction. We designed two experiences, one a underwater one (where the robot will be underwater) and people will touch it and sense it and second one will be a desk staticionary robot. Those prototypes were designed to be explored by people with and without visual impairment
Attendees can touch the garment made of Sensequins on the mannequin.
Attendees will see the physical objects moving in mid-air and projected images in 3D space, at the same time, they are allowed to use gestures to control the objects' movement.
Description unavailable.
Questions? Contact: accessibility2022@uist.org