Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

tracking

eye tracking

In Proceedings of UIST 2000
Article Picture

The reading assistant: eye gaze triggered auditory prompting for reading remediation (p. 101-107)

In Proceedings of UIST 2005
Article Picture

ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis (p. 53-61)

In Proceedings of UIST 2005
Article Picture

eyeLook: using attention to facilitate mobile media consumption (p. 103-106)

In Proceedings of UIST 2007
Article Picture

Gaze-enhanced scrolling techniques (p. 213-216)

Abstract plus

Scrolling is an essential part of our everyday computing experience. Contemporary scrolling techniques rely on the explicit initiation of scrolling by the user. The act of scrolling is tightly coupled with the user?s ability to absorb information via the visual channel. The use of eye gaze information is therefore a natural choice for enhancing scrolling techniques. We present several gaze-enhanced scrolling techniques for manual and automatic scrolling which use gaze information as a primary input or as an augmented input. We also introduce the use off-screen gaze-actuated buttons for document navigation and control.

finger tracking with computer vision

In Proceedings of UIST 2004
Article Picture

Visual tracking of bare fingers for interactive surfaces (p. 119-122)

hand tracking

In Proceedings of UIST 2006
Article Picture

Robust computer vision-based detection of pinching for one and two-handed gesture input (p. 255-258)

object tracking

In Proceedings of UIST 2002
Article Picture

The actuated workbench: computer-controlled actuation in tabletop tangible interfaces (p. 181-190)

projector based tracking

In Proceedings of UIST 2005
Article Picture

Moveable interactive projected displays using projector based tracking (p. 63-72)

projector-based tracking

In Proceedings of UIST 2007
Article Picture

Hybrid infrared and visible light projection for location tracking (p. 57-60)

Abstract plus

A number of projects within the computer graphics, computer vision, and human-computer interaction communities have recognized the value of using projected structured light patterns for the purposes of doing range finding, location dependent data delivery, projector adaptation, or object discovery and tracking. However, most of the work exploring these concepts has relied on visible structured light patterns resulting in a caustic visual experience. In this work, we present the first design and implementation of a high-resolution, scalable, general purpose invisible near-infrared projector that can be manufactured in a practical manner. This approach is compatible with simultaneous visible light projection and integrates well with future Digital Light Processing (DLP) projector designs -- the most common type of projectors today. By unifying both the visible and non-visible pattern projection into a single device, we can greatly simply the implementation and execution of interactive projection systems. Additionally, we can inherently provide location discovery and tracking capabilities that are unattainable using other approaches.

vision tracking

In Proceedings of UIST 2003
Article Picture

VisionWand: interaction techniques for large displays using a passive wand tracked in 3D (p. 173-182)