Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

infrared

infrared

In Proceedings of UIST 1997
Article Picture

HoloWall: designing a finger, hand, body, and object sensitive wall (p. 209-210)

In Proceedings of UIST 1999
Article Picture

Implementing phicons: combining computer vision with infrared technology for interactive physical icons (p. 67-68)

In Proceedings of UIST 2009
Article Picture

Optically sensing tongue gestures for computer input (p. 177-180)

Abstract plus

Many patients with paralyzing injuries or medical conditions retain the use of their cranial nerves, which control the eyes, jaw, and tongue. While researchers have explored eye-tracking and speech technologies for these patients, we believe there is potential for directly sensing explicit tongue movement for controlling computers. In this paper, we describe a novel approach of using infrared optical sensors embedded within a dental retainer to sense tongue gestures. We describe an experiment showing our system effectively discriminating between four simple gestures with over 90% accuracy. In this experiment, users were also able to play the popular game Tetris with their tongues. Finally, we present lessons learned and opportunities for future work.

infrared (ir) tag

In Proceedings of UIST 2002
Article Picture

That one there! Pointing to establish device identity (p. 151-160)

infrared detection

In Proceedings of UIST 1997
Article Picture

A finger-mounted, direct pointing device for mobile computing (p. 41-42)

infrared projection

In Proceedings of UIST 2007
Article Picture

Hybrid infrared and visible light projection for location tracking (p. 57-60)

Abstract plus

A number of projects within the computer graphics, computer vision, and human-computer interaction communities have recognized the value of using projected structured light patterns for the purposes of doing range finding, location dependent data delivery, projector adaptation, or object discovery and tracking. However, most of the work exploring these concepts has relied on visible structured light patterns resulting in a caustic visual experience. In this work, we present the first design and implementation of a high-resolution, scalable, general purpose invisible near-infrared projector that can be manufactured in a practical manner. This approach is compatible with simultaneous visible light projection and integrates well with future Digital Light Processing (DLP) projector designs -- the most common type of projectors today. By unifying both the visible and non-visible pattern projection into a single device, we can greatly simply the implementation and execution of interactive projection systems. Additionally, we can inherently provide location discovery and tracking capabilities that are unattainable using other approaches.

In Proceedings of UIST 2010
Article Picture

Enabling beyond-surface interactions for interactive surface with an invisible projection (p. 263-272)

Abstract plus

This paper presents a programmable infrared (IR) technique that utilizes invisible, programmable markers to support interaction beyond the surface of a diffused-illumination (DI) multi-touch system. We combine an IR projector and a standard color projector to simultaneously project visible content and invisible markers. Mobile devices outfitted with IR cameras can compute their 3D positions based on the markers perceived. Markers are selectively turned off to support multi-touch and direct on-surface tangible input. The proposed techniques enable a collaborative multi-display multi-touch tabletop system. We also present three interactive tools: i-m-View, i-m-Lamp, and i-m-Flashlight, which consist of a mobile tablet and projectors that users can freely interact with beyond the main display surface. Early user feedback shows that these interactive devices, combined with a large interactive display, allow more intuitive navigation and are reportedly enjoyable to use.

infrared sensing

In Proceedings of UIST 2007
Article Picture

ThinSight: versatile multi-touch sensing for thin form-factor displays (p. 259-268)

Abstract plus

ThinSight is a novel optical sensing system, fully integrated into a thin form factor display, capable of detecting multi-ple fingers placed on or near the display surface. We describe this new hardware in detail, and demonstrate how it can be embedded behind a regular LCD, allowing sensing without degradation of display capability. With our approach, fingertips and hands are clearly identifiable through the display. The approach of optical sensing also opens up the exciting possibility for detecting other physical objects and visual markers through the display, and some initial experiments are described. We also discuss other novel capabilities of our system: interaction at a distance using IR pointing devices, and IR-based communication with other electronic devices through the display. A major advantage of ThinSight over existing camera and projector based optical systems is its compact, thin form-factor making such systems even more deployable. We therefore envisage using ThinSight to capture rich sensor data through the display which can be processed using computer vision techniques to enable both multi-touch and tangible interaction.