Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

reality

augmented reality

In Proceedings of UIST 1995
Article Picture

The world through the computer: computer augmented interaction with real world environments (p. 29-36)

In Proceedings of UIST 1995
Article Picture

Retrieving electronic documents with real-world objects on InteractiveDESK (p. 37-38)

In Proceedings of UIST 1997
Article Picture

HoloWall: designing a finger, hand, body, and object sensitive wall (p. 209-210)

In Proceedings of UIST 1997
Article Picture

Audio aura: light-weight audio augmented reality (p. 211-212)

In Proceedings of UIST 1997
Article Picture

The metaDESK: models and prototypes for tangible user interfaces (p. 223-232)

In Proceedings of UIST 1998
Article Picture

Of Vampire mirrors and privacy lamps: privacy management in multi-user augmented environments (p. 171-172)

In Proceedings of UIST 1999
Article Picture

Real-world interaction using the FieldMouse (p. 113-119)

In Proceedings of UIST 1999
Article Picture

Linking and messaging from real paper in the Paper PDA (p. 179-186)

In Proceedings of UIST 2000
Article Picture

System lag tests for augmented and virtual environments (p. 161-170)

In Proceedings of UIST 2001
Article Picture

View management for virtual and augmented reality (p. 101-110)

In Proceedings of UIST 2002
Article Picture

The missing link: augmenting biology laboratory notebooks (p. 41-50)

In Proceedings of UIST 2002
Article Picture

An annotated situation-awareness aid for augmented reality (p. 213-216)

In Proceedings of UIST 2004
Article Picture

DART: a toolkit for rapid design exploration of augmented reality experiences (p. 197-206)

In Proceedings of UIST 2005
Article Picture

Moveable interactive projected displays using projector based tracking (p. 63-72)

In Proceedings of UIST 2005
Article Picture

Supporting interaction in augmented reality in the presence of uncertain spatial knowledge (p. 111-114)

In Proceedings of UIST 2007
Article Picture

Hybrid infrared and visible light projection for location tracking (p. 57-60)

Abstract plus

A number of projects within the computer graphics, computer vision, and human-computer interaction communities have recognized the value of using projected structured light patterns for the purposes of doing range finding, location dependent data delivery, projector adaptation, or object discovery and tracking. However, most of the work exploring these concepts has relied on visible structured light patterns resulting in a caustic visual experience. In this work, we present the first design and implementation of a high-resolution, scalable, general purpose invisible near-infrared projector that can be manufactured in a practical manner. This approach is compatible with simultaneous visible light projection and integrates well with future Digital Light Processing (DLP) projector designs -- the most common type of projectors today. By unifying both the visible and non-visible pattern projection into a single device, we can greatly simply the implementation and execution of interactive projection systems. Additionally, we can inherently provide location discovery and tracking capabilities that are unattainable using other approaches.

In Proceedings of UIST 2007
Article Picture

Lucid touch: a see-through mobile device (p. 269-278)

Abstract plus

Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.

In Proceedings of UIST 2008
Article Picture

Foldable interactive displays (p. 287-290)

Abstract plus

Modern computer displays tend to be in fixed size, rigid, and rectilinear rendering them insensitive to the visual area demands of an application or the desires of the user. Foldable displays offer the ability to reshape and resize the interactive surface at our convenience and even permit us to carry a very large display surface in a small volume. In this paper, we implement four interactive foldable display designs using image projection with low-cost tracking and explore display behaviors using orientation sensitivity.

In Proceedings of UIST 2010
Article Picture

Gilded gait: reshaping the urban experience with augmented footsteps (p. 185-188)

Abstract plus

In this paper we describe Gilded Gait, a system that changes the perceived physical texture of the ground, as felt through the soles of users' feet. Ground texture, in spite of its potential as an effective channel of peripheral information display, has so far been paid little attention in HCI research. The system is designed as a pair of insoles with embedded actuators, and utilizes vibrotactile feedback to simulate the perceptions of a range of different ground textures. The discreet, low-key nature of the interface makes it particularly suited for outdoor use, and its capacity to alter how people experience the built environment may open new possibilities in urban design.

In Proceedings of UIST 2010
Article Picture

Combining multiple depth cameras and projectors for interactions on, above and between surfaces (p. 273-282)

Abstract plus

Instrumented with multiple depth cameras and projectors, LightSpace is a small room installation designed to explore a variety of interactions and computational strategies related to interactive displays and the space that they inhabit. LightSpace cameras and projectors are calibrated to 3D real world coordinates, allowing for projection of graphics correctly onto any surface visible by both camera and projector. Selective projection of the depth camera data enables emulation of interactive displays on un-instrumented surfaces (such as a standard table or office desk), as well as facilitates mid-air interactions between and around these displays. For example, after performing multi-touch interactions on a virtual object on the tabletop, the user may transfer the object to another display by simultaneously touching the object and the destination display. Or the user may "pick up" the object by sweeping it into their hand, see it sitting in their hand as they walk over to an interactive wall display, and "drop" the object onto the wall by touching it with their other hand. We detail the interactions and algorithms unique to LightSpace, discuss some initial observations of use and suggest future directions.

augmented reality (ar)

immersive virtual reality

In Proceedings of UIST 1995
Article Picture

The virtual tricorder: a uniform interface for virtual reality (p. 39-40)

mixed reality

In Proceedings of UIST 2004
Article Picture

DART: a toolkit for rapid design exploration of augmented reality experiences (p. 197-206)

In Proceedings of UIST 2010
Article Picture

MAI painting brush: an interactive device that realizes the feeling of real painting (p. 97-100)

Abstract plus

Many digital painting systems have been proposed and their quality is improving. In these systems, graphics tablets are widely used as input devices. However, because of its rigid nib and indirect manipulation, the operational feeling of a graphics tablet is different from that of real paint brush. We solved this problem by developing the MR-based Artistic Interactive (MAI) Painting Brush, which imitates a real paint brush, and constructed a mixed reality (MR) painting system that enables direct painting on physical objects in the real world.

reality

In Proceedings of UIST 2010
Article Picture

The engineering of personhood (p. 343-346)

Abstract plus

Any subset of reality can potentially be interpreted as a computer, so when we speak about a particular computer, we are merely speaking about a portion of reality we can understand computationally. That means that computation is only identifiable through the human experience of it. User interface is ultimately the only grounding for the abstractions of computation, in the same way that the measurement of physical phenomena provides the only legitimate basis for physics. But user interface also changes humans. As computation is perceived, the natures of self and personhood are transformed. This process, when designers are aware of it, can be understood as an emerging form of applied philosophy or even applied spirituality.

virtual reality

In Proceedings of UIST 1995
Article Picture

Amortizing 3D graphics optimization across multiple frames (p. 13-19)

In Proceedings of UIST 1996
Article Picture

The go-go interaction technique: non-linear mapping for direct manipulation in VR (p. 79-80)

In Proceedings of UIST 1996
Article Picture

Language-level support for exploratory programming of distributed virtual environments (p. 83-94)

In Proceedings of UIST 1996
Article Picture

The Lego interface toolkit (p. 97-98)

In Proceedings of UIST 1997
Article Picture

Immersion in desktop virtual reality (p. 11-19)

In Proceedings of UIST 1997
Article Picture

Worldlets---3D thumbnails for wayfinding in virtual environments (p. 21-30)

In Proceedings of UIST 1997
Article Picture

The omni-directional treadmill: a locomotion device for virtual worlds (p. 213-221)

In Proceedings of UIST 1998
Article Picture

Of Vampire mirrors and privacy lamps: privacy management in multi-user augmented environments (p. 171-172)

In Proceedings of UIST 2000
Article Picture

System lag tests for augmented and virtual environments (p. 161-170)

In Proceedings of UIST 2006
Article Picture

Procedural haptic texture (p. 179-186)

virtual reality (vr)

virtual reality technology

In Proceedings of UIST 1995
Article Picture

A 3D tracking experiment on latency and its compensation methods in virtual environments (p. 41-49)