Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

technique

3d interaction technique

advanced interaction technique

In Proceedings of UIST 2000
Article Picture

The architecture and implementation of CPN2000, a post-WIMP graphical application (p. 181-190)

animation technique

In Proceedings of UIST 1993
Article Picture

Animation support in a user interface toolkit: flexible, robust, and reusable abstractions (p. 57-67)

demonstrational technique

In Proceedings of UIST 1992
Article Picture

A history-based macro by example system (p. 99-106)

fluid interaction technique

In Proceedings of UIST 2003
Article Picture

Fluid interaction techniques for the control and annotation of digital video (p. 105-114)

graphical technique

In Proceedings of UIST 1996
Article Picture

Ambiguous intentions: a paper-like interface for creative design (p. 183-192)

input technique and device

In Proceedings of UIST 2006
Article Picture

Camera phone based motion sensing: interaction techniques, applications and performance study (p. 101-110)

interaction technique

In Proceedings of UIST 1992
Article Picture

Progress in building user interface toolkits: the world according to XIT (p. 181-190)

In Proceedings of UIST 1993
Article Picture

A graphics toolkit based on differential constraints (p. 109-120)

In Proceedings of UIST 1994
Article Picture

Reconnaissance support for juggling multiple processing options (p. 27-28)

In Proceedings of UIST 1994
Article Picture

An architecture for an extensible 3D interface toolkit (p. 59-67)

In Proceedings of UIST 1994
Article Picture

Translucent patches---dissolving windows (p. 121-130)

In Proceedings of UIST 1995
Article Picture

Retrieving electronic documents with real-world objects on InteractiveDESK (p. 37-38)

In Proceedings of UIST 1996
Article Picture

Tilting operations for small screen interfaces (p. 167-168)

In Proceedings of UIST 1998
Article Picture

Path drawing for 3D walkthrough (p. 173-174)

In Proceedings of UIST 1999
Article Picture

The VideoMouse: a camera-based multi-degree-of-freedom input device (p. 103-112)

In Proceedings of UIST 1999
Article Picture

Integrated manipulation: context-aware manipulation of 2D diagrams (p. 159-160)

In Proceedings of UIST 1999
Article Picture

The role of kinesthetic reference frames in two-handed input performance (p. 171-178)

In Proceedings of UIST 1999
Article Picture

Linking and messaging from real paper in the Paper PDA (p. 179-186)

In Proceedings of UIST 2000
Article Picture

Sensing techniques for mobile interaction (p. 91-100)

In Proceedings of UIST 2000
Article Picture

The reading assistant: eye gaze triggered auditory prompting for reading remediation (p. 101-107)

In Proceedings of UIST 2000
Article Picture

ToolStone: effective use of the physical manipulation vocabularies of input devices (p. 109-117)

In Proceedings of UIST 2001
Article Picture

Voice as sound: using non-verbal voice input for interactive control (p. 155-156)

In Proceedings of UIST 2001
Article Picture

A suggestive interface for 3D drawing (p. 173-181)

In Proceedings of UIST 2002
Article Picture

StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls (p. 101-110)

In Proceedings of UIST 2002
Article Picture

TiltType: accelerometer-supported text entry for very small devices (p. 201-204)

In Proceedings of UIST 2002
Article Picture

WebThumb: interaction techniques for small-screen browsers (p. 205-208)

In Proceedings of UIST 2003
Article Picture

VisionWand: interaction techniques for large displays using a passive wand tracked in 3D (p. 173-182)

In Proceedings of UIST 2004
Article Picture

Navigating documents with the virtual scroll ring (p. 57-60)

In Proceedings of UIST 2004
Article Picture

A remote control interface for large displays (p. 127-136)

In Proceedings of UIST 2004
Article Picture

Interacting with hidden content using content-aware free-space transparency (p. 189-192)

In Proceedings of UIST 2004
Article Picture

The MaggLite post-WIMP toolkit: draw it, connect it and run it (p. 257-266)

In Proceedings of UIST 2006
Article Picture

Multi-layer interaction for digital tables (p. 269-272)

In Proceedings of UIST 2007
Article Picture

Shadow reaching: a new perspective on interaction for large displays (p. 53-56)

Abstract plus

We introduce Shadow Reaching, an interaction technique that makes use of a perspective projection applied to a shadow representation of a user. The technique was designed to facilitate manipulation over large distances and enhance understanding in collaborative settings. We describe three prototype implementations that illustrate the technique, examining the advantages of using shadows as an interaction metaphor to support single users and groups of collaborating users. Using these prototypes as a design probe, we discuss how the three components of the technique (sensing, modeling, and rendering) can be accomplished with real (physical) or computed (virtual) shadows, and the benefits and drawbacks of each approach.

In Proceedings of UIST 2007
Article Picture

Blui: low-cost localized blowable user interfaces (p. 217-220)

Abstract plus

We describe a unique form of hands-free interaction that can be implemented on most commodity computing platforms. Our approach supports blowing at a laptop or computer screen to directly control certain interactive applications. Localization estimates are produced in real-time to determine where on the screen the person is blowing. Our approach relies solely on a single microphone, such as those already embedded in a standard laptop or one placed near a computer monitor, which makes our approach very cost-effective and easy-to-deploy. We show example interaction techniques that leverage this approach.

In Proceedings of UIST 2009
Article Picture

Disappearing mobile devices (p. 101-110)

Abstract plus

In this paper, we extrapolate the evolution of mobile devices in one specific direction, namely miniaturization. While we maintain the concept of a device that people are aware of and interact with intentionally, we envision that this concept can become small enough to allow invisible integration into arbitrary surfaces or human skin, and thus truly ubiquitous use. This outcome assumed, we investigate what technology would be most likely to provide the basis for these devices, what abilities such devices can be expected to have, and whether or not devices that size can still allow for meaningful interaction. We survey candidate technologies, drill down on gesture-based interaction, and demonstrate how it can be adapted to the desired form factors. While the resulting devices offer only the bare minimum in feedback and only the most basic interactions, we demonstrate that simple applications remain possible. We complete our exploration with two studies in which we investigate the affordance of these devices more concretely, namely marking and text entry using a gesture alphabet.

In Proceedings of UIST 2009
Article Picture

Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices (p. 121-124)

Abstract plus

We present Abracadabra, a magnetically driven input technique that offers users wireless, unpowered, high fidelity finger input for mobile devices with very small screens. By extending the input area to many times the size of the device's screen, our approach is able to offer a high C-D gain, enabling fine motor control. Additionally, screen occlusion can be reduced by moving interaction off of the display and into unused space around the device. We discuss several example applications as a proof of concept. Finally, results from our user study indicate radial targets as small as 16 degrees can achieve greater than 92% selection accuracy, outperforming comparable radial, touch-based finger input.

In Proceedings of UIST 2010
Article Picture

PhoneTouch: a technique for direct phone interaction on surfaces (p. 13-16)

Abstract plus

PhoneTouch is a novel technique for integration of mobile phones and interactive surfaces. The technique enables use of phones to select targets on the surface by direct touch, facilitating for instance pick&drop-style transfer of objects between phone and surface. The technique is based on separate detection of phone touch events by the surface, which determines location of the touch, and by the phone, which contributes device identity. The device-level observations are merged based on correlation in time. We describe a proof-of-concept implementation of the technique, using vision for touch detection on the surface (including discrimination of finger versus phone touch) and acceleration features for detection by the phone.

interactive technique

In Proceedings of UIST 1995
Article Picture

SDM: selective dynamic manipulation of visualizations (p. 61-70)

lens interaction technique

In Proceedings of UIST 1997
Article Picture

Debugging lenses: a new class of transparent tools for user interface debugging (p. 179-187)

navigation technique

In Proceedings of UIST 1995
Article Picture

Using information murals in visualization applications (p. 73-74)

novel interaction technique

In Proceedings of UIST 2001
Article Picture

Cursive: a novel interaction technique for controlling expressive avatar gesture (p. 151-152)

pointing technique

In Proceedings of UIST 2006
Article Picture

Multi-layer interaction for digital tables (p. 269-272)

search technique

In Proceedings of UIST 1998
Article Picture

Scratchpad: mechanisms for better navigation in directed Web searching (p. 1-8)

selection technique

In Proceedings of UIST 2005
Article Picture

Zoom-and-pick: facilitating visual zooming and precision pointing with interactive handheld projectors (p. 73-82)