Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

device

3d input device

consumer device

In Proceedings of UIST 2003
Article Picture

Rapid serial visual presentation techniques for consumer digital video devices (p. 115-124)

device aggregation

In Proceedings of UIST 2005
Article Picture

Dial and see: tackling the voice menu navigation problem with cross-device user experience integration (p. 187-190)

device ensemble

In Proceedings of UIST 2008
Article Picture

Iterative design and evaluation of an event architecture for pen-and-paper interfaces (p. 111-120)

Abstract plus

This paper explores architectural support for interfaces combining pen, paper, and PC. We show how the event-based approach common to GUIs can apply to augmented paper, and describe additions to address paper's distinguishing characteristics. To understand the developer experience of this architecture, we deployed the toolkit to 17 student teams for six weeks. Analysis of the developers' code provided insight into the appropriateness of events for paper UIs. The usage patterns we distilled informed a second iteration of the toolkit, which introduces techniques for integrating interactive and batched input handling, coordinating interactions across devices, and debugging paper applications. The study also revealed that programmers created gesture handlers by composing simple ink measurements. This desire for informal interactions inspired us to include abstractions for recognition. This work has implications beyond paper - designers of graphical tools can examine API usage to inform iterative toolkit development.

device integration

In Proceedings of UIST 2005
Article Picture

Dial and see: tackling the voice menu navigation problem with cross-device user experience integration (p. 187-190)

display device

In Proceedings of UIST 2004
Article Picture

Using light emitting diode arrays as touch-sensitive input and output devices (p. 287-290)

exertion device

In Proceedings of UIST 1997
Article Picture

The omni-directional treadmill: a locomotion device for virtual worlds (p. 213-221)

foldable input device

In Proceedings of UIST 2008
Article Picture

Towards more paper-like input: flexible input devices for foldable interaction styles (p. 283-286)

Abstract plus

This paper presents Foldable User Interfaces (FUI), a combination of a 3D GUI with windows imbued with the physics of paper, and Foldable Input Devices (FIDs). FIDs are sheets of paper that allow realistic transformations of graphical sheets in the FUI. Foldable input devices are made out of construction paper augmented with IR reflectors, and tracked by computer vision. Window sheets can be picked up and flexed with simple movements and deformations of the FID. FIDs allow a diverse lexicon of one-handed and two-handed interaction techniques, including folding, bending, flipping and stacking. We show how these can be used to ease the creation of simple 3D models, but also for tasks such as page navigation.

hand-held device

handheld device

In Proceedings of UIST 2002
Article Picture

WebThumb: interaction techniques for small-screen browsers (p. 205-208)

input device

In Proceedings of UIST 1992
Article Picture

A testbed for characterizing dynamic response of virtual environment spatial sensors (p. 15-22)

In Proceedings of UIST 1997
Article Picture

Pick-and-drop: a direct manipulation technique for multiple computer environments (p. 31-39)

In Proceedings of UIST 1997
Article Picture

A finger-mounted, direct pointing device for mobile computing (p. 41-42)

In Proceedings of UIST 1997
Article Picture

The omni-directional treadmill: a locomotion device for virtual worlds (p. 213-221)

In Proceedings of UIST 1997
Article Picture

The metaDESK: models and prototypes for tangible user interfaces (p. 223-232)

In Proceedings of UIST 1998
Article Picture

Interaction and modeling techniques for desktop two-handed input (p. 49-58)

In Proceedings of UIST 1998
Article Picture

A user interface using fingerprint recognition: holding commands and data objects on fingers (p. 71-79)

In Proceedings of UIST 1999
Article Picture

The VideoMouse: a camera-based multi-degree-of-freedom input device (p. 103-112)

In Proceedings of UIST 1999
Article Picture

Real-world interaction using the FieldMouse (p. 113-119)

In Proceedings of UIST 2000
Article Picture

Sensing techniques for mobile interaction (p. 91-100)

In Proceedings of UIST 2000
Article Picture

ToolStone: effective use of the physical manipulation vocabularies of input devices (p. 109-117)

In Proceedings of UIST 2001
Article Picture

Empirical measurements of intrabody communication performance under varied physical configurations (p. 183-190)

In Proceedings of UIST 2001
Article Picture

Pop through mouse button interactions (p. 195-196)

In Proceedings of UIST 2003
Article Picture

Synchronous gestures for multiple persons and computers (p. 149-158)

In Proceedings of UIST 2003
Article Picture

VisionWand: interaction techniques for large displays using a passive wand tracked in 3D (p. 173-182)

In Proceedings of UIST 2003
Article Picture

PreSense: interaction techniques for finger sensing input devices (p. 203-212)

In Proceedings of UIST 2004
Article Picture

Using light emitting diode arrays as touch-sensitive input and output devices (p. 287-290)

In Proceedings of UIST 2006
Article Picture

Mobile interaction using paperweight metaphor (p. 111-114)

In Proceedings of UIST 2008
Article Picture

An exploration of pen rolling for pen-based interaction (p. 191-200)

Abstract plus

Current pen input mainly utilizes the position of the pen tip, and occasionally, a button press. Other possible device parameters, such as rolling the pen around its longitudinal axis, are rarely used. We explore pen rolling as a supporting input modality for pen-based interaction. Through two studies, we are able to determine 1) the parameters that separate intentional pen rolling for the purpose of interaction from incidental pen rolling caused by regular writing and drawing, and 2) the parameter range within which accurate and timely intentional pen rolling interactions can occur. Building on our experimental results, we present an exploration of the design space of rolling-based interaction techniques, which showcase three scenarios where pen rolling interactions can be useful: enhanced stimulus-response compatibility in rotation tasks [7], multi-parameter input, and simplified mode selection.

In Proceedings of UIST 2009
Article Picture

Mouse 2.0: multi-touch meets the mouse (p. 33-42)

Abstract plus

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.

In Proceedings of UIST 2009
Article Picture

Disappearing mobile devices (p. 101-110)

Abstract plus

In this paper, we extrapolate the evolution of mobile devices in one specific direction, namely miniaturization. While we maintain the concept of a device that people are aware of and interact with intentionally, we envision that this concept can become small enough to allow invisible integration into arbitrary surfaces or human skin, and thus truly ubiquitous use. This outcome assumed, we investigate what technology would be most likely to provide the basis for these devices, what abilities such devices can be expected to have, and whether or not devices that size can still allow for meaningful interaction. We survey candidate technologies, drill down on gesture-based interaction, and demonstrate how it can be adapted to the desired form factors. While the resulting devices offer only the bare minimum in feedback and only the most basic interactions, we demonstrate that simple applications remain possible. We complete our exploration with two studies in which we investigate the affordance of these devices more concretely, namely marking and text entry using a gesture alphabet.

In Proceedings of UIST 2010
Article Picture

MAI painting brush: an interactive device that realizes the feeling of real painting (p. 97-100)

Abstract plus

Many digital painting systems have been proposed and their quality is improving. In these systems, graphics tablets are widely used as input devices. However, because of its rigid nib and indirect manipulation, the operational feeling of a graphics tablet is different from that of real paint brush. We solved this problem by developing the MR-based Artistic Interactive (MAI) Painting Brush, which imitates a real paint brush, and constructed a mixed reality (MR) painting system that enables direct painting on physical objects in the real world.

input output device

In Proceedings of UIST 2002
Article Picture

TiltType: accelerometer-supported text entry for very small devices (p. 201-204)

input technique and device

In Proceedings of UIST 2006
Article Picture

Camera phone based motion sensing: interaction techniques, applications and performance study (p. 101-110)

interacting with a group of device

In Proceedings of UIST 2005
Article Picture

Dial and see: tackling the voice menu navigation problem with cross-device user experience integration (p. 187-190)

interaction device

In Proceedings of UIST 1997
Article Picture

A finger-mounted, direct pointing device for mobile computing (p. 41-42)

mobile device

In Proceedings of UIST 2000
Article Picture

Sensing techniques for mobile interaction (p. 91-100)

In Proceedings of UIST 2000
Article Picture

The metropolis keyboard - an exploration of quantitative techniques for virtual keyboard design (p. 119-128)

In Proceedings of UIST 2001
Article Picture

Toward more sensitive mobile phones (p. 191-192)

In Proceedings of UIST 2002
Article Picture

TiltType: accelerometer-supported text entry for very small devices (p. 201-204)

In Proceedings of UIST 2006
Article Picture

Camera phone based motion sensing: interaction techniques, applications and performance study (p. 101-110)

In Proceedings of UIST 2006
Article Picture

Mobile interaction using paperweight metaphor (p. 111-114)

In Proceedings of UIST 2008
Article Picture

Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces (p. 205-208)

Abstract plus

We present Scratch Input, an acoustic-based input technique that relies on the unique sound produced when a fingernail is dragged over the surface of a textured material, such as wood, fabric, or wall paint. We employ a simple sensor that can be easily coupled with existing surfaces, such as walls and tables, turning them into large, unpowered and ad hoc finger input surfaces. Our sensor is sufficiently small that it could be incorporated into a mobile device, allowing any suitable surface on which it rests to be appropriated as a gestural input surface. Several example applications were developed to demonstrate possible interactions. We conclude with a study that shows users can perform six Scratch Input gestures at about 90% accuracy with less than five minutes of training and on wide variety of surfaces.

In Proceedings of UIST 2008
Article Picture

Lightweight material detection for placement-aware mobile computing (p. 279-282)

Abstract plus

Numerous methods have been proposed that allow mobile devices to determine where they are located (e.g., home or office) and in some cases, predict what activity the user is currently engaged in (e.g., walking, sitting, or driving). While useful, this sensing currently only tells part of a much richer story. To allow devices to act most appropriately to the situation they are in, it would also be very helpful to know about their placement - for example whether they are sitting on a desk, hidden in a drawer, placed in a pocket, or held in one's hand - as different device behaviors may be called for in each of these situations. In this paper, we describe a simple, small, and inexpensive multispectral optical sensor for identifying materials in proximity to a device. This information can be used in concert with e.g., location information, to estimate, for example, that the device is "sitting on the desk at home", or "in the pocket at work". This paper discusses several potential uses of this technology, as well as results from a two-part study, which indicates that this technique can detect placement at 94.4% accuracy with real-world placement sets.

In Proceedings of UIST 2009
Article Picture

TapSongs: tapping rhythm-based passwords on a single binary sensor (p. 93-96)

Abstract plus

TapSongs are presented, which enable user authentication on a single "binary" sensor (e.g., button) by matching the rhythm of tap down/up events to a jingle timing model created by the user. We describe our matching algorithm, which employs absolute match criteria and learns from successful logins. We also present a study of 10 subjects showing that after they created their own TapSong models from 12 examples (< 2 minutes), their subsequent login attempts were 83.2% successful. Furthermore, aural and visual eavesdropping of the experimenter's logins resulted in only 10.7% successful imposter logins by subjects. Even when subjects heard the target jingles played by a synthesized piano, they were only 19.4% successful logging in as imposters. These results are attributable to subtle but reliable individual differences in people's tapping, which are supported by prior findings in music psychology.

In Proceedings of UIST 2009
Article Picture

Disappearing mobile devices (p. 101-110)

Abstract plus

In this paper, we extrapolate the evolution of mobile devices in one specific direction, namely miniaturization. While we maintain the concept of a device that people are aware of and interact with intentionally, we envision that this concept can become small enough to allow invisible integration into arbitrary surfaces or human skin, and thus truly ubiquitous use. This outcome assumed, we investigate what technology would be most likely to provide the basis for these devices, what abilities such devices can be expected to have, and whether or not devices that size can still allow for meaningful interaction. We survey candidate technologies, drill down on gesture-based interaction, and demonstrate how it can be adapted to the desired form factors. While the resulting devices offer only the bare minimum in feedback and only the most basic interactions, we demonstrate that simple applications remain possible. We complete our exploration with two studies in which we investigate the affordance of these devices more concretely, namely marking and text entry using a gesture alphabet.

In Proceedings of UIST 2009
Article Picture

SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices (p. 111-120)

Abstract plus

One of the challenges with using mobile touch-screen devices is that they do not provide tactile feedback to the user. Thus, the user is required to look at the screen to interact with these devices. In this paper, we present SemFeel, a tactile feedback system which informs the user about the presence of an object where she touches on the screen and can offer additional semantic information about that item. Through multiple vibration motors that we attached to the backside of a mobile touch-screen device, SemFeel can generate different patterns of vibration, such as ones that flow from right to left or from top to bottom, to help the user interact with a mobile device. Through two user studies, we show that users can distinguish ten different patterns, including linear patterns and a circular pattern, at approximately 90% accuracy, and that SemFeel supports accurate eyes-free interactions.

In Proceedings of UIST 2009
Article Picture

Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices (p. 121-124)

Abstract plus

We present Abracadabra, a magnetically driven input technique that offers users wireless, unpowered, high fidelity finger input for mobile devices with very small screens. By extending the input area to many times the size of the device's screen, our approach is able to offer a high C-D gain, enabling fine motor control. Additionally, screen occlusion can be reduced by moving interaction off of the display and into unused space around the device. We discuss several example applications as a proof of concept. Finally, results from our user study indicate radial targets as small as 16 degrees can achieve greater than 92% selection accuracy, outperforming comparable radial, touch-based finger input.

In Proceedings of UIST 2010
Article Picture

Sensing foot gestures from the pocket (p. 199-208)

Abstract plus

Visually demanding interfaces on a mobile phone can diminish the user experience by monopolizing the user's attention when they are focusing on another task and impede accessibility for visually impaired users. Because mobile devices are often located in pockets when users are mobile, explicit foot movements can be defined as eyes-and-hands-free input gestures for interacting with the device. In this work, we study the human capability associated with performing foot-based interactions which involve lifting and rotation of the foot when pivoting on the toe and heel. Building upon these results, we then developed a system to learn and recognize foot gestures using a single commodity mobile phone placed in the user's pocket or in a holster on their hip. Our system uses acceleration data recorded by a built-in accelerometer on the mobile device and a machine learning approach to recognizing gestures. Through a lab study, we demonstrate that our system can classify ten different foot gestures at approximately 86% accuracy.

mobile device and interface

In Proceedings of UIST 2002
Article Picture

Ambient touch: designing tactile interfaces for handheld devices (p. 51-60)

mobile device interaction

In Proceedings of UIST 2008
Article Picture

SideSight: multi-"touch" interaction around small devices (p. 201-204)

Abstract plus

Interacting with mobile devices using touch can lead to fingers occluding valuable screen real estate. For the smallest devices, the idea of using a touch-enabled display is almost wholly impractical. In this paper we investigate sensing user touch around small screens like these. We describe a prototype device with infra-red (IR) proximity sensors embedded along each side and capable of detecting the presence and position of fingers in the adjacent regions. When this device is rested on a flat surface, such as a table or desk, the user can carry out single and multi-touch gestures using the space around the device. This gives a larger input space than would otherwise be possible which may be used in conjunction with or instead of on-display touch input. Following a detailed description of our prototype, we discuss some of the interactions it affords.

pen input device

In Proceedings of UIST 2006
Article Picture

Multi-layer interaction for digital tables (p. 269-272)

personal device

In Proceedings of UIST 2010
Article Picture

PhoneTouch: a technique for direct phone interaction on surfaces (p. 13-16)

Abstract plus

PhoneTouch is a novel technique for integration of mobile phones and interactive surfaces. The technique enables use of phones to select targets on the surface by direct touch, facilitating for instance pick&drop-style transfer of objects between phone and surface. The technique is based on separate detection of phone touch events by the surface, which determines location of the touch, and by the phone, which contributes device identity. The device-level observations are merged based on correlation in time. We describe a proof-of-concept implementation of the technique, using vision for touch detection on the surface (including discrimination of finger versus phone touch) and acceleration features for detection by the phone.

pointing device

In Proceedings of UIST 2003
Article Picture

Considering the direction of cursor movement for efficient traversal of cascading menus (p. 91-94)

In Proceedings of UIST 2006
Article Picture

Soap: a pointing device that works in mid-air (p. 43-46)

portable device

In Proceedings of UIST 1999
Article Picture

Generalized and stationary scrolling (p. 1-9)

reconfigurable input device

In Proceedings of UIST 2009
Article Picture

A reconfigurable ferromagnetic input device (p. 51-54)

Abstract plus

We present a novel hardware device based on ferromagnetic sensing, capable of detecting the presence, position and deformation of any ferrous object placed on or near its surface. These objects can include ball bearings, magnets, iron filings, and soft malleable bladders filled with ferrofluid. Our technology can be used to build reconfigurable input devices -- where the physical form of the input device can be assembled using combinations of such ferrous objects. This allows users to rapidly construct new forms of input device, such as a trackball-style device based on a single large ball bearing, tangible mixers based on a collection of sliders and buttons with ferrous components, and multi-touch malleable surfaces using a ferrofluid bladder. We discuss the implementation of our technology, its strengths and limitations, and potential application scenarios.

relative pointing device

In Proceedings of UIST 2008
Article Picture

Kinematic templates: end-user tools for content-relative cursor manipulations (p. 47-56)

Abstract plus

This paper introduces kinematic templates, an end-user tool for defining content-specific motor space manipulations in the context of editing 2D visual compositions. As an example, a user can choose the "sandpaper" template to define areas within a drawing where cursor movement should slow down. Our current implementation provides templates that amplify or dampen the cursor's speed, attenuate jitter in a user's movement, guide movement along paths, and add forces to the cursor. Multiple kinematic templates can be defined within a document, with overlapping templates resulting in a form of function composition. A template's strength can also be varied, enabling one to improve one's strokes without losing the human element. Since kinematic templates guide movements, rather than strictly prescribe them, they constitute a visual composition aid that lies between unaided freehand drawing and rigid drawing aids such as snapping guides, masks, and perfect geometric primitives.

small screen device

In Proceedings of UIST 2004
Article Picture

Collapse-to-zoom: viewing web pages on small screen devices by interactively removing irrelevant content (p. 91-94)

spatially aware device

In Proceedings of UIST 2009
Article Picture

Virtual shelves: interactions with orientation aware devices (p. 125-128)

Abstract plus

Triggering shortcuts or actions on a mobile device often requires a long sequence of key presses. Because the functions of buttons are highly dependent on the current application's context, users are required to look at the display during interaction, even in many mobile situations when eyes-free interactions may be preferable. We present Virtual Shelves, a technique to trigger programmable shortcuts that leverages the user's spatial awareness and kinesthetic memory. With Virtual Shelves, the user triggers shortcuts by orienting a spatially-aware mobile device within the circular hemisphere in front of her. This space is segmented into definable and selectable regions along the phi and theta planes. We show that users can accurately point to 7 regions on the theta and 4 regions on the phi plane using only their kinesthetic memory. Building upon these results, we then evaluate a proof-of-concept prototype of the Virtual Shelves using a Nokia N93. The results show that Virtual Shelves is faster than the N93's native interface for common mobile phone tasks.

transparent device

In Proceedings of UIST 2007
Article Picture

Lucid touch: a see-through mobile device (p. 269-278)

Abstract plus

Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.

virtual device interface