Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

hardware

hardware

In Proceedings of UIST 1996
Article Picture

The Lego interface toolkit (p. 97-98)

In Proceedings of UIST 2001
Article Picture

Empirical measurements of intrabody communication performance under varied physical configurations (p. 183-190)

In Proceedings of UIST 2006
Article Picture

Soap: a pointing device that works in mid-air (p. 43-46)

In Proceedings of UIST 2008
Article Picture

Going beyond the display: a surface technology with an electronically switchable diffuser (p. 269-278)

Abstract plus

We introduce a new type of interactive surface technology based on a switchable projection screen which can be made diffuse or clear under electronic control. The screen can be continuously switched between these two states so quickly that the change is imperceptible to the human eye. It is then possible to rear-project what is perceived as a stable image onto the display surface, when the screen is in fact transparent for half the time. The clear periods may be used to project a second, different image through the display onto objects held above the surface. At the same time, a camera mounted behind the screen can see out into the environment. We explore some of the possibilities this type of screen technology affords, allowing surface computing interactions to extend 'beyond the display'. We present a single self-contained system that combines these off-screen interactions with more typical multi-touch and tangible surface interactions. We describe the technical challenges in realizing our system, with the aim of allowing others to experiment with these new forms of interactive surfaces.

novel hardware

In Proceedings of UIST 2007
Article Picture

ThinSight: versatile multi-touch sensing for thin form-factor displays (p. 259-268)

Abstract plus

ThinSight is a novel optical sensing system, fully integrated into a thin form factor display, capable of detecting multi-ple fingers placed on or near the display surface. We describe this new hardware in detail, and demonstrate how it can be embedded behind a regular LCD, allowing sensing without degradation of display capability. With our approach, fingertips and hands are clearly identifiable through the display. The approach of optical sensing also opens up the exciting possibility for detecting other physical objects and visual markers through the display, and some initial experiments are described. We also discuss other novel capabilities of our system: interaction at a distance using IR pointing devices, and IR-based communication with other electronic devices through the display. A major advantage of ThinSight over existing camera and projector based optical systems is its compact, thin form-factor making such systems even more deployable. We therefore envisage using ThinSight to capture rich sensor data through the display which can be processed using computer vision techniques to enable both multi-touch and tangible interaction.

In Proceedings of UIST 2008
Article Picture

SideSight: multi-"touch" interaction around small devices (p. 201-204)

Abstract plus

Interacting with mobile devices using touch can lead to fingers occluding valuable screen real estate. For the smallest devices, the idea of using a touch-enabled display is almost wholly impractical. In this paper we investigate sensing user touch around small screens like these. We describe a prototype device with infra-red (IR) proximity sensors embedded along each side and capable of detecting the presence and position of fingers in the adjacent regions. When this device is rested on a flat surface, such as a table or desk, the user can carry out single and multi-touch gestures using the space around the device. This gives a larger input space than would otherwise be possible which may be used in conjunction with or instead of on-display touch input. Following a detailed description of our prototype, we discuss some of the interactions it affords.

In Proceedings of UIST 2009
Article Picture

Mouse 2.0: multi-touch meets the mouse (p. 33-42)

Abstract plus

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.