Keywords
UIST2.0 Archive - 20 years of UIST
Back
Back to keywords index

mouse

mouse

In Proceedings of UIST 1999
Article Picture

Real-world interaction using the FieldMouse (p. 113-119)

In Proceedings of UIST 2001
Article Picture

Pop through mouse button interactions (p. 195-196)

In Proceedings of UIST 2005
Article Picture

Predictive interaction using the delphian desktop (p. 133-141)

In Proceedings of UIST 2006
Article Picture

Soap: a pointing device that works in mid-air (p. 43-46)

In Proceedings of UIST 2007
Article Picture

Dirty desktops: using a patina of magnetic mouse dust to make common interactor targets easier to select (p. 183-186)

Abstract plus

A common task in graphical user interfaces is controlling onscreen elements using a pointer. Current adaptive pointing techniques require applications to be built using accessibility libraries that reveal information about interactive targets, and most do not handle path/menu navigation. We present a pseudo-haptic technique that is OS and application independent, and can handle both dragging and clicking. We do this by associating a small force with each past click or drag. When a user frequently clicks in the same general area (e.g., on a button), the patina of past clicks naturally creates a pseudo-haptic magnetic field with an effect similar to that ofsnapping or sticky icons. Our contribution is a bottom-up approach to make targets easier to select without requiring prior knowledge of them.

In Proceedings of UIST 2009
Article Picture

Mouse 2.0: multi-touch meets the mouse (p. 33-42)

Abstract plus

In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.

mouse input

In Proceedings of UIST 2008
Article Picture

OctoPocus: a dynamic guide for learning gesture-based command sets (p. 37-46)

Abstract plus

We describe OctoPocus, an example of a dynamic guide that combines on-screen feedforward and feedback to help users learn, execute and remember gesture sets. OctoPocus can be applied to a wide range of single-stroke gestures and recognition algorithms and helps users progress smoothly from novice to expert performance. We provide an analysis of the design space and describe the results of two experi-ments that show that OctoPocus is significantly faster and improves learning of arbitrary gestures, compared to con-ventional Help menus. It can also be adapted to a mark-based gesture set, significantly improving input time compared to a two-level, four-item Hierarchical Marking menu.

spatial mouse

In Proceedings of UIST 2005
Article Picture

Circle & identify: interactivity-augmented object recognition for handheld devices (p. 107-110)