

We present the satellite cursor - a novel technique that uses multiple cursors to improve pointing performance by reducing input movement. The satellite cursor associates every target with a separate cursor in its vicinity for pointing, which realizes the MAGIC (manual and gaze input cascade) pointing method without gaze tracking. We discuss the problem of visual clutter caused by multiple cursors and propose several designs to mitigate it. Two controlled experiments were conducted to evaluate satellite cursor performance in a simple reciprocal pointing task and a complex task with multiple targets of varying layout densities. Results show the satellite cursor can save significant mouse movement and consequently pointing time, especially for sparse target layouts, and that satellite cursor performance can be accurately modeled by Fitts' Law.

Most of today's GUIs are designed for the typical, able-bodied user; atypical users are, for the most part, left to adapt as best they can, perhaps using specialized assistive technologies as an aid. In this paper, we present an alternative approach: SUPPLE++ automatically generates interfaces which are tailored to an individual's motor capabilities and can be easily adjusted to accommodate varying vision capabilities. SUPPLE++ models users. motor capabilities based on a onetime motor performance test and uses this model in an optimization process, generating a personalized interface. A preliminary study indicates that while there is still room for improvement, SUPPLE++ allowed one user to complete tasks that she could not perform using a standard interface, while for the remaining users it resulted in an average time savings of 20%, ranging from an slowdown of 3% to a speedup of 43%.

One of the challenges with using mobile touch-screen devices is that they do not provide tactile feedback to the user. Thus, the user is required to look at the screen to interact with these devices. In this paper, we present SemFeel, a tactile feedback system which informs the user about the presence of an object where she touches on the screen and can offer additional semantic information about that item. Through multiple vibration motors that we attached to the backside of a mobile touch-screen device, SemFeel can generate different patterns of vibration, such as ones that flow from right to left or from top to bottom, to help the user interact with a mobile device. Through two user studies, we show that users can distinguish ten different patterns, including linear patterns and a circular pattern, at approximately 90% accuracy, and that SemFeel supports accurate eyes-free interactions.