Director for Mixed Reality Research
Institute for Creative Technologies
USC School of Cinematic Arts Interactive Media Division
In the good old days, the human was here, the computer there, and a good living was to be made by designing ways to interface between the two. Now we find ourselves unthinkingly pinching to zoom in on a picture in a paper magazine. User interfaces are changing instinctual human behavior and instinctual human behavior is changing user interfaces. We point or look left in the "virtual" world just as we point or look left in the physical.
It is clear that nothing is clear anymore: the need for "interface" vanishes when the boundaries between the physical and the virtual disappear. We are at a watershed moment when to experience being human means to experience being machine. When there is not a user interface - it is just what you do. When instinct supplants mice and menus and the interface insinuates itself into the human psyche.
We are redefining and creating what it means to be human in this new physical/virtual integrated reality - we are not just designing user interfaces, we are designing users.
Mark Bolas is the Director of the Mixed Reality Lab at the USC Institute of Creative Technologies and an Associate Professor in the Interactive Media & Games division of the School of Cinematic Arts, where he directs the Mixed Reality Studio. His work focuses on researching perception, agency, and intelligence - creating virtual environments and transducers that fully engage one's perception and cognition to create a visceral memory of the experience.
Bolas leads research projects for the Army Research Office, the Office of Naval Research, and DARPA, as well as a variety of other clients, including content for the entertainment industry. He has led the development of a number of influential products including the open-source FOV2GO, which informed the design of the Oculus Rift; the Wide-5 HMD; Pinch interface gloves; and the Boom and Molly telepresence system. Bolas' 1988-89 thesis work "Design and Virtual Environments" was the first effort to map the breadth of virtual reality as a new medium.
In addition to USC, he has taught at Stanford University and Keio University, exploring tangible interfaces, augmented reality, and computational illumination. These projects have explored context-sensitive audio interfaces, socially interactive toys, augmented reality, confocal illumination, and mobile phone web logging.
Bolas co-founded Fakespace Labs, Inc. in 1988 and developed and sold VR hardware and systems for dozens of major research labs over the decades. He holds more than twenty patents and has been recognized with awards from the Consumer Electronics Association, Popular Science, SIGGRAPH Best Emerging Technology, IEEE's Industry Excellence, and IEEE's Virtual Reality Technical Achievement Award.
New representations of thought — written language, mathematical notation, information graphics, etc — have been responsible for some of the most significant leaps in the progress of civilization, by expanding humanity's collectively-thinkable territory.
But at debilitating cost. These representations, having been invented for static media such as paper, tap into a small subset of human capabilities and neglect the rest. Knowledge work means sitting at a desk, interpreting and manipulating symbols. The human body is reduced to an eye staring at tiny rectangles and fingers on a pen or keyboard.
Like any severely unbalanced way of living, this is crippling to mind and body. But less obviously, and more importantly, it is enormously wasteful of the vast human potential. Human beings naturally have many powerful modes of thinking and understanding. Most are incompatible with static media. In a culture that has contorted itself around the limitations of marks on paper, these modes are undeveloped, unrecognized, or scorned.
We are now seeing the start of a dynamic medium. To a large extent, people today are using this medium merely to emulate and extend static representations from the era of paper, and to further constrain the ways in which the human body can interact with external representations of thought.
But the dynamic medium offers the opportunity to deliberately invent a humane and empowering form of knowledge work. We can design dynamic representations which draw on the entire range of human capabilities — all senses, all forms of movement, all forms of understanding — instead of straining a few and atrophying the rest.
This talk suggests how each of the human activities in which thought is externalized (conversing, presenting, reading, writing, etc) can be redesigned around such representations.
Bret Victor has designed experimental UI concepts at Apple, interactive data graphics for Al Gore, and musical instruments at Alesis. He’s responsible for "Inventing on Principle", "Learnable Programming", "Media for Thinking the Unthinkable", "Up and Down the Ladder of Abstraction", and everything else at worrydream.com. He holds a BSEE from Caltech and MSEE from UC Berkeley.