Research at the HIT Lab
Reality is what we experience in our daily lives. It’s the world we are familiar with. Our experiences in the real world, collected in our memory, are what sets most of our expectations for new experiences. Examples include simple things like walking down the street, having a coffee, arguing with a friend, or surfing the Internet. The real world, however, is just one of many realities we can experience.
In their 1994 seminal paper, Paul Milgram and Fumio Kishino formulated the notion of the “Reality-Virtuality Continuum” [1,2]. This work for the first time gave us a way of talking about the relationship between (real) reality, virtual reality, and augmented reality as being one of increasing immersion into the virtual, combined with decreasing connection to the real.
Our work at the HIT Lab NZ is focused on providing people with technological support for experiencing various realities to enhance work and daily life. We put people first, looking at the tasks they are trying to accomplish, and then adding an appropriate mix of technology to support these tasks within a given environment. In some cases, the “appropriate mix” may be no technology at all, though most of what we are interested in involves at least some combination of real and virtual content — somewhere to the right of the Real Environment.
Human-Robot Interaction (HRI) involves people interacting with physical entities that have the ability to sense, ponder, and act on the real world. While there are many aspects to HRI, we can view this interaction as being just to the right of the Real Environment in the figure above. Examples include interacting with robots in a face-to-face manner in the home or shops, or tele-operation of remote search-and-rescue robots.
Augmented Reality (AR) is the merging of digital content into the otherwise real world. AR adds synthetically generated information to our perception within the context of the real world. Examples include so-called Smart Glasses which overlay contextual graphics registered with real-world objects, such as graphical call-outs indicating points of interest (e.g., a cafe) in the direction the user is looking, or mobile-phone-based AR for historical tour guiding.
Augmented Virtuality (AV) is the merging of real-world content into an otherwise virtual world. AV adds information captured from the real world to our perception within the context of a virtual world. Examples include real-world actors blended into virtual sets (e.g., the weatherman effect), or the introduction of scans of nearby objects displayed in my environment.
The main difference between AR and AV is the notion of the “primary world.” In AR, the real world is the primary world, while in AV the virtual world is the primary world.
Virtual Reality (VR) is at the extreme right end of the Reality-Virtuality Continuum. In VR, everything the user sees, hears, touches, and interacts with is virtual. In a purely virtual world, the user is completely immersed in the virtual world, like a person in a swimming pool is completely immersed in water; all the senses are stimulated through the medium of water.
The HIT Lab NZ has deep expertise in all aspects of technology application along the Reality-Virtuality Continuum. We draw on this human-centered deep understanding to design effective systems for work and play.
 Milgram, F. Kishino, “A taxonomy of mixed reality visual displays”, IEICE (Institute of Electronics, Information and Communication Engineers) Transactions on Information and Systems, Special issue on Networked Reality, Dec. 1994.
 Milgram, H. Takemura, A. Utsumi, F. Kishino, Proc. SPIE 2351, Telemanipulator and Telepresence Technologies, 282 (December 21, 1995), doi:10.1117/12.197321