Reality–virtuality continuum
The reality-virtuality continuum is a theoretical framework that describes the continuous scale between the completely virtual, a virtuality, and the completely real, reality. The reality-virtuality continuum therefore encompasses all possible variations and compositions of real and virtual objects. It has been described as a concept in new media and computer science.
The concept was first introduced in 1994 by Paul Milgram, a professor at University of Toronto that pioneered wearable computing research. [1] [2] Since the inception of the continuum, scholars have argued the continuum should be updated to match the current state of wearable computing systems.
The area between the two extremes, where both the real and the virtual are mixed, is called mixed reality. This in turn is said to consist of both augmented reality, where the virtual augments the real, and augmented virtuality, where the real augments the virtual.[3] This continuum has been extended into a two-dimensional plane of virtuality and mediality.[4]
The development of virtual reality products has allowed society to integrate and apply AR and AV into various industries like education, e-commerce, entertainment, gaming, and many more. Products such as VR headsets, haptic suits, digital measuring devices, green screens, and smart glasses allow users to interact with reality and virtual objects simultaneously.
Components across the continuum
[edit ]The reality-virtuality continuum is a continuous scale ranging from the real environment to virtual environment. Mixed reality is the variation between the two environments, based on the composition of real and virtual objects. Augmented reality (AR) and augmented virtuality (AV) are mixed realities on the continuum, located based on their composition towards the two extremes. [1]
Real Environments
[edit ]The real environment contains the physical and tangible world, where all things occur naturally, and not artificially made. The interaction of biotic factors such as animals, plants, and abiotic factors such as water, air, natural resources, are naturally occurring and physical. [5] In contrast to the real environment, is the virtual environment which synthetically builds virtual content through computers. The real environment includes the interaction of individuals with the physical aspects of the environment.
Virtual Environments
[edit ]A virtual environment simulates aspects of human sensory using digital information through computer networks. [3] Users are able to interact with computer environments such as emails, chatlogs, and documented sharing through the web at a surface level. [3] Virtual environments can progress to immersive virtual environments (IVEs) when individuals are in a psychological state perceiving themselves existing within the virtual environment. [3]
Mixed Reality
[edit ]Mixed reality is the application of technology that merges the virtual and real environment by simulating physical environments in the digital world. Human senses such as sight, hearing, and touch can be replicated and simulated by technology to simulate real world occurrences. An individual's perception of reality shifts based on the composition of real and virtual objects implemented. Virtual content blends real and virtual environments by adapting and responding to user interactions to simulate realities. [3]
Augmented Reality
[edit ]Augmented reality (AR) applies digital information onto the real world, enhancing human perception of real physical surroundings overlaid with digital content. AR is positioned towards the real environment as virtual objects are integrated into the real environment. The main focus of AR applications is to enhance and augment the real environment, with virtual objects. Virtual objects are spatially registered to the real world, where users can see and interact with the real environment continuously. [3]
Augmented Virtually
[edit ]Augmented reality (AV) integrates real-world objects to the virtual environment. AV is positioned towards the virtual environment as real world elements are integrated into the virtual environment. Augmented reality focuses on bringing a sense of the real world into a simulated space with digital applications. Real objects are integrated into the virtual environment, where users can see and interact with the virtual environment continuously. In contrast to AR, AV focuses on embedding live real environment content into the virtual space. [3]
History
[edit ]Predating the Continuum
[edit ]Early characterizations of technologies within the continuum can be dated back to the 1970s when Myron Krueger first introduced the terms 'artificial reality` and `video place`. While earning his Computer Science Ph.D at the University of Wisconsin-Madison, his early interactive computer graphics work marks him as a first generation researcher in the space. In the 1980s, the "God-father of VR", Jarod Lainer founded VPL Research, one of the earliest companies to develop and sell VR products. [6] [7] During his 20s, he conducted a widely distributed interview with Whole Earth Review, where he shares ideas of the possible applications for virtual reality as well as dissecting a framework for virtual reality. [8]
Origins of the Reality–Virtuality Continuum
[edit ]In 1994, Paul Milgram, Haruo Takemura, Akira Utsumi and Fumio Kishino published "Augmented Reality: A class of displays on the reality-virtuality continuum". They were the first to coin the term 'virtuality-reality continuum' in this article, defining the continuum as a continuous line, where the endpoints consist of the real environment and the virtual environment. The article categorized existing mixed reality hardware within the continuum based on a three dimensional framework based on extent of world knowledge, reproduction fidelity and extent of presence.[9]
Applications
[edit ]AR Technology
[edit ]Current applications of hardware including Augmented Reality (AR) technologies, mixing real and virtual environments can start even inside classrooms. [10] For example, some online science textbooks today may include interactive pop-up simulations. Additionally, children use AR technology in classrooms to build their own 3D story books by using various programs downloaded into computers. One of the most used includes the program known as Zooburst. [10]
More complex applications of AR technologies include the innovation of Augmented Reality through Graphic Overlays on Stereovideo (ARGOS), a system that was made to get real world information out of online machinery [11] . The most notable applications of this technology come from digital measuring devices such as rulers and tape measures, often used in current telephone applications, as well as online technology companies such as Amazon’s "try on" methods, allowing users to measure products in a real environment before purchasing anything.
AV Technology
[edit ]Augmented Virtuality (AV) technologies aim to do the inverse of AR, that is, to bring the real world into a virtual space/environment. The most advertised technologies in today’s world include the use of green screen, where a real object is placed in a completely virtual environment and is often seen in weather forecasting or online meeting applications such as Zoom or Apple’s FaceTime [12] .
VR Technology
[edit ]Current Virtual Reality(VR) tools include 3D modeling softwares such as AutoDesk applications such as AutoCAD, as well as the 3D animation software Cinema4D [13] . In the world of completely virtual environments, VR headsets are also a crucial part in describing the virtual side of the continuum [1] . As the name describes, VR technologies aim to synthesize the real world, where humans are immersed into the said "cyberspace" to satisfy sensory stimuli to a sense of reality [13] .
VR headsets also utilize stimuli techniques known as Haptic Sensing and Haptic Feedback. Haptic Sensing uses real time information that is gathered from human interaction and produces virtual results [13] . An example of this in Human Machine Interaction (HMI) derives from virtual instrument applications where users' hand movement produces sounds in real time [13] . Haptic Feedback is the inverse interaction technique, creating a stimuli that is felt by the user as a result of virtual interactions that makes the environment seem as real as possible [13] . Some VR games that include more intense scenes and gameplay are programmed to create vibrational stimuli felt by the user [14] .
Military Applications
[edit ]Within the military, training and simulation is currently the primary application of XR/AR systems. Live-fire drills are replaced through virtual scenarios, making the training process much more efficient and cost effective. Smart glasses can also be utilized in high-risk situations for maintenance to improve accuracy and safety. [15]
The United States Army is also actively funding research and development of XR/AR technologies for eventual military deployment. In 2018 Microsoft was awarded a 22ドル billion Other Transaction Agreement (OTA) over the next 10 years to develop the Integrated Visual Augmentation System (IVAS), an AR system for improving situational awareness in combat. The headsets are able to display information traditionally only available to commanders directly to individual combat units, improving the capabilities of dismounted soldiers. There have been several variants of the IVAS since its inception, each adding incremental improvements to the user comfort, communication, and night-vision capabilities.
In 2025 the IVAS contract was transferred through a contract novation to Anduril Industries and Rivet Industries under a new name, the Soldier Borne Mission Command (SBMC) program. Anduril now oversees the development and production of the IVAS system, taking over the original contract from Microsoft, alongside an additional 159ドル million contract. IVAS will be built upon Anduril’s Lattice platform and Microsoft Azure, allowing the IVAS headsets to leverage sensor fusion, computer vision, machine learning, and artificial intelligence capabilities. [16]
In October 2025, Anduril announced EagleEye in partnership with Meta, Oakley, Qualcomm and Gentex. The EagleEye headset enhances visual perception and improves battlefield communication by providing soldiers with combat information and mission objectives. [17] In an announcement video a soldier can be seen interacting with the headset through the heads-up display, using the headset to plan a mission, place visual markers in connection with his rifle, and observe enemies and fellow soldiers through a physical wall. [18]
Future considerations
[edit ]Privacy concerns
[edit ]As with any software systems, devices across the virtuality-reality continuum pose a data privacy risk for users. In August 2022, Meta removed the Facebook login requirement for Quest headset users, replacing it with a new Meta Horizon profile. Users no longer needed to tie their real names to their Quest accounts, and any previous users who had logged into Quest systems through Facebook could decouple their identity information by creating a new Meta Horizon account. [19]
Wearable devices can also be utilized to counteract modern surveillance systems. Mann describes different methods of counteracting surveillance by using wearable cameras and screens to extend reality. [20] By repositioning surveillance technologies onto the user, such devices can playfully counteract the modern panopticon of social surveillance in public spaces, as to "surveil the surveillance". [20]
Motion sickness
[edit ]Developing immersive consumer virtual reality/augmented reality systems is a highly complex hardware engineering problem. As companies seek to improve the hardware specifications, user experience can be improved and virtual reality motion sickness can be mitigated. [21]
Advancing optics and display systems can advance stereoscopic image quality and improve immersion. Current top of the line consumer devices like the Bigscreen Beyond, utilizes aspheric lenses to minimize the physical lens system whilst maintaining a wide field of view. [22] Maximizing field of view in these systems can mitigate the tunnel-vision effect that causes VR motion sickness. [23] Improving the display systems of XR headsets can help mitigate motion sickness. LCD and OLED displays with high refresh rates and high resolution can reduce visual artifacts and mitigate the screen-door effect. [24] [25]
See also [all]
[edit ]- Computer-mediated reality
- Extended reality
- Simulated reality
- Spatial computing
- Transreality gaming
- Virtual Environment
- Natural Environment
- Extended Reality
- Virtual reality sickness
- ^ a b c Milgram, Paul; H. Takemura; A. Utsumi; F. Kishino (1994). "Augmented Reality: A class of displays on the reality-virtuality continuum". Proceedings of SPIE - The International Society for Optical Engineering Vol. 2351. Retrieved 2021年06月01日.
- ^ Gutiérrez Alonso, Mario Arturo; Mario Gutierrez; F. Vexo; Daniel Thalmann (2008). Stepping Into Virtual Reality. Springer London. p. 7. ISBN 9781848001176 . Retrieved May 13, 2025.
- ^ a b c d e f g Blascovich, Jim (2002), Schroeder, Ralph (ed.), "Social Influence within Immersive Virtual Environments", The Social Life of Avatars: Presence and Interaction in Shared Virtual Environments, Computer Supported Cooperative Work, London: Springer, pp. 127–145, doi:10.1007/978-1-4471-0277-9_8, ISBN 978-1-4471-0277-9 , retrieved 2025年11月13日
- ^ Mediated Reality with implementations for everyday life, 2002 August 6th, Presence Connect, the on line companion to the MIT Press journal PRESENCE: Teleoperators and Virtual Environments, MIT Press
- ^ Johnson, D. L.; Ambrose, S. H.; Bassett, T. J.; Bowen, M. L.; Crummey, D. E.; Isaacson, J. S.; Johnson, D. N.; Lamb, P.; Saul, M.; Winter-Nelson, A. E. (1997). "Meanings of Environmental Terms". Journal of Environmental Quality. 26 (3): 581–589. Bibcode:1997JEnvQ..26..581J. doi:10.2134/jeq1997.00472425002600030002x. ISSN 1537-2537.
- ^ News, UC IT. "Jaron Lanier, "the Father of VR", addresses tech enthusiasts at UC Berkeley". UC Tech News. Retrieved 2025年11月22日.
{{cite web}}:|last=has generic name (help) - ^ "Jaron Lanier at Microsoft Research". Microsoft Research. Retrieved 2025年11月22日.
- ^ "A Protrait of the Young Visionary". www.jaronlanier.com. Retrieved 2025年11月22日.
- ^ Milgram, Paul; H. Takemura; A. Utsumi; F. Kishino (1994). "Augmented Reality: A class of displays on the reality-virtuality continuum". Proceedings of SPIE - The International Society for Optical Engineering Vol. 2351. Retrieved 2021年06月01日.
- ^ a b "A Taxonomy of Mixed Reality Visual Displays". ResearchGate. Archived from the original on 2025年03月23日. Retrieved 2025年11月13日.
- ^ Drascic, David; Grodski, Julius J.; Milgram, Paul; Ruffo, Ken; Wong, Peter; Zhai, Shumin (1993年05月01日). "ARGOS: A display system for augmenting reality". Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '93. New York, NY, USA: Association for Computing Machinery. p. 521. doi:10.1145/169059.169506. ISBN 978-0-89791-575-5.
- ^ Stapleton, Christopher; Davies, Jim (October 2011). "Imagination: The third reality to the virtuality continuum". 2011 IEEE International Symposium on Mixed and Augmented Reality - Arts, Media, and Humanities. pp. 53–60. doi:10.1109/ISMAR-AMH.2011.6093657. ISBN 978-1-4673-0059-9 – via iEEE Explore.
- ^ a b c d e Olmedo, Héctor (2013年01月01日). "Virtuality Continuum's State of the Art". Procedia Computer Science. 2013 International Conference on Virtual and Augmented Reality in Education. 25: 261–270. doi:10.1016/j.procs.2013年11月03日2. ISSN 1877-0509.
- ^ Shi, Yuxiang; Shen, Guozhen (2024). "Haptic Sensing and Feedback Techniques toward Virtual Reality". Research (Washington, D.C.). 7 0333. Bibcode:2024Resea...7..333S. doi:10.34133/research.0333. ISSN 2639-5274. PMC 10964227 . PMID 38533183.
- ^ "AR and VR In Defense Market Size, Share, 2025-2030 Outlook". Mordor Intelligence. 2025年09月19日. Retrieved 2025年11月21日.
- ^ Source, Microsoft (2025年02月11日). "Anduril and Microsoft partner to advance Integrated Visual Augmentation System (IVAS) program for the U.S. Army". Source. Retrieved 2025年11月22日.
- ^ "Anduril's EagleEye Puts Mission Command and AI Directly into the Warfighter's Helmet". Anduril. 2025年10月13日. Retrieved 2025年11月22日.
- ^ Anduril Industries (2025年10月16日). EagleEye: Superpowers For Superheroes . Retrieved 2025年11月22日 – via YouTube.
- ^ Machkovech, Sam (2022年07月09日). "Meta removes Facebook account mandate from Quest VR—but is that enough?". Ars Technica. Retrieved 2025年11月21日.
- ^ a b Mann, Steve; Nolan, Jason; Wellman, Barry (2003). "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments". Surveillance & Society. 1 (3): 331–355. doi:10.24908/ss.v1i3.3344. ISSN 1477-7487.
- ^ Kourtesis, Panagiotis; Collina, Simona; Doumas, Leonidas A. A.; MacPherson, Sarah E. (2019). "Technological Competence Is a Pre-condition for Effective Implementation of Virtual Reality Head Mounted Displays in Human Neuroscience: A Technological Review and Meta-Analysis". Frontiers in Human Neuroscience. 13 342. doi:10.3389/fnhum.2019.00342 . ISSN 1662-5161. PMC 6783565 . PMID 31632256.
- ^ "Aspheric Surfaces - Part 1: Introduction to Aspherical Surfaces in Optical Design". Ansys Optics. Retrieved 2025年11月21日.
- ^ "Motion Sickness in VR: Why it happens and how to minimise it". VirtualSpeech. 2024年10月09日. Retrieved 2025年11月21日.
- ^ Descheneaux, Charles R.; Reinerman-Jones, Lauren; Moss, Jason; Krum, David; Hudson, Irwin (2020). "Negative Effects Associated with HMDS in Augmented and Virtual Reality". In Chen, Jessie Y. C.; Fragomeni, Gino (eds.). Virtual, Augmented and Mixed Reality. Design and Interaction. Lecture Notes in Computer Science. Vol. 12190. Cham: Springer International Publishing. pp. 410–428. doi:10.1007/978-3-030-49695-1_27. ISBN 978-3-030-49695-1.
- ^ Milgram, Paul (June 2006). "Some Human Factors Considerations for Designing Mixed Reality Interfaces" (PDF). Meeting Proceedings Paper or File – via NATO Science and Technology Organization.