Spirituality

Latest In

Spirituality

Augmented And Virtual Reality - Development History And Future Advances

Augmented and virtual reality is changing how we see and interact with digital information. Because of their ability to recreate interactions between computer-generated visuals and the actual environment, these near-eye displays have gotten much attention and effort.

Author:Suleman Shah
Reviewer:Han Ju
Mar 05, 2023
38.3K Shares
1.2M Views
Augmented and virtual realityis changing how we see and interact with digital information. Because of their ability to recreate interactions between computer-generated visuals and the actual environment, these near-eye displays have gotten much attention and effort.
Virtual and augmented reality offers an appealing new method for individuals to experience their surroundings. In contrast to traditional display technologies such as televisions, laptops, and smartphones, augmented and virtual reality displays are intended to change interactions between the viewer, display, and the surrounding world.
Viewers can immerse themselves in a creative universe that merges fiction and reality by superimposing virtual visuals on the actual environment. Augmented, virtual reality display aims to give realistic pictures that may imitate, integrate into, or reconstruct the surrounding world without causing pain to the wearer.
To provide visual comfort, the optical system should have an appropriate field of view, create 3D pictures with matching depth and high resolution, and give enough contrast and brightness.
For long-term user comfort, a compact and lightweight construction are preferred. Meeting these goals is difficult because of the trade-offs between different optical components and system architectures.
Some augmented, and virtual reality devices are emerging due to significant developments in optical components, display technology, and digital processing.

Developments In Virtual Reality Display

The fundamental challenge with computer-generated 3D picture production is the immersive experience with a virtual world. Stereo vision is an essential feature of the human visual system when analyzing the capabilities of producing 3D pictures in virtual reality.
Real-world viewing of a 3D object makes an accommodation cue (the focus of the eyes) and a vergence cue (relative rotation of the eyes) match. However, in most modern virtual reality systems, there is just one fixed display plane with various rendered information.
The viewer's eyes will concentrate on the display plane to collect picture information. However, the location of the computer-generated 3D object is often not in the display plane. Consequently, the visual system in the viewer's brain will compel the eyeball to concentrate on the virtual 3D object.
At the same time, the eye lens focuses on the display plane, resulting in mismatched accommodation and vergence distance. This condition, known as vergence-accommodation conflict (VAC), induces dizziness and nausea.
Aside from aesthetic comfort, the total weight and bulk of the device will restrict use duration and applications. To ensure user comfort, the device should be as light as feasible while maintaining a wide field of view in the virtual area.
A man is wearing a virtual reality smart technology headset
A man is wearing a virtual reality smart technology headset

How To Reduce Vergence Accommodation Conflict In Virtual Reality

Use Of Multi-Focal System

In the late 1990s, the multi-focal display was presented as a solution to the vergence-accommodation conflict issue of head-mounted displays. It entails creating numerous picture planes or altering image plane positions to fit the vergence and accommodation distances.
Transparent panels will raise costs and show visible moiré patterns when many panels are stacked together. Beam splitters may be used to assist set up a space multiplexing system.
Because the distance between each beam scanner and the human eye varies, the picture is presented at various depths.
The temporal multiplexing approach is based on dynamic components and may modify the panel distance or effective focal length at any moment. Although it is still difficult to produce an adaptive lens with an extensive tuning range and quick reaction time, this technology may minimize the number of physical parts, resulting in a substantially smaller system volume.
This design can provide multi-focal planes while reducing system size and weight; however, it necessitates using a costly spatial light modulator, and the picture quality is not yet suitable for commercial goods.
Multiple picture planes are generated via polarization multiplexing based on distinct polarization states. Light from the display panel passes via a pixelated polarization modulation layer (PPML), which may modify the ratio of two orthogonal polarization states, allowing each pixel in the associated focus plane to be adjusted separately.
For a linear polarization system, PPML may be a polarization rotational; for a circularly polarized system, it can be an integrated polarization rotator and quarter-wave plate.

Use Of Micro Lens Array System

An advanced design includes placing a micro-lens array (MLA) in front of the display screen to adjust the location of virtual pictures globally or individually.
When the micro-lens array is perfectly aligned with the display, a minor movement of the micro-lens array may result in a significant shift in focus for the virtual picture.
Instead of moving a bigger display panel or heavier lens across a longer distance, moving the micro-lens array plate, a little space may drastically reduce vergence-accommodation conflict.

Use Of Light Field System

Light field displays should ideally simulate a physical wavefront formed by a real object. We trace the points on the item and light the appropriate pixels on the display screen to present a virtual 3D object.
The light field around those places may then be modeled using discrete emitting rays. Although this approach provides accurate depth information and retinal blur, it sacrifices resolution.

Development Of Pancake Virtual Reality

A compact optical design that considers the headset's center of gravity is desperately required. Polarization-based folded optics (or pancake optics) with smaller form factors have recently gained popularity.
The main idea is to construct a hollow in which the optical path may be folded into a smaller region. Due to the BS, only 25% of total energy is transmitted to the viewer's side. Recent improvements in holographic optics provide an even wider variety of optical element options.
Flat holographic films may be used for the reflecting polarizer and the BS. This indicates that total system efficiency may increase from 25% to 100%.

Advancement In Augmented Reality Display Architectures

Unlike virtual reality displays, which give an immersive experience, augmented reality displays strive for see-through systems that overlay computer-generated pictures with actual landscapes. Near-eye devices with high transmittance, an appropriate field of view, and a small form factor are required to provide this distinct visual experience with wearing comfort.
As a result, freeform optics with a wide field of view and excellent transmittance are critical for augmented reality displays. This building, however, has a huge volume and hefty weight owing to the prism shape.
A light guide-based structure and free-space coupler are often employed to establish a delicate balance between visual comfort and wearable comfort while reducing system size.

Use Of Freeform Prisms And Beam Scanners

Because of the advancement of diamond-turning equipment, freeform prisms have received great attention. The freeform prisms in an augmented reality system typically need a partially reflecting surface and a total internal reflection (TIR) surface to overlay the computer-generated pictures and transmit the surrounding surroundings.
This advanced structure includes two refraction surfaces, a total internal reflection surface, and a partial reflection surface into one element, allowing for more design flexibility.
This design produces high-quality pictures with a broad field of view, but the complete system is large and heavy because of volume constraints.
A designed beam scanner cube used as the coupler is another frequent example of a freeform-based augmented reality gadget. This device architecture offers the most basic option for augmented reality display with a wide field of view but a bigger physical size.
Furthermore, the conservation of étendue is the product of the field of view and eye box. There is an additional trade-off between the field of view and eyebox (or exit pupil). As a result, the narrower the eye box, the greater the field of view.

Use Of Lightguide-Based Architectures

The lightguide-based structure outperforms the freeform design in termsof aesthetic and wearing comfort, particularly in the small and thin form factor.
Lightguide-based near-eye display (LNED) has been one of the most extensively utilized architectures for augmented reality displays over the last decade. It is used in numerous commercial devices such as the HoloLens 2, Magic Leap 1, and Lumus.
Input and output couplers are critical optical components that impact the performance of an LNED. The input coupler typically has high efficiency, allowing it to use the light generated by the optical engine entirely.
On the other hand, the output coupler has a low gradient efficiency over the exit pupil to guarantee an extended and homogenous eyebox.
LNEDs are classified into two types depending on their coupler designs: grating-based lightguides and geometrical lightguides.
A man is wearing a goggle headset for virtual online meeting
A man is wearing a goggle headset for virtual online meeting

Use Of Free Space Coupler-Based Architectures

Unlike freeform optical devices or LNEDs, free-space couplers offer more architectural flexibility and no volume or total internal reflection limits.
Numerous designs based on free-space couplers have undoubtedly been presented because of their high degrees of freedom, but each design has advantages and disadvantages.
Based on their operating principles, these systems may be divided into three types: reflecting couplers, diffusive couplers, and diffractive couplers.
  • Reflective Coupler:The surface reflection of a flat or curved surface provides the basis for a reflective free-space coupler. The lens collimates the computer-generated pictures projected by the display, which are subsequently reflected into the viewer's eye. Meta 2 by Meta Vison, DreamGlass by Dream World, and NorthStar by LeapMotion have all used this design effectively.
  • Diffusive Coupler:The light scattering of optical components is the foundation of a diffusive free-space coupler. In such a setup, the displayed pictures are immediately projected onto the coupler, which is often a diffuser with a flat or curved surface. The diffuser should have angular selectivity to scatter the off-axis incoming picture and transmit the ambient light in front of the eye to maintain see-through capabilities.
  • Diffractive coupler:A diffractive free-space coupler is made of flat diffraction optical components with predetermined phase profiles, such as a lens or freeform optics. The Maxwellian system uses the Maxwellian vision principle, which immediately produces a focus-free picture on the retina. A standard 2D display and a laser light source are examples of image sources. Exit pupil shifting may be used to enlarge the region covered by the focus point and so widen the eye box. The light field system with a micro-lens array may also be used in an augmented reality system, such as a virtual reality display's light field. The projection system is a common design used to transmit the original picture from the image source to near the focus of the diffractive coupler. These augmented reality systems with fictional optics, like the multiplexing approach in virtual reality displays, are not independent and may be merged to balance their distinct benefits and trade-offs.

People Also Ask

What Is An Augmented Reality?

Augmented reality is a perspective of the physical world that includes computer-generated components. These inputs might include audio to video, graphics to GPS overlays, and more.

Is Augmented Reality And Virtual Reality The Same?

Augmented reality (AR) uses a smartphone's camera to overlay digital components to a live scene. Snapchat glasses and the game Pokemon Go are two examples of augmented reality experiences.
Virtual reality (VR) means an entirely immersive experience that isolates the user from the outside world.

What Is The Impact Of Virtual And Augmented Reality?

People may use virtual and augmented reality to engage with one another and enhance their communication abilities. Furthermore, any previously known gadget could not give the user an outstanding and comprehensive experience.

What Is Virtual Reality And Augmented Reality?

Augmented reality (AR) enhances your surroundings by adding digital features to a live view, frequently utilizing a smartphone's camera. Virtual reality (VR) is an immersive experience that simulates a real-lifeworld.

Conclusion

Because of the various advanced architectures with unique features, such as reducing vergence–accommodation conflict through adjustable lenses, resolving compact size issues with polarizing films, and providing a large field of view through freeform optics, augmented and virtual reality displays have scientific significance as well as broad application prospects.
Although it is still difficult for these architectures to achieve all of the requirements for visual and wearable comfort at this level, knowing about and assessing advanced systems will help us focus on unsolved difficulties and inspire more elegant solutions.
Jump to
Suleman Shah

Suleman Shah

Author
Suleman Shah is a researcher and freelance writer. As a researcher, he has worked with MNS University of Agriculture, Multan (Pakistan) and Texas A & M University (USA). He regularly writes science articles and blogs for science news website immersse.com and open access publishers OA Publishing London and Scientific Times. He loves to keep himself updated on scientific developments and convert these developments into everyday language to update the readers about the developments in the scientific era. His primary research focus is Plant sciences, and he contributed to this field by publishing his research in scientific journals and presenting his work at many Conferences. Shah graduated from the University of Agriculture Faisalabad (Pakistan) and started his professional carrier with Jaffer Agro Services and later with the Agriculture Department of the Government of Pakistan. His research interest compelled and attracted him to proceed with his carrier in Plant sciences research. So, he started his Ph.D. in Soil Science at MNS University of Agriculture Multan (Pakistan). Later, he started working as a visiting scholar with Texas A&M University (USA). Shah’s experience with big Open Excess publishers like Springers, Frontiers, MDPI, etc., testified to his belief in Open Access as a barrier-removing mechanism between researchers and the readers of their research. Shah believes that Open Access is revolutionizing the publication process and benefitting research in all fields.
Han Ju

Han Ju

Reviewer
Hello! I'm Han Ju, the heart behind World Wide Journals. My life is a unique tapestry woven from the threads of news, spirituality, and science, enriched by melodies from my guitar. Raised amidst tales of the ancient and the arcane, I developed a keen eye for the stories that truly matter. Through my work, I seek to bridge the seen with the unseen, marrying the rigor of science with the depth of spirituality. Each article at World Wide Journals is a piece of this ongoing quest, blending analysis with personal reflection. Whether exploring quantum frontiers or strumming chords under the stars, my aim is to inspire and provoke thought, inviting you into a world where every discovery is a note in the grand symphony of existence. Welcome aboard this journey of insight and exploration, where curiosity leads and music guides.
Latest Articles
Popular Articles