Changes

Jump to navigation Jump to search
1,242 bytes removed ,  11:41, 29 October 2024
no edit summary
Line 11: Line 11:  
There may be a human visual system response similar to the [[vestibulo-ocular reflex]] in a system without positional tracking that can be counteracted using a long focal distance for something that is displayed. This concept can be built into a software system that has an IMU attached by disabling the nearest parts of the image if the head moves, to avoid VOR discomfort.
 
There may be a human visual system response similar to the [[vestibulo-ocular reflex]] in a system without positional tracking that can be counteracted using a long focal distance for something that is displayed. This concept can be built into a software system that has an IMU attached by disabling the nearest parts of the image if the head moves, to avoid VOR discomfort.
   −
NEDs are also known as [[Head-mounted Display|head-mounted displays]] (HMDs) and encompass electronic viewfinders. Bhakta et al. (2014) referred that “near-eye displays are the headphones of the display world, creating small, portable, personal viewing experiences.” They have several advantages over traditional displays, such has a compact size, being lightweight, demanding low power, and can be see-through, being able to produce a virtual image that looks like a big screen TV from a small form factor. Furthermore, NEDs can be placed in two general categories: immersive and see-through. Immersive NEDs block the user’s view of the real world, creating a large field of view image (e.g., VR headset). See-through NEDs allow for the user to see the real world, generating a transparent image or a very small opaque image that blocks a small portion of the user’s peripheral vision. Examples of see-through NEDs are [[augmented reality headset]]s or smart glasses like the [[Google Glass]].<ref name=”1”>Bhakta, V.R., Richuso, J. and Jain, A. (2014). DLP  Technology for Near Eye Display. Retrieved from http://www.ti.com/lit/wp/dlpa051/dlpa051.pdf</ref><ref name=”2”>Lanman, D. and Luebke, D. (2013). Near-Eye Light Field Displays. ACM Transactions on Graphics, 32(6)</ref><ref name=”3”>Stanford University. Near-Eye Light Field Displays. Retrieved from https://talks.stanford.edu/douglas-lanman-near-eye-light-field-displays/</ref>
+
Near-eye light field displays introduce a light-field-based approach to NEDs. This allows for thinner and lighter HMDs capable of depicting accurate accommodation, convergence, and binocular-disparity [[depth cue]]s. The two human eyes perceive the world slightly differently. In the same way, light rays that enter the pupil at different location will encode a slightly different picture of the world being observed <ref name=”3”>Stanford University. Near-Eye Light Field Displays. Retrieved from https://talks.stanford.edu/douglas-lanman-near-eye-light-field-displays/</ref><ref name=”4”> Fattal, D. (2016). The ultimate guide to 3D technologies. Retrieved from https://thenextweb.com/insider/2016/04/23/guide-to-3d-tech/#</ref>. A light field is composed of all the light rays at every point in space travelling in every direction. It is a 4D data, since every point in three-dimensional space is attributed a direction. This concept came about in the 1990s as a solution to problems in computer graphics and vision <ref name=”5”> LightField Forum. Refocus your Eyes: Nvidia presents Near-Eye Light Field Display Prototype. Retrieved from http://lightfield-forum.com/2013/07/refocus-your-eyes-nvidia-presents-near-eye-light-field-display-prototype/</ref>. Near-eye light field displays must independently render light rays that are coming from every direction through every point in space in order to trigger accommodation. Sharp images from out-of-focus display elements are depicted by synthesizing these light fields that correspond to virtual scenes located within the viewer’s natural accommodation range (Figure 1). Lanman and Luebke (2013) mention that “conventional displays are intended to emit light isotropically. In contrast, a light field display supports the control of tightly-clustered bundles of light rays, modulating radiance as a function of position and direction across its surface.”<ref name=”2”>Lanman, D. and Luebke, D. (2013). Near-Eye Light Field Displays. ACM Transactions on Graphics, 32(6)</ref><ref name=”3”></ref><ref name=”4”></ref>
   −
Near-eye light field displays introduce a light-field-based approach to NEDs. This allows for thinner and lighter HMDs capable of depicting accurate accommodation, convergence, and binocular-disparity [[depth cue]]s. The two human eyes perceive the world slightly differently. In the same way, light rays that enter the pupil at different location will encode a slightly different picture of the world being observed <ref name=”3”></ref><ref name=”4”> Fattal, D. (2016). The ultimate guide to 3D technologies. Retrieved from https://thenextweb.com/insider/2016/04/23/guide-to-3d-tech/#</ref>. A light field is composed of all the light rays at every point in space travelling in every direction. It is a 4D data, since every point in three-dimensional space is attributed a direction. This concept came about in the 1990s as a solution to problems in computer graphics and vision <ref name=”5”> LightField Forum. Refocus your Eyes: Nvidia presents Near-Eye Light Field Display Prototype. Retrieved from http://lightfield-forum.com/2013/07/refocus-your-eyes-nvidia-presents-near-eye-light-field-display-prototype/</ref>. Near-eye light field displays must independently render light rays that are coming from every direction through every point in space in order to trigger accommodation. Sharp images from out-of-focus display elements are depicted by synthesizing these light fields that correspond to virtual scenes located within the viewer’s natural accommodation range (Figure 1). Lanman and Luebke (2013) mention that “conventional displays are intended to emit light isotropically. In contrast, a light field display supports the control of tightly-clustered bundles of light rays, modulating radiance as a function of position and direction across its surface.” <ref name=”2”></ref><ref name=”3”></ref><ref name=”4”></ref>
  −
__NOTOC__
   
Traditional HMDs only provide a single display plane; without a proper focus cue, the display decouples accommodations from the vergence of the eyes. Since there is a mismatch, the observer has to rely only on the binocular vision to perceive a 3D space. This can lead to visual discomfort, fatique, eye strain, and headaches <ref name=”6”> Stanford Computational Imaging Lab (2015). The Light Field Stereoscope - SIGGGRAPH 2015 [Video]. Retrieved from https://www.youtube.com/watch?v=YJdMPUF8cDM</ref>.
 
Traditional HMDs only provide a single display plane; without a proper focus cue, the display decouples accommodations from the vergence of the eyes. Since there is a mismatch, the observer has to rely only on the binocular vision to perceive a 3D space. This can lead to visual discomfort, fatique, eye strain, and headaches <ref name=”6”> Stanford Computational Imaging Lab (2015). The Light Field Stereoscope - SIGGGRAPH 2015 [Video]. Retrieved from https://www.youtube.com/watch?v=YJdMPUF8cDM</ref>.
   Line 20: Line 18:     
With the [[Oculus Rift]], the commercial interest in HMDs increased. Indeed, over the last few years interest in VR has been increasing, both by researchers as well as consumers. NED technology has a vast range of applications besides gaming and entertainment. It can be applied in education, teleconferencing, scientific visualization, training and simulation, phobia treatment, and surgical training, for example. Immersive VR has also been shown to be effective in the treatment of post-traumatic stress disorder. For the continuing development of VR. It is essential to have a visually comfortable experience, such as diminishing the vergence-accommodation conflict that occurs in most HMDs. The improvement of light field displays is a path to the creation of better and more visually comfortable headsets <ref name=”2”></ref><ref name=”7”></ref><ref name=”8”> Huang, Fu-Chung, Chen, K. and Wetzstein, G. (2015). The Light Field Stereoscope | SIGGRAPH 2015. Retrieved from http://www.computationalimaging.org/publications/the-light-field-stereoscope/</ref>. Huang et al. (2015) wrote that “correct or nearly correct focus cues significantly improve stereoscopic correspondence matching, 3D shape perception becomes more veridical, and people can discriminate different depths better. Vergence and accommodation cues are neurally coupled in the human brain; it seems intuitive that displays supporting all depth cues improve visual comfort and performance in long-term experiences.” <ref name=”7”></ref>
 
With the [[Oculus Rift]], the commercial interest in HMDs increased. Indeed, over the last few years interest in VR has been increasing, both by researchers as well as consumers. NED technology has a vast range of applications besides gaming and entertainment. It can be applied in education, teleconferencing, scientific visualization, training and simulation, phobia treatment, and surgical training, for example. Immersive VR has also been shown to be effective in the treatment of post-traumatic stress disorder. For the continuing development of VR. It is essential to have a visually comfortable experience, such as diminishing the vergence-accommodation conflict that occurs in most HMDs. The improvement of light field displays is a path to the creation of better and more visually comfortable headsets <ref name=”2”></ref><ref name=”7”></ref><ref name=”8”> Huang, Fu-Chung, Chen, K. and Wetzstein, G. (2015). The Light Field Stereoscope | SIGGRAPH 2015. Retrieved from http://www.computationalimaging.org/publications/the-light-field-stereoscope/</ref>. Huang et al. (2015) wrote that “correct or nearly correct focus cues significantly improve stereoscopic correspondence matching, 3D shape perception becomes more veridical, and people can discriminate different depths better. Vergence and accommodation cues are neurally coupled in the human brain; it seems intuitive that displays supporting all depth cues improve visual comfort and performance in long-term experiences.” <ref name=”7”></ref>
 
+
__NOTOC__
 
==Prototypes==
 
==Prototypes==
 +
[[File:NE-LF prototype.png|thumb|NVIDIA’s near-eye light field display prototype (image: Lanman and Luebke, 2013)]]
   −
[[File:NE-LF prototype.png|thumb|Figure 2. NVIDIA’s near-eye light field display prototype (image: Lanman and Luebke, 2013)]]
+
[[File:Lightfield stereoscope.jpg|thumb|The [[light field stereoscope]] prototype (Image: Huang et al., 2015)]]
 
  −
[[File:Lightfield stereoscope.jpg|thumb|Figure 3. The [[light field stereoscope]] prototype (Image: Huang et al., 2015)]]
      
[[File:LFS images.jpg|thumb|Figure 4. Images with front and rear focus produced by the light field stereoscope (Image: Huang et al., 2015)]]
 
[[File:LFS images.jpg|thumb|Figure 4. Images with front and rear focus produced by the light field stereoscope (Image: Huang et al., 2015)]]

Navigation menu