Line 11:
Line 11:
There may be a human visual system response similar to the [[vestibulo-ocular reflex]] in a system without positional tracking that can be counteracted using a long focal distance for something that is displayed. This concept can be built into a software system that has an IMU attached by disabling the nearest parts of the image if the head moves, to avoid VOR discomfort.
There may be a human visual system response similar to the [[vestibulo-ocular reflex]] in a system without positional tracking that can be counteracted using a long focal distance for something that is displayed. This concept can be built into a software system that has an IMU attached by disabling the nearest parts of the image if the head moves, to avoid VOR discomfort.
−
Near-eye displays (NEDs) project images into a viewer’s eyes, creating a virtual image in his field of view. The image appears at a distance, and larger than the small display panel and optics used to create it. However, according to [[Doug Lanman]] and Luebke (2013), these kind of displays have a fundamental problem: the unaided human eye cannot accommodate (focus) on objects placed in close proximity.<ref name=”1”> Bhakta, V.R., Richuso, J. and Jain, A. (2014). DLP Technology for Near Eye Display. Retrieved from http://www.ti.com/lit/wp/dlpa051/dlpa051.pdf</ref><ref name=”2”> Lanman, D. and Luebke, D. (2013). Near-Eye Light Field Displays. ACM Transactions on Graphics, 32(6)</ref><ref name=”3”> Stanford University. Near-Eye Light Field Displays. Retrieved from https://talks.stanford.edu/douglas-lanman-near-eye-light-field-displays/</ref>
+
NEDs are also known as [[Head-mounted Display|head-mounted displays]] (HMDs) and encompass electronic viewfinders. Bhakta et al. (2014) referred that “near-eye displays are the headphones of the display world, creating small, portable, personal viewing experiences.” They have several advantages over traditional displays, such has a compact size, being lightweight, demanding low power, and can be see-through, being able to produce a virtual image that looks like a big screen TV from a small form factor. Furthermore, NEDs can be placed in two general categories: immersive and see-through. Immersive NEDs block the user’s view of the real world, creating a large field of view image (e.g., VR headset). See-through NEDs allow for the user to see the real world, generating a transparent image or a very small opaque image that blocks a small portion of the user’s peripheral vision. Examples of see-through NEDs are [[augmented reality headset]]s or smart glasses like the [[Google Glass]].<ref name=”1”>Bhakta, V.R., Richuso, J. and Jain, A. (2014). DLP Technology for Near Eye Display. Retrieved from http://www.ti.com/lit/wp/dlpa051/dlpa051.pdf</ref><ref name=”2”>Lanman, D. and Luebke, D. (2013). Near-Eye Light Field Displays. ACM Transactions on Graphics, 32(6)</ref><ref name=”3”>Stanford University. Near-Eye Light Field Displays. Retrieved from https://talks.stanford.edu/douglas-lanman-near-eye-light-field-displays/</ref>
−
−
NEDs are also known as [[Head-mounted Display|head-mounted displays]] (HMDs) and encompass electronic viewfinders. Bhakta et al. (2014) referred that “near-eye displays are the headphones of the display world, creating small, portable, personal viewing experiences.” They have several advantages over traditional displays, such has a compact size, being lightweight, demanding low power, and can be see-through, being able to produce a virtual image that looks like a big screen TV from a small form factor. Furthermore, NEDs can be placed in two general categories: immersive and see-through. Immersive NEDs block the user’s view of the real world, creating a large field of view image (e.g., VR headset). See-through NEDs allow for the user to see the real world, generating a transparent image or a very small opaque image that blocks a small portion of the user’s peripheral vision. Examples of see-through NEDs are [[augmented reality headset]]s or smart glasses like the [[Google Glass]] <ref name=”1”></ref><ref name=”2”></ref><ref name=”3”></ref>.
Near-eye light field displays introduce a light-field-based approach to NEDs. This allows for thinner and lighter HMDs capable of depicting accurate accommodation, convergence, and binocular-disparity [[depth cue]]s. The two human eyes perceive the world slightly differently. In the same way, light rays that enter the pupil at different location will encode a slightly different picture of the world being observed <ref name=”3”></ref><ref name=”4”> Fattal, D. (2016). The ultimate guide to 3D technologies. Retrieved from https://thenextweb.com/insider/2016/04/23/guide-to-3d-tech/#</ref>. A light field is composed of all the light rays at every point in space travelling in every direction. It is a 4D data, since every point in three-dimensional space is attributed a direction. This concept came about in the 1990s as a solution to problems in computer graphics and vision <ref name=”5”> LightField Forum. Refocus your Eyes: Nvidia presents Near-Eye Light Field Display Prototype. Retrieved from http://lightfield-forum.com/2013/07/refocus-your-eyes-nvidia-presents-near-eye-light-field-display-prototype/</ref>. Near-eye light field displays must independently render light rays that are coming from every direction through every point in space in order to trigger accommodation. Sharp images from out-of-focus display elements are depicted by synthesizing these light fields that correspond to virtual scenes located within the viewer’s natural accommodation range (Figure 1). Lanman and Luebke (2013) mention that “conventional displays are intended to emit light isotropically. In contrast, a light field display supports the control of tightly-clustered bundles of light rays, modulating radiance as a function of position and direction across its surface.” <ref name=”2”></ref><ref name=”3”></ref><ref name=”4”></ref>
Near-eye light field displays introduce a light-field-based approach to NEDs. This allows for thinner and lighter HMDs capable of depicting accurate accommodation, convergence, and binocular-disparity [[depth cue]]s. The two human eyes perceive the world slightly differently. In the same way, light rays that enter the pupil at different location will encode a slightly different picture of the world being observed <ref name=”3”></ref><ref name=”4”> Fattal, D. (2016). The ultimate guide to 3D technologies. Retrieved from https://thenextweb.com/insider/2016/04/23/guide-to-3d-tech/#</ref>. A light field is composed of all the light rays at every point in space travelling in every direction. It is a 4D data, since every point in three-dimensional space is attributed a direction. This concept came about in the 1990s as a solution to problems in computer graphics and vision <ref name=”5”> LightField Forum. Refocus your Eyes: Nvidia presents Near-Eye Light Field Display Prototype. Retrieved from http://lightfield-forum.com/2013/07/refocus-your-eyes-nvidia-presents-near-eye-light-field-display-prototype/</ref>. Near-eye light field displays must independently render light rays that are coming from every direction through every point in space in order to trigger accommodation. Sharp images from out-of-focus display elements are depicted by synthesizing these light fields that correspond to virtual scenes located within the viewer’s natural accommodation range (Figure 1). Lanman and Luebke (2013) mention that “conventional displays are intended to emit light isotropically. In contrast, a light field display supports the control of tightly-clustered bundles of light rays, modulating radiance as a function of position and direction across its surface.” <ref name=”2”></ref><ref name=”3”></ref><ref name=”4”></ref>