The affordable sensors, chips, and high-resolution displays critical to rendering a decent VR experience were engineered for iPhones and Galaxys not Rifts and Vives. Early on, VR pioneer Oculus built prototypes with 1080p AMOLED displays from Samsung Galaxy S4 smartphones. The ideal experience is one in which the can’t discern nary a pixel on the screen, referred to as retina resolution.
The best VR displays are somewhere between super-screen-door and retina resolution. Retina resolution depends on a number of factors, one of which is how close the display is to your eyes. That means for VR to hit retina resolution, we’ll need displays with way more, way smaller pixels.
Luckily, science is on it, and with retina resolution laptops and phones yawn-worthy at this point, VR and AR are now key technologies driving cutting-edge research in high-res displays. In a recent example, a team of scientists led by Samsung’s Won-Jae Joo and Stanford’s Mark Brongersma published a paper in Science describing a new meta-OLED display that can pack in 10,000 pixels per inch with room to scale.
In comparison, today’s smartphone and VR displays are less than 1,000 pixels per inch. The team says current displays, sufficient for TVs or smartphones, can’t meet the pixel density needs of near-eye VR and AR applications.
They’re looking beyond headsets too, writing, “An ultrahigh density of 10,000 pixels per inch readily meets the requirements for the next-generation microdisplays that can be fabricated on glasses or contact lenses.” Of course, they aren’t alone in their quest for ultra-high-def, and the display is still firmly in the research phase, but it hints at what the future holds for stunning AR/VR experiences.
The new display was born from a breakthrough in solar cells, where Brongersma’s lab used optical metasurfaces-these are surfaces with built-in nanoscale structures to control a material’s properties-to manipulate light. He realized the same approach could be useful in organic light-emitting diode displays too. Some of the top displays in the world-like the ones in high-end televisions and iPhones-use OLEDs because they’re very thin and flexible and known for their deep, pure colors.
Currently, there are two ways to make OLED displays. The method has limitations both in how small the subpixels can be and how large the display can go. For larger displays like televisions, manufacturers opt for white OLEDs with red, blue, and green filters sitting on top of them.
Instead of filters or color-specific OLED materials, the new display makes use of a surface bristling with tiny silver nanopillars 80 nanometers high. Larger squares-containing red, blue, and green subpixels-make up the display’s pixels.
As the pixels are no longer shaded by filters-and due to a curious property of the metasurface that allows light to build up and resonate, a bit like sound in a musical instrument-the display’s color is very pure and achieves greater brightness with less power.
The team estimates the approach could yield pixel density up to 20,000 pixels per inch.
The next step is to make a full-size display, something Samsung is currently working to make happen. Of course, other research is going after ultrahigh resolution displays too. Displays over 1,000 pixels per inch are monochromatic, and full-color displays remain challenging. Still, quality virtual and augmented reality interfaces may solve a few of the less palatable side effects of 2D displays because it’s better tailored to how our brains operate in the world.