A research team of Stanford researchers has developed a new AI-powered holographic AR technology that is said to be thinner, lighter and of higher quality than anything seen before.

The Stanford lab’s device has a thinner layer of holographic components than before, which can fit almost inside regular eyeglass frames and can be trained to project realistic, full-color, moving 3D images that appear at different depths.

Like other AR glasses, they use waveguides, which is the part that directs light through the glasses and into the wearer’s eyes. Researchers say they have developed a unique nanophotonic metasurface waveguide that could eliminate the need for bulky optics and a physical waveguide model that uses AI algorithms to drastically improve image quality. According to the study, the models are automatically calibrated using camera feedback.

According to Doctor Gun-Yeal Lee, there is no other AR system that compares in both capability and compactness.

Companies like Meta have spent billions buying and building AR glasses technology, hoping to eventually produce products the size and shape of regular glasses.

Source: The Verge