Stanford University researchers have developed a novel method for ordinary image sensors to sense light in three dimensions. With this technology, cameras will be able to estimate the distance between objects, resulting in a three-dimensional image that can be viewed on smartphones. Simply said, smartphone cameras may be able to capture three-dimensional images (3D).
LIDAR (light detection and ranging) works by shooting a laser at an object and measuring the light that bounces back. It can tell if an object is moving, how far it is going, how fast it is moving, and whether it is getting closer. . Existing LIDAR systems are cumbersome, but the Stanford research team’s LIDAR system was able to capture megapixel-resolution depth maps with a commercially accessible digital camera. They have used a cost-effective alternative to the existing LIDAR systems, that can only be seen in iPhone 13 Pro and iPhone 13 Pro Max.
These new approaches , could be the foundation for a new type of compact, low-cost, energy-efficient LIDAR. Having 3D imaging on a smartphone opens up a slew of possibilities for adding new features to fitness, health, and sports apps. Athletes in training, for example, might track and analyse their movements to get valuable biomechanical insights. This tool can be used to assist you correct your posture and angles during exercises or yoga.