Solid-State FLASH LiDAR with a Revolutionary New Architecture
Stay informed about our latest advancements and product updates.
Tesla boss Elon Musk has no use for LiDAR, the laser radar sensing system most automakers install on their autonomous vehicles. The CEO of North Carolina-based startup Sense Photonics is confident his company has come up with a LiDAR system that actually “sees” better than existing systems with a system called flash LiDAR architecture..
Read more >>
The future of 3D sensing requires a new approach to 3D image capture. Rather than using complex scanning solutions to create an image pixel-by-pixel, we believe the best approach is to capture the entire frame at once—just like a camera. We call this Flash LiDAR and it delivers high performance 3D sensing that is simple, modular, scalable and reliable.
This new paradigm for 3D depth sensing addresses a variety of industries:
Proprietary Laser Array
We make our own lasers. Our patented process not only dramatically cuts the cost of the emitter, but also enables a new kind of flexible laser array.
This enables us to deliver, for the first time, high-powered, long range Flash LiDAR across very wide vertical and horizontal fields of view.
Our LiDAR system operates similarly to a camera by using a solid-state, high-powered, laser emitter coupled with a solid-state sensor with many pixels.
Our system does not scan in any way—no spinning, no rotating mirrors, no pivoting MEMS mirrors—no mechanical movement of any kind.
Solid-state technology simplifies manufacturing, enhances reliability and eliminates concerns about vibrations or recalibration during the product’s lifetime.
Like other LiDAR systems, ours generates 3D point clouds. However, because it captures information like a 2D camera, it also natively outputs depth images without the need for additional processing. This enables easy integration with high resolution RGB data.
Our LiDAR also captures intensity data like a near infrared camera, enabling it to produce monochromatic images that are spatially and temporally correlated with the 3D data. These images have sufficient resolution to be processed directly into deep learning algorithms developed for standard cameras and can address situations where standard cameras fail due to lighting conditions.
To address the concerns of operating outdoors or in the presence of multiple LiDAR systems, we’ve included ambient light suppression and interference mitigation into our 3D sensor.
This enables our systems to work well both in bright sunlight and in the dark. And, because we have coded the information, our LiDAR systems do not interfere with each other when facing each other or imaging the same target.