Lucid Is Bringing AI-Enhanced 3D and Depth to Dual Camera Mobile and Smart Devices

Updated:2018/6/26 16:02

Device manufacturers can now leverage simple dual cameras with Lucid’s core software solution to deliver highly accurate human-like 3D and depth perception

Mobile World Congress Shanghai – June 26, 2018 — Lucid, the maker of the first VR180 3D camera, LucidCam, announces it is scaling its core 3D software technology into more mobile and smart devices featuring dual or multi cameras. Lucid’s proprietary real-time 3D Fusion technology, which mimics how the brain processes and learns from what humans see with two eyes, will enable device manufacturers to leverage dual and multi cameras to capture 3D and depth.

This expansion comes when Lucid has become profitable and is in strong growth mode, with revenues up 10x from a year ago. “We see this as a unique opportunity where our technology syncs with the acceleration of the industry as dual cameras move into many more devices,” said Han Jin, CEO, Lucid. As an example, just in mobile, 300 million units of phones were shipped with dual cameras in 2017, and analysts predict a 400 percent growth year over year in the phone market, reaching 50 percent penetration by 2020.

Lucid’s pure software solution provides the many devices featuring dual cameras, such as mobile phones, robots, drones, VR/AR and security cameras, human-like depth perception eliminating the need for expensive and space-consuming hardware solutions such as depth sensors, structured light and time of flight systems. “The way we as humans accurately perceive three dimensions and distances is not solely based on our two eyes but rather a combination of experience, learning and inference. As chips and servers begin to approach the processing power of our brains, we believe we can mimic this intelligence in software using AI and data on top of dual cameras,” said Jin.

Lucid is already incorporating its software into devices from several mobile phone, camera and robot makers, as well as working with laptop, drone and chip makers to enhance their next generation products. With the LucidCam the company started its research of replicating human vision capabilities, at a time when dual camera devices were a small minority of the market. “The depth information that is captured through the addition of the second camera is what separates a device from recommending you similar clothes to knowing much more, such as the precise size, shape, style, fit, and texture. This leads to a much bigger benefit for consumers, and thus belongs in many more products than just the LucidCam,” said Jin.

When Lucid was founded Jin and his team began by creating robot eyes and instead of building out the whole robot decided to focus on the eyes as a camera. Today the company is using the previously embedded software technology to enhance any dual camera device by inserting a unique “vision profile” during the manufacturing process. This vision profile is similar to a personalized prescription anyone receives from an optometrist following an eye exam. “Like a baseball player who can train to improve his vision, the vision profile in dual cameras will also continue to get better through continuous learning. In this way devices attain exceptional visual perception without sophisticated hardware,” said Jin.

Lucid’s 3D Fusion technology uses AI and machine learning coupled with historic data in the cloud to provide added vision intelligence to ultimately deliver the most accurate 3D/depth perception. This solution is ideal for any real-time 3D/depth enabled applications and supports even live feeds into holographic displays. The technology has been refined over the past three years, and has been integrated seamlessly into several devices which are already commercially available and selling in the millions of units globally. By offering a software only solution to serve the growing demand for 3D and depth in the industry, Lucid is helping manufacturers significantly reduce their hardware costs and development efforts.

For press release services, please email us at

Related News

Copyright© 2014 C114 All rights reserved.