Software-Definable LiDAR Improves Safety, Reduces Vehicle Cost - EETimes

2022-07-15 19:52:56 By : Ms. Anna Qiu

The latest generation cars integrate an increasing number of advanced features designed to improve vehicle safety and allow for more efficient and comfortable driving. Solutions such as advanced driver assistance systems (ADAS) or different levels of autonomous driving (LiDAR) are now available on many vehicles, requiring advanced sensors with a high degree of accuracy and a competitive price.

Among these, LiDAR is an essential sensor to detect — within a maximum operating range — everything outside the vehicle. By obtaining an accurate map of the scenario of interest in real time, LiDAR allows you to identify objects or obstacles placed in the immediate vicinity of the vehicle. This can help the vehicle avoid possible collisions with other vehicles, cyclists, pedestrians, or animals.

LiDARs, which may have electromechanical moving parts or be entirely solid state depending on the model, are notoriously one of the most expensive on–board sensors integrated on vehicles. In this article, we will present a LiDAR sensor with a particularly competitive cost (an order of magnitude lower than traditional devices), characterized by high accuracy and a customizable software component capable of extending the possible number of applications.

Leveraging a long history in the defense industry, and the development of high–speed sensors and edge processing for active protection systems by their parent company, PreAct Technologies was founded in 2018 with the aim of creating collision detection systems for the automotive market. In 2019, the company began to work on ADAS and pre–crash applications with multiple OEMs and Tier Ones.

PreAct has not limited itself to the development of a single sensor, but has created a complete hardware and software solution able to meet the requirements of multiple applications in the automotive industry and beyond. The fully solid–state LiDAR is a flash–type sensor that illuminates the entire scene with each ‘flash’ — as opposed to more traditional LiDAR’s that use a laser to scan a scene line by line.

“We have developed a near–field flash LiDAR, but we consider ourselves more as a software company than a hardware company”, said Paul Drysch, founder and CEO at PreAct Technologies. “Rather than focus on manufacturing, our business model is to license our LiDAR reference design at a loss, perhaps even for free. We then provide control and application software–as–a–service that continually enhances the value proposition for our customers.”.

PreAct is currently negotiating contracts with European and U.S. Tier 1 and Tier 2 suppliers to produce these units and sell them through their commercial channels along with the required software licenses. The sensor, named TrueSense, is shown in Figure 1. It is a fast, low cost, accurate, and reliable continuous wave time–of–flight flash LiDAR that uses the indirect time of light principle (iTOF) to measure the distance of each pixel simultaneously.

“It is our software that enables us to work outside in harsh ambient conditions. That’s because we’ve made some breakthroughs on reducing the noise floor and stabilizing the 3D–point cloud that enables iTOF to work outside of the cabin”, said Kurt Brendley, founder and COO at PreAct Technologies. “We will continually shrink the form factor and cost as we transition from an FPGA–based product to a programmable system–on–a–chip that continues to support our software–as–a–service business model.”

Today, near–field sensing (up to 25 meters) can be performed using some combination of radar, ultrasound, and cameras, or LiDAR systems. The first option is inexpensive, but ineffective at the same time because frame rate and resolution are very low. Commercially available LiDARs offer high performance, but they are very expensive, slow, and have limited functionality in the near–field.

TrueSense, on the other hand, is based on inexpensive time–of–flight imaging chips and LED emitters, has a high–resolution RGB camera, and achieves sample rate up to 200 fps. It is an automotive grade sensor able to work even in bright sunlight and in all the typical scenarios where time–of–flight usually struggles.

TrueSense works in conjunction with two other PreAct products, TrueDrive and TrueSim, to provide customers with a complete object tracking solution. If TrueSense is the heart, TrueDrive is the brain of the near–field object detection and tracking solution. It converts sensor data acquired by one or more TrueSense sensors into integrated 3D–point clouds that are used to rapidly define and track objects. Each TrueDrive can connect and sync up to four TrueSense units enabling different configurations (see Figure 2). The device integrates a hyper ECU that uses parallel processing and targeted AI to quickly perform object definition and tracking, collision detection, volumetric measurements, and more.

TrueSim is a physics–based vehicle simulator that accurately models PreAct sensors and algorithms. It allows PreAct to simulate the sensors on a customer’s platform within a high–fidelity virtual environment. TrueSim is offered with a large library of vehicles‚ pedestrians‚ bicycles, traffic signs, maps, and more. Custom objects can be defined and added to the library as well. By offering accurate optics models (such as simulating retro–reflectors), TrueSim enables fast algorithm prototyping and ensures sensors will work for custom applications.

“With our own in–house simulation tool that emulates our LiDAR performance in a realistic environment, we can assess our ability to meet the customer’s requirements and, in parallel, we can start writing the perception software that would be plugged into that LiDAR”, Drysch said.

Due to its high accuracy, PreAct’s software–definable LiDAR is a solution also suitable for robotics and factory automation. In those applications, one or more TrueSense units can replace existing sensors, simplifying the integration and reducing the overall costs.

Among the Tier 1 suppliers collaborating with PreAct is ZF Friedrichshafen. They showed off some of their work last year at Plug and Play’s Startup Autobahn, which included an automatic door actuator. Basically, you walk up to your car, you perform a gesture, and PreAct’s LiDAR — mounted on the vehicle — detects not only your gesture, but also if there is some obstacle (i.e. a pole, trash garbage, fire hydrant) or oncoming traffic (pedestrian, bicyclist, car, or motorcycle) that could cause damage or prevent the door from opening safely. The same sensor can be used for curb detection, lane change assist, and more.

“As one would expect for a software–defined system with the capabilities of our flash LiDAR, we are constantly responding to customer demands in fields extending from automotive, to robotics and manufacturing. In addition, with software providing the basis for our solution to real–world operational problems, such as outdoor operations, it can be and is be continually modified to increase base performance”, Drysch said.

Maurizio Di Paolo Emilio has a Ph.D. in Physics and is a Telecommunications Engineer. He has worked on various international projects in the field of gravitational waves research designing a thermal compensation system, x-ray microbeams, and space technologies for communications and motor control. Since 2007, he has collaborated with several Italian and English blogs and magazines as a technical writer, specializing in electronics and technology. From 2015 to 2018, he was the editor-in-chief of Firmware and Elettronica Open Source. Maurizio enjoys writing and telling stories about Power Electronics, Wide Bandgap Semiconductors, Automotive, IoT, Digital, Energy, and Quantum. Maurizio is currently editor-in-chief of Power Electronics News and EEWeb, and European Correspondent of EE Times. He is the host of PowerUP, a podcast about power electronics. He has contributed to a number of technical and scientific articles as well as a couple of Springer books on energy harvesting and data acquisition and control systems.

Two remarks: More like a stereo-camera than a LiDAR. No problem with that. But, 20m sight @ 200km/h (as claimed) means 360ms, one third of a second reaction time! What should be done in that time? Steering? At a 200km/h lateral movement is rather limited Breaking? Top notch sport cars with cup tyres reach 11m/s² on dry tarmac. OK, reduces impact speed to 186 km/h. In theory, ignoring mechanical reaction time of the brakes. In practice more like 190km/h. pre-fasten seatbelts? Even at a 190 km/h of limited use. OK, 20m is sufficient for many urban traffic scenarios. That's it. /Carsten

Have just re-read the article following reading your comment and I don't see 200 km/hr (320mph), I see 200fps, otherwise agree with your comment that at that speed not a lot to be done at 20m sensor range.

You must Register or Login to post a comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.