The world through the eyes of the car. How do drones see it?
Progress in the field of unmanned vehicles in recent years has accelerated rapidly. Already on December ? 2018 drones can move freely along public roads in Moscow and Tatarstan. It seems a little more and we will be able to get into the car and go about our business, while our transport will take over the management of the entire process of driving. Dreams are dreams, but what is such a car really capable of and will it crowd out a person?
Toshiba Visconti processes the image from four cameras, evaluating the images at once by a variety of criteria: marking, moving and parked cars, traffic lights and signs, headlights, pedestrians and cyclists. After identifying and classifying objects into video, the processor transmits information to the “brain” of the machine, whose autopilot already decides on optimal behavior. This is how the advanced driver assistance system ADAS works, preventing collisions and attacks on pedestrians (second-level autonomy criterion).
A bunch of camera and Toshiba Visconti processor monitors the traffic situation better and more attentively than a person. Source: Toshiba
The full cycle of Toshiba Visconti of past generations, from receiving an image to issuing information with recognition results, took up to 100 ms. In Visconti ? the cycle was reduced to 50 ms. At best, the driver response time is 500 ms. During this time, the car traveling at 80 km /h will cover 11 meters - a long distance in case of a dangerous situation on the road.
Visconti also solves the problem of monocular vision - the processor is able to build a three-dimensional reconstruction of the space, analyzing the sequence of frames when moving. It works for both moving and stationary objects on the carriageway and beyond.
Cameras are not able to recognize distant objects and build detailed maps, moreover, their functionality directly depends on weather conditions. These deficiencies can be compensated by radars emitting radio signals with a frequency of tens of gigahertz. They perfectly define obstacles in space. Radars with a radiation frequency of 24 GHz and 77 GHz are already used in expensive ADAS systems for early braking when detecting the intersection of travel with a pedestrian or another car. Unlike cameras, radar has a very narrow angle of action, inversely proportional to the desired range. In addition, the radar has a high cost (at the level of $ ?000), which immediately limits the range of its use only representative and premium cars.
Radars do an excellent job of localizing objects, but without defining their shape and only in a narrow range.
Lidars are considered the most effective, but at the same time the most ambiguous sensor for autonomous cars. They build a detailed picture of the world around themselves with the help of laser beams, which are reflected from obstacles and come back. Moreover, the lidars do it with inaccessible to other sensors accuracy. With the help of a lidar, a car creates its own 3D map for tens of meters around, recognizing cars, people and any obstacles.
So the world sees the car with a lidar
However, there are more disadvantages to lidar than advantages. First, the lidars become helpless under heavy rain or during a snowfall - laser beams are reflected from water droplets and snow flakes. Secondly, the lidar should have a full circle view, which means that it creates a "hump" on the roof of the car. Thirdly, the lidars are not just expensive, but very expensive: the early Velodyne production models cost 75 thousand dollars, the modern Waymo designs cost ?500 dollars.
Line of lidar Velodyne. Source: Velodyne
The appearance of “solid-state” lidars without moving parts should reduce the cost of devices by orders of magnitude in the coming years. Velodyne claims to have made some kind of breakthrough that will reduce the price of lidars to $ 50.
Toshiba, in turn, is working to improve the efficiency of lidars. So, this year was submitted new chip 3r3190. uniting in itself chains for data analysis at long and short distances. This allowed the effective range of lidars to be doubled to 200 m, as well as to get rid of the problem of glare that affected the quality of reflections.
How it works for
For the implementation of autopilot in Tesla cars, a system of eight cameras with different angles and range of view, 12 ultrasonic sensors in a circle and long-range frontal radar are installed. Ultrasonic sensors are responsible for recognizing machines in adjacent rows and obstacles when driving at low speeds. Cameras are responsible for searching for pedestrians, cars, markings and signs. Helps them in this radar. For the route, GPS is used, and the sensors ensure that the car goes strictly along the lanes and avoids accidents. On the one hand, it allows you to use the Tesla autopilot in any cities. On the other hand, the autopilot still requires the attention of the driver to operate.
In Tesla, the lidar is deliberately not used, Ilon Musk openly opposes the lidars, justifying this with their price and problematic work in bad weather. It's hard not to agree with him - an additional 7-10 thousand dollars to the price and a “hump” on the roof would not add Tesla to its attractiveness.
No matter how good the bundle of cameras, radar and ultrasonic sensors looks, they have malfunctions. In 201? the Tesla Model S crashed into a road divider in autopilot mode, causing the driver to die. As shown
an investigation by the owners electric car, Tesla autopilot could not correctly read the erased markings, and cameras and radar, in turn, did not see the danger in the rapidly approaching steel barrier.
The Waymo system uses a lidar, five radars, eight cameras and GPS, and Chrysler Pacifica Hybrid cars (currently 60? 62 thousand units planned) and Jaguar I-PACE (plans 20 thousand units) are chosen as serial commercial carriers.
Electric Waymo Jaguar I-PACE is not as utilitarian as the spacious Chrysler Pacifica, but it looks amazing - even the lidar on the roof does not spoil the view. Copyright: Waymo
When moving, the Waymo system uses Google Street View data, checking it with its sensors. This ensures complete autonomy - unlike Tesla, Waymo cars really do not require driver intervention, but simply carry passengers. Unlike Tesla, Waymo does not sell cars, but the transportation service, that is, robotaxi.
Panoramic video of Waymo helps you understand how a stand-alone car recognizes the surrounding space.
The main disadvantage of Waymo is the extremely limited list of cities where the drones operate - for the correct functioning of the autopilot the urban environment should be shot in 3D, and this is a long and complicated procedure, therefore, while Waymo operates in only two dozen American cities. However, the expansion of the road network is only a matter of time. Big time.
Yandex presented its unmanned vehicle project just a year ago. The Toyota Prius was equipped with a block of lidar, cameras, radar, GPS and IMU, that is typical for autonomous car components. From Yandex, the drone got a software platform that showed itself well when driving along the cramped streets of Moscow’s Khamovniki district, and on a long journey from Moscow to Kazan.
The car of Yandex that reached Tatarstan remained there, becoming the first unmanned taxi in Russia. He now works in the city of Innopolis, carrying passengers between the five main points. And in October, a similar taxi appeared on the territory of Skolkovo. In the company's distant plans to bring unmanned taxis to the streets of cities on a commercial basis.
In 201? the state institute NAMI showed an unmanned “minibus” of SUTLE, then still no more than an experimental concept car. Two years later, KAMAZ-1221 ShATL was announced as a future serial project, which will be put on the conveyor in 2022. The mini-electric bus with lidars, cameras and ultrasonic sensors is still gently moving at a speed of 10 km /h, but as the software platform improves, they promise to tighten the speed to 110 km /h.
What will the drones change?
The exclusion of the human factor will allow to increase the marginal conditions of car operation - to increase the maximum speed, reduce the width of the lanes, reduce the distance between cars in the stream. As a result, the throughput of roads will significantly increase, the average speed will increase and the number of jams will decrease.
According to the American Highway Capacity Manual, one lane of the highway per hour passes about ?200 cars running by people. Various studies show that switching to autonomous cars will increase this figure to 7200-12000 cars per hour. Such an impressive leap in road use efficiency is achieved by increasing the safe speed and reducing the distance between cars on the lane from 40-50 meters to 6-7 meters - for cars that send each other data about their speed and intended maneuvers, this distance will be enough for safe movement.
However, we are still far from such an unmanned future. Production cars of famous automakers have just adapted the second level of autonomy, the best and most expensive models are preparing to step over to the third level. But in the next decade, it is not worth even dreaming about drones of the fifth level of autonomy - a person will be the main thing on the road for a long time.
It may be interesting