Shenyi Translation Bureau is a compilation team of 36Kr. It focuses on the fields of technology, commerce, workplace, life and other fields, and focuses on introducing new technologies, new ideas and new trends abroad.
Editor’s note: The new crown epidemic caused most businesses to suspend business.Even if the development of unmanned vehicles will be affected?Of course, a road test that requires personnel cooperation cannot be carried out.However, many unmanned vehicle companies have put their work on the simulation. KYLE WIGGERS conducted an inventory of this. The original text was published on Venturebeat with the title: The challenges of developing autonomous vehicles during a pandemic.The length of the relationship, we published in two parts, this is the second half.
Related reading: During the epidemic of New Coronavirus, the development of unmanned vehicles encountered challenges (Part 1)
Aurora, a self-driving car company founded by former Waymo engineer Chris Urmson, claims that its Virtual Testing Suite platform has completed more than 1 million tests per day.This platform and other tools allow company engineers to quickly identify, review, classify, and convert most events and interesting road conditions into virtual tests, and then run thousands of tests to evaluate every change in the main code base.
The Virtual Testing Suite includes code base testing, perception testing, manual driving evaluation and simulation.Engineers have to write unit tests (for example, to see if the answer given by a method of computing speed is correct) and integration tests (for example, to see if the same method can work normally in other parts of the system).New work must first pass all relevant tests before it can be incorporated into larger code, so that engineers can identify and fix any problems.
Aurora created a series of special simulation perception tests based on actual log data.They said.I am developing “highly realistic” sensor simulations so that tests can be generated for rare and high-risk situations.Other experiments they regularly run with the Virtual Testing Suite include evaluation of the Aurora Driver (Aurora’s full-stack unmanned driving platform) to see how it performs in a series of driving benchmark tests.
Regardless of the nature of the test, their custom-designed tools will automatically extract information from Aurora’s log data (for example, pedestrian walking speed) and then implant it into various simulation models to save engineers time.
Above: Visualization of real driving data of a test car in Aurora.
The company said that in the months since Aurora suspended all actual testing, its vehicle operators have been working with the classification and labeling team to mine manual and autonomous driving data for road events and then convert them into simulated virtual tests.Aurora also said that he is developing new tools, such as Web applications designed to make it easier for engineers to simulate, while enhancing existing approaches to support the creation of new test scenarios.
An Aurora spokesperson also said that the company’s engineers are continuing to develop and improve the company’s on-board map-Aurora Atlas-mainly for the area where the Aurora Driver will drive after the restoration of the road test.They are also adding new maps to Cloud Atlas.This is a database specifically designed to store Atlas data, and can use machine learning models to automatically generate traffic lights and other annotations.
Advances in artificial intelligence and machine learning have made it easier to teach car drivers to drive on roads never seen before.Researchers at the MIT Computer Science and Artificial Intelligence Laboratory recently wrote a paper describing a method similar to Aurora ’s approach, using Virtual Image Synthesis and Transformation for Autonomy (VISTA, virtual image synthesis and autonomous conversion), A photorealistic simulator that uses only real-world corpora to synthesize vehicle tracks of potential perspective.VISTA can train a model, which can then navigate the vehicle on a road that has not been seen before, even if the car is in a simulated proximity to a collision location.
Urmson said in a statement: “In the long run, we do not expect COVID-19 to drag our progress, which is largely due to our investment in virtual testing. But it also shows that there is no need to drive, and at the same timeIt also guarantees the urgency of safe and fast transportation of people and goods in autonomous driving. So we are more committed to our mission than ever before, we will continue to recruit experts in various fields, and we will continue to pay everyone in the companyWe will do everything possible to advance the development of Aurora Driver. As long as our industry works together, originality, dedication and thoughtful leadership will allow us to go through this challenging period. ”
As the pioneer of Uber’s unmanned vehicle project, the Advanced Technology Department (ATG) currently retains a team that is still expanding its test set in Uber’s simulator based on test runway and road behavior data.Adrian Thompson, head of ATG system engineering and testing, said that every time they make any adjustments to their autonomous driving system software, they will automatically re-run the entire set of simulation tests.
ATG engineers also have their own tools, such as DataViz.This is a web version of the application developed by ATG and Uber’s data visualization team. It can be used to view how the car in the simulated environment interprets and perceives the virtual world.DataViz provides realistic displays of elements such as cars, ground images, lane markings and signs.For information generated by algorithms (such as object classification, prediction, planning, and looking forward, etc.), it will also perform abstract representation (through color and geometric coding, etc.).Figurative and abstract representation comorbidities together allow employees to inspect and debug the information collected through offline and online testing, as well as explore information during the creation of new scenarios.
Above: Uber’s autonomous visualization system, which is a web-based vehicle data platform.
Thompson said that Uber’s decision to accelerate the development of modeling and simulation tools has been paying off in the past two years.He said that in some cases, the company and began to use more than 2 million miles of sensor logs, and then combined with simulation to complete “most of the” AI training and verification work.
Thompson said: “Our AI model development work is basically not affected by the inability to go on the road. Our test track test is to verify the model, so we can maintain the original development rhythm during this period of time, if not accelerated.”
Thompson also said that compared to before the New Corona virus pandemic, the mileage of virtual cars in the Uber simulation environment has increased, which may not be surprising.He did not say clearly that the health crisis was the cause, but said that COVID-19 provided an opportunity to continue to expand the simulation.
“We have formulated a comprehensive strategic plan to further expand our simulation mileage. Our model-based development method makes our operations more robust under this new crown pandemic. Of course, this matterTo some extent there is a certain chance. “He also added.”In the foreseeable future, we will continue to rapidly expand our simulation capabilities. Even after this epidemic, we have no plan to reduce the simulated mileage.”
When forced to stop all actual road tests, Lyft is developing a new vehicle platform.Despite this, Jonny Dyer, engineering director of the Lyft L5 level autonomous driving department, said that the company is also “double betting” on the simulation, using the data of about 100,000 miles of its own unmanned vehicle driving in the real environment.Correct the simulated environment before verification.
Specifically, Lyft is improving the technology used in simulation to guide subjects (such as virtual pedestrians) to actually react to vehicles. Part of this is done with the help of AI and machine learning models.It also develops tools such as benchmarking frameworks that allow engineers to compare behavior detectors and improve performance. It also develops a dashboard that dynamically updates visualizations to help create diverse simulation content.
Dyer said that Lyft ’s focus is not on challenges such as analog camera, lidar, and radar sensor data. Instead, they focus more on traditional physics-based mechanisms and help identify the correct parameter set that can be used for simulationmethod.He said: “This is not to use a simulation model to play a large-scale game, but to use high fidelity to simulate a suitable driving. We focus on fidelity, to ensure that the simulation and real driving conditions are doneThe matter is closer. This is not just a question of simulated mileage, but also a question of the correctness of the simulation. ”
Dyer said that Lyft has also redesigned its verification strategy to allow for more evaluation of structural and dynamic simulations during the outbreak.The company originally planned to conduct actual testing before carrying out these steps (it will still be part of it)-but the home order forced their hardware engineers to shift their work from engineering to simulation.
For example, a senior computer engineer installed a high-performance server in her bedroom to run Lyft ’s self-driving car technology stack. This server has eight graphics cards and a powerful x86 processor.Heat dissipation, equipped with four desktop fans.Another engineer made an electrolytic corrosion device in his garage using the Raspberry Pi and the circuit board he bought on eBay.Another engineer turned the cricket ground in his backyard into the detection range of the lidar sensor, and also used full-size road signs to calibrate Lyft’s plan to integrate as many new sensors as possible.
Despite the great efforts made by unmanned vehicle companies in response to the New Crown crisis, a few people seem to be unavoidably damaged by the epidemic.Some experts assert that simulation cannot replace a real drive test.
In the simulation involving real data, a long-term challenge is that even if the original sensor has not been recorded, every scene must respond to the movement of the self-driving car.There are also photos or videos that do not capture any angles or perspectives. They must also be rendered or simulated using predictive models. This is because in the past simulations relied on computer-generated graphics and reality-based rendering (these renderings represent a certain degree ofThe rough version of the world).(It is worth noting that even the British startup Wayve, which mainly trains driverless models in a simulated environment, relies on feedback from safety drivers to fine-tune the model.)
A paper published by researchers at Carnegie Mellon University outlines other challenges faced by simulation that interfere with real-world hardware development:
Reality gap: The simulated environment does not always adequately represent physical reality—for example, a simulation without an accurate tire model may not be able to explain the behavior of the car when turning at high speeds in real life.
Resource cost: The computational overhead of simulation requires dedicated hardware such as a graphics card, which can lead to high cloud computing costs.According to a recent report by Synced, if you want to train a state-of-the-art machine learning model (such as Grover at the University of Washington) to generate and detect fake news, it may cost more than $ 25,000 in two weeks.
Reproducibility: No matter how good the simulator may contain uncertain elements, these uncertain elements will cause the test to be unable to reproduce.
In fact, Yandex, who still continues live road testing in permitted areas (such as Moscow), pointed out that although simulation can help in the development of autonomous vehicles, public testing is still crucial.The company said that shifting the research direction to full simulation without drive testing will slow down the development of autonomous vehicles in the short term, because it is possible to develop problem solutions and resources that require simulation with 100% accuracy and complexity.It will be as much as needed to develop autonomous driving technology itself.
A Yandex spokesperson said: “[If there is no real test,] autonomous driving companies cannot collect important real driving data.” [In addition,] simulated driving and driving on the test track can help prove that the vehicle meets the specific requirements of the laboratory environment.Requirements. But when driving on highways, autonomous driving platforms need to face more complex real-world dynamics, including different weather conditions and various pedestrian and driver behaviors. ”
In addition to exposing the automated driving system to these complex dynamics, Timothy B. Lee of Ars Technica also pointed out that the test can ensure that the failure rate of sensors and other hardware is low; that the car will choose a safe passenger landing site;Ensure that the fleet operators are well-trained to handle any emergencies.The test also allows the company to identify possible problems, such as whether there are enough vehicles available for rush hour service.
Dyer does not completely disagree with these views, but in general he is more optimistic about the prospects of simulation testing.He said that simulation is very suitable for structural and functional testing of test runway data, which occupies a large part of Lyft’s autonomous driving route map.
He said: “In fact, all simulations will be somewhat limited, because you have to use reality to correct and verify simulated driving. … So the simulation will not replace the road test quickly, [because] you ca n’t be in the simulation.I have done all the things. But my confirmation is that we are making great progress in the simulated environment. At this point, the epidemic has no impact at all. There are many things inevitable for large-scale engineering projects, such as technical debt, to be repairedInfrastructure, but if it is in the process of operating procedures, it is very difficult to repair. I think if you invest in these now, we will receive great returns once restarted. ”
Brian Collie, senior partner and managing director of Boston Consulting Group, and others are skeptics. He believes that the return of the epidemic to the commercialization of driverless car technology has been delayed for at least three years.Ford seems to have hinted that it will announce that it will postpone the plan to start the self-driving car service until 2022.The automaker has been cooperating with Argo AI and testing its strategy to enter the market through the pilot programs of Postmates, Walmart, Domino and local partners.
Karmel admits that there may be some bumps along the way, especially when Waymo’s test is suspended, but he said with confidence that the outbreak did not have a substantial impact on the planned launch.
Karmel said: “If you only focus on the synthetic mileage and don’t get it in the real world to test, it is difficult to see what actual progress we have made. Anyway, all we have to do is toGo study-even during the epidemic, we still have thousands of years of experience. ”