When Making Autonomous Vehicles Smart and Safe, It Helps to Have the Right Tools

Siemens Digital Industries Software has sponsored this post.

“Hop in your car—it’s taking you to Starbucks!”

That used to sound like a line from a sci-fi movie, but the progress being made towards a driverless automotive system seems increasingly inevitable to the public mind.

Autonomous vehicles (AVs) have enormous potential. However, that potential is predicated on an AV's ability to sense and perceive its surroundings, and make decisions independently in real-time. It falls to engineers to invest AVs with those abilities and assure passenger safety. The pressure is on.

With the right hardware and software, AVs can ensure passenger safety by appropriately dealing with any unpredictable scenario. The critical hardware component is the Artificial Intelligence (AI) chip, which needs to be customized for different usage and conditions. In addition, the software needs to synergize with the AI chip to achieve the fastest and most accurate analytics and decision-making (inference). Lastly, after the system is trained and optimized with specific simulations, it must be rigorously validated.

Autonomous vehicle development involves many moving parts, including a vast array of sensors, such as LiDAR and radar, and actuators. AI chips are essential for connecting and integrating these sensors and actuators. Riding on top of the chips is the software stack. AI chips take in data from the car sensors and then process the data with software so that the system can make accelerating, steering and braking decisions.

This article will describe how Xcelerator, the digital design and networking portfolio from Siemens enables engineers to develop autonomous vehicle technology, from Chip to City.

Why It’s Necessary to Customize AI Chips

Different vehicles have different usage patterns and operate in various environments. For example, family sedans and carrier trucks have different drivers, operating hours and routes. In addition, different vehicles contain and integrate assorted product features or sensors. Vehicle security requirements vary, as well. Therefore, each type of vehicle requires a specifically customized AI chip that can effectively collect and integrate the data for analytics.

Customizing the AI Chip and Choosing the Right Software

Developing and testing AI chips can be challenging. After the chip is designed and built, it needs to be tested through simulations in a virtual driving environment. However, traditional testing methods are inadequate for comprehensive environment testing.

Xcelerator provides software from leading suppliers to make this development job easier. It combines the Siemens Digital Innovation Platform with MindSphere, Siemens Cloud Solutions, Mentor Solutions and Mendix to allow for fast and easy construction, integration and extension of existing data and networking systems. For example, within the Xcelerator portfolio, the Catapult high-level synthesis (HLS) flow can drive chip design and shorten the entire design and verification process by 50 percent.

Testing the Chips

Siemens’s PAVE360 solution can provide a comprehensive digital twin, which is a virtual representation that incorporates a product’s mechanical, electrical and software aspects. By incorporating simulation, data analytics and AI capabilities, a digital twin can go beyond testing the functions of processors by providing a holistic view of the behavior of autonomous vehicles.

PAVE360, a 360-degree environment in which self-driving cars operate, includes automation hardware and software subsystems, full vehicle models, sensor data fusion, traffic flows and smart city simulations. In addition, PAVE360 can employ AI techniques to generate synthetic traffic conditions to extend testing scenarios.

The PAVE360 platform supports cross-ecosystem collaboration among carmakers, chipmakers, tier-one suppliers, software developers and other vendors. Sharing ideas and best practices can happen throughout design, development, customization, construction and testing. The result is a scalable, closed-loop design-simulation-emulation solution.

Vehicle Validation is Key

After the vehicle control system hardware and software components are integrated, they must be tested further and validated via simulation. Next comes applying simulation to train the system software. It must perform analytics and decision-making accurately and correctly when the autonomous vehicle is in the field. Simulation can also help optimize hardware and software synergy.

While real-world autonomous vehicle testing has increased over the years, the industry is still inching towards fully autonomous driving—Level 5 in the hierarchy outlined by the Society of Automotive Engineers (SAE):

  • Level 0: Manual driving
  • Level 1: Driver assistance
  • Level 2: Partial driving automation
  • Level 3: Conditional driving automation
  • Level 4: High driving automation
  • Level 5: Full driving automation

Even today, Level 2 and 3 deployments remain few and far between. Several high-profile crashes have increased the public’s skepticism about the feasibility of Level 4 or 5 driving, and the industry is far away from mass-producing Level 4 vehicles.

While testing and validation can prevent potentially costly in-field accidents, testing bottlenecks are hampering these efforts. It is not the physical testing which is causing bottlenecks, but rather the virtual simulation and training. One reason for these bottlenecks is that a huge number of scenarios require testing. Another is that autonomous driving differs from other simulations in that its success depends on the quality of data used to train the autonomous vehicles.

Siemens Simcenter Prescan360 software is an off-the-shelf, mass-scale solution for validating and verifying autonomous driving vehicles.

Scenario and Vehicle Modeling

Before releasing autonomous vehicles to the general public, it’s essential to simulate how they will perform under different scenarios. Such simulation helps to ensure the vehicles will behave as intended and make accurate and reliable decisions independently.

At the highest level of automation, an autonomous vehicle is expected to model the behavior of an alert and non-aggressive driver. On top of that, the autonomous vehicle will use probabilistic modeling to predict the likelihood of events in any given scenario: What is the probability of a frontal crash if the vehicle ahead suddenly brakes? How much time will the car or truck behind the vehicle that suddenly braked take to respond? What’s the risk of a lateral crash if a leading vehicle accepts a lane change? 

Numerous factors affect the environment in which autonomous vehicles operate. For example, even if we only take into account five variations in weather, five coefficients of tire friction and five types of visibility, we already have 125 possible scenarios. Testing all the plausible combinations of conditions is not feasible in a reasonable amount of time. Therefore, it’s imperative to automatically eliminate the infeasible combinations from the initial setup so that only the feasible scenarios are tested.

Scene templates in Simcenter Prescan360 can model dynamic scenarios. For example, sometimes a detailed model to simulate the vehicle’s behavior or response to the environment, people or road conditions is needed. In these cases, an appropriate fidelity model in MATLAB’s Simulink environment and Simcenter Amesim can be employed.

Communications

For an autonomous vehicle to navigate safely, it needs to acquire information from its environment via sensors such as cameras, radar and LiDAR, other cars nearby, the Global Positioning System (GPS), surrounding infrastructure and network services such as Google Maps and Network Time Protocol.

It is also critical to model the autonomous vehicle’s reaction if its communication with other cars, GPS and the network infrastructure is jammed. Modeling these reactions is key to ensuring the secure design and development of components, subsystems and systems.

Subsystem Modeling

Subsystems like algorithms need to have their parameters fine-tuned and validated to meet the intended design requirements. They can be verified with simulation using hardware-in-the-loop (HiL), software-in-the-loop (SiL) and/or model-in-the-loop (MiL).

Multi-physics system simulation software solutions such as Simcenter Amesim software and MATLAB’s Simulink environment can be utilized to gain insights into the system’s response before committing designs to hardware. For example, Simcenter Amesim and Simcenter Tyre can simulate systems such as road-to-tire friction for a specific circumstance.

(Image courtesy of Siemens Digital Industries Software.)

Urban Environment and Traffic Condition Databases

Data about previous accidents is a valuable source of scenario modeling. Using Simcenter Prescan360 to recreate these accidents can help understand why the accidents happened so that the system can learn via simulation. For example, accident databases such as the German In-Depth Accident Study (GIDAS) and the China In-Depth Accident Study (CIDAS) contain accident descriptions that can be imported into Simcenter Prescan360.

Integration with Requirements and Test Case Management

Simulating millions of miles alone is not enough to build confidence in autonomous vehicle safety. What’s needed are rigorous and more formal verification and validation of autonomous systems.

Polarion Requirements is a formal model toolbox that interfaces with Simcenter Prescan360. The toolbox receives the simulation output from Simcenter Prescan360. Using this output, the toolbox then makes a decision leading to a successful verification and/or to the generation of a counterexample or edge case. The counterexample and edge-case scenarios are looped back to the simulation platform for further processing and validation.

Scenario Space Exploration

In addition to algorithm optimization, it is critical to identify scenarios that make the autonomous vehicle fail (“falsification”). Hierarchical Evolutionary Engineering Design System (HEEDS) algorithms are used to generate relevant new scenarios to help identify such situations.

Connected Factory

The Xcelerator portfolio offers solutions for Industry 4.0. additive manufacturing (AM), also known as 3D printing, which is revolutionizing industrial production and creating lighter, stronger parts and systems.

AI technology can be deployed during the AM process to get production right the first time. Analyzing a full job right before execution of a 3D print can address the errors that originate from suboptimal scan strategies and process parameters. As a result, AI-driven AM can solve overheating challenges, reduce scrap and increase yield.

The number of variables that influence the quality of an additively manufactured component is too large for finite element analysis (FEA) to handle alone. In addition, conventional simulation cannot ensure the simultaneous emergence of parts and materials, and ordinary simulation can’t generate an optimal design by systematically varying design parameters, structure or shape. Conventional simulation is also slower than a trained AI algorithm by a factor of seven to ten.

Connected Factory Floor (MindSphere)

All the factory floor systems can be managed with MindSphere, a cloud-based Internet of Things (IoT) solution. MindSphere connects products, plants, systems and machines to harness the data generated by IoT and analyze it with AI.

AI on the Factory Floor and in Edge Devices

In addition to contributing to product design and testing, digital twins can virtualize and optimize production processes such as material flow, resource allocation and logistics. This enables manufacturers to produce more affordable, higher quality customized products quickly and efficiently.

For example, AI can take on the critical problem of unplanned downtime. A digital twin can collect data, use AI-based analytics to detect anomalies and schedule predictive maintenance proactively to avoid downtime.

AI can also be used to connect the factory to the vehicles on the road. All the data collected in the field can be fed back into the comprehensive digital twin. This feedback makes it possible to update models, fix issues, optimize systems and predict maintenance issues in the field and on the factory floor. The Xcelerator portfolio provides a unique solution from chip to city.

Edge computing, which moves data processing close to the source of the data (the edge), avoids the delays that occur when data is forced to travel from the edge to the central cloud for analytics. Real-time analytics become possible, enabling more timely decisions.

(Image courtesy of Siemens Digital Industries Software.)

Smart City Connection

AI can also transform a city’s infrastructure with digital solutions. For example, AI can predict traffic patterns based on real-time data collected on the road and on historical data. In addition, 5G technology will enable rapid communication among vehicles. That speedy communication means adjusting travel routes to minimize congestion becomes possible. Passenger and freight lanes of traffic could be managed in a similar way.


The introduction of Simcenter Prescan360 marks the launch of an off-the-shelf engineering environment for autonomous vehicles, one that realizes massive virtualization and verification of ADAS and AV technology, from Chip to City. With this solution, developers can now take a smarter approach to large-scale testing, verification, and validation of numerous autonomous vehicle aspects.

For more information, download Siemens’ white paper, “From Chip to City: AI’s influence on the future of autonomous mobility.”