Article 3 of 7 in our Content Series: There is a big difference between the 3D graphic requirement for sensor simulations for ADAS development and for fully immersing human driver test subjects. Dennis Marcus explains.
Cruden’s automotive customers use a variety of tools to simulate vehicle dynamics, sensors, traffic and more. All driver-in-the-loop simulation content starts with a 3D virtual environment but the nature, extent and components of that environment will differ depending on the simulation work that needs to be done. The schematic above identifies two different simulation paths – engineering-centric and human-centric – as defined by their different aims and tools. This distinction is relevant to discussions about 3D content requirements.
Content for sensor simulation: the engineering approach
The environment for testing vehicle sensors and how they collaborate in sensor simulation or sensor fusion requires scenario editors and tools that enable engineers to quickly devise road layouts to simulate new use cases. This is the engineering-centric approach represented in the upper level of the schematic of the ADAS/AD virtual test environment.
Here, the visuals need not be photo-realistic, just sufficient for engineering use. The 3D element is presented as part of the simulation only when required.
Content for driver-in-the-loop simulation: the human approach
If the sensor for which you’re building content is the human eye however, then you need more detail, and a different type of detail in order to create the most realistic view; one that will impress and immerse a human driver in a driver-in-the-loop (DIL) simulator. This is represented by the lower level of the schematic.
As we saw in the previous article, Cruden only creates highly detailed, graphically rich, human-centric 3D content for its driving simulators. In fact, we believe that the Cruden artists’ 3D work is the best level of driving simulator graphics available given current visualization technology.
For engineers to get the most out of sessions with a large number of regular drivers in a driving simulator, these non-expert participants must be presented with a realistic environment. Without it, they will never get fully immersed and will behave as if they’re driving in a game or as part of an experiment, rather than as they would in a real car. Here as in most other situations, you need to look after the graphical quality in order to use a driving simulator properly. Forget this and the driver feedback data being collected is not too valuable.
You need to make the road or track as realistic in the simulator as it is in the real world because the feel is all-important if the simulator is going to be of value as an engineering tool.
The driving simulation will still require sensors of course, that provide an object list and situational awareness to challenge the ADAS controller to make its decision and to present the human with a realistic scenario. Is there a car in front? What speed is it doing? for example. But where the focus is on the driver, there does not need to be the same strong focus on the sensors. They do not need to be fully modelled.
We believe that in 95% of cases where a simulator is used for driver-in-the-loop testing for ADAS and AD experiments, ground truth sensors are sufficient. Yet we see many OEMs and Tiers 1s use their advanced physics based sensor models in their driving simulators while developing and validating ADAS controllers.
The problem of this way of working is that you will end up driving for hours in your simulator waiting for a false positive to happen and then to evaluate the impact this might have on the driver! We read about the billions of kilometers of testing required for AD but that’s to develop sensor systems and AD controllers, something that all can be done using off-line simulation. It’s not for human testing, which should be focused on the interaction between the driver and the automated systems in the car. However, for DIL simulation, the billions of kilometers in off-line simulation are very important as well, as they provide the input for the DIL testing scenarios.
Where 3D content disappoints
When automotive simulation engineers talk to external third-party content suppliers about modelling graphics, they may find companies that know how to create 3D models of roads and houses, but can they do it properly? Do they know how to make a model that synchronizes with the information such as the OpenDrive road definition, point cloud or mesh?
For a DIL simulator experience to work, the different content layers have to match. The tire models interact with the detailed point cloud definition of the road, while the graphics models of the tires should be on the road surface and not appear either above or below the road. It’s hard. There are many outfits offering 3D graphics design, but there are very few that can do it properly for use in a driving simulator used for automotive engineering.
From years of working in vehicle dynamics simulation, Cruden understands the science of 3D content and modelling and how to get it right, for the most realistic simulation. Just as a vehicle model uses the road definition and interaction with the tire model to calculate the steering torque, in ADAS controller testing, vehicle models interact with the 3D world in a similar way but on other elements of the virtual environment.
Furthermore, Cruden works with and deeply understands OpenDrive layers, the logical definition of the road that has become the standard in automotive simulation. This will tell the simulation exactly where the car is and supply information such as number of lanes, locations of sidewalks and highway entries.
But creating Open Drive layers is not easy and many 3D graphics design companies are unfamiliar with OpenDrive. When using a 3D model in an automotive simulator the accuracy of the OpenDrive definition is essential.
Watch our video below to learn more about this topic.
To speak to an expert about 3D content creation for your driving simulation, please contact Dennis Marcus via email@example.com or on +31 20 707 4646.
If you have enjoyed this article, why not sign up to receive more like this via our occasional emails.
Other articles in the series:
Article 6: Rendering a world of new possibilities