List of figures

Figure Description

Figure 1

Levels of scenario abstraction (Source: [13])

Figure 2

From abstract to concrete

Figure 3

From concrete to abstract

Figure 4

An illustration of a set of traces accepted by an ASAM OpenSCENARIO model

Figure 5

An illustration of how the set of traces accepted by a scenario is a subset of the composition of the traces accepted by the scenario invocations, which themselves are subsets of the traces accepted by their type scenario.

Figure 6

Different behavior invocation overlappings allowed by overlap kinds equal, start, end, initial, and final

Figure 7

Different behavior invocation overlappings allowed by overlap kinds inside, full, and any

Figure 8

An illustration of minimum and maximum start-to-start offsets

Figure 9

An illustration of an ASAM OpenSCENARIO scenario execution state showing scenario instances and field bindings.

Figure 10

Entity overview

Figure 11

Yaw, pitch, and roll angle in an ISO 8855:2011 compliant coordinate system

Figure 12

Route-based s/t-coordinate system with origin at the beginning of the route

Figure 13

Vehicle coordinate system

Figure 14

Entity overview

Figure 15

Entity overview

Figure 16

Entity overview

Figure 17

Entity overview

Figure 18

Basic junction

Figure 19

Junction routes

Figure 20

A typical crossing

Figure 21

A crossing

Figure 22

Entity overview

Figure 23

Actions that prioritize exact reproduction

Figure 24

Actions that prioritize respecting physical movement constraints

Figure 25

A remain_stationary action

Figure 26

Crossing with line from free space points

Figure 27

Picture of crossing with specified start_angle

Figure 28

Understanding product testing

Figure 29

Recommending scenarios and parameter

Figure 30

Understanding AV/ADAS developer scenarios

Figure 31

Specifying regulation scenarios

Figure 32

Tracing back requirements

Figure 33

Sharing scenarios with other companies

Figure 34

Scenario sharing with auditors and regulators

Figure 35

Sharing scenarios with customers

Figure 36

Including traffic models and agents

Figure 37

Describing real world with scenarios

Figure 38

Creating self-checking scenarios

Figure 39

Creating abstract scenarios for documentation purposes

Figure 40

Re-utilizing scenarios for research

Figure 41

Deriving a simulation scenario from findings of a SOTIF analysis

Figure 42

Deriving new hazardous events, system insufficiencies or triggering conditions for SOTIF from findings during a simulation run

Figure 43

Replaying different variants of a critical scenario observed on the road in simulation, to derive new SOTIF insights from it.

Figure 44

Evaluating of residual risk because of unknown scenario

Figure 45

Integrating tools from other vendors

Figure 46

Processing and comparing results

Figure 47

Creating natural language scenarios without technical details

Figure 48

Tracing back verification

Figure 49

Workflow for cross-company scenario testing

Figure 50

Creating platform independent scenarios

Figure 51

Tracing back requirements

Figure 52

Specifying a driving mission

Figure 53

Accomplishing driving missions

Figure 54

Converting abstract test descriptions into scenarios

Figure 55

Reuse/combine of scenario elements to avoid copy-paste

Figure 56

Specifying test aspects

Figure 57

Converting between abstraction levels

Figure 58

Using real-world data for scenarios

Figure 59

Performing automated scenario execution

Figure 60

Using different tool chains

Figure 61

Converting abstract to concrete scenarios

Figure 62

Running tests in different environments

Figure 63

Describing test track scenarios

Figure 64

Measuring the verification progress

Figure 65

Migrating from ASAM OpenSCENARIO {VER_XML_LATEST} to ASAM OpenSCENARIO

Figure 66

Re-using constructs, artifacts and libraries

Figure 67

Migrating from ASAM OpenSCENARIO {VER_XML_LATEST} to ASAM OpenSCENARIO

Figure 68

Executing simulations randomly