BASELABS Create Embedded - Frequently Asked Questions

This page answers typical questions about sensor fusion technology, integration, and safety aspects of the sensor fusion library BASELABS Create Embedded.

1 What type of problem is addressed by BASELABS Create Embedded?

When developing automated driving functions like Autonomous Emergency Braking (AEB) or piloted driving, data from several sensors and sensor types need to be combined to provide a unified representation of the environment – the so-called environmental model – to the actual driving function. Such a data fusion is a complex software that is cost-intensive if developed manually, especially if it runs on embedded devices and in safety-critical contexts. BASELABS Create Embedded is an embedded data fusion library that can be configured for a specific sensor set and driving function. It takes in data from any number of radar, camera, and lidar sensors and provides a unified list of objects. Its dependency-free and customizable C-code compiles and runs on typical automotive embedded hardware. It is production-ready due to its ISO 26262 compliance.

2 What kind of sensor fusion is contained in BASELABS Create Embedded?

BASELABS Create Embedded provides the configuration, generation, and test of so-called object fusion systems. By this, several perception issues are addressed, e.g.:

  • Missing detections / false negatives
  • Clutter detections / false positives
  • Sensor measurement errors
  • Overlapping and complementary fields of view
  • Heterogeneous sensor technologies
  • Temporal synchronization

These systems take in object-level data from an arbitrary number of sensors (see also question 4) and output a unified object-level representation (see also question 5).

3 How can I integrate the sensor fusion system in my middleware, e.g. ROS, ADTF, vADASdeveloper, RTMaps, or AUTOSAR?

From the configured sensor setup, BASELABS Create Embedded creates a dedicated data fusion system that is contained in a C library which is middleware-independent and thus, allows for the integration in any middleware that supports C/C++ libraries.

For each configured sensor, the created data fusion library provides a dedicated function that incorporates the latest sensor measurements into the data fusion result. For an exemplary 2-sensor system, this would result in two functions, one for each sensor:

Each of these functions takes in the time or timestamp of the measurements and the measurements itself. Note that the type of the measurements is sensor-specific, its content depends on the system configuration (see also question 4).

Additionally, a function for the processing of the ego-motion quantities velocity and yaw rate is provided:

To integrate the created data fusion library into a middleware, these functions need to be called, e.g. immediately when new measurements are available or according to more sophisticated execution schemas like deterministic buffering (see also question 10).

To get the result of the data fusion, a function to retrieve the data fusion result is part of the data fusion library (see also question 9):

3.1 ROS Integration

BASELABS Create Embedded optionally generates the source code that integrates the data fusion library into a data fusion ROS node which calls the aforementioned functions.

 

For each configured sensor, the generated data fusion node subscribes to the sensor-specific topic that holds the sensor measurements, calls the sensor-specific processing function from the created data fusion library and publishes the data fusion result.

3.2 vADASdeveloper Integration

BASELABS Create Embedded optionally generates the source code that integrates the data fusion library in an vADASdeveloper component that calls the aforementioned functions. For each configured sensor, the generated data fusion component provides a sensor-specific input pin that receives the sensor measurements, calls the sensor-specific processing function from the created data fusion library and provides the data fusion result on an output pin.

3.3 Integration for RTMaps, ADTF, Simulink and custom middleware

The created data fusion library integrates like other C libraries. For that, include the created header file and initialize the data fusion library:

Each time a sensor provides new measurements, convert its data to the sensor-specific data structure (here the example of a radar is shown)

and call the sensor specific processing function from the created data fusion library:

To retrieve the data fusion result, call the data_fusion_get_tracks() function from the data fusion library (see also question 8):

Now, you may access the data fusion result for each track or object:

4 Which sensor interfaces are supported?

BASELABS Create Embedded supports any sensor which provides detections on an object level.

For each sensor type, a dedicated sensor interface is used. These interfaces have in common that they contain a list of object detections, bounding boxes, or tracks. The type or interface of the list elements is highly flexible and configurable – the interface/code listings below are just examples. BASELABS Create Embedded ships with typical interfaces for camera, radar, and lidar sensors and allows for direct usage of many sensors. These interfaces can be easily extended if any of your sensors provide additional information you would like to use in your data fusion.

Camera: Both bounding boxes and tracked objects are supported.

  • Example bounding boxes: The sensor or an additional pre-processing such as a detector based on a deep learning network has already analyzed the raw image from the camera and provides detected objects or detections via its interface, e.g. a list of bounding boxes where each bounding box consists of the object’s row, column, and width in the image plane and the object’s classification (car/truck/pedestrian/…).
    Sensor example: In-house sensor and/or image detection algorithm development
  •  Example tracked objects: BASELABS Create Embedded also supports “smart” cameras that provide objects or tracks in the vehicle frame, i.e. a list of tracks where each track consists of the object’s position x/y, its velocity, acceleration, size, and classification. Sensor example: Mobileye EyeQ

Radar: Both object detections and tracked objects are supported.

Most automotive radar sensors can be directly used. They provide either object detections where each detection typically consists of range, azimuth angle, and Doppler speed or tracked objects where each track contains x, y, vx, and vy. Optionally, radar detections or tracks may provide classification information (car/truck/pedestrian/…) and the radar cross-section (RCS) value.
Sensor examples: Continental ARS, Bosch MRR, Delphi/APTIV ESR

Lidar: Both bounding boxes and tracked objects are supported.

The sensor or an additional pre-processing like DBSCAN or a detector based on a deep learning network needs to provide extracted object detections from the lidar point cloud, e.g. a list of bounding boxes where each 2-dimensional or 3-dimensional bounding box consists of the object center x, y (and z) and extension width, length (and height), and if available a classification. Sensor examples: Valeo Scala, Ibeo HAD

Custom sensors: Any custom sensor on an object level is supported.

If one of your sensors’ interface contains quantities which are not supported by any of the standard sensor interfaces Create Embedded ships with, you can easily add new sensor interfaces using the extension wizards.

5 What does the output of the sensor fusion contain?

The output of the data fusion is a list of objects or tracks where each object contains:

  • Kinematic states like position (x/y), velocity (vx/vy), acceleration (ax/ay) (any combination),
  • Classification like car, truck or pedestrian (classification value customizable, optional),
  • Additional quantities like length and width (optional),
  • An arbitrary number of additional custom quantities (optional),
  • Confidence level/track score/existence probability,
  • Unique identifier (ID),
  • Error/uncertainty representations like covariance matrix(es) and probability distribution(s).

The aforementioned quantities can be configured and arbitrarily combined that a data fusion A might provide position, velocity, and class only while another data fusion B provides all mentioned quantities plus some custom value. The following table shows the resulting C data structures for both exemplary systems:

To define custom data fusion outputs or interfaces, BASELABS Create Embedded provides several extension wizards that support the creation of specific data fusion interfaces:

6 How can I handle sensors with (partially) overlapping fields of view?

For data fusion, it is crucial to handle sensor setups with overlapping fields of view as this provides the redundancy that reduces false alarms, increases accuracy, and improves availability.

BASELABS Create Embedded provides an integrated way to handle many different sensor constellations such as (partially) overlapping fields of view. It is based on a central state for each object which is continuously updated by the measurements of the different sensors. Besides the actual object quantities like position, velocity, and size, this central state also includes a value that qualifies an object or track as existing or not – the so-called existence probability. Typically, this value needs to be large until a driving function would react to it. As this value is very crucial for the reliability of the driving function, it needs to be properly determined. For this, BASELABS Create Embedded needs to know when (or under which circumstances) a sensor can detect an object. This is typically the case if the object is within the sensor's field of view, but other criteria can also influence the detectability of an object. For example, the detector stage of a camera may have been trained to detect only rear views of vehicles. In this case, the detectability or the probability of object detection by the sensor would be very low or even zero. With this knowledge about the sensor’s detection characteristics at hand, the data fusion is now aware that a crossing object cannot be detected by the mentioned exemplary camera sensor. Although the object is within the sensor’s field of view and thus will not reduce the existence probability for this object if the sensor does not provide a detection or measurement for it. Vice versa, if the sensor’s characteristics allow for the detection of an object but the sensor does not provide a detection or measurement for it, then the data fusion will decrease the existence probability for that object, as it is now more likely that the object or track represents a false alarm. The data fusion provided by BASELABS Create Embedded incorporates such sensor characteristics using so-called detection models.

For each configured sensor, a dedicated detection model provides information about the sensor’s detection characteristics, i.e. when the sensor can detect an object. Since each object is represented by a central state that is updated independently for each sensor. This state automatically benefits if the object can be observed by more than one sensor at a given position, i.e. if the object is currently located within an overlapping zone.

Most detection models limit the sensor’s detectability to the sensor’s field of view, e.g. by using the maximum detection range and opening angle as parameters. However, BASELABS Create Embedded supports custom descriptions of the field of view by its API, e.g. using a polygon. Additionally, this API allows for the implementation of sophisticated detection models, e.g. depending on the object type and the location, different detection probabilities can be set.

7 How can I reduce false alarms?

To reduce false alarms in general, a track should be confirmed several times by measurements or detections. These detections may come from one sensor at different timestamps or from multiple sensors.

In BASELABS Create Embedded, the degree of track confidence is expressed by the so-called existence probability. A value close to one indicates that the system believes that this track represents an existing object. A value close to zero indicates that the system is sure that the track does not represent an existing object – the track is likely to be clutter or a false alarm. The existence probability changes over time and mainly depends on the detection characteristics of the sensor (see question 6), the number of sensors that can detect the object for a given object position and the measurement uncertainty of the sensors.

Typically, the driving function defines the minimum existence probability a track needs until the driving function reacts to it. Below you can find a code example that illustrates how the existence value of a track is compared against a threshold:

8 How can I compensate for missing detections? How can I use asynchronous sensors?

Although the posed questions address different issues, they have a common cause.
‘Missing detections’ means that a sensor does not provide detection(s) for a tracked object even though the sensor’s detection model claims high detectability for this object (see also question
6).
‘Asynchronous sensors’ provide their measurements at different timestamps. If one sensor provides its measurements from time A, then the resulting central data fusion state will also be at time A. If now another sensor provides its measurements from time B, the central data fusion state needs to be somehow synchronized to time B as well.
In both cases, measurements are not available for a given timestamp, i.e. they are not available when they are needed.

To overcome this limitation, the central data fusion state needs to be synchronized to the requested timestamp. This synchronization is often referred to as prediction and requires a so-called system or motion model. Motion models take in a state at a given timestamp and predict it to another timestamp. BASELABS Create Embedded ships with multiple motion models that can be selected and configured. Custom motion models can be easily added:

With the motion model at hand, BASELABS Create Embedded compensates sensor outages or missing detections for a short time by predicting the central state to the requested timestamp. This prediction only approach comes at the price of increased uncertainty in the central state and output of the data fusion, e.g. the area that defines likely object positions becomes larger with increasing prediction times.

The same approach is used by BASELABS Create Embedded to handle asynchronous sensors. When new measurements from a sensor should be processed, the central data fusion state is automatically predicted to the timestamp of the measurements, using the selected motion model and then updated using the measurements. As this approach is followed for all configured sensors, they get automatically synchronized by the data fusion. The source code that realizes the described approach is part of the data fusion reference architecture of BASELABS Create Embedded:

9 How can I adjust the update rate of the sensor fusion output?

To adjust the update rate of the data fusion output, call the provided data_fusion_get_tracks() function for retrieving the resulting tracks at the required frequency, and provide the required timestamp:

The data_fusion_get_tracks() function synchronizes or predicts the central data fusion state to the given time using the configured motion model (see also question 7) and outputs the tracks at the given time (BASELABS Create Embedded).

10 How can I account for sensors with different delays and out-of-sequence measurements?

When using multiple sensors, data may arrive delayed at the data fusion, e.g. due to pre-processing or communication. To handle these so-called out-of-sequence measurements, BASELABS Create Embedded uses a deterministic buffering approach. When using this buffering method, the measurements of the different sensors are buffered and then processed in chronological order. The deterministic approach sets a maximum time that the data fusion system waits for delayed sensor measurements depending on the sensor’s update rate and maximum delay. If measurements for a sensor do not arrive within this timespan, the data fusion system continues processing measurements that already arrived.

To use the deterministic buffering approach, each sensor update rate and maximum latency need to be configured:

From that, BASELABS Create Embedded determines buffer sizes and the processing strategy.

11 How can I use object length and width information?

Some sensors provide additional object attributes like width and length besides position and motion. These additional attributes allow characterizing an object as an extended object. The estimation of width and length can be added to the data fusion algorithm by configuring additional system models, measurement models, and state proposers.

By default, a single system model is configured in the ‘Target Object’ section of the Data Fusion Designer. This first system model is the motion model. It describes how the position, velocity and acceleration of an object change over time (see also question 8). To add width and length system models, press the + button twice and select ‘Width model’ and ‘Length model’:

Now, for each sensor in your system, choose whether this sensor provides information or measurements for the width and length quantities, e.g. a camera might provide width measurements only while a lidar could provide both width and length measurements:

The described adaption leads to several changes in the generated data fusion system:

  • The output interface now contains additional quantities for width and length, i.e. the resulting C structures now contain additional fields for width and length.
  • The input interfaces of the sensor data now contain additional quantities for width and length (if the sensor supports it).
  • The built-in birds-eye-view visualization additionally shows a bounding box for each object.

BASELABS Create Embedded

12 How can I incorporate object classification into the sensor fusion?

Some sensors classify detected objects, e.g. with respect to the object type. Such classification values represent valuable information that can be used to improve the driving function, e.g. by applying different reactions for cars and pedestrians. Besides the driving function, classification values can also be used to improve the performance of the data fusion itself, e.g. different detection models might be applied depending on the object type (see question 21) to express differences in a sensor’s detection capability. Additionally, the classification might be used to improve the measurement-to-track association.

These classifications can be described using so-called categorical variables, i.e. variables which can take on a defined number of discrete values. For a categorical variable that holds the object type, exemplary valid values could be ‘car’, ‘truck’, ‘motorcycle’, ‘pedestrian’ and ‘bicycle’. While some sensors of the data fusion system could exactly provide the aforementioned classification values, some other sensors could only provide a subset of these values or could combine multiple classes into one class. Other sensors could not provide any classification information at all. When multiple sensors provide classification information, this is often referred to as classification fusion.

The estimation of an object’s classification can be added to the data fusion by configuring an additional system model as well as additional measurement models and state proposers (see also question 11). For that, categorical variables need to be defined first. BASELABS Create Embedded ships with a built-in categorical variable that can take on the values ‘Car’, ‘Truck’, ‘Motorcycle’, ‘Pedestrian’, and ‘Bicycle’. Additional categorical variables can be added by a wizard:

Next, we need a model that describes how the value of a categorical or classification variable may change over time. For that, different options are provided by BASELABS Create Embedded which can be selected from in the appropriate model wizard:

BASELABS Create Embedded ships with a built-in system model ‘Object class model’ for the aforementioned categorical variable with its 5 possible values which will be used for further steps.

To add a classification value to the configured ‘Target Object’, click the  button to add a new system model and select the ‘Object class model’. This automatically selects the categorical value for the classes ‘Car’, ‘Truck’, ‘Motorcycle’, ‘Pedestrian’, and ‘Bicycle’ as well:

Now, for each sensor, a categorical measurement model is needed that describes how a sensor classifies different object types, i.e. which values are reported by the sensor for each object type at which probability. Note that the values a sensor reports might be different than the values the system needs to determine. To create a new categorical measurement model, use the wizard that comes with BASELABS Create Embedded:

For each sensor, select the appropriate categorical measurement model:

The described adaption leads to several changes in the generated data fusion system:

  • The input interfaces of the sensors data now contain additional fields that hold the observed categorical value or classification (if the sensor supports it). The following example demonstrates how to fill the classification value for the measurements of an exemplary camera sensor:
  • For each track, the output interface now contains a field that holds the distribution over the categorical value:

For each track, the resulting classification probabilities can now be retrieved as follows:

  • The built-in birds-eye-view visualization now additionally shows the classification value with the largest probability:

13 Which target platforms are supported?

All embedded platforms are supported if there is a C99 compiler available for this platform and if this platform has a floating-point unit (FPU) with IEEE-754 support with or without support for denormal or subnormal numbers. BASELABS Create Embedded provides self-contained C99 source code which is dependency-free (except C functions of the standard library, e.g. sin(), sqrt()). For memory and CPU consumption, see questions 14 and 16.

For pre-development activities, PC (Windows/Linux) and ARM-based platforms such as NVIDIA DRIVE are supported.

14 Are the algorithms numerically stable?

BASELABS Create Embedded provides high numerical stability on embedded devices and single floating-point precision.

To achieve such numerical stability, Kalman filters are implemented based on square root filtering, and operate on LDL factorized covariance matrices.

Furthermore, BASELABS Create Embedded offers functionality to observe the condition of several data fusion components during runtime.

15 How much memory does the resulting sensor fusion system require on the target?

The memory consumption of the resulting data fusion systems mainly depends on

  • the maximum number of tracks, i.e. the maximum number of objects the data fusion should take care of,
  • the maximum number of measurements each sensor may provide,
  • the sizes of the interfaces of the sensors and data fusion output (see questions 4 and 5), and
  • the number of unique sensors types, i.e. sensors that share the same interface, maximum number of measurements and models.

For example, a 3R1C data fusion system consisting of a long-range radar (64 measurements, 10Hz), 2 identical corner radars (each 64 measurements, 10Hz) and a camera (20 measurements, 20Hz), requires approximately 60KB ROM and 20KB RAM.

BASELABS Create Embedded

16 How many sensors can be processed on my target hardware?

The CPU consumption of the resulting data fusion systems mainly depends on

  • the maximum number of tracks, i.e. the maximum number of objects the data fusion should take care of,
  • the maximum number of measurements each sensor may provide, and
  • the sizes of the interfaces of the sensors and data fusion output (see questions 4 and 5).

Both the number of tracks and measurements largely have a linear influence on CPU consumption.

The following diagram illustrates CPU consumption for typical data fusion systems and platforms. All numbers are retrieved running the data fusion on a single core and at peak load. For numbers on specific platforms, please feel free to contact us.

BASELABS Create Embedded

17 Does the sensor fusion algorithm take advantage of multi-core architectures?

Currently, the data fusion of BASELABS Create Embedded runs on a single core. However, due to the open architecture (see also questions 18 and 20), multi-core partitioning can be applied to this architecture in a manual step.

18 Which portions of the sensor fusion algorithm are provided as source code?

In short: 100%.

However, full source code access including the data fusion library is available with the Embedded License only (see also question 19).

BASELABS Create Embedded

19 Which external dependencies does the sensor fusion algorithm have?

None.

The data fusion systems of BASELABS Create Embedded are self-contained and do not rely on any external or third-party library, code or license terms.

The only exception is functions from the C standard library, e.g. mathematical functions like sin(), sqrt() and fabs() which are typically part of your compiler.

20 How can I customize the sensor fusion algorithm?

BASELABS Create Embedded comes with a ready-to-use sensor fusion architecture and implementation. However, it is often required to customize the final algorithm. While BASELABS Create Embedded is a completely white-boxed solution (cp. question 18) which in general allows modifying any aspect of the algorithm, it provides many explicitly foreseen variation points in its contained sensor fusion architecture. These variation points make it very efficient to influence the performance of the resulting sensor fusion algorithm and to customize it for a specific sensor or use case. Among others, there are variation points for e.g.

  • the system or temporal behavior of objects,
  • the sensor-specific detectability and appearance of objects,
  • the way how new tracks are created and no longer relevant tracks are removed and
  • the prioritization of tracks.

All foreseen variation points come with a dedicated API that - if implemented - provides the custom functionality to the sensor fusion. As an example, the following code snippet illustrates how to implement a custom detection model of an assumed sensor whose detection rate decreases depending on the object's distance:

21 Why is it safe to use BASELABS Create Embedded for safety-critical systems in serial production?

BASELABS Create Embedded enables the safety-compliant development of data fusion algorithms. The contained embedded software library is developed according to ASPICE and ISO 26262​. By that, the software can be directly used in series development and drastically reduces the development effort.

BASELABS Create Embedded has been developed in accordance with a relevant sub-set of ISO 26262:2018 requirements and methods for ASIL B. It comes with:

  • A safety case that can be integrated into the safety case of the system to be developed.
  • A safety manual that gives users guidelines on how to use the product in a safety-related context.

Parts of BASELABS Create Embedded are considered as a software tool, i.e. this tool might need to be qualified when used in the context of the ISO 26262 (see ISO 26262:2018-8 Chapter 11). Create Embedded supports this tool qualification by two means:

  • The software tool itself has been developed according to relevant aspects of ISO 26262 and ASPICE, addressing the possibility for ‘evaluation of the tool development process’.
  • A test suite is provided to enable users the ‘validation of the software tool’.

For the software units that come with the embedded data fusion library, users benefit from

  • the safety concept of BASELABS Create Embedded,
  • the extensive documentation containing all relevant algorithmic and design details,
  • the implementation and testing according to ISO 26262 guidelines,
  • the implemented error detection mechanisms (derived from failure mode analysis) that provide a solid basis to perform error handling at the software-architecture level, specific to the system to be developed and
  • the test reports for the intended hardware, compiler, and operating system and the specific version of BASELABS Create Embedded.

The confirmation reviews of relevant ISO 26262:2018 work products were performed by a Principal Safety Engineer of the exida GmbH. The assessment of the compliance of the development process with ASPICE was performed by an intacs™ Certified Competent Assessor.

In addition to the product offering, BASELABS consults on

  • functional and system architectural aspects,
  • safety aspects like safety architectures and support in safety analysis (ISO 26262),
  • process aspects like requirements engineering, ReqIF, dashboards, and tools.

22 What tooling do you provide to support me when customizing my sensor fusion algorithm?

BASELABS Create Embedded includes a ready-to-use sensor fusion architecture and implementation. This architecture can be extended at multiple variations points (cp. question 21). In addition to these APIs, BASELABS Create Embedded comes with tooling to further support this process. As of today, this includes:

  • Wizard dialogs for the variation points that help you initially use the APIs
  • Extension for the integrated development environment (IDE) Visual Studio Code

In the following example, the complete tool-based workflow how to add and modify a custom detection model is illustrated.

Adding a custom detection model

Inside of the Data Fusion Designer a new detection model can be added.

In the final step, you can enter the source code that implements the behavior of this mode. In this example we implement a detection model that has a varying detection rate based on the distance of the object (cp. question 21)

After all this information has been provided a new source code file containing the complete implementation of this custom detection model is generated. You can now already use this implementation inside of your data fusion configuration.

Use your IDE to further develop your extension:

23 How does BASELABS accelerate your series development?

Today, innovation in mobility and individual transport is mostly driven by software. On the one side, the amount of software and its complexity increases while at the same time its interconnection with safety-critical sub-systems such as steering or braking increases.

Success Factors for Efficient Software Development

It is well accepted that to efficiently develop complex software, a couple of principles and concepts shall be applied in order to master this challenge. To leverage success, these are some of the proven practices in modern software development:

  • Reuse code by creating and utilizing domain-specific libraries. By sharing and reusing code, you can clearly increase productivity in your software project. This may be applied to a certain department, a company, or even the whole industry.
  • Pursue a well-structured architectural software design that allows a high level of flexibility. A proper software architecture ensures that code can be reused for different applications. If you master to break it down to individual components with powerful APIs, you can also account for new use cases and keep flexibility for custom extensions.
  • Systematically and continuously apply static code analysis practices from day one of your software development to ensure immediate feedback. This also helps to prevent late changes in crucial design decisions of your software project.

Relevant Constraints in Automotive Software Development

In the traditional automotive industry, the C programming language is the de-facto standard for implementing software functionality. In automotive, its usage is highly regulated by standards and best-practices such as MISRA-C:2012. In a nutshell, MISRA is kind of a rule set that proposes best practices to make your software implementation less risky given the typical pitfalls you see when applying C to real-world problems. This especially applies to safety-relevant systems. There are some relevant rules that have to be applied:

  • Do not perform dynamic memory allocation during runtime. This should prevent your application from running into trouble if no additional memory is available during a potential runtime allocation. This accounts for predictability and determinism.
  • Do not download any unused code (in terms of requirements specification) to your ECU. Typically, it is well-accepted that you should only download code to your ECU that is really required. While this keeps the memory footprint of your application small, it also ensures that no unwanted code can be called by accident. This is highly relevant if you decide to design with software libraries.
  • Respect compiler warnings (better turn them into fatal errors); apply consistent unit testing, and strive for the required coverage according to your ASIL goals. Finally, use comprehensive static code analysis for immediate feedback during development.

If we compare these rules to the initial design goals for efficient software development, we realize that modern complex systems in the automotive sector can hardly be mastered with the dominant programming language C. The main reason for this is that plain C lacks good abstraction capabilities for complex software architectures.

One potential option to overcome these limitations would be the introduction of C++ for automotive software development. A first initiative was done when proposing a MISRA rule set for C++. However, MISRA-C++:2008 does still not consider relevant features that have been introduced with recent C++ standards (starting from C+11 to C++20). In general, it was written by safety experts that assumed an average audience with rather limited knowledge of C++. Hence, the result addresses an obsolete language standard and practices. It shall be highlighted that C++ is not just a 'better C'. More precisely, it is an own language that shares some concepts with C but also has important differences (e.g. different type system or implementation-dependent behavior) that might lead to risky misunderstandings.

Current initiatives around MISRA-C++:202x try to close that gap by bringing together recent C++ best practices with safe software development practices to make it less risky. However, this standard is still under development. Even if this new standard is released, a next challenge will be the broad availability of qualified development toolchains for the relevant target ECUs.

BASELABS's Contribution

With BASELABS Create Embedded a further alternative was introduced to the market. It aims to resolve the above mentioned conflicts by proposing several technologies:

  • BASELABS provides a domain-specific SDK/library for sensor data fusion applications. This includes general purpose algebraic functionality, Kalman filtering, and a statistical layer to realize complex multi-object tracking applications.
  • BASELABS developed Trait-C language, an extension of the C-language on preprocessor-level to enable features such as templates/generics. With this technology, we can ensure software reusability while still respecting the automotive safety constraints.
  • The BASELABS software is 100% self-contained. This means, it has no external dependencies. This allows straightforward integration into your continuous integration infrastructure and your target ECUs.
  • The code of the BASELABS SDK was developed according to ASIL-B practices. It has 100% of branch coverage. This directly scales into your project. We are providing coverage reports and a compiler test kit.
  • The BASELABS workflow is compatible with any C99 development toolchain. So, you can directly start today by applying it to your existing infrastructure.
  • The complete codebase was developed according to ISO26262:2018 and MISRA-C:2012. You will get access to our safety manual.
  • To leverage the efficiency of your developers, we provide plugins for IDEs such as Visual Studio Code.
  • The SDK comes with comprehensive tutorials, API documentation, and architectural design documents. Moreover, we provide a descriptive way to configure and implement object fusion use cases based on a reference architecture.

You have a question that is not on the list? Write to us to get in touch.

Contact & further information

Top of page