Video Injection Methods in a Real-world Vehicle for Increasing Test Efficiency

Similar documents
David Howarth. Business Development Manager Americas

Significant Reduction of Validation Efforts for Dynamic Light Functions with FMI for Multi-Domain Integration and Test Platforms

Physics Based Sensor simulation

Virtual Testing of Autonomous Vehicles

Link:

Industrial Keynotes. 06/09/2018 Juan-Les-Pins

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

Vehicle-in-the-loop: Innovative Testing Method for Cognitive Vehicles

NEOLINE. X-COP 9100s. International Hybrid device DVR with GPS & Radar detector

Volkswagen Group: Leveraging VIRES VTD to Design a Cooperative Driver Assistance System

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor

Fire Service College - immersive 3D emergency training

LEARNING FROM THE AVIATION INDUSTRY

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Robotic Vehicle Design

Real-Time Testing Made Easy with Simulink Real-Time

Virtual Homologation of Software- Intensive Safety Systems: From ESC to Automated Driving

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

GUIDED WEAPONS RADAR TESTING

Automotive Needs and Expectations towards Next Generation Driving Simulation

Autonomous Automation: How do we get to a Million Miles of testing?

YOUR PRODUCT IN 3D. Scan and present in Virtual Reality, Augmented Reality, 3D. SCANBLUE.COM

Transportation Informatics Group, ALPEN-ADRIA University of Klagenfurt. Transportation Informatics Group University of Klagenfurt 3/10/2009 1

Interaction in Urban Traffic Insights into an Observation of Pedestrian-Vehicle Encounters

Intelligent driving TH« TNO I Innovation for live

Visualization in automotive product development workflow

High Performance Imaging Using Large Camera Arrays

Development & Simulation of a Test Environment for Vehicle Dynamics a Virtual Test Track Layout.

Ground vibration testing: Applying structural analysis with imc products and solutions

Qosmotec. Software Solutions GmbH. Technical Overview. QPER C2X - Car-to-X Signal Strength Emulator and HiL Test Bench. Page 1

NEW! Training system for modern radar technology

Putting It All Together: Computer Architecture and the Digital Camera

A SERVICE-ORIENTED SYSTEM ARCHITECTURE FOR THE HUMAN CENTERED DESIGN OF INTELLIGENT TRANSPORTATION SYSTEMS

SHAPING THE FUTURE OF IOT: PLATFORMS FOR CO-CREATION, RAPID PROTOTYPING AND SUCCESSFUL INDUSTRIALIZATION

Waves Nx VIRTUAL REALITY AUDIO

Flexible and Modular Approaches to Multi-Device Testing

CarSim/TruckSim/BikeSim Real-Time Hardware In the Loop Mechanical Simulation Corporation

Virtual Reality Calendar Tour Guide

Robotic Vehicle Design

Virtual testing by coupling high fidelity vehicle simulation with microscopic traffic flow simulation

MotionDesk. 3-D online animation of simulated mechanical systems in real time. Highlights

What will the robot do during the final demonstration?

Technologies Explained PowerShot G16, PowerShot S120, PowerShot SX170 IS, PowerShot SX510 HS

A VIRTUAL VALIDATION ENVIRONMENT FOR THE DESIGN OF AUTOMOTIVE SATELLITE BASED NAVIGATION SYSTEMS FOR URBAN CANYONS

CSM High-Voltage Measurement Systems

FPGA Based Sine-Cosine Encoder to Digital Converter using Delta-Sigma Technology

Transponder Based Ranging

Wireless technologies Test systems

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Ultra-small, economical and cheap radar made possible thanks to chip technology

A flexible application framework for distributed real time systems with applications in PC based driving simulators

Enhancing Shipboard Maintenance with Augmented Reality

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

White paper on CAR150 millimeter wave radar

WHITE PAPER. Sensor Comparison: Are All IMXs Equal? Contents. 1. The sensors in the Pregius series

Peripheral Sensor Interface for Automotive Applications

Situational Awareness A Missing DP Sensor output

Manual Web Portal pettracer GPS cat collar Version 1.0

Behind the Test Challenges of Automotive Radar Systems

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

ESI Group to acquire CIVITEC

Vision with Precision Webinar Series Augmented & Virtual Reality Aaron Behman, Xilinx Mark Beccue, Tractica. Copyright 2016 Xilinx

The 3xD Simulator for Intelligent Vehicles Professor Paul Jennings. 20 th October 2016

The LVCx Framework. The LVCx Framework An Advanced Framework for Live, Virtual and Constructive Experimentation

Development of a driver information and warning system with vehicle hardware-in-the-loop simulations

Prototyping Unit for Modelbased Applications

Model of Firearms Simulator Based on o Serious Game ond Sensor Technology

Final Report Non Hit Car And Truck

Omni-Directional Catadioptric Acquisition System

Engineering Support for the Design of Electrohydraulic Drive Systems.

Connect Your Diary and Improve Student Engagement and School Communication

Years 9 and 10 standard elaborations Australian Curriculum: Digital Technologies

Fiber-optic temperature measurement solves HV challenges in e-mobility Tech Article

ReVRSR: Remote Virtual Reality for Service Robots

Tobii T60XL Eye Tracker. Widescreen eye tracking for efficient testing of large media

Smart eye using Ultrasonic sensor in Electrical vehicles for Differently Able.

EE 314 Spring 2003 Microprocessor Systems

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Elgar ETS TerraSAS. 1kW-1MW V. Standalone TerraSAS Photovoltaic Simulator

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Accuracy Performance Test Methodology for Satellite Locators on Board of Trains Developments and results from the EU Project APOLO

Lab 7: Introduction to Webots and Sensor Modeling

High Power Programmable DC Power Supplies PVS Series

Document Version Publisher s PDF, also known as Version of Record (includes final page, issue and volume numbers)

DATE: 17/08/2006 Issue No 2 e-plate Operation Overview

Using FMI/ SSP for Development of Autonomous Driving

Use of Probe Vehicles to Increase Traffic Estimation Accuracy in Brisbane

Rapid Array Scanning with the MS2000 Stage

SCOE SIMULATION. Pascal CONRATH (1), Christian ABEL (1)

Design of an Integrated OLED Driver for a Modular Large-Area Lighting System

[Overview of the Consolidated Financial Results]

Digital Engines for Smart and Connected Cars By Bob O Donnell, TECHnalysis Research Chief Analyst

CAEMAX D x telemetry. digital modular convenient. Modular, multi-channel telemetry system for a variety of applications. productive testing

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL

Controlling vehicle functions with natural body language

High-level model of an acceleration sensor with feedback as part of an inertial navigation system

State Library of Queensland Digitisation Toolkit: Scanning and capture guide for image-based material

Automatic Testing of Photonics Components

Practical Experiences on a Road Guidance Protocol for Intersection Collision Warning Application

Transcription:

DEVELOPMENT SIMUL ATION AND TESTING Video Injection Methods in a Real-world Vehicle for Increasing Test Efficiency IPG Automotive AUTHORS For the testing of camera-based driver assistance systems under real conditions simulation methods can contribute to increased test efficiency. IPG Automotive presents the potential of video injection methods in real components based on the Video Interface Box. Dipl.-Wirt.-Ing. Raphael Pfeffer is Product Manager Test Systems at IPG Automotive in Karlsruhe (Germany). 44 Dipl.-Ing. Marc Haselhoff is Engineer in the Electronics Development Department at IPG Automotive in Karlsruhe (Germany). VARIETY OF SCENARIOS There is one aspect all active safety systems have in common: they must deliver reliable functionality which is subject to country-specific testing by various testing organisations. In Europe the independent Euro NCAP (European New Car Assessment Programme) organisation carries out these tests and evaluates new vehicles. One of the key criteria for Euro NCAP testing of current and future generations of advanced driver assistance systems (ADAS) is the reliability of detecting situations in which pedestrians or cyclists might be involved. One of the prerequisites for faultless functionality is extensive in-development testing, including a number of relevant scenarios. To generate appropriate sce-

narios, simulation offers a nearly unlimited variety of exactly reproducible combination options from the environment, traffic objects and vehicle dynamics. For virtual test driving IPG Automotive has developed diverse coordinated software and hardware products which, by means of simulation, make these scenarios usable for the development of safety-relevant assistance functions such as an emergency-braking assist in the MiL, SiL, HiL and ViL environment. POTENTIAL OF AN ARTIFICIAL PEDESTRIAN When crossing a street, a pedestrian, with a single step, transforms a non-critical into a potentially life-threatening situation. In order to gain time, and thus the required braking distances, in a situation like this a camera-based assistance system must be able to already detect the pedestrian s intention of stepping into the street. Real human beings, when starting to walk, will shift their upper body forwards even before moving their legs. Currently used passive dummies, so-called pedestrian targets on remotecontrolled platforms, behave in nearly the opposite manner. As a result of inertia, dummies, when starting to walk, will lean backwards with their upper body. Furthermore, the extremities of simple models are often rigid and therefore unsuitable for stimulating modern detection algorithms. An anatomically correct emulation of the human motion FIGURE 1 Real Euro NCAP pedestrian targets (EPTa and EPTc, left) versus virtual pedestrians (adult and child, right) ( Euro NCAP and IPG Automotive) sequence can only be achieved by a very high technical effort. Damage to or loss of such a target would mean that it could no longer be used for conclusive tests. To avoid this, faulty detection by the test vehicle must be prevented by complex measures, for instance by pulling the dummy upwards out of the danger zone immediately prior to the impact. This requires additional, cost-intensive mechanisms. Furthermore, the reflection behaviour of the entire set-up, consisting of the dummy, drive and securing system with metallic components, may significantly differ from a real-world pedestrian. This carries particular weight in investigations of radar and image data fusion. Therefore, shifting the test scenarios into the virtual world of the CarMaker open integration and test platform may provide an alternative solution. Here, all relevant elements such as the vehicle, road, road users and traffic objects can be modelled analogously to real-world road testing, including the pedestrians of relevance to active safety systems, FIGURE 1. VIRTUAL IMAGE DATA INJECTION INTO REAL-WORLD COMPONENTS The virtual pedestrians and cyclists in IPGMovie, the 3-D visualisation tool of the CarMaker product family, feature a realistic motion sequence and are therefore suited for stimulating image data based detection algorithms. The scenarios required in Euro NCAP can be simulated in CarMaker and used for testing assistance and safety functions, FIGURE 2. Their utilisation is possible along the entire V-Model in the MiL and SiL areas as well as on the HiL test bench. Even in the real-world test vehicle the systems can be used in ViL operation (i.e. the combination of a real-world vehicle and virtual environment). For application in the software-in-the-loop stage the Sensor Model Extension Package offers a wide range of possibilities to provide the scenes generated by IPGMovie with filters or faults, and to transfer them to the algorithms under test via a network socket. If, at a later stage in the development process, an electronic control unit (ECU) exists as a real-world prototype or even a production version is available, simulation again offers several options to import the generated image data into the ECU. 07-08I2016 Volume 118 45

DEVELOPMENT SIMUL ATION AND TESTING FIGURE 2 Euro NCAP related test with virtual pedestrian target ( IPG Automotive) Monitor HiL provides the fastest and most cost-effective way to achieve this. In this case, the complete camera system, consisting of optics, image sensor (imager) and ECU, is placed in front of a high-resolution monitor in a way that allows the system to capture the scene and the objects depicted. In scenarios with very large differences in brightness, such as those existing at tunnel exits or in oncoming traffic situations at night, these set-ups reach their limits. The representation of scenes for extreme wide-angle lenses ( fisheyes ) can be achieved only with a major investment of effort and at high precision during assembly as well. For these cases, and in order to avoid further inadequacies of filming, the Video Interface Box, FIGURE 3, has been created to provide a technical means of injecting the image data directly into the ECU. For this purpose, the optics and image sensor are physically separated from the rest of the camera system and a tailored hardware interface is developed for the interface that has been created. As the optics and the image sensors in most systems are purchased mass-production items, an interface, frequently achieved by a cable connection, is usually available to begin with. Hence linking without requiring the ECU s engineering design to be modified, in other words using a production version, is possible in this case. As the optical path, consisting of a lens and the colour filter applied to the sensor, is removed, an emulation of these components is necessary in the visualisation. The previously mentioned Sensor Model Extension Package provides all the tools required for this purpose. In order to make the test system as flexible as possible, the hardware interface is achieved by an exchangeable piggyback board within the Video Interface Box. This board merely serves to adjust the signals to the plug connection and the electrical levels of the original connection between the imager and the ECU. All aspects relating to timing and embedding of data that does not contain the actual image information are located as FPGA code on the motherboard of the Video Interface Box and can thus be adapted to the specific project and largely parameterised by the user. For this purpose the parameterisation is carried out in IPGMovie, which allows a 46 FIGURE 3 Video Interface Box (technical block diagram) ( IPG Automotive)

convenient adjustment to be made without the need to interfere with the FPGA code. The connection between IPGMovie and the Video Interface Box is created via the HDMI output of a high-performance, off-the-shelf video card. IPG Automotive s own protocol in connection with widely proved HDMI transmission technology guarantees low-lag and efficient transmission of image data including reliable, exact timing. This high-precision timing makes it possible to limit data buffering to a minimum and to thus extremely minimise the lag caused by signal adjustment. As the four different camera views are jointly transmitted as tiles using split-screen technology, the views are optimally synchronised amongst each other and may also be used for injection into stereo camera ECUs, the algorithms of which can deliver faulty results even in the case of minor synchronisation differences. Via the HDMI connection IPGMovie also accesses the register structure of the emulated imager. Due to this access, IPGMovie, for instance, can read the previously issued command of changing the exposure time and respond accordingly. In the opposite direction, the initialisation of the register structure and adjustment of the values to runtime can be carried out via this feedback channel. Particularly due to the dynamic response to changes in exposure settings real in-the-loop testing is enabled in the first place, as the ECU does not have to be operated in so-called HiL mode in which some of the features of the production software are deactivated. IPG Automotive is currently developing emulations of image sensors made by the leading manufacturers and of the most commonly used models, as well as hardware for signal adjustment to customary camera-to-ecu connections such as differential NTSC or diverse variants of LVDS. The growing number of emulations, adaptation boards and their combinations reduces the effort to be invested in, and thus the costs incurred, for the projects. VEHICLE-IN-THE-LOOP FOR TESTING IN THE FULL REAL-WORLD VEHICLE The utilisation of the Video Interface Box (VIB) in the vehicle-in-the-loop test vehicle marks a further step in testing camera-based advanced driver assistance systems under real-world conditions, FIGURE 4. The vehicle-in-the-loop method combines the advantages of real-world road tests and simulation, and is attractive particularly in late development stages. For this purpose the device-under-test, a camera-based system in this case, is integrated in the full vehicle. In other words, the unit is already embedded in the final array of integrated systems. However, in contrast to real-world road testing, elements of the environment are calculated in the simulation. The advantage of this method is that even complex scenarios can be easily generated or adopted from earlier stages of the simulation and run in a completely reproducible manner. In spite of this, the behaviour of the vehicle with respect to its handling characteristics fully corresponds to its behaviour displayed in a real-world road test. For an implementation according to the vehicle-in-the-loop method and linking of simulation and the real-world vehicle, essentially, two sub-tasks have to be performed: determination of the position of the vehicle in the real world, and transfer of the position, movement and situation into simulation perception of the simulated environment and transfer of the respective sensors or downstream components in the vehicle (this item, with full transferability, can be carried out in the ViL case as well, using the Video Interface Box method discussed above). Utilising an inertial measurement platform has proved a viable means of determining the position. The high-precision acceleration and angular rate sensors are used to determine the relative position FIGURE 4 VIB in a vehicle-in-the-loop setup ( IPG Automotive) 07-08I2016 Volume 118 47

DEVELOPMENT Simulation and TESTIng and position change. To avoid drift over time, the system is supported by absolute position data from GPS and DGPS. As a result, with an update rate of up to 1000 Hz, accuracies of 1 to 2 cm can be achieved (1σ environment). Additionally, a compact real-time system (RoadBox), on which the simulation core runs in hard real time and to which the position and motion data of the realworld vehicle is transmitted via a CAN interface, is carried on board. The vehicle model in the simulation is completely exchanged for the measured data in this process. Vehicle dynamics are real. The real-time computer itself is connected to a host PC via Ethernet which, for one, enables the test engineer / test driver to control the simulation and, for the other, contains the video card for the Video Interface Box. In the test driving events the driver moves the vehicle across an open proving ground. All the objects which are relevant for the test case are generated in the simulation. Augmented reality glasses with see-through technology are used to enable the driver to perceive the driving situation, particularly for closed-loop scenarios. The glasses similar to a head-up display show the virtual objects as overlays. The movement of the head is captured using a tracking system so that the virtual object overlays can always be appropriately displayed. This results in a total system that is oriented to real-world road testing to the extent possible, but enables the tests to be run with significantly reduced effort and expense. COMPARABLE METHODS The validation effort for camera-based assistance functions has long reached its limits in terms of feasibility and economy. The utilisation of the Video Interface Box described in this article within a vehicle-in-the-loop environment makes risk-free and reproducible testing even of complex test scenarios possible at high levels of efficiency and in real-world boundary conditions. Future generations of image analysis and functional algorithms will take aspects into account which can no longer be achieved by physical dummy targets. At that time, if not sooner, comparable methods will become indispensable. 48

International, Digital, Interactive: The new emagazine from ATZoffhighway ATZ offhighway is a one-of-a-kind special-interest magazine for specialized vehicles and engines in an emergent market Test now for 30 days free of charge: www.emag.springerprofessional.de/ atz-offhighway-worldwide ATZoffhighway emagazine has 80 pages packed with information: company news and the latest products specialist articles from industry and research guest comment interview on the cover story Keyword search: The search function enables you search for a keyword in the complete issue in a matter of seconds Didactically prepared: Animations and editorial videos offer genuine added value and complement the specialist articles from the automotive industry Responsive HTML5 implementation: This ensures that you have access to your emagazine not only from desktops and laptops but also from smartphones and tablets PDF downloads: The classic function for saving and downloading articles Interactive contents: Jump immediately to your selected article with one click User-friendly and direct without an app: HTML5 technology provides a direct link to the website, ensuring access without an app store 07-08I2016 Volume 118 49