Test-bed for Unified Perception & Decision Architecture
|
|
- Cleopatra Copeland
- 6 years ago
- Views:
Transcription
1 Test-bed for Unified Perception & Decision Architecture Luca Bombini, Stefano Cattani, Pietro Cerri, Rean Isabella Fedriga, Mirko Felisa, and Pier Paolo Porta Abstract This paper presents the test-bed that will be developed for a Unified Perception & Decision Architecture (UPDA). Due to the increasing demand of ADAS systems to be mounted on cars, it is more and more important to develop a unified architecture that can communicate and share information between these systems. This is the aim of an ERC-founded project and to develop and test such architecture a car has been set up with many different sensors. 1 Introduction VisLab is undertaking highly innovative research within its ERC-founded European project, whose topic is the development of an open standard for the perception and decision subsystems of intelligent vehicles. Currently, many commercial vehicles include sophisticated control devices like ABS, ESP, and others. These control equipments have been independently developed by car manufacturers and suppliers. Generally, they also act independently, and are singularly tuned. Nevertheless, new methods to improve overall performance are currently under development, exploiting communication and cooperation of these devices: the recently introduced Unified Chassis Control (UCC) is an example. The deployment of the UCC in the mass market, requires to adapt and rethink all control subsystems to provide communication, data fusion, and an overall tuning: namely to integrate all of them together. From the car manufacturers and suppliers point of view, the introduction of the UCC requires the redesign of each single block (ABS, ESP,...) meaning an additional financial effort, besides the obvious delay in reaching the market. Had a complete UCC architecture been defined well in advance with respect to the development of each single block, its implementation would have been straightforward, less costly, and would have reached the market earlier. Fig. 1. Test-bed vehicle for the UPDA architecture Perception and decision modules are in an earlier development stage than control ones: the advanced driver assistance systems that are currently available on the market are only basic ones (Lane Departure Warning, Blind Spot Monitoring, Automatic Cruise Control, Collision Mitigation), independently developed by different car manufacturers and suppliers. The state of the art of advanced driver assistance systems, in fact, has not yet defined a complete architecture that would allow to fuse together all these basic blocks and benefit from their integration. The availability of such architecture would allow to define a standard module interface so that the following research efforts could be more focused in providing modular systems, already engineered to fit into this architecture. In order to develop these concepts, VisLab is setting up a vehicle prototype (three more are expected for 2009) integrating various sensing technologies. A new architecture (UPDA, Unified Perception & Decision Architecture)
2 is meant to take as input all the data coming from the different perception modules, to fuse them, and to build a more sophisticated world model to be used in the following decision level. In doing this, a standard interface will be defined for each perception module, allowing different providers to integrate their own perception system into this standard architecture, thus boosting competition. The prototype vehicle will be used for testing both UPDA and ADAS applications. The car integrates 10 cameras, 4 laserscanners of 2 different brands, a laser, a GPS unit connected to an Inertial Measurement Unit, 4 PCs; the whole car has been set up for drive by wire, and can be controlled via CAN messages. In the following, a detailed description of the overall system aims is listed: Crossings assistance [ref A]: lateral perception is important when facing an intersection, and as junctions may have very different layouts, the lateral system must be able to perceive traffic coming from different angular directions. During a traffic merging manoeuvre, the vehicle is allowed to pull into traffic only when oncoming traffic leaves a gap of a sufficient time interval (usually at least 10 seconds). Hence the vehicle, regardless of the junction layout, needs to perceive the oncoming traffic as well as estimate cars speed from long distance. Overtaking assistance [ref A]: when driving on a road with multiple lanes, the lane change manoeuvre is a common task in order to overtake slower traffic. Two rear cameras installed acquire images of the road behind and on the vehicle side. The cameras are installed so that they can frame the area close to the vehicle and extend their vision over the horizon. This system can overcome some LIDAR limitations, like its inability to provide useful results when moving along dusty roads, where the clouds produced by the vehicle itself negatively affect the laser beams. This is a problem mainly of the rear LIDAR since dust clouds are produced by the vehicle in motion; sometimes it affects also the front ones especially during sudden stops when dust clouds overtake the car. Despite this problem, LIDAR data have been used to refine the detected obstacles position measurement, thus reducing the errors introduced by vehicle pitching, since distance estimation is performed by a monocular system. Obstacle & Pedestrian detection [ref A]: the sensing of obstacles up to a distance of 50m is of great importance when driving in a urban environment since other vehicles traveling in either direction or fixed and moving obstacles may be on the vehicle's trajectory. Stereoscopic systems offer great advantages since they allow to reconstruct the 3D environment and therefore help to detect anything above the road surface. Lane & Stop Line detection, Lane keeping [ref A]: navigation in a urban environment has to be very precise. The vehicle must be able to detect obstacles and lane markings with high confidence on the detection probability and position accuracy; moreover the system must detect stop lines, to allow a precise positioning of the vehicle at intersections, and lane markings in sharp curves. The perception system must therefore cover a distance range useful for detections at driving at medium-to-high speeds. The vehicle includes two stereo systems (one on the front and one at the back) which provide precise sensing in its close proximity. Parking assistance [ref C e D]: the system aims at supporting the driver during parking manoeuvres. Road Signs detection [ref B]: the vehicle is able to detect and classify a large variety of road signs. This functionality may warn drivers and supplies additional environmental information to other on-board systems such as ACC. Stop-and-Go & ACC [ref B]: this functionality is to be used for speed adjustment in respect of the preceding vehicle. Stop-and-Go is for queue-driving, when the vehicle speed is lower than 15km/h in a urban-like environment, whereas ACC operates at higher speeds, like on highways. Once engaged, gas and brake control are handled by the system until some driver s action on the pedals disables the automatic speed control of the car. 2 Hardware Architecture The vehicles is equipped with a variety of devices, including sensors for world perception, navigation and control, systems, localizations devices, computers, displays, batteries, wiring. A lot of effort was spent to have a high level
3 of integration: from the outside the devices are barely visible, while in the inside all necessary controls and displays are integrated into the dash board, headrest and armrest. Figure 2 shows the overall equipment schema. The acronyms used are the following: Cameras: DLR and DLL = DragonFly2 Lateral Right and Left FRR and FRL = FireFly2 Rear Right and Left DWR and DWL = DragonFly2 Wide (baseline) Right and Left DGNR and DGNL = DragonFly2 Grey (b/w) Narrow (baseline) Right and Left DBR and DBL = DragonFly2 Back Right and Left Laser and Laserscanners: HFR and HFL = Hella Front Right and Left HIDIS = Hella Idis IL = IBEO Lux HB = Hella Back Other Devices: 2 Point Grey DragonFly2 [7] colour cameras with 1024x768 resolution and 2.5mm optics are placed near the end of the hood, above the tires at both ends of the bumper. Computers: 2 Point Grey DragonFly2 [7] colour cameras with 1024x768 resolution and 2.5mm optics are placed near the end of the hood, above the tires at both ends of the bumper. Console and Displays: 2 Point Grey DragonFly2 [7] colour cameras with 1024x768 resolution and 2.5mm optics are placed near the end of the hood, above the tires at both ends of the bumper. Fig. 2. Complete scheme of vehicle s hardware components
4 2.1 Perception and Localization Devices The UDPA architecture implemented on this vehicle receives, as input, information coming form laserscanners, cameras, GPS antenna, inertial navigation system and vehicle state. All these information are gathered to create, and continuously update, a perception map of the environment where the car is called to move, on the basis of which the implemented driver assistance actions are evaluated and taken. In this paragraph we will go through details about the perception and localization devices Cameras The equipment described in this section is addressed to supply the vehicle with an overall vision of the road status around it in terms of pedestrians, generic obstacles, overtaking cars in order to safeguard its safety and that of the other road users. Fig. 3. Cameras set-up The camera devices are the following: 2 Point Grey DragonFly2 [7] colour cameras with 1024x768 resolution and 2.5mm optics are placed near the end of the hood, above the tires at both ends of the bumper. 2 Point Grey FireFly2 [7] colour cameras with 1024x768 resolution and 6mm optics are installed in both wing mirrors lodging, besides the respective mirrors. 2 Point Grey DragonFly2 colour cameras and 2 Point Grey DragonFly2 b/w cameras, all with 1024x768 resolution and 6.4mm optics, are installed in the main frontal area, behind the windscreen. 2 Point Grey DragonFly2 colour cameras with 1024x768 resolution and mm optics are placed at the back of the vehicle, between the rear bumper and the number plate. The front lateral cameras (Point Grey DragonFly2) are employed for the following aims: Crossing assistance: the very wide field of view makes these laserscanner particularly useful in this task, allowing a variety of intersection patterns faceable (e.g. private to public road exits); The front cameras (Point Grey DragonFly2) are employed for the following aims:
5 Obstacle & Pedestrian detection: here is especially enforced the IBEO Lux s long detection range (200m), to compensate the (relatively) short range of cameras and beam scanner, that, on the other hands, provides 3D information of the detected items; Lane & Stop Line detection, Lane keeping [ref A]: navigation in a urban environment has to be The wing mirror cameras (Point Grey FireFly2) are employed for the following aims: Overtaking assistance [ref A]: when driving on a road with multiple lanes, the lane change The rear cameras (Point Grey DragonFly2) are employed for the following aims: Parking assistance [ref A]: when driving on a road with multiple lanes, the lane change Laser and Laserscanners Lasers and scanning lasers (laserscanners) are typically very precise in measuring distances, and very robust with respect of light condition. At the same time they are very dependant on the vehicle attitude (roll, pitch) and provide very few information useful to classify the object causing a certain reflection at a certain time. Hence, they are the ideal complementary for vision based sensors (i.e. cameras), that are normally less precise in distance estimation but very powerful in providing colour/texture information. For an in-depth examination of laser-camera fusion, see [1], [2]. Fig. 4. Laser and laserscanners set-up The vehicle is equipped with a total of 5 laser based devices (see schema Fig. 4). The higher speeds are normally reached when driving forward, hence a larger number of sensors has been planned on the front: 4 are on the front and 1 is on the back. Let discuss them separately. The laser devices are the following: 3 Hokuyo UTM-30LX [3] single layer laserscanners, mounted on the left and right corner. They have a 270 degrees and 30 meters scanning range. The angular resolution is 0.25 degrees and the scanning frequency is 25Hz;
6 1 IBEO Lux [4], a 4 layer laserscanner with 100 degrees and 200 meters range, mounted on the centre front. The angular resolution is 4cm, the scanning frequency is 12.5Hz and the vertical field of view is 3.4 degrees; 1 Hella MultiBeam [5] infrared laser. This is not a scanning device, but a multi beam laser with 16 beams, each one of 1 degree horizontal and 3 degree vertical field of view. It cat be seen a very low resolution range camera [6]. The front side laser/laserscanners (IBEO and Hella) are employed for the following aims: Obstacle & Pedestrian detection: here is especially enforced the IBEO Lux s long detection range (200m), to compensate the (relatively) short range of cameras and beam scanner, that, on the other hands, provides 3D information of the detected items; Stop-and-Go & ACC: while IBEO sensor is a really wide porpoise device, the Hella range sensor is specifically though (and limited) for this scope, due to the ability of providing 3D information even at very short distances (out of the stereo field of view); The corner side laserscanners (Hokuyo) are employed for the following aims: Crossing assistance: the very wide field of view makes these laserscanner particularly useful in this task, allowing a variety of intersection patterns faceable (e.g. private to public road exits); Overtaking assistance: again, the wide field of view if enforced to detect overtaking vehicles in the blind spot; Parking assistance: free space detection when moving inside parking lots, looking for an available and suitable parking spot, can be made by motion stereo with the monocular cameras available on the vehicles sides, however the reinforcement in distances estimation given by the laserscanner is very valuable; The rear side laserscanner (Hokuyo) is employed for the following aim: Parking assistance: the backup manoeuvre normally occurs at reduced speed, typically during parking, hence the main support asked by a driver is obstacle presence warning, like pedestrians, other vehicles, poles, etc. For this reason on the back side only one Hokuyo UTM-30LX laserscanner has been mounted, placed on the centre Other Devices Besides the previous perception devices, in order to understand the vehicle s position in world coordinates and its basic motion information (in terms of type, rate and direction), two other devices are installed: a Racelogic VBOX II SX GPS Global Positioning System and a Three Axis Racelogic VBOX II IMU Inertial Measurement Unit both with 20Hz update rate (see Fig. 2). 2.2 Processing Devices The processing of all data acquired by the previous perception equipment is exploited by four computers, all placed in the vehicle s boot. Three of such computers were especially assembled and employ two AmericanInc A and a Voom2 motherboards whereas the fourth one is a dspace Micro AutoBox (see Fig. 5). The dspace computer is dedicated to manage the low-level devices like ESP, EPS and ACC systems, interfacing itself with the car s switchboard and making it work.
7 Fig. 5. Computers set-up 2.3 Console and Displays In order to get a visual feedback of the overall system behaviour and let the driver interact with it, five displays are installed in the vehicle s passenger compartment. One touchscreen monitor 15 EASYPANEL EP150AD33-15W- U-DM, used for debug purposes, is installed on the dashboard in front of passenger s seat. A touchscreen monitor 7 EASYPANEL EP150AD33-15W-U-DM is placed on the command panel above radio car s lodging and two others EASYPANEL EP070AV28EP DM in passengers headrests. Last monitor 7 Hardstone MM070 is installed in the rear-view mirror (see Fig. 6). Fig. 6. Interface set-up 3 Project s objectives The project is aimed at the development of autonomous vehicles and supervised driving systems with the ultimate goal of defining a common open architecture which will be proposed as a standard to the automotive sector. Besides providing clear advantages on safety for road users, the availability of an open architecture will encourage and make possible the sharing of knowledge between public and private research communities (academic and automotive industry) and thus speed up the design of a standard platform for future vehicles. Further research steps will be eased -and therefore made more effective- thanks to the common and open architectural layer proposed. The project is divided into the following two main milestones: 1. the development of fully autonomous vehicles and,
8 2. extension towards driving assistance systems, namely systems able to supervise a driver and to intervene when necessary. Figure??? shows the evolution of driving assistance systems leading to autonomous driving. The first 4 steps have a human being as vehicle main leader: starting from a set of independent warning systems (step 2) like lane departure warning, to independent active systems (step 3) like adaptive cruise control, collision avoidance, and finally a unified perception and decision architecture (step 4) devoted to perform an active cooperative driving. From step 5 onward the vehicle leader changes from the human being to the electronic pilot increasing the sensing capabilities. First with autonomous driving and then (step 6) with supervised driving, in which the human instructs and directs the manoeuvre while the main control is owned by the electronic pilot. Human contribution, in this case, is treated like one of the other sensing devices, and thus overridable. This project will reach step 3 and step 4, using the new concept architecture to make all the systems cooperate together. The first milestone will be a demonstration of fully autonomous vehicles able to cope with real scenarios and not only with controlled environments. This requires to develop both an extended perception system able to build an accurate world model and a sophisticated decision system. This part will be based on the work already developed by VisLab for other projects (like ARGO [1p] and the DARPA Challenges [2p]) and its experience as a primary player in this field. In the second stage, leading to the second milestone, a perception module able to analyze the driver s intentions as well as a Human Machine Interface will be added in order to enable driving assistance features. As mentioned, this requires to extend both the perception and decision functionalities in order to integrate new inputs. During both stages, the logical architecture of the vehicle (i) autonomous system and (ii) supervisory system will be designed, tested, and validated thanks to intermediate tests on its completeness, feasibility, and scalability by means of the test bed described. References [AMAA Reference Heading] For the references please use [ ] to number your references serially, then press tab and continue typing the refernces.] [1] Details at [7] Specification and details at [a] A. Broggi, A. Cappalunga, S. Cattani, and P. Zani, ``Lateral Vehicles Detection Using Monocular High Resolution Cameras on TerraMax'' in Procs. IEEE Intelligent Vehicles Symposium 2008, Eindhoven, Netherlands, Jun [b] ref B: Real time road signs recognition (Marelli, IV07) ref C: 3D parking assistant system ref D: Evaluation of a Vision-Based Parking Assistance System
9 [1] Alberto Broggi, Stefano Cattani, Pier Paolo Porta, and Paolo Zani, A Laserscanner- Vision Fusion System Implemented on the TerraMax Autonomous Vehicle, In International Conference on Intelligent Robots and Systems (IROS06), Beijing, China, October [2] Massimo Bertozzi, Luca Bombini, Pietro Cerri, Paolo Medici, Pier Claudio Antonello, and Maurizio Miglietta, Obstacle Detection and Classification fusing Radar and Vision, In Procs. IEEE Intelligent Vehicles Symposium 2008, pages , Eindhoven, Netherlands, June [3] Specification and details at [4] Specification and details at [5] Specification and details at [6] S.B. Gokturk, H. Yalcin, C. Bamji, A Time-of-Flight Depth Sensor System Description, Issues, and Solutions, In Proc. IEEE Workshop Real-Time 3D Sensors, 2004 [1p] [2p] Yi-Liang Chen, Venkataraman Sundareswaran, Craig Anderson, Alberto Broggi, Paolo Grisleri, Pier Paolo Porta, Paolo Zani, and John Beck, TerraMax: Team Oshkosh Urban Robot, Journal of Field Robotics, 25(10): , October Alberto Broggi, Massimo Bertozzi, Alessandra Fascioli, and Gianni Conte, Automatic Vehicle Guidance: the Experience of the ARGO Vehicle. World Scientific, Singapore, April Luca Bombini, Stefano Cattani, Pietro Cerri, Rean Isabella Fedriga, Mirko Felisa, and Pier Paolo Porta VisLab Dipartimento di Ingegneria dell Informazione Università degli Studi di Parma Italy {bombini,cattani,cerri,fedriga,felisa,porta}@vislab.it Keywords: ADAS, Autonomous vehicle, Unified Perception and Decision Architecture. [AMAA Keywords]
Visione per il veicolo Paolo Medici 2017/ Visual Perception
Visione per il veicolo Paolo Medici 2017/2018 02 Visual Perception Today Sensor Suite for Autonomous Vehicle ADAS Hardware for ADAS Sensor Suite Which sensor do you know? Which sensor suite for Which algorithms
More informationChoosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles
Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles Ali Osman Ors May 2, 2017 Copyright 2017 NXP Semiconductors 1 Sensing Technology Comparison Rating: H = High, M=Medium,
More informationGNSS in Autonomous Vehicles MM Vision
GNSS in Autonomous Vehicles MM Vision MM Technology Innovation Automated Driving Technologies (ADT) Evaldo Bruci Context & motivation Within the robotic paradigm Magneti Marelli chose Think & Decision
More informationA software video stabilization system for automotive oriented applications
A software video stabilization system for automotive oriented applications A. Broggi, P. Grisleri Dipartimento di Ingegneria dellinformazione Universita degli studi di Parma 43100 Parma, Italy Email: {broggi,
More informationP1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems
Light has to go where it is needed: Future Light Based Driver Assistance Systems Thomas Könning¹, Christian Amsel¹, Ingo Hoffmann² ¹ Hella KGaA Hueck & Co., Lippstadt, Germany ² Hella-Aglaia Mobile Vision
More informationDeliverable D1.6 Initial System Specifications Executive Summary
Deliverable D1.6 Initial System Specifications Executive Summary Version 1.0 Dissemination Project Coordination RE Ford Research and Advanced Engineering Europe Due Date 31.10.2010 Version Date 09.02.2011
More informationAuthor s Name Name of the Paper Session. DYNAMIC POSITIONING CONFERENCE October 10-11, 2017 SENSORS SESSION. Sensing Autonomy.
Author s Name Name of the Paper Session DYNAMIC POSITIONING CONFERENCE October 10-11, 2017 SENSORS SESSION Sensing Autonomy By Arne Rinnan Kongsberg Seatex AS Abstract A certain level of autonomy is already
More information23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017
23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS Sergii Bykov Technical Lead Machine Learning 12 Oct 2017 Product Vision Company Introduction Apostera GmbH with headquarter in Munich, was
More informationinteractive IP: Perception platform and modules
interactive IP: Perception platform and modules Angelos Amditis, ICCS 19 th ITS-WC-SIS76: Advanced integrated safety applications based on enhanced perception, active interventions and new advanced sensors
More informationAdvances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving
FEATURED ARTICLES Autonomous Driving Technology for Connected Cars Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving Progress is being made on vehicle periphery sensing,
More informationIntelligent Technology for More Advanced Autonomous Driving
FEATURED ARTICLES Autonomous Driving Technology for Connected Cars Intelligent Technology for More Advanced Autonomous Driving Autonomous driving is recognized as an important technology for dealing with
More informationMetadata of the chapter that will be visualized online
Metadata of the chapter that will be visualized online ChapterTitle Chapter Sub-Title Camera-Based Automotive Systems Chapter CopyRight - Year Springer Science+Business Media, LLC (This will be the copyright
More informationDesign of an Instrumented Vehicle Test Bed for Developing a Human Centered Driver Support System
Design of an Instrumented Vehicle Test Bed for Developing a Human Centered Driver Support System Joel C. McCall, Ofer Achler, Mohan M. Trivedi jmccall@ucsd.edu, oachler@ucsd.edu, mtrivedi@ucsd.edu Computer
More informationSIS63-Building the Future-Advanced Integrated Safety Applications: interactive Perception platform and fusion modules results
SIS63-Building the Future-Advanced Integrated Safety Applications: interactive Perception platform and fusion modules results Angelos Amditis (ICCS) and Lali Ghosh (DEL) 18 th October 2013 20 th ITS World
More informationFLASH LiDAR KEY BENEFITS
In 2013, 1.2 million people died in vehicle accidents. That is one death every 25 seconds. Some of these lives could have been saved with vehicles that have a better understanding of the world around them
More informationOBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER
OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER Nils Gageik, Thilo Müller, Sergio Montenegro University of Würzburg, Aerospace Information Technology
More informationWhite paper on CAR28T millimeter wave radar
White paper on CAR28T millimeter wave radar Hunan Nanoradar Science and Technology Co., Ltd. Version history Date Version Version description 2017-07-13 1.0 the 1st version of white paper on CAR28T Contents
More informationPerception platform and fusion modules results. Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event
Perception platform and fusion modules results Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event 20 th -21 st November 2013 Agenda Introduction Environment Perception in Intelligent Transport
More informationDesigning A Human Vehicle Interface For An Intelligent Community Vehicle
Designing A Human Vehicle Interface For An Intelligent Community Vehicle Kin Kok Lee, Yong Tsui Lee and Ming Xie School of Mechanical & Production Engineering Nanyang Technological University Nanyang Avenue
More informationDriver status monitoring based on Neuromorphic visual processing
Driver status monitoring based on Neuromorphic visual processing Dongwook Kim, Karam Hwang, Seungyoung Ahn, and Ilsong Han Cho Chun Shik Graduated School for Green Transportation Korea Advanced Institute
More informationSensor Fusion for Navigation in Degraded Environements
Sensor Fusion for Navigation in Degraded Environements David M. Bevly Professor Director of the GPS and Vehicle Dynamics Lab dmbevly@eng.auburn.edu (334) 844-3446 GPS and Vehicle Dynamics Lab Auburn University
More informationWhite paper on CAR150 millimeter wave radar
White paper on CAR150 millimeter wave radar Hunan Nanoradar Science and Technology Co.,Ltd. Version history Date Version Version description 2017-02-23 1.0 The 1 st version of white paper on CAR150 Contents
More informationSilicon radars and smart algorithms - disruptive innovation in perceptive IoT systems Andy Dewilde PUBLIC
Silicon radars and smart algorithms - disruptive innovation in perceptive IoT systems Andy Dewilde PUBLIC Fietser in levensgevaar na ongeval met vrachtwagen op Louizaplein Het Laatste Nieuws 16/06/2017
More informationIsrael Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats
Mr. Amos Gellert Technological aspects of level crossing facilities Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings Deputy General Manager
More informationEffective Collision Avoidance System Using Modified Kalman Filter
Effective Collision Avoidance System Using Modified Kalman Filter Dnyaneshwar V. Avatirak, S. L. Nalbalwar & N. S. Jadhav DBATU Lonere E-mail : dvavatirak@dbatu.ac.in, nalbalwar_sanjayan@yahoo.com, nsjadhav@dbatu.ac.in
More informationAn Information Fusion Method for Vehicle Positioning System
An Information Fusion Method for Vehicle Positioning System Yi Yan, Che-Cheng Chang and Wun-Sheng Yao Abstract Vehicle positioning techniques have a broad application in advanced driver assistant system
More informationEG 1 Millimeter-wave & Integrated Antennas
EuCAP 2010 ARTIC Workshop 5-12 July, San Diego, California EG 1 Millimeter-wave & Integrated Antennas Ronan SAULEAU Ronan.Sauleau@univ-rennes1.fr IETR (Institute of Electronics and Telecommunications,
More informationUsing Vision-Based Driver Assistance to Augment Vehicular Ad-Hoc Network Communication
Using Vision-Based Driver Assistance to Augment Vehicular Ad-Hoc Network Communication Kyle Charbonneau, Michael Bauer and Steven Beauchemin Department of Computer Science University of Western Ontario
More informationIntelligent driving TH« TNO I Innovation for live
Intelligent driving TNO I Innovation for live TH«Intelligent Transport Systems have become an integral part of the world. In addition to the current ITS systems, intelligent vehicles can make a significant
More informationEvaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed
AUTOMOTIVE Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed Yoshiaki HAYASHI*, Izumi MEMEZAWA, Takuji KANTOU, Shingo OHASHI, and Koichi TAKAYAMA ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
More informationInvited talk IET-Renault Workshop Autonomous Vehicles: From theory to full scale applications Novotel Paris Les Halles, June 18 th 2015
Risk assessment & Decision-making for safe Vehicle Navigation under Uncertainty Christian LAUGIER, First class Research Director at Inria http://emotion.inrialpes.fr/laugier Contributions from Mathias
More informationPedestrian Detection Using On-board Far-InfraRed Cameras
Vol. 47 No. SIG 5(CVIM 13) Mar. 2006 IV2005 OTCBVS 05 2 Pedestrian Detection Using On-board Far-InfraRed Cameras Masayoshi Aoki and Noboru Yasuda There are many active researches on pedestrian detection
More informationKeynote Speakers. Perception Technology for Automated Vehicles. Alberto Broggi
Keynote Speakers Perception Technology for Automated Vehicles Alberto Broggi Prof., Ph.D., IEEE Fellow, director of VisLab, at the Università di Parma Abstract VisLab has been active in the field of intelligent
More informationRobust Positioning for Urban Traffic
Robust Positioning for Urban Traffic Motivations and Activity plan for the WG 4.1.4 Dr. Laura Ruotsalainen Research Manager, Department of Navigation and positioning Finnish Geospatial Research Institute
More informationMMW sensors for Industrial, safety, Traffic and security applications
MMW sensors for Industrial, safety, Traffic and security applications Philip Avery Director, Navtech Radar Ltd. Overview Introduction to Navtech Radar and what we do. A brief explanation of how FMCW radars
More informationMaking Vehicles Smarter and Safer with Diode Laser-Based 3D Sensing
Making Vehicles Smarter and Safer with Diode Laser-Based 3D Sensing www.lumentum.com White Paper There is tremendous development underway to improve vehicle safety through technologies like driver assistance
More informationDriver Assistance Systems (DAS)
Driver Assistance Systems (DAS) Short Overview László Czúni University of Pannonia What is DAS? DAS: electronic systems helping the driving of a vehicle ADAS (advanced DAS): the collection of systems and
More informationRevision of the EU General Safety Regulation and Pedestrian Safety Regulation
AC.nl Revision of the EU General Safety Regulation and Pedestrian Safety Regulation 11 September 2018 ETSC isafer Fitting safety as standard Directorate-General for Internal Market, Automotive and Mobility
More informationPositioning Challenges in Cooperative Vehicular Safety Systems
Positioning Challenges in Cooperative Vehicular Safety Systems Dr. Luca Delgrossi Mercedes-Benz Research & Development North America, Inc. October 15, 2009 Positioning for Automotive Navigation Personal
More informationDevelopment of a 24 GHz Band Peripheral Monitoring Radar
Special Issue OneF Automotive Technology Development of a 24 GHz Band Peripheral Monitoring Radar Yasushi Aoyagi * In recent years, the safety technology of automobiles has evolved into the collision avoidance
More informationDeployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection
Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection Clark Letter*, Lily Elefteriadou, Mahmoud Pourmehrab, Aschkan Omidvar Civil
More informationRoadside Range Sensors for Intersection Decision Support
Roadside Range Sensors for Intersection Decision Support Arvind Menon, Alec Gorjestani, Craig Shankwitz and Max Donath, Member, IEEE Abstract The Intelligent Transportation Institute at the University
More informationPerSEE: a Central Sensors Fusion Electronic Control Unit for the development of perception-based ADAS
10-4 MVA2015 IAPR International Conference on Machine Vision Applications, May 18-22, 2015, Tokyo, JAPAN PerSEE: a Central Sensors Fusion Electronic Control Unit for the development of perception-based
More informationTeam Kanaloa: research initiatives and the Vertically Integrated Project (VIP) development paradigm
Additive Manufacturing Renewable Energy and Energy Storage Astronomical Instruments and Precision Engineering Team Kanaloa: research initiatives and the Vertically Integrated Project (VIP) development
More informationLane Detection Using Median Filter, Wiener Filter and Integrated Hough Transform
Journal of Automation and Control Engineering Vol. 3, No. 3, June 2015 Lane Detection Using Median Filter, Wiener Filter and Integrated Hough Transform Sukriti Srivastava, Manisha Lumb, and Ritika Singal
More informationProject. Document identification
Project GRANT AGREEMENT NO. ACRONYM TITLE CALL FUNDING SCHEME TITLE 248898 2WIDE_SENSE WIDE SPECTRAL BAND & WIDE DYNAMICS MULTIFUNCTIONAL IMAGING SENSOR ENABLING SAFER CAR TRANSPORTATION FP7-ICT-2009.6.1
More informationNew Automotive Applications for Smart Radar Systems
New Automotive Applications for Smart Radar Systems Ralph Mende*, Hermann Rohling** *s.m.s smart microwave sensors GmbH Phone: +49 (531) 39023 0 / Fax: +49 (531) 39023 58 / ralph.mende@smartmicro.de Mittelweg
More informationIntroducing LISA. LISA: Laboratory for Intelligent and Safe Automobiles
Introducing LISA LISA: Laboratory for Intelligent and Safe Automobiles Mohan M. Trivedi University of California at San Diego mtrivedi@ucsd.edu Int. Workshop on Progress and Future Directions of Adaptive
More informationA simple embedded stereoscopic vision system for an autonomous rover
In Proceedings of the 8th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2004' ESTEC, Noordwijk, The Netherlands, November 2-4, 2004 A simple embedded stereoscopic vision
More informationThe Evolution of Nano-Satellite Proximity Operations In-Space Inspection Workshop 2017
The Evolution of Nano-Satellite Proximity Operations 02-01-2017 In-Space Inspection Workshop 2017 Tyvak Introduction We develop miniaturized custom spacecraft, launch solutions, and aerospace technologies
More informationWHO. 6 staff people. Tel: / Fax: Website: vision.unipv.it
It has been active in the Department of Electrical, Computer and Biomedical Engineering of the University of Pavia since the early 70s. The group s initial research activities concentrated on image enhancement
More informationTHE FUTURE OF AUTOMOTIVE - AUGMENTED REALITY VERSUS AUTONOMOUS VEHICLES
The 14 International Conference RELIABILITY and STATISTICS in TRANSPORTATION and COMMUNICATION 2014 Proceedings of the 14th International Conference Reliability and Statistics in Transportation and Communication
More informationDesign of a Remote-Cockpit for small Aerospace Vehicles
Design of a Remote-Cockpit for small Aerospace Vehicles Muhammad Faisal, Atheel Redah, Sergio Montenegro Universität Würzburg Informatik VIII, Josef-Martin Weg 52, 97074 Würzburg, Germany Phone: +49 30
More informationMoving Obstacle Avoidance for Mobile Robot Moving on Designated Path
Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Taichi Yamada 1, Yeow Li Sa 1 and Akihisa Ohya 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1,
More informationA Winning Combination
A Winning Combination Risk factors Statements in this presentation that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such
More informationVolkswagen Group: Leveraging VIRES VTD to Design a Cooperative Driver Assistance System
Volkswagen Group: Leveraging VIRES VTD to Design a Cooperative Driver Assistance System By Dr. Kai Franke, Development Online Driver Assistance Systems, Volkswagen AG 10 Engineering Reality Magazine A
More informationFinal Report Non Hit Car And Truck
Final Report Non Hit Car And Truck 2010-2013 Project within Vehicle and Traffic Safety Author: Anders Almevad Date 2014-03-17 Content 1. Executive summary... 3 2. Background... 3. Objective... 4. Project
More informationFig 1. Statistical Report for death by road user category
Vehicle Accident Prevention Using Assistant Braking System Jim Harrington J 1, Kavianand G 2, Jeevanth K 3 3 rd year UG Scholar, Department of Electronics and Communication Engineering, Panimalar Engineering
More informationHIGHTS: towards sub-meter positioning accuracy in vehicular networks. Jérôme Härri (EURECOM) on Behalf of HIGHTS ETSI ITS Workshop March 6-8, 2018
HIGHTS: towards sub-meter positioning accuracy in vehicular networks Jérôme Härri (EURECOM) on Behalf of HIGHTS ETSI ITS Workshop March 6-8, 2018 The HIGHTS Consortium 09.03.2018 H2020 HIGHTS Project 2
More informationApplications of Millimeter-Wave Sensors in ITS
Applications of Millimeter-Wave Sensors in ITS by Shigeaki Nishikawa* and Hiroshi Endo* There is considerable public and private support for intelligent transport systems ABSTRACT (ITS), which promise
More informationFusion in EU projects and the Perception Approach. Dr. Angelos Amditis interactive Summer School 4-6 July, 2012
Fusion in EU projects and the Perception Approach Dr. Angelos Amditis interactive Summer School 4-6 July, 2012 Content Introduction Data fusion in european research projects EUCLIDE PReVENT-PF2 SAFESPOT
More informationAn Image Processing Based Pedestrian Detection System for Driver Assistance
I J C T A, 9(15), 2016, pp. 7369-7375 International Science Press An Image Processing Based Pedestrian Detection System for Driver Assistance Sandeep A. K.*, Nithin S.** and K. I. Ramachandran*** ABSTRACT
More informationHelicopter Aerial Laser Ranging
Helicopter Aerial Laser Ranging Håkan Sterner TopEye AB P.O.Box 1017, SE-551 11 Jönköping, Sweden 1 Introduction Measuring distances with light has been used for terrestrial surveys since the fifties.
More informationTsuyoshi Sato PIONEER CORPORATION July 6, 2017
Technology R&D for for Highly Highly Automated Automated Driving Driving Tsuyoshi Sato PIONEER CORPORATION July 6, 2017 Agenda Introduction Overview Architecture R&D for Highly Automated Driving Hardware
More informationMachine Vision Beyond the Factory. Jeff Burnstein President October 18, 2012 Beijing
Machine Vision Beyond the Factory Jeff Burnstein President October 18, 2012 Beijing 1 Machine Vision is No Longer Tied to the Factory Biometrics Medical Imaging Traffic Management High end Security and
More informationDigital Engines for Smart and Connected Cars By Bob O Donnell, TECHnalysis Research Chief Analyst
WHITE PAPER On Behalf of Digital Engines for Smart and Connected Cars By Bob O Donnell, TECHnalysis Research Chief Analyst SUMMARY Interest in advanced car electronics is extremely high, but there is a
More informationArgos Ingegneria S.p.A. October 2009
Argos Ingegneria S.p.A. October 2009 1 Photometric Measurement Systems SMF/M SMF/M General description SMF/M is the photometric measurement system for AGL equipment especially designed and developed by
More informationTechnical Datasheet. Blaxtair is an intelligent cameraa with the ability to generate alarms when a pedestrian is detected
BlaXtair 1 Product Overview Technical Datasheet Figure 1 Blaxtair sensor head Blaxtair is an intelligent cameraa with the ability to generate alarms when a pedestrian is detected in a predefined area.
More informationThe GATEway Project London s Autonomous Push
The GATEway Project London s Autonomous Push 06/2016 Why TRL? Unrivalled industry position with a focus on mobility 80 years independent transport research Public and private sector with global reach 350+
More informationLED flicker: Root cause, impact and measurement for automotive imaging applications
https://doi.org/10.2352/issn.2470-1173.2018.17.avm-146 2018, Society for Imaging Science and Technology LED flicker: Root cause, impact and measurement for automotive imaging applications Brian Deegan;
More informationAutomotive Needs and Expectations towards Next Generation Driving Simulation
Automotive Needs and Expectations towards Next Generation Driving Simulation Dr. Hans-Peter Schöner - Insight fromoutside -Consulting - Senior Automotive Expert, Driving Simulation Association September
More informationWhite paper on SP25 millimeter wave radar
White paper on SP25 millimeter wave radar Hunan Nanoradar Science and Technology Co.,Ltd. Version history Date Version Version description 2016-08-22 1.0 the 1 st version of white paper on SP25 Contents
More informationRECOMMENDATION ITU-R M.1310* TRANSPORT INFORMATION AND CONTROL SYSTEMS (TICS) OBJECTIVES AND REQUIREMENTS (Question ITU-R 205/8)
Rec. ITU-R M.1310 1 RECOMMENDATION ITU-R M.1310* TRANSPORT INFORMATION AND CONTROL SYSTEMS (TICS) OBJECTIVES AND REQUIREMENTS (Question ITU-R 205/8) Rec. ITU-R M.1310 (1997) Summary This Recommendation
More informationIntelligent Driving Agents
Intelligent Driving Agents The agent approach to tactical driving in autonomous vehicles and traffic simulation Presentation Master s thesis Patrick Ehlert January 29 th, 2001 Imagine. Sensors Actuators
More informationHAVEit Highly Automated Vehicles for Intelligent Transport
HAVEit Highly Automated Vehicles for Intelligent Transport Holger Zeng Project Manager CONTINENTAL AUTOMOTIVE HAVEit General Information Project full title: Highly Automated Vehicles for Intelligent Transport
More informationspecifications as these arise from the requirements and the applications coming mainly from the automotive industry.
issue 2 December 2010 Editorial Welcome to the MiniFaros EC funded project second newsletter. MiniFaros completed its first year and within this time the first outcomes have been made available. Within
More informationUsing FMI/ SSP for Development of Autonomous Driving
Using FMI/ SSP for Development of Autonomous Driving presented by Jochen Köhler (ZF) FMI User Meeting 15.05.2017 Prague / Czech Republic H.M. Heinkel S.Rude P. R. Mai J. Köhler M. Rühl / A. Pillekeit Motivation
More informationInter- and Intra-Vehicle Communications
Inter- and Intra-Vehicle Communications Gilbert Held A Auerbach Publications Taylor 5* Francis Group Boca Raton New York Auerbach Publications is an imprint of the Taylor & Francis Croup, an informa business
More informationCAPACITIES FOR TECHNOLOGY TRANSFER
CAPACITIES FOR TECHNOLOGY TRANSFER The Institut de Robòtica i Informàtica Industrial (IRI) is a Joint University Research Institute of the Spanish Council for Scientific Research (CSIC) and the Technical
More informationBluetooth Low Energy Sensing Technology for Proximity Construction Applications
Bluetooth Low Energy Sensing Technology for Proximity Construction Applications JeeWoong Park School of Civil and Environmental Engineering, Georgia Institute of Technology, 790 Atlantic Dr. N.W., Atlanta,
More informationTHE SCHOOL BUS. Figure 1
THE SCHOOL BUS Federal Motor Vehicle Safety Standards (FMVSS) 571.111 Standard 111 provides the requirements for rear view mirror systems for road vehicles, including the school bus in the US. The Standards
More informationNCS Lecture 2 Case Study - Alice. Alice Overview
NCS Lecture 2 Case Study - Alice Richard M. Murray 17 March 2008 Goals: Provide detailed overview of a a model networked control system Introduce NCS features to be addressed in upcoming lectures Reading:
More informationAutomatic Maneuver Recognition in the Automobile: the Fusion of Uncertain Sensor Values using Bayesian Models
Automatic Maneuver Recognition in the Automobile: the Fusion of Uncertain Sensor Values using Bayesian Models Arati Gerdes Institute of Transportation Systems German Aerospace Center, Lilienthalplatz 7,
More informationDirectional Driver Hazard Advisory System. Benjamin Moore and Vasil Pendavinji ECE 445 Project Proposal Spring 2017 Team: 24 TA: Yuchen He
Directional Driver Hazard Advisory System Benjamin Moore and Vasil Pendavinji ECE 445 Project Proposal Spring 2017 Team: 24 TA: Yuchen He 1 Table of Contents 1 Introduction... 3 1.1 Objective... 3 1.2
More informationCOST Action: TU1302 Action Title: Satellite Positioning Performance Assessment for Road Transport SaPPART. STSM Scientific Report
COST Action: TU1302 Action Title: Satellite Positioning Performance Assessment for Road Transport SaPPART STSM Scientific Report Assessing the performances of Hybrid positioning system COST STSM Reference
More informationAvailable theses (October 2011) MERLIN Group
Available theses (October 2011) MERLIN Group Politecnico di Milano - Dipartimento di Elettronica e Informazione MERLIN Group 2 Luca Bascetta bascetta@elet.polimi.it Gianni Ferretti ferretti@elet.polimi.it
More informationBy Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.
Leddar optical time-of-flight sensing technology, originally discovered by the National Optics Institute (INO) in Quebec City and developed and commercialized by LeddarTech, is a unique LiDAR technology
More informationThe Research of the Lane Detection Algorithm Base on Vision Sensor
Research Journal of Applied Sciences, Engineering and Technology 6(4): 642-646, 2013 ISSN: 2040-7459; e-issn: 2040-7467 Maxwell Scientific Organization, 2013 Submitted: September 03, 2012 Accepted: October
More informationMOBY-DIC. Grant Agreement Number Model-based synthesis of digital electronic circuits for embedded control. Publishable summary
MOBY-DIC Grant Agreement Number 248858 Model-based synthesis of digital electronic circuits for embedded control Report version: 1 Due date: M24 (second periodic report) Period covered: December 1, 2010
More information1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.
ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means
More informationJournal of Mechatronics, Electrical Power, and Vehicular Technology
Journal of Mechatronics, Electrical Power, and Vehicular Technology 8 (2017) 85 94 Journal of Mechatronics, Electrical Power, and Vehicular Technology e-issn: 2088-6985 p-issn: 2087-3379 www.mevjournal.com
More informationOPEN CV BASED AUTONOMOUS RC-CAR
OPEN CV BASED AUTONOMOUS RC-CAR B. Sabitha 1, K. Akila 2, S.Krishna Kumar 3, D.Mohan 4, P.Nisanth 5 1,2 Faculty, Department of Mechatronics Engineering, Kumaraguru College of Technology, Coimbatore, India
More informationCivil Radar Systems.
Civil Radar Systems www.aselsan.com.tr Civil Radar Systems With extensive radar heritage exceeding 20 years, ASELSAN is a new generation manufacturer of indigenous, state-of-theart radar systems. ASELSAN
More informationSmart eye using Ultrasonic sensor in Electrical vehicles for Differently Able.
IOSR Journal of Electrical and Electronics Engineering (IOSR-JEEE) e-issn: 2278-1676,p-ISSN: 2320-3331, Volume 9, Issue 2 Ver. V (Mar Apr. 2014), PP 01-06 Smart eye using Ultrasonic sensor in Electrical
More informationRevised and extended. Accompanies this course pages heavier Perception treated more thoroughly. 1 - Introduction
Topics to be Covered Coordinate frames and representations. Use of homogeneous transformations in robotics. Specification of position and orientation Manipulator forward and inverse kinematics Mobile Robots:
More informationBehaviour-Based Control. IAR Lecture 5 Barbara Webb
Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor
More informationC-ELROB 2009 Technical Paper Team: University of Oulu
C-ELROB 2009 Technical Paper Team: University of Oulu Antti Tikanmäki, Juha Röning University of Oulu Intelligent Systems Group Robotics Group sunday@ee.oulu.fi Abstract Robotics Group is a part of Intelligent
More informationVSI Labs The Build Up of Automated Driving
VSI Labs The Build Up of Automated Driving October - 2017 Agenda Opening Remarks Introduction and Background Customers Solutions VSI Labs Some Industry Content Opening Remarks Automated vehicle systems
More informationA LASER RANGE-FINDER SCANNER SYSTEM FOR PRECISE MANEOUVER AND OBSTACLE AVOIDANCE IN MARITIME AND INLAND NAVIGATION
A LASER RANGE-FINDER SCANNER SYSTEM FOR PRECISE MANEOUVER AND OBSTACLE AVOIDANCE IN MARITIME AND INLAND NAVIGATION A.R. Jiménez, R.Ceres and F. Seco Instituto de Automática Industrial - CSIC Ctra. Campo
More information2014 Market Trends Webinar Series
Robotic Industries Association 2014 Market Trends Webinar Series Watch live or archived at no cost Learn about the latest innovations in robotics Sponsored by leading robotics companies 1 2014 Calendar
More information