Platforms & Applications for Embedded Vision. The Spring 2015 Computing Systems Week, May 5-7, Oslo. Embedded Computer Vision

Similar documents
Vehicle parameter detection in Cyber Physical System

RF module and Sensing Workshop Proposal. Tachlog Pvt. Ltd.

SHAPING THE FUTURE OF IOT: PLATFORMS FOR CO-CREATION, RAPID PROTOTYPING AND SUCCESSFUL INDUSTRIALIZATION

Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

UNIT1. Keywords page 13-14

Revision for Grade 7 in Unit #1&3

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Training Schedule. Robotic System Design using Arduino Platform

Supervisors: Rachel Cardell-Oliver Adrian Keating. Program: Bachelor of Computer Science (Honours) Program Dates: Semester 2, 2014 Semester 1, 2015

Voice Guided Military Robot for Defence Application

Intelligent Tactical Robotics

A Solar-Powered Wireless Data Acquisition Network

FRAUNHOFER INSTITUTE FOR OPEN COMMUNICATION SYSTEMS FOKUS COMPETENCE CENTER VISCOM

CSE237d: Embedded System Design Junjie Su May 8, 2008

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

sensors & systems Imagine future imaging... Leti, technology research institute Contact:

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

CONTACT: , ROBOTIC BASED PROJECTS

CR 33 SENSOR NETWORK INTEGRATION OF GPS

Wireless Music Dock - WMD Portable Music System with Audio Effect Applications

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Distributed spectrum sensing in unlicensed bands using the VESNA platform. Student: Zoltan Padrah Mentor: doc. dr. Mihael Mohorčič

Recognition of Group Activities using Wearable Sensors

Energy autonomous wireless sensors: InterSync Project. FIMA Autumn Conference 2011, Nov 23 rd, 2011, Tampere Vesa Pentikäinen VTT

Embedded & Robotics Training

1 Introduction. 2 Embedded Electronics Primer. 2.1 The Arduino

Signals, Instruments, and Systems W7. Embedded Systems General Concepts and

Sensors Fundamentals. Renesas Electronics America Inc Renesas Electronics America Inc. All rights reserved.

KI-SUNG SUH USING NAO INTRODUCTION TO INTERACTIVE HUMANOID ROBOTS

Development of intelligent systems

Major Project SSAD. Mentor : Raghudeep SSAD Mentor :Manish Jha Group : Group20 Members : Harshit Daga ( ) Aman Saxena ( )

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Ultra-small, economical and cheap radar made possible thanks to chip technology

Electronics Design Laboratory Lecture #11. ECEN 2270 Electronics Design Laboratory

Wirelessly Controlled Wheeled Robotic Arm

CALL FOR PAPERS. embedded world Conference. -Embedded Intelligence- embedded world Conference Nürnberg, Germany

Implementaion of High Performance Home Automation using Arduino

The Mote Revolution: Low Power Wireless Sensor Network Devices

Sensor system of a small biped entertainment robot

GENESIS TECH PROJECT

A Systems Approach to Electronic Product Development. Steven Dunbar Analog Field Applications Texas Instruments

CMOS MT9D111Camera Module 1/3.2-Inch 2-Megapixel Module Datasheet

Visvesvaraya Technological University, Belagavi

Industrial radar sensing. April 2018

Aug 6 th, Presented by: Danielle George- Project Manager Erin McCaskey Systems Engineer. LSP-F , Rev. B

Ahmad Faraz Hussain 1, Polash Kumar Das *1, Prabhat Ranjan 1 1 School of Electronic and information, South China University of Technology Guangzhou,

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

A Survey of Sensor Technologies for Prognostics and Health Management of Electronic Systems

Sensors. CS Embedded Systems p. 1/1

WIRELESS VOICE CONTROLLED ROBOTICS ARM

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Embedded Sensors. We can offer you complete solutions for intelligent integrated sensor systems.

Pannel: SIGNAL 2018 Advances on Sensing Techniques and Signal Processing

SNIOT702 Specification. Version number:v 1.0.1

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

KINECT CONTROLLED HUMANOID AND HELICOPTER

Controlling Humanoid Robot Using Head Movements

Agriculture Automation & Monitoring using NI my RIO & Image Processing to Estimate Physical Parameters of Soil

Arduino Robotics (Technology In Action) By John-David Warren, Josh Adams

3-Degrees of Freedom Robotic ARM Controller for Various Applications

Using the VM1010 Wake-on-Sound Microphone and ZeroPower Listening TM Technology

Meet Cue. USER PROGRAMMABLE LEDS & BUTTONS Customizes your experience.

NUST FALCONS. Team Description for RoboCup Small Size League, 2011

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

Senior Design I. Fast Acquisition and Real-time Tracking Vehicle. University of Central Florida

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Multi-Vehicles Formation Control Exploring a Scalar Field

PRESENTED BY HUMANOID IIT KANPUR

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

CORC 3303 Exploring Robotics. Why Teams?

SEE MORE, SMARTER. We design the most advanced vision systems to bring humanity to any device.

KUTESat. Pathfinder. Presented by: Marco Villa KUTESat Project Manager. Kansas Universities Technology Evaluation Satellite

Design of a Remote-Cockpit for small Aerospace Vehicles

Design and Implementation of Robot employed with Sense Aware

Robust Self-Powered Wireless Hydrogen Sensor

UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

A Unique Home Automation System through MEMS

Building Perceptive Robots with INTEL Euclid Development kit

ROBOTICS & EMBEDDED SYSTEMS

Get your daily health check in the car

PlaceLab. A House_n + TIAX Initiative

ADVANCED SAFETY APPLICATIONS FOR RAILWAY CROSSING

INTELLIGENT SELF-PARKING CHAIR

Critical Design Review: M.A.D. Dog. Nicholas Maddy Timothy Dayley Kevin Liou

Sensor Network Platforms and Tools

Global Image Sensor Market with Focus on Automotive CMOS Sensors: Industry Analysis & Outlook ( )

AI Application Processing Requirements

Embedded Robotics. Software Development & Education Center

CS 393R. Lab Introduction. Todd Hester

System LAB, STMicroelectronics INDIA Aug 2015

Azaad Kumar Bahadur 1, Nishant Tripathi 2

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Arduino Workshop A Hands On Introduction With 65 Projects John Boxall

Project METEOR Instrumentation Platform P08101

Internet of Things (Winter Training Program) 6 Weeks/45 Days

An IoT Based Real-Time Environmental Monitoring System Using Arduino and Cloud Service

IMDL Fall Final Report

Transcription:

Platforms & Applications for Embedded Vision Presenter: Emanuel M. Popovici, Electrical & Electronic Engineering, University College Cork, Ireland e.popovici@ucc.ie The Spring 2015 Computing Systems Week, May 5-7, Oslo Embedded Computer Vision Vision is the art of seeing what is invisible to others. Jonathan Swift

University College Cork(1845) Largest University in the South of Ireland George Boole was UCCs first Prof. of Mathematics www.georgeboole.com When Boole meets Shannon-2 nd irisc workshop, 1-2 Sept 15

Presentation Overview MISSION STATEMENT Embedded Vision Main Ingredients Some projects Sense the invisible Future developments I am always building totally useless gadgets just because I think they re fun to make, Claude Shannon 3

nd Embedded Vision Our view Embedded Systems Collection of small components brought together into one system to serve the purpose of a specific situation nd Vision Vision is a very powerful sensor generating information rich data. A multi dimensional(nd) vision is about sensing beyond human capabilities. Vision Technology Departure from tradition with IR, Thermal, Lidar, ultraviolet, enriched with a myriad of other sensors Applications Requirements Low Power is the norm Applications for nd vision Medical, entertainment, safety, security, automotive, education, environment,. Making Sensors Smart Key is to provide low energy algorithms for image and sensor data processing, sensor fusion, machine learning, etc. Efficient Vision systems rely on low energy Computation/Processing 4

Main Ingredients Heterogeneous everything Vision Sensors CMOS cameras, Thermal imaging, IR, LIDAR, stereo vision, UV Image/Video Signal Processing Low resolution to high resolution, low frame rate to high frame rate, low power to not-so-low power Other sensors Accelerometers, Temperature, Humidity, Microphones, Gases, PIR, luminosity, (Wireless) Processing Platforms Inforce, Intel(Edison, Quark), Myriad, Rasberry Pi, SparkCore, TI, Aldebaran, Custom(ARM), Mikroelektonika, Movidius, Image/Video + Signal Processing Sensor fusion for smarter, lower power vision systems Human/Robot Interaction Sensors and processing everywhere. Power consumption is the key Support and particularly compilers are essential ingredients in the design process 5

Some (serious) Projects To the subject areas U-Play Toys and Interfaces for interacting with toys for children with disabilities i-bees Accelerometers, Temperature, Humidity, Microphones, Gases, PIR, luminosity, IR, Safe/Secure Farm Inforce, Intel(Edison, Quark), Myriad, Rasberry Pi, SparkCore, TI, Aldebaran, Custom(ARM) Human - robot interactions Robot-robot interaction Visual and vision at core Non-intrusive Health Status Inside the hive view Bee tracking More than bees Detecting intruders using energy neutral vision? Detecting alive things Interdisciplinarity generates best ideas 6

U-Play2: 2 nd Prize, IEEE/IBM Smarter Planet Challenge, 2013 Team: M. Donovan, J. Cunningham, F. Edwards-Murphy, T. Jezequel, T. Lambe, A. Zagoneanu, M. Bradley, J. McCarthy, E. Popovici

Main Project Aims and Goals U-Play 2 Children Develop scenarios for human-robot and robotrobot interaction Hexbugs Low power vision algorithms For bugs, for Nao FLiR Lepton Customisation End-user driven, scenario driven, medical, etc Nao Vision Recognition System For the bugs in an area using vision of a fixed camera as well as NAO. Bugs with vision Swarm Fusion Thermal imaging, standard camera, other sensors Integration 8

Image Processing Low Power Cascaded Object Detectors Detects objects whose aspect ratios don t show significance variance Color Detection System Using RGB imaging examine individual pixels to detect desirable colors Image Segmentation Dividing an image into multiple parts to extract any relevant information from it Cascaded Object Detection Two cascaded object detectors have been created. One detects the Hexbug form the other detects the triangles form. Color Detection A color detection algorithm has been completed to a prototype stage but requires further refinement 9 Identification Lines and triangles have been segmented. An algorithm to distinguish between different blobs is underway

Hexbug U-Play Platform U-Play Chassis & PCB Teensy 3.1 Microcontroller TI ez430-rf2500 Radio A smart microcontroller & PCB with sensors, transceiver & microphone mounted on a donor spider body. Low cost, small ARM processor with large amount of I/O. Connects Hexbugs in a mesh network to pass commands from users to a specific node. 10 Spark Core

A first step towards smart toys Focus on toys Cost efficiency Explored the existing platforms and integrated the smart processing, communication, vision/sensor systems Power consumption Working transfer of data using the Texas Instruments ez430-rf2500 modules Learning from others how to play Interacting with doctors, teachers, psychologists, gaming/computer scientists 11

Nao Robot and Hexbugs Our toys Hardware Nao is a highly versatile, humanoid robot equipped with many sensors and actuators including: 2 cameras, tactile sensors and directional microphones to facilitate interaction with its environment. Choregraphe Environment Programming software with intuitive graphical user interface containing both standard and advanced functions for creating user defined movements and behaviors. Toys and Kids Applications for kids with physical and learning disabilities, autism, but also promoting engineering 12

Some implementations IEEE ISTAS 2015 13

FLiR Lepton Thermal Camera A new dimension for toys Hardware The Lepton transmits data over SPI and I2C to the microcontroller. The camera captures 80x60 pixel images through the reception of infrared radiation. Raspberry Pi/ Spark core Data transferred from the camera module is encoded, stored or communicated wirelessly Cool Group The images and videos captured feature an auto-scaling temperature function adapting to the hottest object in the room. 14

Implementations I4Santa System to catch Santa(using cameras, PIR sensor and imagination) Other Interfaces Speaker ID, gesture recognition, wireless commands, EOG/EMG/ECG interfaces Sensor Fusion Swarm of toys built; distributed processing; etc 15

i-bees redefining the hive 1 st Prize, IEEE/IBM Smarter Planet Challenge, 2014 Team: F. Edwards Murphy, P. Whelan, L. Pinson, L. O Leary, K. Troy, K. Hetherington, E. Lahiff, E. Popovici A project funded by the Irish Research Council

Main Project Aims and Goals i-bees Bees Smart Bee health and behaviour Processing Algorithms Image, vision, sound Decision Support Systems From the beehive to the cloud and back Cameras Sensor Fusion Smart Bee Hive Vision Recognition System See beyond usual Embedded Vision Platforms Combining Thermal imaging image and with standard camera Integration 17

Image and Video Processing (IWASI 2015) Thermal camera Detect beyond the hive: health, status, temperature, etc System at a glance Energy neutral operation Thermal imaging Cost, low power, operating range, algorithms for image and video analysis IR camera Used for visualisation inside the hive 3B(Big Brother for Bees) System Solar powered, energy optimisation, accurate decisions, adaptive sensing algorithms 18 IR camera Low power, event/user triggered, night vision, security, etc

From Bees to Bits to Information Colony Status and Weather Forecast (SAS2015) Tom Goddard(USF), bee tracking Bee Counting and tracking Tom Goddard(USF), bee tracking

S-Farm: Smart/ Safe/ Secure Farming Team: Jack McCarthy, William Healy, Jonathan Hourihane, L. Marnane, E. Popovici

Main Project Aims and Goals s-farm Children Farm setup and beyond Cameras + Sensors Processing Algorithms Ranging, real-time decision making Sensor Fusion Customisation Weareable, automotive and beyond Smart Radar Vision Recognition System Alive things Embedded Vision Platforms Combining Thermal/IR imaging,, UWB, etc Integration 21

Processing Smart processing requirement (Fraunhofer Institute) System design Smart radar detecting alive objects Visualisation/Warning system Key for usability nd Vision Accuracy is key goal Processing Low power, real-time, heterogeneous vision Processing Real time decision support system 22 Processing Sensor fusion, high accuracy

InfiniTime system(m. Magno ETH Zurich)and Goals ACCELEROMETER (VIBRATIONAL WAKE UP) ADXL362 E-PAPER DISPLAY ANALOG MICROPHONE ADMP404 MEMS OPA 433 EN DC-DC ADC MSP430FR5969 EN DC-DC GPIO ADC ANALOG CAMERA Versatility Cost, low power, operating range, algorithms for image and video analysis Energy neutral system Solar powered, energy optimisation, accurate decisions, adaptive sensing algorithms Display ENERGY HARVESTER + BATTERY Electronics Solar Panels NFC/UWB & Wake up Radio Wrist Q TEGs

Thank you! MISSION STATEMENT http://sites.google.com/site/embedded0101 www.youtube.com : embedded.systems@ucc E,popovici@ucc.ie Always looking for sponsors, partners and collaborators 24