Newsletter. Date: 16 th of February, 2017 Research Area: Robust and Flexible Automation (RA2)

Similar documents
Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

CAPACITIES FOR TECHNOLOGY TRANSFER

TECHNOLOGICAL COOPERATION MISSION COMPANY PARTNER SEARCH

ICT4 Manuf. Competence Center

Newsletter. Date: 21 st of March, 2018 Research Area 3: Innovative and Sustainable Organizations

FP7 ICT Call 6: Cognitive Systems and Robotics

Multisensory Based Manipulation Architecture

Graz University of Technology (Austria)

Newsletter. Date: 1 st of June, 2017 Research Area 3: Innovative and Sustainable Organizations

How technology can enable the fourth industrial revolution. Lynne McGregor 28 February 2018

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

SMART MANUFACTURING: A Competitive Necessity. SMART MANUFACTURING INDUSTRY REPORT Vol 1 No 1.

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Cognitive Systems and Robotics: opportunities in FP7

Centre for Autonomous Marine Operations and Systems

Digital Manufacturing

Assessment of Smart Machines and Manufacturing Competence Centre (SMACC) Scientific Advisory Board Site Visit April 2018.

Farnborough Airshow Farnborough Air Show Investor Relations Technology Seminar 2018 Rolls-Royce

SMART MANUFACTURING: 7 ESSENTIAL BUILDING BLOCKS

Digitalisation as day-to-day-business

Technology trends in the digitalization era. ANSYS Innovation Conference Bologna, Italy June 13, 2018 Michele Frascaroli Technical Director, CRIT Srl

Roadmap Pitch: Road2CPS - Roadmapping Project Platforms4CPS Roadmap Workshop

Introduction. digitalsupercluster.ca

PROJECT FACT SHEET GREEK-GERMANY CO-FUNDED PROJECT. project proposal to the funding measure

Developing a Mobile, Service-Based Augmented Reality Tool for Modern Maintenance Work

1 Abstract and Motivation

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

European Rail Research Advisory Council

Success Stories within Factories of the Future

2014 Market Trends Webinar Series

3.1 Publishable summary

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

MOBILITY RESEARCH NEEDS FROM THE GOVERNMENT PERSPECTIVE

Accessible Power Tool Flexible Application Scalable Solution

AI Application Processing Requirements

Virtual Reality and Full Scale Modelling a large Mixed Reality system for Participatory Design

Learning and Using Models of Kicking Motions for Legged Robots

Electrical and Automation Engineering, Fall 2018 Spring 2019, modules and courses inside modules.

Responsible AI & National AI Strategies

2. Publishable summary

STEPMAN Newsletter. Introduction

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Component Based Mechatronics Modelling Methodology

The Smart Production Laboratory: A Learning Factory for Industry 4.0 Concepts

ROBOTIC AUTOMATION Imagine Your Business...better. Automate Virtually Anything

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Industry 4.0: the new challenge for the Italian textile machinery industry

PPP InfoDay Brussels, July 2012

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Creating a 3D environment map from 2D camera images in robotics

SECOND YEAR PROJECT SUMMARY

John Henry Foster INTRODUCING OUR NEW ROBOTICS LINE. Imagine Your Business...better. Automate Virtually Anything jhfoster.

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Enabling Technologies. The Norwegian Landscape

AR-Enhanced Human-Robot-Interaction Methodologies Algorithms

What will the robot do during the final demonstration?

Strategic Plan for CREE Oslo Centre for Research on Environmentally friendly Energy

Innovation Report: The Manufacturing World Will Change Dramatically in the Next 5 Years: Here s How. mic-tec.com

BI TRENDS FOR Data De-silofication: The Secret to Success in the Analytics Economy

Aerospace Software* Cost and Timescale Reduction *and complex electronic hardware

Why interest in visual perception?

Research Infrastructures and Innovation

Virtual Reality: Basic Concept

ICT : Internet of Things and Platforms for Connected Smart Objects

Learning and Using Models of Kicking Motions for Legged Robots

More Info at Open Access Database by S. Dutta and T. Schmidt

CMM-Manager. Fully featured metrology software for CNC, manual and portable CMMs. nikon metrology I vision beyond precision

Hvorfor investerer SDU mere end 100 millioner i Industry 4.0?

Software update news about digital manufacturing tools and software

CENTER OF MODERN CONTROL TECHNIQUES A INDUSTRIAL INFORMATICS

ReVRSR: Remote Virtual Reality for Service Robots

FAST RAMP-UP AND ADAPTIVE MANUFACTURING ENVIRONMENT

Towards Sustainable Process Industries: The Role of Control and Optimisation. Klaus H. Sommer, President of A.SPIRE

Robotics Laboratory. Report Nao. 7 th of July Authors: Arnaud van Pottelsberghe Brieuc della Faille Laurent Parez Pierre-Yves Morelle

A4BLUE - Adaptive Automation in Assembly For BLUE collar workers satisfaction in Evolvable context

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

AI in Europe How could the EC help European society and economy to make the best of this revolution?

Ref. Ares(2014) /01/2014. Executive Summary

Otto Bihler Maschinenfabrik GmbH & Co. KG

FP7 ICT Work Programme

Intelligent Buildings Remote Monitoring Using PI System at the VSB - Technical University of Ostrava Jan Vanus

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Open Source Voices Interview Series Podcast, Episode 03: How Is Open Source Important to the Future of Robotics? English Transcript

The safe & productive robot working without fences

Digitising European Industry. Strengthening competitiveness in digital technologies value chains and platforms

VSI Labs The Build Up of Automated Driving

Industry 4.0. Advanced and integrated SAFETY tools for tecnhical plants

Artificial Intelligence and Robotics Getting More Human

Terms of Reference. Call for Experts in the field of Foresight and ICT

Multi-Robot Coordination. Chapter 11

Framework Programme 7

N.B. When citing this work, cite the original published paper.

INDUSTRY 4.0. Modern massive Data Analysis for Industry 4.0 Industry 4.0 at VŠB-TUO

MILAN DECLARATION Joining Forces for Investment in the Future of Europe

Robotic Technology for Port and Maritime Automation

HORIZON Intelligent cross-linked and flexible process chain

High Performance Computing Systems and Scalable Networks for. Information Technology. Joint White Paper from the

CALL FOR PAPERS. embedded world Conference. -Embedded Intelligence- embedded world Conference Nürnberg, Germany

Traffic Control for a Swarm of Robots: Avoiding Group Conflicts

Transcription:

www.sfimanufacturing.no Newsletter Date: 16 th of February, 2017 Research Area: Robust and Flexible Automation (RA2) This newsletter is published prior to each workshop of SFI Manufacturing. The aim is to keep the community up to date with the current research that is being carried out within and related to the SFI. This issue of the newsletter is focused on the research and achievements from the area Robust and Flexible Automation. In this issue: Introduction to the research area From the virtual world into the real world New projects and project proposals PhD and postdoc progress reports SFI Manufacturing A cross-disciplinary centre for research based innovations for competitive high value manufacturing in Norway

About the research area Recent developments within automation technologies increase the ability to rapidly adapt to changing conditions, and open new ways to use automation and robotics in manufacturing systems. Some examples of advances based on novel technology: Increased flexibility and robustness In-process monitoring and real-time control Faster and easier reconfiguration Intuitive and adaptive manufacturing adapted to human needs Intuitive programming and tasking Trajectory planning with adaption to changing environments Sensor systems capable of mapping and analysing changing environments Multi-robot and human-robot coordination and cooperation Real time sensor based control From the virtual world into the real world On the 14th of March 2017, the next SFI workshop will take place, with Rolls-Royce as the host organization. Research area Robust and Flexible Automation has the main responsibility for the scientific part of the workshop. In the following sections, we give a brief status report and introduction to the main topics that we plan to present in the workshop, i.e. Grasping using Deep Learning, 3D Point Clouds for object localization and Simulation for developing and virtual testing of production systems. 1) Deep Learning The overall objective of the research within the area Robust and Flexible Automation is to further develop and link novel technologies and methodologies within automation to support innovation processes and advanced work systems in the manufacturing industries. Figure 2. Development of Artificial intelligent methods. Automatic grasping using Deep Learning Pose estimation of objects is one of the key problems for the automatic-grasping task of robotics. Within the research area Robust and Flexible Automation, a long-term goal is to create a generic system for grasping all types of objects in cluttered scenes. As a step towards this goal, we have developed a system for robot grasping using deep learning. Figure 1. Visualization of the overall objective of the research area. A number of research challenges will be addressed within the research area. The work is organised into two research and technology development work packages (WP s): WP2.1 Extended pick and place robotic handling of non-uniform objects WP2.2 Flexible and integrated production systems These WP s are also highly interlinked in the way that WP2.2 will focus on using and integrating the specific technologies and methods developed in WP2.1, as well as other technologies on a production system level. In addition, activities within these WP s also link to the two other research areas within the SFI, i.e. RA1 and RA3. The 2017 work package activities have been chosen based on the mapping of industrial use cases and requirements, technology watch and the preliminary results from 2016. Deep learning algorithms can learn features and tasks directly from images, and can automatically extract high-level, complex abstractions from images. Instead of learning a traditional machine vision system each new object it should handle, a deep learning algorithm can be trained for handling a large spectre of objects and objects in cluttered scenes. These strengths makes deep learning an important tool to achieve a more generic system for grasping. What is Deep Learning and why using it? Deep learning is based on a hierarchical learning architecture and is motivated by artificial intelligence emulating the deep, layered learning process of the primary sensorial areas of the human brain, which automatically extracts features and abstractions from the underlying data. Deep learning models have demonstrated strong power in learning hierarchical features, which greatly facilitates computer vision tasks like object detection and recognition. Deep Learning for detecting robotic grasps Traditionally, object localization and grasping have been solved by a fixed setup with an exact position of the object. Alternatively, vision based grasping can been applied. The need for a fixed setup is then removed. However, the objects have to be known to the vision system. In this work, we present a vision-based robotic grasping system, which not only can recognize different objects, but also estimate their poses by using a deep learning model, finally grasp them and move to a predefined destination. We apply a 3D convolutional neural network (3DCNN) to detect potential grasps and estimate their pose.

We have built a database comprised of five types of objects with different poses and illuminations for experimental performance evaluation. The experimental results demonstrate that the vision-based robotic system can grasp objects successfully regardless of different poses, illuminations and scenes. 3D vision in SFI Manufacturing SFI manufacturing has two approaches to 3D vision. We work on developing new algorithms to locate objects in point clouds, using Deep Learning, and we use these algorithms as well as existing methods in available industrial vision systems to solve generic industrial issues, like bin-picking In 2016 and the beginning of 2017, we have been evaluating the open source Point Cloud Library (PCL) and the commercial systems Scorpion and Halcon, in order to compare them with the methods developed in SFI Manufacturing, as well as use them directly for object localization and inline inspection in industrial applications. The innovation project NAP 1, with help from SFI Manufacturing, has successfully demonstrated the use of point cloud in combination with laser triangulation to localize large parts with highly reflective material, see figure 6 and 7. Figure 3. 3D image of a bin with steel parts. The red arrow shows the gripping point and angle. Figure 4. By using a differential flow sensor it is possible to distinguish a successful from an unsuccessful grasp. Using VR for training Deep learning models need huge amount of data in order to generate a good model. Last year, Google made an experiment on grasping by learning. Between six and 14 robots were at any time picking up objects, and over the course of two months the robots had picked objects 800 000 times. We have developed a simulation tool that produces simulated data to train our model. This will give an improved model, reduce the total number of grasps needed, and save time. Further work Even though the algorithm can learn how to grasp from simulated data, a physical experiment will be needed to learn from real world data. To improve the grasping model, we will automate the process of learning by using a robotic manipulator which will be left to continuously try to grasp objects based on the already developed algorithm. By sensing a successful grasp in the experimental setup, the algorithm will continue to learn how to make the best possible grasps. 2) 3D vision Machine vision is the use and processing of images or point clouds by computers to execute tasks such as verification and localization of objects. In SFI Manufacturing we are focussing on 3D vision with point clouds. Point clouds represents the external surface of objects and can be described as listed point coordinates x, y and z, visualised with or without colours. Recent developments in the sensor-field enable us to generate point clouds fast and with high accuracy, which makes it useful for industrial inspection and quality control. 3D vision, compared to 2D vision, is robust to variation in light conditions, simplify object localization and makes inspection of more unconstrained scenes possible. Figure 6 and 7. Detection of burrs using 3D sensor. Further work In 2017, the plan is to get a better understanding of state-of-the-art methods, and use this knowledge as input and reference for further development of new algorithms, as well as apply it to new industrial applications. We will especially focus on reducing the time necessary to setup an object localization system based on 3D vision. 3) Simulation Simulation in manufacturing control consists of developing a model and a virtual scene that will form the basis for developing and virtual testing of control systems. Simulation has the potential to reduce development time for new processes drastically. Many commercial products are available and used in the industry, but the potential remains huge. Simulation in SFI Manufacturing Continuing on the work of SFI Norman, SFI Manufacturing has been working on verification of control applications using simulation models. In SFI Norman, a demonstration was made using 3D Create from Visual Components. The integration was done using a software stack with a proprietary protocol. The raising of the OPC UA protocol as the main Industry 4.0 protocol offers new possibilities, by reducing the work needed to connect the control stack to the simulation system and the real shop floor. The OPC UA protocol itself is a standard available into many products such a PLCs, Vision systems and MES systems, but the realization of its potential requires a standardisation of its profiles. SFI Manufacturing, together with the project NAP, has been focusing on increasing knowledge on OPC UA and improving its software ecosystem and device profiles. Figure 5. Colour point cloud of shiny metal tool parts from Teeness. 1 NAP Nullfeilproduksjon i Autonome Produksjonssystemer

New projects and project proposals Figure 8. Shop floor simulation using Visual Components 3D Create. Together with the Innovation project AutoFlex 2, SFI Manufacturing uses a method of near simultaneous execution of a physical environment and a simulated environment. To visualize the physical and simulated environment, augmented reality (AR) is used. Figure 9 and 10 show the environment before and after AR is used for visualization. A part of the result from the innovation project Autoflex, is a method for effective and intuitive programming of robots and robot assembly applications, through restructuring and adding required data to existing CAD-models. The structure of these CAD-models has been developed further in SFI Manufacturing, to simplify extraction of required data. The CAD-models is input to a simulated environment, where changes can be simulated and verified to be automation friendly. In the last grant from the Research Counsil s programme for User-driven Research based Innovation (BIA), a total of 42 innovation projects and 5 researcher-driven projects were invited to negotiations. The projects described below are all relevant for the research topics within the area Robust and Flexible Automation, because they partly address the same research topics, and they have one or several industrial and/or research partners from the SFI. In addition, we have also added a brief description of an EU-proposal that has been submitted, with SFI partners and relevance. Innovation project DAMP Fast Development of new Automated Manufacturing Processes through digital integration and testing Main objective: To achieve a digitally integrated manufacturing development process capable of delivering proven new and automated production processes in the shortest possible time, at a minimum cost, and as automated as possible. Expected results: To greatly reduce the time and cost involved in production process development, starting with a digital product model, and ending with a verified and proven automated production process. The outcome will include automatic programming of virtual production processes from digital product models: automatic programming of production processes on manufacturing equipment, and automatic data capture and analysis for physical product and production process verification. Innovation project FREM Fleksibel Robotisert Elektromekanisk Montasje Figure 9 and 10. The environment before and after AR is used for visualization. Further work We will continue with working on visualisation to decrease development time for new manufacturing processes and modification of existing processes. Furthermore, we will be working on using and pushing interoperability standards and develop new methods for effective and intuitive programming. Main objective: To enable automatic assembly of electromechanical products in small batches by breaking research in human-robot cooperation and other key technological areas. Expected results: Focus on flexible solutions for secure and efficient human-robot cooperation 3D machine vision for automatic localization of components from CAD models Robotic force based insertion at high precision component assembly Communication and data models for information flow between production and design processes Methods for quick and efficient programming of a large number of assembly operations Generic usable rules for design and development that takes into account the latest assembly technologies and thus facilitates positioning and assembly with cooperating robots 2 AutoFlex Flexible automated manufacturing of large and complex products

Innovation project SmartChain Digitaliserte, automatiserte og integrerte verdikjeder PhD and postdoc progress reports Main objective: To develop methods and solutions that ensure efficient supply chains with a high degree of technology supported production, management and control. Expected results: Strategic concept for efficient supply chains with high degree of automation and digitalization Method for automated production of volume products in conjunction with the production of premium products and technical prototypes in a laboratory environment Methodology for integrated development of production-friendly products Integrated system for real-time monitoring and control, which effectively manages material flow and unexpected events in the supply chain Knowledge-building project for Industry CPS The Cyber Physical System Plant Perspective Main objective: To develop, utilize, implement and evaluate enabling technologies for Norwegian industries of the future in a combined physical and virtual industrial model built on the principles of Industry 4.0 including digitalization and interconnectivity. Expected results: Framework for Norwegian approach for digital manufacturing industry Roadmap of new digital (Industry 4.0) technologies Demonstration and evaluation of new technologies for CPS where decision support through simulation capacities improves overall plant efficiency Methods on how advanced automated and human potentials and solutions in the plant manufacturing system can be increased, based on a new digital plant platform Use of cloud resources providing functionalities and computational capacities Submitted EU-proposal Optimus OPTimized In-line Measurement and control for manufacturing Systems + 11 other international partners Main objective: To develop, implement and demonstrate a system-level architecture for metrology and control to improve the first pass yield of micron accurate products from 80-90% to 99%. Expected results: The development and demonstration of three non-destructive inspection methods, a holistic process control, and system level architectures, which will be implemented and demonstrated in three pilot production lines. PhD s and postdocs are essential resources within the SFI. Linn, Signe and Mathias are connected to the research area Robust and Flexible Automation. Linn Danielsen Evjemo My name is Linn Danielsen Evjemo and I started my PhD on the 1th of December, 2016. I have my master s degree from the department of engineering cybernetics at NTNU. In my master work I attempted to perform real-time telemanipulation of a humanoid robot using hand-held motion trackers. In my PhD I will focus on large-scale, robotized additive manufacturing using industrial robots and cold metal transfer welding. I will try to see if it is possible to combine the large workspace of an industrial robot arm with the flexibility and relative affordability of traditional additive manufacturing methods. Signe Moe My name is Signe Moe and I started my postdoc on the 1th of January, 2017. I have a master and PhD from the department of engineering cybernetics at NTNU. In my master, I worked on path following of marine vehicles in the presence of unknown ocean currents. I finished my PhD in November 2016 on the topic of guidance and control of marine vehicles and set-based control of robotic systems. I am now planning on extending set-based theory to industrial needs. I will partly collaborate with Linn Danielsen Evjemo on 3D-printing using robots, and will also focus on applications such as collision avoidance and orientation control. Mathias Hauan Aarbo I am Mathias and I started my PhD in the fall of 2015. My PhD focuses on robotic assembly and sensor fusion. I come from the department of engineering cybernetics and work mainly with sensor fusion and robotics. My master thesis was on sensor fusion of delayed displacement measurements. The Bayesian formulation of how to handle that delay was my main topic. In my PhD I look at assembly with articulated robots under uncertainty. With enclosed robots we can know exactly where everything is placed and can make plans that will work in these environments. We as humans operate in an environment of uncertainty, objects are larger or smaller than we expected, but we compensate. And that is what we are trying to make the robots do in a systematic fashion. The probabilistic approach to uncertainties sees things as likelihood. It is likely that the piece is there, and based on how likely it is, the robot can form a better understanding of the object. Of particular interest is the screw-in process, where a small uncertainty can cause large issues. Estimation of the angle of entry, the location of the hole, and the relative orientation of the screw combined with robust control strategies will allow users to do faster prototyping of robot tasks and simplify automation.

Centre manager SFI Manufacturing Sverre Gulbrandsen-Dahl, SINTEF Raufoss Manufacturing E-mail: sverre.gulbrandsen-dahl@sintef.no Phone: +47 916 01 205 Leader of research area Robust and Flexible Automation Lars Tore Gellein, SINTEF Raufoss Manufacturing E-mail: lars.tore.gellein@sintef.no Phone: +47 920 38 688 Stay updated! Visit the website, or follow SFI Manufacturing on Twitter, for updates and information about the program and research areas. sfimanufacturing.no @SFI_Manufact