D2.3 Safety evaluation and standardization

Size: px
Start display at page:

Download "D2.3 Safety evaluation and standardization"

Transcription

1 D2.3 Safety evaluation and standardization

2 Project Acronym: ColRobot Project full title: Collaborative Robotics for Assembly and Kitting in Smart Manufacturing Project No: Call: H2020-ICT-2015 Coordinator: ENSAM Project start date: February 1, 2016 Project duration: 36 months Abstract This public deliverable describes the evaluation of the safety components and presents work relevant for standardization bodies. Document control sheet Title of Document Safety evaluation and standardization Work Package WP2 Safety and standardization Last version date Status Final Document Version: v.8 File Name ColRobot D2.3 Dissemination Level Partner Responsible IFF Versioning and contribution history Version Date Revision Description Partner v.1 15/12/2017 First draft version IFF v Second draft with input from UC and CITC UC, CITC v Revised version IFF v Revisions IFF v Final version ready for internal review IFF v Coordinator review ENSAM v Changes from Technaid and formatting Technaid, IFF v Changes from UC and IFF UC, IFF Disclaimer This document is provided «as is» with no warranties whatsoever, including any warranty or merchantability, noninfringement, fitness for any particular purpose, or any warranty otherwise arising out of any proposal, specification or sample. No license, express or implied, by estoppels or otherwise, to any intellectual property rights are granted herein. The members of the project ColRobot do not accept any liability for actions or omissions of ColRobot members or third parties and disclaim any obligation to enforce the use of this document. This document reflects only the authors' view and the Commission is not responsible for any use that may be made of the information it contains. This document is subject to change without notice. 2

3 Index 1. Introduction Safety evaluation Experimental evaluation workspace monitoring system for safety Objective Description Results Evaluation approach speed of humans (CITC) Objective Description Results Evaluation speed of hand estimation (UC) Objective Description Results High resolution camera with structured aperture evaluations Evaluation soft safety functionality Objective Description Results Evaluation process support functionality Objective Description Results Discussion high resolution camera system with a structured aperture Competing requirements Outlook Standardization Relevant standards Standardization relevant issues Activities to coordinate with standards

4 5. Summary Introduction In this document, we present testing results of the safety system in use in ColRobot, including specific results which will contribute to standards and best practices. The safety components used and developed in the ColRobot project are specific for mobile manipulators and seek to challenge the state of the art regarding human detection and dynamic safety areas. This is particularly important for mobile manipulators, as a high degree of flexibility is needed to accommodate for the robot moving around in a given workspace. As a recap, the overall objectives of the work package 2 can be summarized as the following: Set up a system for human and environment detection Detect human operators working in the robot s workspace, track their relative position to influence the robot s velocity Provide dynamic safety areas that change in size and position according to the risks and robot states, and investigate how different safeguarding methods can be combined for specific applications Evaluation of the systems according to international standards Based upon the risk analysis of the specific use-cases, there are a number of different safety sensors for safeguarding the mobile platform, the manipulator, and the tool and/or handled parts during different specific phases of work. Put simply, laser scanners safeguard the platform during normal platform motion. Additionally a workspace monitoring system has been adapted to work on the various ColRobot demonstrators in order to safeguard different tools. To support the workspace monitoring system and to better calculate the minimum required safety distance for any time, a combination of UWB and IMU sensors will be used to determine the speed and position of human operators and their arms in the workspace relative to the robot. In addition to the safety functionality offered by the workspace monitoring system, further soft safety features have also been implemented to support the robot and to maintain high system availability and throughput. An overview of the components of the workspace monitoring system and the functionalities offered by each is in Figure 1. Figure 1: Overview of the workspace monitoring hardware and the functionalities offered by the different components This document describes the tests and evaluations carried out on the safety systems and for the novel soft safety functionalities of the workspace monitoring system. In particular, the following evaluations have been carried out and will be described here: 4

5 - Evaluation of the detection of intruding objects for hard safety by the workspace monitoring system under various lighting conditions according to IEC Evaluation of the performance of the UWB sensors for measuring the position and speed of humans relative to the mobile manipulator - Evaluation of the performance of the IMU sensors for measuring the position and speed of human arms - Evaluation of the performance of soft safety functionality of the workspace monitoring system for the following tasks: o Detection and classification of humans for soft safety applications o Measurement of objects for supporting bin-picking or related tasks Furthermore, this document will describe which standards this work could interest and will offer insights, questions, and input for specific standardization committees. 2. Safety evaluation In this section we will describe the evaluations carried out related to safety functionalities Experimental evaluation workspace monitoring system for safety An experiment was carried out to test the limitations of the monitoring system under adverse lighting, as described in IEC Safety of machinery Electro-sensitive protective equipment Part 4: Particular requirements for equipment using vision based protective devices (VBPD) Objective The objective of the experiment is to determine what effect different adverse lighting conditions have on the workspace monitoring system. In particular we want to know whether the system always moves into a failsafe state or whether a situation whereby a false positive (non-alarm) occurs due to lighting conditions. Furthermore, we aim to determine with the current set up what the limits of external lighting are (how much extra light is too much, and what kinds of lighting are challenging) Description The test set-up utilized the current demonstrator of the ColRobot workspace monitoring system available at the IFF facilities. This includes a table, the workspace monitoring system mounted approximately 1,5 m above the ground and at a distance of approximately 1,5 m from the table. 5

6 Figure 2: Test set-up with workspace monitoring system overseeing ColRobot work table For testing purposes we mounted an aluminium rod with a diameter of 12 mm to the table (Figure 3). As per the standard, the rod was colored a matte black, as a worst-case. We then conducted two sets of tests, one with the stationary rod in the workspace, and a second control without any objects in the workspace. A virtual safety zone with dimensions 50 cm x 50 cm x 60cm to be monitored was manually positioned on the table surface, so that the stationary rod was either in the safety zone or not. (Figure 4). Figure 3: Test rod with diameter 12mm attached to stationary flange for tests (left) and approximate size of monitored zone (right) 6

7 Figure 4: Tests with stationary rod in monitored safety zone (left) and with monitored safety zone empty (right) During our tests we logged the camera measurements and the output signal from the workspace monitoring system and which would be passed on to the safety circuit in the ColRobot systems. Figure 5: Exemplary adverse lighting set-up (with incandescent lamp) The following table shows the specifications of the different lights used to simulate adverse lighting conditions. Table 1: Specification of lights used to simulate adverse lighting conditions Type of Specifications Comments illumination for testing a) Incandescent Hedler H25s halogen (quartz) lamp: The light intensity was adjusted by 7

8 light W rated power changing the distance of the - Rated voltage: 230V interfering light to the test set-up. b) Flourescent light Here we used two different light sources. The light intensity was adjusted by The first one was fixed 1200 mm above changing the distance of the the workspace and offered an interfering light (on the tripod) to the additional 600 lux in lighting. test set-up. - Size: T8 x 600 mm (25mm diameter) - Rated power: 55 W, 230V - Colour temperature: K The second flourescent light source was on a tripod. - Size: T8 x 600 mm (25mm diameter) - Rated power: 18-20W - Colour temperature: K c) Flashing beacon Nikon SB600 Manually triggered at approximately 1 Hz d) Stroboscopic light Cameo Thunder Wash 100RGB Manually programmed with RGB RGB LEDs channels at full intensity (white light) - each LED with 0,2W and with 1 Hz strobe light frequency The ambient lighting was measured at 400 lux and the accuracy of the light intensity measurement was +/- 5%. The set-up was in a laboratory with no direct sunlight, on an overcast winter day. The tests were carried out such that the light was switched on / off during the measurement, so that not only the stationary behaviour during a specific light intensity was measured, but also the system reaction during sudden changes in light type and intensity. As we will see in the results, besides the overall limits regarding how much incandencent light the system is able to handle, the most errors were due to a sudden change in the light, which resulted in a short false measurement. Each individual test was repeated 10 times. The results are listed with a result and further information about how many times out of 10 that result was observed (e.g. No 9/10 means that in nine out of the ten tests no intrusion was detected) Results Table 2 shows the test sequence and the test results. Of particular importance was the transition from one lighting situation to another. We observed that sudden changes in light were often more of a challenge than the static lighting condition. Therefore, for all cases except those with stroboscopic light, there are separate results for the transitions (turning interfering light on and off) and the static situation (interfering light is on). Table 2: Test sequence and results Without test piece (nothing should be With test piece (intrusion should be 8

9 Test number and description Lighting conditions detected) Test result (intrusion? Yes/no) detected) Test result (intrusion? Yes/no) Q-Tests - Normal operation Test 1 switch on incandescent light with 250 lux increase over ambient light 400 lux -> 690 lux No (10/10) Yes (10/10) incandescent light on with 250 lux increase over ambient light 690 lux No (10/10) Yes (10/10) switch off interfering light 690 lux -> 400 lux No (9/10) Yes (10/10) Test 2 switch on flashing beacon source placed at outer limit of sensing zone, at least 3 m from optical axis of sensor and 2m in height 800 lux No (10/10) Yes (10/10) Test 3 switch on flourescent light sources (with uniform light intensity increase of 250 lux over ambient light) 400 lux -> 640 lux No (10/10) Yes (10/10) flourescent light on 640 lux No (10/10) Yes (10/10) switch off interfering light 640 lux -> 400 lux No (10/10) Yes (10/10) Test 4 switch on incandescent light source with a round object in front of the light to cast a shadow on the passive pattern (<50% of the area viewed by projection system). 789 lux No (7/10) Yes (10/10) Interfering light with shadow on 789 lux No (10/10) Yes (10/10) switch off interfering light 789 lux -> 400 lux No (9/10) Yes (10/10) R-Test Failure to danger caused by indirect light (pattern) Test 5 switch on incandescent light source. Incandescent light source should produce light increase of 1000 lux over 500 lux ambient light. 400 lux -> 1400 lux Yes (10/10) Yes (10/10) Incandescent light source on 1400 lux No (10/10) Yes (10/10) Switch off interfering light 1400 lux -> 400 lux Yes (9/10) Yes (10/10) Test 6 switch on stroboscopic light source 400 lux No (10/10) Yes (10/10) 9

10 Test 7 switch on Flourescent light sources) should produce a uniform light intensity increase of 500 lux over ambient light of 500 lux) 400 lux -> 1000 lux No (9/10) Yes (10/10) flourescent light on 1000 lux No (10/10) Yes (10/10) switch off interfering light 1000 lux -> 400 lux No (10/10) Yes (10/10) S-Test Failure to danger caused by direct light interference (sensor) Test 8 switch on incandescent light source. Incandescent light source should produce light increase of 3000 lux over 500 lux ambient light. Stroboscopic light source placed at outer limit of sensing zone, at least 3 m from optical axis of sensor and 2m in height 400 lux -> 3500 lux Yes (10/10) Yes (10/10) 3500 lux Incandescent light and Stroboscopic light on Yes (10/10) Yes (10/10) switch off interfering light 3500 lux ->400 lux Yes (10/10) Yes (10/10) Test 9 switch on Flourescent light sources (should produce a uniform light intensity increase of 1000 lux over ambient light) 400 lux -> 1400 lux Yes (6/10) Yes (10/10) Flourescent light on 1400 lux No (10/10) Yes (10/10) switch off interfering light 1400 lux -> 400 lux No (10/10) Yes (10/10) T-Test failure to danger due to fading ambient light Test 10 reduce ambient light to 250 lux 250 Lux No (10/10) Yes (10/10) Test 11 lights completely out) 5 lux No (10/10) Yes (10/10) There were two main insights that were gained during the tests. 1) On the one hand, there is an absolute limit how much additional incandescent light the system is able to handle. This was measured at approximately 1100 lux over ambient light. This was very reproducible (the light intensity was adjusted by manually moving the incandescent light source closer and further away from the table). 2) The second insight is that the system reacts badly to quick changes in lighting (both fluorescent in Test 9 and incandescent in Test 5), resulting in false positive detections. From a safety standpoint, it is important that any failure lead to a failsafe mode. Therefore, a false positive is less critical than a false negative (a situation where the workspace monitoring system doesn t detect a real intrusion). There were no false negative situations. False positives occurred either during a transition in the lighting (sudden change in either direction), or in the case of test 8, when there was generally too much light in the scene. 10

11 2.2. Evaluation approach speed of humans (CITC) Objective The objective of the experiment is evaluate the performance of the UWB sensors for measuring the speed and position of humans relative to the robot in a shared workspace. Relevant KPIs include: - Accuracy - System frequency - Latency for sending signal to safety controller Description A UWB geolocation solution is deployed (solution provided by the Ubisense 1 company) within the ENSAM Platform, composed of 6 UWB antennas, 1 controller, and UWB Tags 2 on mobile robot to identify its angle of movement and 1 in the helmet of human operator. A simple schematic drawing of this is in Figure 6. Figure 6: Schematic drawing of UWB sensor set-up with multiple antenna in room, two receivers on robot, and one in helment of human operator Two positioning algorithms are used, the AOA (angle of arrival) and the TDOA (Time difference of arrival), to obtain better accuracy and reliability of the geolocation information The theoretical spatial resolution is between 150 and 500 mm and the frequency of measurements reaches 100 hz. We carried out a web visualization interface allowing to supervise the position of mobile robot and human operator (Figure 7), evaluation the instantaneous and average speeds of both of them. We also developed a web API, able to send instantaneous geolocation information of human and robot to other applications / safety controller. [operator, x=., y=., time=.] [Robot_Front, x=., y=., time=.] [Robot_Back, x=., y=., time=.]

12 Figure 7: Web visualization interface to supervise position of mobile robot and human operator Different tests and experimentation were carried out by the CITC and Ubisense teams, to benchmark the UWB geolocation solution. The main results are presented in the following Results For the UWB solution deployed within the ENSAM platform, the frequency of tags is defined as 30hz maximum. The web visualization interface we carried out shows an accuracy of 500 mm. A second benchmark platform was developed by Ubisense to define the accuracy of a runner person within an area (Figure 8). The frequency of tags is defined as 100 hz and the accuracy is about 200 mm. A video presenting this test is available hereafter: Figure 8: Test of UWB geolocation solution of a runner person according to accuracy and latency Finally, we test a geofencing solution developed by ubisense, dedicated to the traceability of objects in bins or shelves (Figure 9). The frequency of tags is defined as 100 hz and the accuracy is less than 100 mm (bins are 100 mm spaced). A video presenting this test is available hereafter: 12

13 Figure 9: Test of UWB geofencing solution according to accuracy and latency The latency required to display the geolocation information (or to send this information to a controller) was verified for the three tests. We show that the Ubisense geolocation solutions are very efficient in terms of latency, even with high speed of tags (runner person or thrown object from one place to another). In conclusion, we showed through our tests that the UWB solution developed by Ubisense is efficient, reliable and can be used to evaluate the safety zone of the human operator, on the condition that the frequency of tags Is sufficiently high ; the lifetime of the tags batteries is of course inversely proportional to the tags frequency Evaluation speed of hand estimation (UC) Objective The objective of the experiment is to evaluate the performance of the IMU sensors for measuring the position (and speed derived from positional data) of human arms relative to their bodies. Relevant KPIs include: - Accuracy estimating the arm pose in space in different configurations; - System speed (including communication latencies). 13

14 Description Human arm positioning estimation is required to avoid collisions in human-robot collaborative operations, as in ColRobot prototype, Figure 10. The knowledge of human arm position can be used to speed up robot task execution and at the same time improve safety conditions for human workers. In order to capture human arm movement the IMUs system Tech-MCS V3 from Technaid is used. The Tech-MCS V3 incorporates 3D inertial sensors called "Tech-IMU" (that contains 3D accelerometer, gyroscope, magnetometer and temperature sensors) and a hub device called "Tech-HUB V3" that organizes and sends the data obtained from the Tech-IMUs to a PC. Data is transmitted by USB cable or by Bluetooth. These IMUs can be attached directly to the human body. Five Tech-IMUs are used to capture arm movements being placed in different human body limbs: left forearm; left arm; right forearm; right arm; and chest. Several kind of data can be extracted from the IMU system, in this case the system provides 3D Orientation, i.e. the orientation of each Tech-IMU; that is then transformed in relation to the Tech-IMU placed in the chest of the human. Taking into account the human torso kinematics, the distance between the human chest and each human hand can be obtained. Nevertheless, these distances are subject to errors. There are two main sources of errors: those related to the estimation of IMU orientation provided by each Tech-IMU and also those related to rough estimation of the human s body measures (as well as the differences from human to human in terms of body dimensions). The body measurements we considered are: (1) the distance between human belly and human shoulder, (2) the length of the human arm (the distance between shoulder and elbow), and (3) the length of the human forearm (the distance between elbow and wrist). Notice that the length of the human hand was not taken into account in this approach. The velocity is estimated by deriving the estimated arm position. From the human upper body kinematics, the following equation is obtained to calculate the positioning error (e max ) when estimating the position of the human wrist. where: e max = θ max (3d d 7 + d d 4 2 ) θ max is the maximum angular error in each Tech-IMU (in radians); d 10 is the forearm length (distance between the elbow and the wrist); d 7 is the arm length (distance between the shoulder and the elbow); d 3 is the distance between the belly and the throat; d 4 is the distance between the throat and the shoulder. The above variables are illustrated in Erreur! Source du renvoi introuvable. Figure 10: Schematic drawing of use of IMUs to track speed of human arms, to be combined with information from UWB sensors for improved human position and speed tracking. 14

15 d 4 d 7 d 3 d 10 e max Figure 11: Schematic drawing illustrating the error in each Tech-IMU Results The provider of the IMUs system estimates the angular error θ max of each Tech-IMU in 0.7 degree RMS in static conditions. We conducted experiments to evaluate the real system error by placing humans in known configurations (in static conditions) and comparing the achieved minimum distance with the ground truth. Humans who were between 1.65 and 1.85 meters tall and their body measures differ in 5 % were considered in these experiments. The wrist positioning was estimated, obtaining an average error of about +-2 centimetres. This error is affected by the IMUs angles error and the human physiognomy. We receive human wrist position estimations at 25 Hz. 3. High resolution camera with structured aperture evaluations As described in the introduction, the workspace monitoring system has a number of features combined into one system. In this section, we will evaluate the functionalities offered by the high resolution camera with a structured aperture and focus on the tasks of human detection and process support Evaluation soft safety functionality In this section we will describe the evaluation of the soft safety functionality of the workspace monitoring system, namely the ability of the system to detect humans and distinguish humans from other objects in the workspace. This is an evaluation of the software we use with the camera system, and our approach uses a RGB-images and Deep Learning. For the experiment, we used the mobile robot Annie equipped with a 15

16 head-mounted camera system similar to the one we developed in ColRobot. Since we are focusing on the software aspects and since the image quality is comparable Objective The objective of the experiment is determine the performance of the human detection algorithm. We wanted to know whether it is possible to use an arbitrary dataset of persons for detection of persons and human body parts in our specific laboratory and production environment. The set-up is meant to test variations in the number of persons, their clothes, head cover, gloves, and occlusion of body parts. For evaluation, we determined the true positives and false positives based on the Intersection over Union of the detected bounding box and a ground truth box. The relevant KPIs include the hit rate, precision and the mean average precision Description For the human detection, we used the Tensorflow Object Detection API which provides several models of neural networks and detection methods. For this application, we chose the net ResNet101, which was the winner of the 2015 COCO dataset challenge, and the Faster-RCNN detection architecture. For the Tensorflow API, a model Faster-RCNN-ResNet101 with pre-trained weights on the COCO dataset is available. To develop the soft safty functionality, we labeled a subset of the ImageNet dataset and created bounding boxes to determine safety-relevant categories. These seven categories are person, head, body, arm, hand, leg or foot, and they can occur multiple times in one image. Figure 12 shows example images of the labeled dataset, illustrating the arbitary content regarding people. In total, we labeled 200 images with an overall amount of 511 objects dedicating to head -category and only 154 foot -objects (table 3). We split the labeled dataset into a training and a test set which is processed into the transfer-learning method of the Tensorflow Object API. The training was done for 100,000 steps, and results in a total loss of (a) (b) (c) Figure 12: Variation in training and testing datasets 16

17 Table 3: Partitioning of the dataset Subset for training Subset for testing Total person head body arm hand leg foot Images total For our experiment we used the mobile robot Annie equipped with a camera system similar to the one developed for ColRobot (Figure 13). The data of the ground truth dataset was hand-labelled (Table 4). Table 4: Labelled Data Ground Truth Figure 13: Mobile robot Annie equipped with a headmounted camera system similar to the ColRobot workspace monitoring system. person 246 head 216 body 316 arm 305 hand 216 leg 250 foot 262 Objects total 1811 Images total 184 We made different types of images focusing on the following variations: - single and multiple persons, - changes in illumination, - partly hidden persons and body parts (gloves, helmets). Moreover, we used different clothes (color of jackets, lab coat) and changed the body posture (sitting, standing, walking). Figure 11 illustrates a subset of these images. 17

18 (a) (b) (c) (d) (e) (f) Figure 14: Image subset of experiment We determined the performance of the algorithm for these datasets by calculating the Intersection of Union (IoU) for the bounding boxes of the detected object and a ground truth object. If the IoU is greater than a certain threshold, it is counted as true positives (TP), else as false positives (FP). If the detector misses an object which is in the ground truth, it is counted as false negative (FN). We used 0.5, as the threshold for the IoU. For evaluation, we determine the hit rate (R) and the precision (P) for each class, and the Mean Average Precision (MAP) of the classifier. The hit rate indicates whether relevant objects are detected (and not left out), whereas the precision describes whether the detection of an object is relevant. We computed these metrices as follows: Hit rate R of category c: Precision P of category c: Mean Average Precision MAP: R(c) = TP(c) (TP(c) + FN(c) P(c) = TP(c) (TP(c) + FP(c) MAP = 1 C P(c) c C Results The method is able to detect objects and to classify in seven categories. We received correct detections mostly for persons and heads (Figure 12 (a) and (b), although the lightling was at times challenging (e.g. humans not well lit, or backlighting) (Figure 15 (d). The classifier is however errorous in some images. It did not detect all categories (Figure 15 (c): arms and hands) and missclassifies objects which are not in the ground truth (Figure 12 (e): robotic arm is not a human arm). The overall performance for the detection of seven categories is measured by the MAP= using IoU of 0.5 (Table 5). Our results show that the detection of persons is stable in most images (Table 5). Furthermore, the algorithm detects the head and body more frequently, but for the other body parts, it has a lower performance. (a) (b) (c) 18

19 (d) (e) (f) Figure 15: Image subset of experimental results Table 5: Detection Results IOU=0.5 TP FN FP Hit Rate Precision person head arm hand body leg foot objects total MAP When considering safety, a false negative signal is more critical than a false positive. However, given that the rationale behind the human detection is to avoid unnecessary machine stops, false positive classifications can be considered as the more critical case and what should be minimized. It is therefore a good result that the number of false positive signals are much lower than the number of false negative, especially for extremities (arms, hands, legs, and feet). We can also see that the detection of the head, body, and of a person as a whole is much better than detection of extremeties. This is due to low contrast between extremeties and the environment in our tests, as well as the much larger variation in poses the extremeties can take in a given picture. This points to a need for a larger data set for training these body parts. However, again given our goal of soft safety, detection of a whole person is sufficient. This means that the image needs to be so large so that body and head are able to be captured (requirement on the optics). In the future we will test the results using another neural network (as a benchmark), and we will also, during testing at end-user sites, be able to generate more training data. Since light and environmental conditions play a large role in detection, it will also be good to have tests in end-user facilities. In particular we believe that these conditions will be more favourable in the Thales facilities, with one large room (vs smaller rooms with different lighting at IFF facilities) and without daylight (which can lead to strong variations in lighting). Given that the high resolution camera with a structured aperture is used both for soft safety functionality and process support, an overall conclusion regarding the complete camera system will be discussed in Section Evaluation process support functionality Objective The objective of the experiment is determine the performance of the high resolution camera with a structured aperture for supporting the picking processes. We originally chose to use a high resolution camera with a structured aperture to be able to detect objects and their position with the same camera system that is already being used for safety purposes and which has a good view of the scene, in order to support picking processes. The functional principle of the camera is described in detail in Deliverable Description To test the measurement error, we took images with the high resolution camera with a structured aperture of objects that are known a-priori. The objects have been previously measured by hand. The objects were 19

20 placed on a table at a distance of approximately 1 m from the camera system. This distance corresponds to the planned distance between the camera and objects during normal operation. The images were taken with an ambient lighting of approximately 800 lux, which also corresponds to the lighting conditions to be expected. The objects were moved in different orientations and positions on the table and in total over 10 images were taken. The images were processed according to the stereo techniques used and a measurement for a length of the was taken. These measurements were compared to the ground truth measurements to determine the error Results For the first test, a wooden box with side length 172 x 172 mm was placed on a table surface ca mm from the camera system (Figure 16). Given the basis length from the calibration of the high resolution camera with a structured aperture of 24mm, and given the 50 mm lens used, we expected to have quite a high error with regard to the depth of a measurement. The high resolution camera with a structured aperture measured a side length of 138 mm (averaged over multiple measurements), resulting in an error of approximately 20%. Figure 16: Wooden box with length and width of 172 mm A further measurement was carried out with an aluminium profile (Figure 17) with side length 200 mm that was place also at a distance of ca mm from the camera system. Here the average measurement was 215 mm, resulting in an error of approximately 5%. 20

21 Figure 17: Aluminum profile with length of 200 mm The differences in error can be attributed to the low depth resolution and the different viewing angle of the measured objects. The length measurement of the wooden box was more in the z-direction of the image, and is more affected by the depth resolution. The aluminium profile is oriented more parallel to the plane of the camera and the length measurement therefore is less dependent on the component from the z- direction. In general, given the combination of relatively small basis length of the stereo system (24 mm), the wide angle of view (a 50mm lens was used) and the distance between the camera and object (ca. 1000mm), we expected for a relatively high sensitivity in depth information. Rough calculations for the sensitivity of the depth measurement show that the system should have a depth resolution of approximately 8 mm with the given configuration. In the original conception of the camera system solely for process information, the use of a tele-lens (150mm 200 mm) was considered, which would have a smaller field of view but a much higher depth resolution. One particular challenge for the measurement was the fact that our structured aperture set-up resulted in individual images with different gray values. This made the use of traditional stereo matching techniques very hard. A further challenge was the fact that, again due to the structured aperture, the focal plane was quite small and objects outside of the focal plane were quite unsharp. Both of these challenges also contributed to the large error we measured Discussion high resolution camera system with a structured aperture In this section we will discuss the results from Sections 3.1 and 3.2 in the context of the overall system for ColRobot. In the first section we will highlight competing requirements arising from the various functionalities assigned to the high resolution camera with a structured aperture. Then we will discuss how these competing requirements and the ensuring compromises made in the design contributed to the results achieved in this evaluation. 21

22 Competing requirements We would first like to point out that the two main tasks assigned to the high resolution camera with a structured aperture include a number of competing requirements. These competing requirements influence the lens angle of view and the overall dimensions and size of the system. With regards to the lens, we have seen that the lens should have as wide an angle of view as possible in order to detect humans for the softsafety functionality. On the other hand, to support process information, the lens should have much smaller angle of view. The compromise of using a 50 mm lens was in practice not optimal for either situation. A further requirement on the system design was to make it as small and lightweight as possible. In particular, this drove us to try the possibility of using a structured aperture to use only one camera. This saved both overall size (normall stereo systems have a larger base length between individual cameras) and weight (only 1 camera was needed instead of two or more) Outlook Due to the strongly diverging requirements and the compromises with regards to the high resolution camera with a structured aperture, we propose focusing instead on one capability to see how we can improve the results within the time available in the project. Therefore we suggest focusing on how we can use the high resolution camera with a structured aperture to increase the performance of the human detection algorithms. We propose to segment the images with the depth information received by the camera system. This pre-processing step would mask the distractive features of the background. With this action taken, we think that the results of the human detection algorithm can be improved due to less intrusive features in the images. The depth resolution, even with a wider field of view and longer viewing distances, are sufficient for image segmentation. In this case, only relevant areas near the robot and tools are considered for human detection. 4. Standardization 4.1. Relevant standards Relevant standards for mobile manipulation and robotic safety include: - ISO , -2 - ISO TS ISO EN RIA (currently in development) 4.2. Standardization relevant issues Specific issues and/or questions regarding these standards: - ISO TS / ISO o Measuring speed of humans. What specific requirements are on this measurement (performance level, etc.)? Is this separate system to be viewed as part of a complete system (sensor for human detection + sensor for speed measurement + all communication pathways+all software = the sensor that needs to fulfil PL d, Cat. 3 according to 61508? Considering current system integration approaches to use off-the-shelf components, this is extreme barrier to innovation. What could be sufficient pathway forward here? o We would like to inform the community about novel use of tactile sensor as 3-position enabling switch for hand-guided movement. Currently only traditional 3-position enabling switch used, which can be ergonomically awkward for user. Our approach to use tactile sensor ring, which has 3-position functionality (through thresholds) and which further requires contact at two cells (not neighbors) to ensure a full grip around wrist is made. This allows a user to change their grip in between movements to the most comfortable and ergonomically viable position. 22

23 4.3. Activities to coordinate with standards Standardizing bodies such as the ISO are consensus-based organizations that make their decisions based on input from members from national committtees. It is very difficult to influence a standardizing committee from the outside, and the best means to interact with the standards committees is through interaction with individuals who are on the committee and who can raise concerns based on new input. The ColRobot has identified a number of ways that they can interact with standardizing committees to raise awareness regarding the issues mentioned above: 1) The Fraunhofer IFF is on the national committee for the working group that is responsible for the ISO-TS In national meetings, the IFF can raise questions and issues that have been identified in ColRobot. 2) ColRobot can interact with other individuals who are involved in standardization through specific workshops. The eurobotics sponsored ERF often has workshops focusing on standardization issues and is attended many individuals who are active on ISO standards at the international level. ColRobot representatives from the Fraunhofer IFF and ENSAM will be present at the ERF 2018 workshop on standardization and will discuss issues relevant to ColRobot in those workshops. 3) Dissemination of best practices, especially with regard to the issues mentioned above. 4) The EU project COVR, which just started in January 2018 is focusing on issues regarding safety and shared safety facilities. The Fraunhofer IFF is a consortium member. This project provides a larger base for sharing best practices, and will provide a further opportunity for best practices developed in ColRobot regarding risk analysis for mobile platforms to be further disseminated in the community amoung relevant stakeholders. 5. Summary In summary we have presented the results of various evaluations of the systems used for safety in the ColRobot project, as well as the high resolution camera with a structured aperture and software developed for further soft-safety and process support functionalities. Furthermore, we have identified standardization relevant issues that have arisen in ColRobot and listed activities to coordinate with standards and disseminate best practice among various stakeholders (not just standardization bodies, but also robotics end-users, component manufacturers, and system integrators). 23

How To Create The Right Collaborative System For Your Application. Corey Ryan Manager - Medical Robotics KUKA Robotics Corporation

How To Create The Right Collaborative System For Your Application. Corey Ryan Manager - Medical Robotics KUKA Robotics Corporation How To Create The Right Collaborative System For Your Application Corey Ryan Manager - Medical Robotics KUKA Robotics Corporation C Definitions Cobot: for this presentation a robot specifically designed

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

1 Abstract and Motivation

1 Abstract and Motivation 1 Abstract and Motivation Robust robotic perception, manipulation, and interaction in domestic scenarios continues to present a hard problem: domestic environments tend to be unstructured, are constantly

More information

What s hot right now and where is it heading?

What s hot right now and where is it heading? Collaborative Robotics in Industry 4.0 What s hot right now and where is it heading? THA Webinar 05.10.2017 Collaborative Robotics in Industry 4.0 Overview What is Human-Robot Collaboration? Common misconceptions

More information

UWB for Lunar Surface Tracking. Richard J. Barton ERC, Inc. NASA JSC

UWB for Lunar Surface Tracking. Richard J. Barton ERC, Inc. NASA JSC UWB for Lunar Surface Tracking Richard J. Barton ERC, Inc. NASA JSC Overview NASA JSC is investigating ultrawideband (UWB) impulse radio systems for location estimation and tracking applications on the

More information

Camera Setup and Field Recommendations

Camera Setup and Field Recommendations Camera Setup and Field Recommendations Disclaimers and Legal Information Copyright 2011 Aimetis Inc. All rights reserved. This guide is for informational purposes only. AIMETIS MAKES NO WARRANTIES, EXPRESS,

More information

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision 11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste

More information

Range Sensing strategies

Range Sensing strategies Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called

More information

Responsible Data Use Assessment for Public Realm Sensing Pilot with Numina. Overview of the Pilot:

Responsible Data Use Assessment for Public Realm Sensing Pilot with Numina. Overview of the Pilot: Responsible Data Use Assessment for Public Realm Sensing Pilot with Numina Overview of the Pilot: Sidewalk Labs vision for people-centred mobility - safer and more efficient public spaces - requires a

More information

Fact File 57 Fire Detection & Alarms

Fact File 57 Fire Detection & Alarms Fact File 57 Fire Detection & Alarms Report on tests conducted to demonstrate the effectiveness of visual alarm devices (VAD) installed in different conditions Report on tests conducted to demonstrate

More information

KMUTT Kickers: Team Description Paper

KMUTT Kickers: Team Description Paper KMUTT Kickers: Team Description Paper Thavida Maneewarn, Xye, Korawit Kawinkhrue, Amnart Butsongka, Nattapong Kaewlek King Mongkut s University of Technology Thonburi, Institute of Field Robotics (FIBO)

More information

Robot: Robonaut 2 The first humanoid robot to go to outer space

Robot: Robonaut 2 The first humanoid robot to go to outer space ProfileArticle Robot: Robonaut 2 The first humanoid robot to go to outer space For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-robonaut-2/ Program

More information

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS) AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS) 1.3 NA-14-0267-0019-1.3 Document Information Document Title: Document Version: 1.3 Current Date: 2016-05-18 Print Date: 2016-05-18 Document

More information

Dental photography: Dentist Blog. This is what matters when choosing the right camera equipment! Checklist. blog.ivoclarvivadent.

Dental photography: Dentist Blog. This is what matters when choosing the right camera equipment! Checklist. blog.ivoclarvivadent. Dental photography: This is what matters when choosing the right camera equipment! Checklist Dentist Blog blog.ivoclarvivadent.com/dentist Dental photography: This is what matters when choosing the right

More information

DIGITAL BEAM-FORMING ANTENNA OPTIMIZATION FOR REFLECTOR BASED SPACE DEBRIS RADAR SYSTEM

DIGITAL BEAM-FORMING ANTENNA OPTIMIZATION FOR REFLECTOR BASED SPACE DEBRIS RADAR SYSTEM DIGITAL BEAM-FORMING ANTENNA OPTIMIZATION FOR REFLECTOR BASED SPACE DEBRIS RADAR SYSTEM A. Patyuchenko, M. Younis, G. Krieger German Aerospace Center (DLR), Microwaves and Radar Institute, Muenchner Strasse

More information

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics ROMEO Humanoid for Action and Communication Rodolphe GELIN Aldebaran Robotics 7 th workshop on Humanoid November Soccer 2012 Robots Osaka, November 2012 Overview French National Project labeled by Cluster

More information

WHITE PAPER ELECTRO-SENSITIVE PROTECTIVE DEVICES (ESPE) FOR SAFE MACHINES

WHITE PAPER ELECTRO-SENSITIVE PROTECTIVE DEVICES (ESPE) FOR SAFE MACHINES WHITE PAPER ELECTRO-SENSITIVE PROTECTIVE DEVICES (ESPE) FOR SAFE MACHINES OPTO-ELECTRONIC PROTECTIVE DEVICES, 2017-08 AUTHORS Otto Goernemann Manager Machine Safety & Regulations at SICK AG, Waldkirch/Germany

More information

Detecting Intra-Room Mobility with Signal Strength Descriptors

Detecting Intra-Room Mobility with Signal Strength Descriptors Detecting Intra-Room Mobility with Signal Strength Descriptors Authors: Konstantinos Kleisouris Bernhard Firner Richard Howard Yanyong Zhang Richard Martin WINLAB Background: Internet of Things (Iot) Attaching

More information

VALERI - A COLLABORATIVE MOBILE MANIPULATOR FOR AEROSPACE PRODUCTION. CLAWAR 2016, London, UK Fraunhofer IFF Robotersysteme

VALERI - A COLLABORATIVE MOBILE MANIPULATOR FOR AEROSPACE PRODUCTION. CLAWAR 2016, London, UK Fraunhofer IFF Robotersysteme VALERI - A COLLABORATIVE MOBILE MANIPULATOR FOR AEROSPACE PRODUCTION CLAWAR 2016, London, UK Fraunhofer IFF Robotersysteme Fraunhofer IFF, Magdeburg 2016 VALERI - A collaborative mobile manipulator for

More information

Exercise questions for Machine vision

Exercise questions for Machine vision Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided

More information

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp. 97 102 SCIENTIFIC LIFE DOI: 10.2478/jtam-2014-0006 ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES Galia V. Tzvetkova Institute

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Product Requirements Document: Automated Cosmetic Inspection Machine Optimax

Product Requirements Document: Automated Cosmetic Inspection Machine Optimax Product Requirements Document: Automated Cosmetic Inspection Machine Optimax Eric Kwasniewski Aaron Greenbaum Mark Ordway ekwasnie@u.rochester.edu agreenba@u.rochester.edu mordway@u.rochester.edu Customer:

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Stabilize humanoid robot teleoperated by a RGB-D sensor

Stabilize humanoid robot teleoperated by a RGB-D sensor Stabilize humanoid robot teleoperated by a RGB-D sensor Andrea Bisson, Andrea Busatto, Stefano Michieletto, and Emanuele Menegatti Intelligent Autonomous Systems Lab (IAS-Lab) Department of Information

More information

Accessible Power Tool Flexible Application Scalable Solution

Accessible Power Tool Flexible Application Scalable Solution Accessible Power Tool Flexible Application Scalable Solution Franka Emika GmbH Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient. Even today, robotics remains a

More information

The project. General challenges and problems. Our subjects. The attachment and locomotion system

The project. General challenges and problems. Our subjects. The attachment and locomotion system The project The Ceilbot project is a study and research project organized at the Helsinki University of Technology. The aim of the project is to design and prototype a multifunctional robot which takes

More information

Sensor system of a small biped entertainment robot

Sensor system of a small biped entertainment robot Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Working towards scenario-based evaluations of first responder positioning systems

Working towards scenario-based evaluations of first responder positioning systems Working towards scenario-based evaluations of first responder positioning systems Jouni Rantakokko, Peter Händel, Joakim Rydell, Erika Emilsson Swedish Defence Research Agency, FOI Royal Institute of Technology,

More information

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011 Sponsored by Nisarg Kothari Carnegie Mellon University April 26, 2011 Motivation Why indoor localization? Navigating malls, airports, office buildings Museum tours, context aware apps Augmented reality

More information

Technical Guide for Radio-Controlled Advanced Wireless Lighting

Technical Guide for Radio-Controlled Advanced Wireless Lighting Technical Guide for Radio-Controlled Advanced Wireless Lighting En Table of Contents An Introduction to Radio AWL 1 When to Use Radio AWL... 2 Benefits of Radio AWL 5 Compact Equipment... 5 Flexible Lighting...

More information

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018)

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018) ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/22/2018 2/02/2018) Note: At least two people must be present in the lab when operating the UR5 robot. Upload a selfie of you, your partner,

More information

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods 19 An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods T.Arunachalam* Post Graduate Student, P.G. Dept. of Computer Science, Govt Arts College, Melur - 625 106 Email-Arunac682@gmail.com

More information

ADALAM Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing D2.2. Ger Folkersma (Demcon)

ADALAM Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing D2.2. Ger Folkersma (Demcon) D2.2 Automatic adjustable reference path system Document Coordinator: Contributors: Dissemination: Keywords: Ger Folkersma (Demcon) Ger Folkersma, Kevin Voss, Marvin Klein (Demcon) Public Reference path,

More information

Architectural/Engineering Specification for a. Microwave Perimeter Intrusion Detection System

Architectural/Engineering Specification for a. Microwave Perimeter Intrusion Detection System Architectural/Engineering Specification for a Microwave Perimeter Intrusion Detection System µltrawave Disclaimer Senstar, and the Senstar logo are registered trademarks, and µltrawave, Silver Network

More information

Intro to Digital Compositions: Week One Physical Design

Intro to Digital Compositions: Week One Physical Design Instructor: Roger Buchanan Intro to Digital Compositions: Week One Physical Design Your notes are available at: www.thenerdworks.com Please be sure to charge your camera battery, and bring spares if possible.

More information

Quartz Lock Loop (QLL) For Robust GNSS Operation in High Vibration Environments

Quartz Lock Loop (QLL) For Robust GNSS Operation in High Vibration Environments Quartz Lock Loop (QLL) For Robust GNSS Operation in High Vibration Environments A Topcon white paper written by Doug Langen Topcon Positioning Systems, Inc. 7400 National Drive Livermore, CA 94550 USA

More information

2. Publishable summary

2. Publishable summary 2. Publishable summary CogLaboration (Successful real World Human-Robot Collaboration: from the cognition of human-human collaboration to fluent human-robot collaboration) is a specific targeted research

More information

According to the proposed AWB methods as described in Chapter 3, the following

According to the proposed AWB methods as described in Chapter 3, the following Chapter 4 Experiment 4.1 Introduction According to the proposed AWB methods as described in Chapter 3, the following experiments were designed to evaluate the feasibility and robustness of the algorithms.

More information

A Foveated Visual Tracking Chip

A Foveated Visual Tracking Chip TP 2.1: A Foveated Visual Tracking Chip Ralph Etienne-Cummings¹, ², Jan Van der Spiegel¹, ³, Paul Mueller¹, Mao-zhu Zhang¹ ¹Corticon Inc., Philadelphia, PA ²Department of Electrical Engineering, Southern

More information

DIGITAL PHOTOGRAPHY CAMERA MANUAL

DIGITAL PHOTOGRAPHY CAMERA MANUAL DIGITAL PHOTOGRAPHY CAMERA MANUAL TABLE OF CONTENTS KNOW YOUR CAMERA...1 SETTINGS SHUTTER SPEED...2 WHITE BALANCE...3 ISO SPEED...4 APERTURE...5 DEPTH OF FIELD...6 WORKING WITH LIGHT CAMERA SETUP...7 LIGHTING

More information

PRODUCT MANUAL. AGD Systems Limited 2016 Doc. Ref. 932 PM ISS3 ISO ISO 9001 Registered Quality Management. Registered Environmental Management

PRODUCT MANUAL. AGD Systems Limited 2016 Doc. Ref. 932 PM ISS3 ISO ISO 9001 Registered Quality Management. Registered Environmental Management ISO 14001 PRODUCT MANUAL ISO 9001 Registered Quality Management 015 Registered Environmental Management 015 AGD Systems Limited 2016 Doc. Ref. 932 PM ISS3 TABLE OF CONTENTS INTRODUCTION Product & technology

More information

Exploring the relationship between ergonomics and measurement quality in handheld FTIR spectrometers

Exploring the relationship between ergonomics and measurement quality in handheld FTIR spectrometers Exploring the relationship between ergonomics and measurement quality in handheld FTIR spectrometers Application note Materials testing Authors Alan Rein, John Seelenbinder and Frank Higgins Agilent Technologies,

More information

SICK AG WHITE PAPER SAFE ROBOTICS SAFETY IN COLLABORATIVE ROBOT SYSTEMS

SICK AG WHITE PAPER SAFE ROBOTICS SAFETY IN COLLABORATIVE ROBOT SYSTEMS SICK AG WHITE PAPER 2017-05 AUTHORS Fanny Platbrood Product Manager Industrial Safety Systems, Marketing & Sales at SICK AG in Waldkirch, Germany Otto Görnemann Manager Machine Safety & Regulations at

More information

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements INTERNATIONAL STANDARD ISO 12233 First edition 2000-09-01 Photography Electronic still-picture cameras Resolution measurements Photographie Appareils de prises de vue électroniques Mesurages de la résolution

More information

This manual applies to the WT-RC-Ex receiver when used to locate all makes and models of 22 Hz and Wavetrak coded transmitters.

This manual applies to the WT-RC-Ex receiver when used to locate all makes and models of 22 Hz and Wavetrak coded transmitters. This manual applies to the WT-RC-Ex receiver when used to locate all makes and models of 22 Hz and Wavetrak coded transmitters. The Wavetrak WT-RC-Ex receiver kit comes with the following pieces of equipment:

More information

Robot: icub This humanoid helps us study the brain

Robot: icub This humanoid helps us study the brain ProfileArticle Robot: icub This humanoid helps us study the brain For the complete profile with media resources, visit: http://education.nationalgeographic.org/news/robot-icub/ Program By Robohub Tuesday,

More information

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC ROBOT VISION Dr.M.Madhavi, MED, MVSREC Robotic vision may be defined as the process of acquiring and extracting information from images of 3-D world. Robotic vision is primarily targeted at manipulation

More information

OS3D-FG MINIATURE ATTITUDE & HEADING REFERENCE SYSTEM MINIATURE 3D ORIENTATION SENSOR OS3D-P. Datasheet Rev OS3D-FG Datasheet rev. 2.

OS3D-FG MINIATURE ATTITUDE & HEADING REFERENCE SYSTEM MINIATURE 3D ORIENTATION SENSOR OS3D-P. Datasheet Rev OS3D-FG Datasheet rev. 2. OS3D-FG OS3D-FG MINIATURE ATTITUDE & HEADING REFERENCE SYSTEM MINIATURE 3D ORIENTATION SENSOR OS3D-P Datasheet Rev. 2.0 1 The Inertial Labs OS3D-FG is a multi-purpose miniature 3D orientation sensor Attitude

More information

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011) Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces

More information

UWB RFID Technology Applications for Positioning Systems in Indoor Warehouses

UWB RFID Technology Applications for Positioning Systems in Indoor Warehouses UWB RFID Technology Applications for Positioning Systems in Indoor Warehouses # SU-HUI CHANG, CHEN-SHEN LIU # Industrial Technology Research Institute # Rm. 210, Bldg. 52, 195, Sec. 4, Chung Hsing Rd.

More information

Single Slit Diffraction

Single Slit Diffraction PC1142 Physics II Single Slit Diffraction 1 Objectives Investigate the single-slit diffraction pattern produced by monochromatic laser light. Determine the wavelength of the laser light from measurements

More information

Date: Paul Spaanderman, Abstract: Not know at this time. Decision Discussion Information Other <specify>

Date: Paul Spaanderman, Abstract: Not know at this time. Decision Discussion Information Other <specify> PROPOSED ITS USE CASE DESCRIPTION Use Case Title: Title Project Name: tbd Source: tbd Date: 2016-09-16 Contact: Paul Spaanderman, ps@paulsconsultancy.com Abstract: Agenda Item: Work item(s): Document(s)

More information

Franka Emika GmbH. Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient.

Franka Emika GmbH. Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient. Franka Emika GmbH Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient. Even today, robotics remains a technology accessible only to few. The reasons for this are the

More information

BeNoGo Image Volume Acquisition

BeNoGo Image Volume Acquisition BeNoGo Image Volume Acquisition Hynek Bakstein Tomáš Pajdla Daniel Večerka Abstract This document deals with issues arising during acquisition of images for IBR used in the BeNoGo project. We describe

More information

D8.1 PROJECT PRESENTATION

D8.1 PROJECT PRESENTATION D8.1 PROJECT PRESENTATION Approval Status AUTHOR(S) NAME AND SURNAME ROLE IN THE PROJECT PARTNER Daniela De Lucia, Gaetano Cascini PoliMI APPROVED BY Gaetano Cascini Project Coordinator PoliMI History

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

Section 2 concludes that a glare meter based on a digital camera is probably too expensive to develop and produce, and may not be simple in use.

Section 2 concludes that a glare meter based on a digital camera is probably too expensive to develop and produce, and may not be simple in use. Possible development of a simple glare meter Kai Sørensen, 17 September 2012 Introduction, summary and conclusion Disability glare is sometimes a problem in road traffic situations such as: - at road works

More information

Using Auto FP High-Speed Sync to Illuminate Fast Sports Action

Using Auto FP High-Speed Sync to Illuminate Fast Sports Action Using Auto FP High-Speed Sync to Illuminate Fast Sports Action by Today s sports photographer not only needs to capture the action, but oftentimes produce a unique feature image for a client. Using Nikon

More information

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter Final Report Prepared by: Ryan G. Rosandich Department of

More information

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications Bluetooth Low Energy Sensing Technology for Proximity Construction Applications JeeWoong Park School of Civil and Environmental Engineering, Georgia Institute of Technology, 790 Atlantic Dr. N.W., Atlanta,

More information

Physics 3340 Spring Fourier Optics

Physics 3340 Spring Fourier Optics Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.

More information

PROJECT FACT SHEET GREEK-GERMANY CO-FUNDED PROJECT. project proposal to the funding measure

PROJECT FACT SHEET GREEK-GERMANY CO-FUNDED PROJECT. project proposal to the funding measure PROJECT FACT SHEET GREEK-GERMANY CO-FUNDED PROJECT project proposal to the funding measure Greek-German Bilateral Research and Innovation Cooperation Project acronym: SIT4Energy Smart IT for Energy Efficiency

More information

Photometric Measurements in the Field. Ronald B. Gibbons Group Leader, Lighting and Infrastructure Technology

Photometric Measurements in the Field. Ronald B. Gibbons Group Leader, Lighting and Infrastructure Technology Photometric Measurements in the Field Ronald B. Gibbons Group Leader, Lighting and Infrastructure Technology Photometric Measurements in the Field Traditional Methods Luminance Meters Current Methods CCD

More information

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,

More information

A Low Cost Optical See-Through HMD - Do-it-yourself

A Low Cost Optical See-Through HMD - Do-it-yourself 2016 IEEE International Symposium on Mixed and Augmented Reality Adjunct Proceedings A Low Cost Optical See-Through HMD - Do-it-yourself Saul Delabrida Antonio A. F. Loureiro Federal University of Minas

More information

Model OB-ITF Infrared Through-Beam Fiber Optic Optical Barrier Operator s Manual. Version

Model OB-ITF Infrared Through-Beam Fiber Optic Optical Barrier Operator s Manual. Version Model OB-ITF Infrared Through-Beam Fiber Optic Optical Barrier Operator s Manual Version 11-2010 Contents 1. Introduction... 2 2. Description... 2 2.1 Model Nomenclature... 2 2.2 Operating Principle...

More information

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design Outline Chapter 1: Introduction Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design 1 Overview: Integration of optical systems Key steps

More information

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring optoelectronic conversion functions (OECFs)

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring optoelectronic conversion functions (OECFs) INTERNATIONAL STANDARD ISO 14524 Second edition 2009-02-15 Photography Electronic still-picture cameras Methods for measuring optoelectronic conversion functions (OECFs) Photographie Appareils de prises

More information

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs)

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs) INTERNATIONAL STANDARD ISO 14524 First edition 1999-12-15 Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs) Photographie Appareils de prises

More information

Technical Application Guide

Technical Application Guide Lighting Control Wireless Occupancy Sensor, Wireless Multi Sensor Technical Application Guide Easily enhance your smart lighting system with the Philips wireless occupancy sensor and multi sensor, which

More information

The safe & productive robot working without fences

The safe & productive robot working without fences The European Robot Initiative for Strengthening the Competitiveness of SMEs in Manufacturing The safe & productive robot working without fences Final Presentation, Stuttgart, May 5 th, 2009 Objectives

More information

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Indoor Positioning by the Fusion of Wireless Metrics and Sensors Indoor Positioning by the Fusion of Wireless Metrics and Sensors Asst. Prof. Dr. Özgür TAMER Dokuz Eylül University Electrical and Electronics Eng. Dept Indoor Positioning Indoor positioning systems (IPS)

More information

It s set up! VISOR. The vision sensor with which you can immediately get going.

It s set up! VISOR. The vision sensor with which you can immediately get going. It s set up! VISOR. The vision sensor with which you can immediately get going. 1 Unpack, set up and get going never before have vision sensors been so powerful and so easily and intuitively operated.

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

Standard Operating Procedure for Flat Port Camera Calibration

Standard Operating Procedure for Flat Port Camera Calibration Standard Operating Procedure for Flat Port Camera Calibration Kevin Köser and Anne Jordt Revision 0.1 - Draft February 27, 2015 1 Goal This document specifies the practical procedure to obtain good images

More information

O ccupancy and Light Level Sensor L ow Vo l tage Ceiling Fixture Mount

O ccupancy and Light Level Sensor L ow Vo l tage Ceiling Fixture Mount FS - 2 0 5 v 2 O ccupancy and Light Level Sensor L ow Vo l tage Ceiling Fixture Mount SPECIFICATIONS Power Voltage.......................................24VDC Current Consumption...........................6.5

More information

CERTIFIED PROFESSIONAL PHOTOGRAPHER (CPP) TEST SPECIFICATIONS CAMERA, LENSES AND ATTACHMENTS (12%)

CERTIFIED PROFESSIONAL PHOTOGRAPHER (CPP) TEST SPECIFICATIONS CAMERA, LENSES AND ATTACHMENTS (12%) CERTIFIED PROFESSIONAL PHOTOGRAPHER (CPP) TEST SPECIFICATIONS CAMERA, LENSES AND ATTACHMENTS (12%) Items relating to this category will include digital cameras as well as the various lenses, menu settings

More information

Workshop IROS 2015 Robotic co-workers methods, challenges and industrial test cases

Workshop IROS 2015 Robotic co-workers methods, challenges and industrial test cases Björn Matthias, ABB Corporate Research, 2015-09-28 New safety standards for collaborative robots, ABB YuMi dual-arm robot Workshop IROS 2015 Robotic co-workers methods, challenges and industrial test cases

More information

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/28/2019 2/08/2019)

Note: Objective: Prelab: ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/28/2019 2/08/2019) ME 5286 Robotics Labs Lab 1: Hello Cobot World Duration: 2 Weeks (1/28/2019 2/08/2019) Note: At least two people must be present in the lab when operating the UR5 robot. Upload a selfie of you, your partner,

More information

figure 1 figure 2 INSTRUCTIONS AND TIPS FOR THE INSTALLATION OF THE HC MODEL CINEMA MIRROR x = y

figure 1 figure 2 INSTRUCTIONS AND TIPS FOR THE INSTALLATION OF THE HC MODEL CINEMA MIRROR x = y CONSIDERATIONS 1 Cinema Mirror is an active optic set to be installed in false ceilings hiding the projection equipment, and that through a pulsation it opens a floodgate of 335x335 mm., allowing the projection

More information

Technology offer. Aerial obstacle detection software for the visually impaired

Technology offer. Aerial obstacle detection software for the visually impaired Technology offer Aerial obstacle detection software for the visually impaired Technology offer: Aerial obstacle detection software for the visually impaired SUMMARY The research group Mobile Vision Research

More information

Homeostasis Lighting Control System Using a Sensor Agent Robot

Homeostasis Lighting Control System Using a Sensor Agent Robot Intelligent Control and Automation, 2013, 4, 138-153 http://dx.doi.org/10.4236/ica.2013.42019 Published Online May 2013 (http://www.scirp.org/journal/ica) Homeostasis Lighting Control System Using a Sensor

More information

X-FPM(4L)/X-FPM(4L)-AR

X-FPM(4L)/X-FPM(4L)-AR LC-Tec Displays AB X-FPM(4L)/X-FPM(4L)-AR product specification February, 2016 X-FPM(4L)/X-FPM(4L)-AR PRODUCT SPECIFICATION Content 1. Revision history... 2 2. Product description... 2 3. Ordering information...

More information

MEM455/800 Robotics II/Advance Robotics Winter 2009

MEM455/800 Robotics II/Advance Robotics Winter 2009 Admin Stuff Course Website: http://robotics.mem.drexel.edu/mhsieh/courses/mem456/ MEM455/8 Robotics II/Advance Robotics Winter 9 Professor: Ani Hsieh Time: :-:pm Tues, Thurs Location: UG Lab, Classroom

More information

More Info at Open Access Database by S. Dutta and T. Schmidt

More Info at Open Access Database  by S. Dutta and T. Schmidt More Info at Open Access Database www.ndt.net/?id=17657 New concept for higher Robot position accuracy during thermography measurement to be implemented with the existing prototype automated thermography

More information

Test Plan for Hearing Aid Compatibility

Test Plan for Hearing Aid Compatibility Test Plan for Hearing Aid Compatibility Version Number 3.1 February 2017 2017 CTIA - The Wireless Association. All rights reserved. CTIA hereby grants to CTIA Authorized Testing Laboratories (CATLs), and

More information

ABB i-bus KNX Lighting Constant lighting control

ABB i-bus KNX Lighting Constant lighting control Practical knowledge ABB i-bus KNX Lighting Constant lighting control This manual describes practical knowledge for constant light control. Subject to changes and errors excepted. Limitation of liability:

More information

Elements of Exposure

Elements of Exposure Elements of Exposure Exposure refers to the amount of light and the duration of time that light is allowed to expose film or a digital-imaging sensor. Exposure is controlled by f-stop, shutter speed, and

More information

ITS USE CASE. Disclaimer

ITS USE CASE. Disclaimer ITS USE CASE Use Case Title: Green Light Optimal Speed Advisory (GLOSA) Project Name: Standaardisatie Tafel (NL) Source: Amsterdam Group (AG), EcoAT, ISO-19091, ETSI-TS103301, SAE-J2735 Date: 2015-11-25

More information

A Comparative Study of Structured Light and Laser Range Finding Devices

A Comparative Study of Structured Light and Laser Range Finding Devices A Comparative Study of Structured Light and Laser Range Finding Devices Todd Bernhard todd.bernhard@colorado.edu Anuraag Chintalapally anuraag.chintalapally@colorado.edu Daniel Zukowski daniel.zukowski@colorado.edu

More information

FPM(L)-NIR(1100) Content PRODUCT SPECIFICATION

FPM(L)-NIR(1100) Content PRODUCT SPECIFICATION LC-Tec Displays AB FPM(L)-NIR(1100) product specification February, 2016 FPM(L)-NIR(1100) PRODUCT SPECIFICATION Content 1. Revision history... 2 2. Product description... 2 3. Ordering information... 2

More information

ScienceDirect. Analysis of Goal Line Technology from the perspective of an electromagnetic field based approach

ScienceDirect. Analysis of Goal Line Technology from the perspective of an electromagnetic field based approach Available online at www.sciencedirect.com ScienceDirect Procedia Engineering 72 ( 2014 ) 279 284 The 2014 Conference of the International Sports Engineering Association Analysis of Goal Line Technology

More information

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions e2v s Onyx family of image sensors is designed for the most demanding outdoor camera and industrial machine vision applications,

More information

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER Nils Gageik, Thilo Müller, Sergio Montenegro University of Würzburg, Aerospace Information Technology

More information

Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level

Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level Klaus Buchegger 1, George Todoran 1, and Markus Bader 1 Vienna University of Technology, Karlsplatz 13, Vienna 1040,

More information

FCam: An architecture for computational cameras

FCam: An architecture for computational cameras FCam: An architecture for computational cameras Dr. Kari Pulli, Research Fellow Palo Alto What is computational photography? All cameras have optics + sensors But the images have limitations they cannot

More information

The Mobile CNC Measurement and 3D Scanning System. WENZEL ScanTec MobileScan3D

The Mobile CNC Measurement and 3D Scanning System. WENZEL ScanTec MobileScan3D The Mobile CNC Measurement and 3D Scanning System WENZEL ScanTec MobileScan3D MobileScan3D What is it and how does it work? MobileScan3D is a truly mobile CNC laser scanning solution allowing fully automatic

More information

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST

MEM: Intro to Robotics. Assignment 3I. Due: Wednesday 10/15 11:59 EST MEM: Intro to Robotics Assignment 3I Due: Wednesday 10/15 11:59 EST 1. Basic Optics You are shopping for a new lens for your Canon D30 digital camera and there are lots of lens options at the store. Your

More information