Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

Similar documents
Humantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction

Virtual Grasping Using a Data Glove

Capacitive MEMS accelerometer for condition monitoring

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Finger Gesture Recognition Using Microphone Arrays

Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit

Gesture Recognition with Real World Environment using Kinect: A Review

ISSN: [Pandey * et al., 6(9): September, 2017] Impact Factor: 4.116

AirWave Bundle. Hole-Home Gesture Recognition and Non-Contact Haptic Feedback. Talk held by Damian Scherrer on April 30 th 2014

An Approach to Semantic Processing of GPS Traces

Keywords: GPS, receiver, GPS receiver, MAX2769, 2769, 1575MHz, Integrated GPS Receiver, Global Positioning System

Developer Techniques Sessions

User Interface Agents

Enabling Cursor Control Using on Pinch Gesture Recognition

Brain-computer Interface Based on Steady-state Visual Evoked Potentials

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

BRAIN COMPUTER INTERFACE (BCI) RESEARCH CENTER AT SRM UNIVERSITY

QUICK START GUIDE FOR DEMONSTRATION CIRCUIT BIT DIFFERENTIAL INPUT DELTA SIGMA ADC LTC DESCRIPTION

Air Marshalling with the Kinect

Introducing 32-bit microcontroller technologies to a technology teacher training programme

Training Schedule. Robotic System Design using Arduino Platform

Multi touch Vector Field Operation for Navigating Multiple Mobile Robots

FINGER MOVEMENT DETECTION USING INFRARED SIGNALS

Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

FYS3240 PC-based instrumentation and microcontrollers. Signal sampling. Spring 2015 Lecture #5

The Control of Avatar Motion Using Hand Gesture

STM32 microcontroller core ECG acquisition Conditioning System. LIU Jia-ming, LI Zhi

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Module 1: Introduction to Experimental Techniques Lecture 2: Sources of error. The Lecture Contains: Sources of Error in Measurement

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Group #17 Arian Garcia Javier Morales Tatsiana Smahliuk Christopher Vendette

FYS3240 PC-based instrumentation and microcontrollers. Signal sampling. Spring 2017 Lecture #5

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

AN AUTOMATED ALGORITHM FOR SIMULTANEOUSLY DETERMINING ULTRASONIC VELOCITY AND ATTENUATION

HELPING THE DESIGN OF MIXED SYSTEMS

Integration Guide TPE-500 SERIES. Force Sensing Potentiometer

AERO2705 Space Engineering 1 Week 7 The University of Sydney

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Individually configurable system. Microphone Arrays.

Partial Discharge Measurement and Monitoring on High Voltage XLPE Cables

Industry s First 0.8µV RMS Noise LDO Has 79dB Power Supply Rejection Ratio at 1MHz Amit Patel

An Example of robots with their sensors

Classification for Motion Game Based on EEG Sensing

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Lab 2A: Introduction to Sensing and Data Acquisition

RFIC Group Semester and Diploma Projects

DERIVATION OF TRAPS IN AUDITORY DOMAIN

An Overview of Biometrics. Dr. Charles C. Tappert Seidenberg School of CSIS, Pace University

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

Implementation of Pixel Array Bezel-Less Cmos Fingerprint Sensor

Controlling Humanoid Robot Using Head Movements

MXD2125J/K. Ultra Low Cost, ±2.0 g Dual Axis Accelerometer with Digital Outputs

Eliminate Pipeline Headaches with New 12-Bit 3Msps SAR ADC by Dave Thomas and William C. Rempfer

AD8232 EVALUATION BOARD DOCUMENTATION

Application Note # 5438

Experiment on signal filter combinations for the analysis of information from inertial measurement units in AOCS

ADC Resolution: Myth and Reality

Oscilloscope Measurement Fundamentals: Vertical-Axis Measurements (Part 1 of 3)

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Laboratory Experiment #1 Introduction to Spectral Analysis

Your Noise is My Command: Sensing Gestures Using the Body as an Antenna

Validation & Analysis of Complex Serial Bus Link Models

Advanced Test Equipment Rentals ATEC (2832)

Module 13: Interfacing ADC. Introduction ADC Programming DAC Programming Sensor Interfacing

Auto Harmonizer. EEL 4924 Electrical Engineering Design (Senior Design) Final Design Report 26 April 2012

Multi-touch Interface for Controlling Multiple Mobile Robots

Classification of Hand Gestures using Surface Electromyography Signals For Upper-Limb Amputees

An Example of robots with their sensors

HeadScan: A Wearable System for Radio-based Sensing of Head and Mouth-related Activities

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

A NEW NEUROMORPHIC STRATEGY FOR THE FUTURE OF VISION FOR MACHINES June Xavier Lagorce Head of Computer Vision & Systems

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

We have continually evolved computing to not only be more efficient, but also more

Paulo Costa, Antonio Moreira, Armando Sousa, Paulo Marques, Pedro Costa, Anibal Matos

Body Cursor: Supporting Sports Training with the Out-of-Body Sence

Integration Guide. Force Sensing Potentiometer. To be used in conjunction with current FSP series data-sheets available at

A Gestural Interaction Design Model for Multi-touch Displays

Design and Implementation of Shift Frequency Measurement System for Metal Detector

PhantomParasol: a parasol-type display transitioning from ambient to detailed

Mechanical grounding: Hi-Fi system

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

1.2W Audio Power Amplifier with Active-low Standby Mode

In this lecture, we will look at how different electronic modules communicate with each other. We will consider the following topics:

Touch & Gesture. HCID 520 User Interface Software & Technology

Chapter 2 Analog-to-Digital Conversion...

PART 1: DESCRIPTION OF THE DIGITAL CONTROL SYSTEM

On-Line, Low-Cost and Pc-Based Fingerprint Verification System Based on Solid- State Capacitance Sensor

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

CASE STUDY BRIDGE DYNAMIC MONITORING

ADC Based Measurements: a Common Basis for the Uncertainty Estimation. Ciro Spataro

GSM BASED PATIENT MONITORING SYSTEM

DUAL ULTRA MICROPOWER RAIL-TO-RAIL CMOS OPERATIONAL AMPLIFIER

Implementation of a Self-Driven Robot for Remote Surveillance

1How much bandwidth do you need?

Debugging EMI Using a Digital Oscilloscope. Dave Rishavy Product Manager - Oscilloscopes

Non-Contact Gesture Recognition Using the Electric Field Disturbance for Smart Device Application

ESA400 Electrochemical Signal Analyzer

WHITE PAPER Need for Gesture Recognition. April 2014

Transcription:

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Adiyan Mujibiya The University of Tokyo adiyan@acm.org http://lab.rekimoto.org/projects/mirage-exploring-interactionmodalities-using-off-body-static-electric-field-sensing/ 1. Project Goal Electronic devices with significant computational resources can now be carried mobile and becoming more ubiquitous. Such advancements lead to a growing research interest in new human-computer interfaces that go beyond the traditional paradigm of keyboard, mouse, and touch screen, including explorations on leveraging human body motion, gesture, and activity to support always-available computing, either with devices that people carry on their bodies, or using devices embedded in the environment. Electric Field (EF) sensing offers strategic solution for these challenges, and recently has gain significant attention due to the availability of inexpensive electronic components to measure the relatively small signals. However, it is difficult to acquire stable and easily interpretable signal which is important to aggregate meaningful contextual information in a passive (for low-power and simple hardware implementation) and non body contact configuration, for broader support in interaction modalities (see publication 1 for thorough review of related work). Our research addresses these issues to make EF sensing more accessible to interface designers. In this research, we seek to achieve an effective non body contact technique to infer the amount and type of body motion, gesture, and activity, as well as location classification and multiple user differentiation using non body contact and passive static electric field sensing. This approach involves passive measurement of static electric field of the environment flowing through sense electrode. This sensing method leverages electric field distortion by the presence of an intruder (e.g. human body). supports ultra-low power operation. It requires no instrumentation to the user, and can be configured as environmental, mobile, and peripheral-attached sensor. Since electric fields penetrate non-conductors, electrode sensors can be hidden, providing protection from weather and wear, while simultaneously adds the element of disappearing input interface. Our method also works outdoor, enabling truly mobile solution. Results from our experiments have demonstrated that our system performs reasonably well for a series of activity and gestures (see publication 1 and 2). Additionally, we achieved encouraging results on location classification within a building, as well as multiple user differentiation using Independent Component Analysis (ICA). 2. Technical breakthrough PASSIVE AMBIENT STATIC ELECTRIC FIELD SENSING We extend previous work by proposing sensing method that leverages much simpler hardware, while aggregating more stable and reliable signal. Furthermore, we also investigate multiple user differentiation and location classification using ambient EF fingerprints. We are not aware of previous work that has explicitly observed the usage of passive (non-signal transmission) sensing of ambient (offbody) static EF for HCI. Our proposed method has simple analog circuitry and 33 Microsoft Research CORE9 Project Summary Booklet

Figure 1. Circuit model of capacitive coupling between the user's body, the environment, and the sense electrode. The sensing voltage (Vs) is measured between the sense electrode and earth ground. Firstly, we observe a capacitance CB between the body and the environment. In Figure 1, this is separated into two capacitances: the coupling capacitance CF between the user's feet relative to the ground, and the coupling capacitance Cw between the body and other objects in the environment, such as the walls. We assume that there are two highly resistive layers between the feet of the subject and the ground. One layer is the sole of the subject's footwear. The other is the surface of the floor. The capacitance CF may be calculated as the sum of the capacitance Cf of the sole and the capacitance Cl of the surface of the floor. Unlike on-body sensing case, there is now a coupling capacitance between the body and the sense electrode (Cd), which is mostly a function of the proximity of the user to the sense electrode. The sensing unit is basically a sensor depicted as a probe or antenna of arbitrary shape connected to an ADC input of a microcontroller. It is assumed that probe's size and shape does not disturb the field being measured. Finally, the sensing voltage (Vs) is measured from the sense electrode to earth ground (i.e., across the sense capacitor Cs). Since the sense capacitor (Cs) value is fixed, changes in any of the coupling capacitances result in an AC voltage change on the sensing voltage (Vs). For normal interactions, Vs is most affected by changes in (1) the distance between the user and the sense electrode-δcd, (2) the user's contact area with the floor-δcf (e.g., standing on one foot vs. two), and (3) the proximity of the user to other objects in the interaction space-δcw. SIGNAL PROCESSING Since we are interested in only an AC signal, we need to DC bias the signal in order to sample it using the single-ended analog-to-digital converter (ADC) on the microcontroller board. In previous on-body sensing approach, this was accomplished using custom hardware before the ADC. We use a simple channel-switching method to DC bias the ADC signal (i.e. we implemented alternate-sampling of the internal voltage reference (VREF) and the analog input). In the case of Successive Approximation Register (SAR) ADC, sampling the VREF will pre-charge the ADC input to a known level. This should establish a DC level when we switch back to sample the analog input (which shares that stored charge). Broad range of microcontroller boards such as Arduino incorporate SAR ADC in their design, due to low-cost and ease of interfacing. 34

Figure 2. Signal acquisition: raw sample (red line), DC component (green line), AC component (blue line), main signal (white line), event detection (green and yellow highlights) and gesture segmentation (grey highlights). We sample each channel at 20 Hz, a sampling rate that would be considered too low for any significant noise other than EF disturbance that we are examining, but is able to represent the relevant spectrum for our purposes. Serial communication client written in Java is used to interface the sensing unit and PC. Furthermore, this program performs key functions such as providing live visualization of the data from our sensor/s (Figure 2 red line), as well as implement the following signal processing and analysis: 1) aggregate DC components by applying a 3rd order Butterworth IIR low-pass filter with a 3 db corner at 7 Hz (Figure 2 green line), 2) aggregate AC components by applying a 3rd order Butterworth IIR high-pass filter with a 3 db corner at 7 Hz (Figure 2 blue line), 3) aggregate base signal (Figure 2 white line) from DC component and ratio of ideal-vsactual VREF reading when measured against Vcc (Figure 2 yellow line), 4) real-time event detection (Figure 2 green and yellow highlights) and segmentation (Figure 2 grey highlights), and 5) motion, activity, and large-body gesture recognition which incorporates heuristic based adaptive threshold and SVM based machine learning approach (more detail in publication 1). We report on a series of experiments with 10 participants showing robust activity and gesture recognition, as well as promising results for robust location classification and multiple user differentiation. Please refer to our paper (publication 1) for complete experiments report. 35

3. Innovative Applications Figure 3. Our interactive applications build on: (a) continuous and discrete activity recognition for activity monitoring, (b) discrete activity recognition embedded in an avatar-controlling game where a user has to physically walk, run, or jump to avoid obstacles, and (c) gesture recognition embedded in a Tetris game where user controls are mapped to whole-body gestures. We developed three applications to demonstrate our methods' real-time interactive capabilities. The first application shows the capability to perform activity monitoring. This application provides real-time visualization of the raw data stream, results of the signal processing (AC/DC components, signal, FFT of the signal) as well as context aggregation results such as standing still, walking, running, and jumping, with their respective speeds and step counts (Figure 3a). The second application leverages activity detection to control the movement of a game character. Figure 3b shows a user playing the game where he tries to avoid obstacles by walking, running, and jumping. The third application is a Tetris game, which the user controls were mapped to a player's whole-body gestures. Although a wide range of gestures can be trained, we leverage intuitive arm and foot motions such as: lifting left or right arm for left or right movement respectively, rotation gesture with one hand for Tetris block rotation, and jumping gesture to drop the block on the top of the stacks. In this application, we pre-trained the gesture classifier (SMO) using 10 examples of each gesture. Figure 3c shows the actual image of a user playing our Tetris game. Figure 4. In this project we explore the feasibility to infer type and amount of body motion, gesture, and activity by passively measuring ambient (off-body) static electric fields. Here we show three configurations representing application domains, which are supported by our proposed method. As described in Figure 4, we also envision three different application domains that can benefit from our proposed sensing method. 36

4. Academic Achievement We have published our research results in top tier conferences representing multiple research domains such as ACM User Interface Software and Technology (UIST'13) and ACM Conference on Embedded Networked Sensor Systems (SenSys '13). Moreover, we plan to submit additional paper to UIST'14. We also published another paper with collaborations from MSR researchers at ACM international conference on Interactive tabletops and surfaces (ITS '13). 2) Adiyan Mujibiya and Jun Rekimoto. 2013. Mirage: body motion and activity recognition using off-body static electric field sensing. In Proceedings of the 11th ACM Conference on Embedded Networked Sensor Systems (SenSys '13). ACM, New York, NY, USA, Article 65, 2 pages. DOI=10.1145/2517351.2517383 http:// doi.acm.org/10.1145/2517351.2517383 3) Adiyan Mujibiya, Xiang Cao, Desney S. Tan, Dan Morris, Shwetak N. Patel, and Jun Rekimoto. 2013. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces (ITS '13). ACM, New York, NY, USA, 189-198. DOI=10.1145/2512349.2512821 http://doi.acm.org/10.1145/2512349.2512821 5. Achievement in Talent Fostering Principal investigator of this project is a PhD Student at The University of Tokyo. This project is one of his main themes for dissertation. 6. Collaboration with Microsoft Research Principal investigator of this project conducted longterm research internship with Microsoft Research Asia in Beijing and Microsoft Research HQ in Redmond. We also conducted demo and discussion during CORE project meetings held occasionally in Tokyo, as well as during CORE 8 and 9 review meetings. 7. Project Development Our extensions of this project will be submitted to ACM UIST'14 and/or other upcoming HCI related conferences. 8. Publications Paper publication 1) Adiyan Mujibiya and Jun Rekimoto. 2013. Mirage: exploring interaction modalities using off-body static electric field sensing. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13). ACM, New York, NY, USA, 211-220. DOI=10.1145/2501988.2502031 http://doi. acm.org/10.1145/2501988.2502031 37