Synchronous stereo-video and biosignal recording a basic setup for Human-Computer-Interface applications
|
|
- Oswin Webster
- 5 years ago
- Views:
Transcription
1 Synchronous stereo-video and biosignal recording a basic setup for Human-Computer-Interface applications O.P. Burmeister, M.G. Litza, M. Nitschke and U.G. Hofmann Institute for Signal Processing, University of Luebeck, Germany Clinic for Neurology, University Hospital of Schleswig-Holstein, Campus Luebeck, Germany hofmann@isip.uni-luebeck.de Abstract The work presented reports on the development of a data acquisition system intended to be used in Human- Computer-interfacing applications to synchronously record upper limb movement and corresponding biosignals, be it EEG, ECoG, EMG or multi unit neuronal signals. Ultimately, this aims at the clinical recording of ECoG signals from awake patients undergoing tumor resection. For that purpose stereovideo camera frames are used to detect and triangulate hand and finger positions with high precision while at the same time a DSP-based 32-channel board acquires wideband biosignals. We validated the synchronous acquistion of both space coordinates and bioelectrical signals (EMG) by performing simple grasp experiments. Further research will be dedicated to the system s clinical application. I. INTRODUCTION One of the most depressing fears humans may face, is the outlook to become unable to voluntarily control body functions and movements. Unfortunately, this state is ultimately reached in patients suffering from traumatic tetraplegia, brain-stem strokes or long-term chronic diseases like amytrophic lateral sclerosis. Thankfully, the last couple of years have seen an immense increase in research efforts, knowledge and public awareness in order to alleviate at least some of the results of this sometimes locked-in called state [1]. Technological main stream of this developments are efforts to directly access the patients brain signals and utilize them for a Human- Computer-Interface (HCI) [2] to provide a new, non-classical path of communication. There are two general moieties of these HCI under development - the least traumatizing one is aiming at analysing macroscopical EEG signals from the brain (called Brain- Computer-Interface ) [3], whereas the other one is gaining microscopical signals from many single neurons from deeper cortical structures [4]. Clearly, while the former has the advantage to get absolutely non-invasive read-outs from the brain [5], [6], the latter features in principle higher information transfer rates, even suitable for real-time control tasks [7] [9]. Quite recently, another, less prominent subgroup of BCI s re-appeared on the scientific map: Macroscopic recordings of local field potentials from the surface of the brain, so called ECoG-recordings, proved to be helpful for predicting imaginary and real hand movements [10] [14], thus providing exiting new signals to be utilized for semi-invasive BCI s. All those recent developments however are based on a very small number of animal experiments or, due to good ethical reasons, restricted to some patients undergoing very specialized brain surgery. On the other side, recordings with strip electrodes from the cortex are quite common in neurosurgery (e.g. [15]) and used to the advantage of many, e.g. young tumor-patients [16]. It seems therefore rewarding for BCI research to widen the group of potential participants for subdural, BCI-oriented recordings by an experimental setup to be used beneficial in any neurosurgical operation room. In addition to the standard cortical mapping performed by strip electrodes, the awake patient could perform an easy movement task, while at the same time his local cortical field potentials are recorded. The paradigm to be investigated will be the correlation between hand-grasp movements and its generating brain activity. For that purpose it is neccessary to track the patients hand and finger positions in three dimensions with the least possible disturbance for the following surgery, while simultaneously recording its biosignal activity. The following describes the development of a recording system to perform synchronously and in one computer three-dimensional, marker-free hand tracking and general multichannel biosignal recordings to be used in a operation room setting. We will report on its realworld use in subsequent clinical publications. II. MATERIALS In order to achieve the minimal disturbance possible in the actual clinical setting, we decided to develop a stereo-camera based, marker-free system for handtracking. Both, stereo-video and biosignal recording system reside in the same industrial PC-chassis, equipped with an Intel Pentium IV, 3.06 GHz CPU and 1 GB SDRAM (DSM Computer, Munich, Germany). The utilized Intel Chip supports Hyperthreading, which permits both applications for handtracking and biosignal recording to run nearly in parallel. We used Windows 2000 as operating system and accessed a 250 GB S-ATA hard disk drive. A sketch of the complete system is depicted in Figure 1. A. Stereo-video system We are using two Sony XC-ST50CE monochrome CCD- Cameras with a Pentax H612A-TH lens, connected to a
2 Fig. 1. Sketch of the recording system PC2Vision framegrabber from Coreco Imaging (Saint-Laurent, Quebec). This framegrabber provides us with the ability to acquire the video data of two cameras synchronously without external genlocking. Due to movement artifacts when recording in interlaced mode we used their progressive scan mode instead. Each received image frame has a size of 760x280 pixels and is upscaled to 760x560 pixels, the frame rate has been manually reduced to 25. Faster frame grabbing doesn t seem necessary due to time restricitions given by the tracking algorithm. The cameras are fixed on a custom-built, portable optical bench. Both cameras are fixed on movable mounts such as to maintain permanent visibility of the tracked hand. For the laboratory setup presented here, we restricted our tracking to uniformly colored background, in order to reduce the complexity of the segmentation of the hand. B. Biosignal recording The data acquisition system is based on off-the-shelf DSP boards (M67, Innovative Integration, Thousand Oaks, CA). The board features a Texas Instruments C6701 highperformance DSP and is augmented with two analog-to-digital conversion modules (AD16 OMNIBUS modules). The DSPtarget is connected to the PC-host via the PCI-bus. This data acquisition system was developed for fast multichannel recording and allows to acquire data from 32 channels simultaneously with a sample rate up to 50kHz and 16 bit resolution [17]. Details can be found elsewhere [18]. Amplification of biosignals was achieved by a custom built, patient safe, wide band, 32 channel amplifier, but may be done by any type of analog multichannel amplifier, suitable for human use. III. METHODS A. 3D-Handtracking Although there are plenty of reports dealing with the tracking of the hand, most of them did not try to achieve a high precision in the hand-finger positions, but instead were interested in the gesture of the hand [19]. Gesture recognition may be done with or without markers and reduces the position detection in the image to a simple blob analysis. However, using markers seems prohibitive in the operation room. Therefore one of the biggest problems for markerfree handtracking is the segmentation of the hand out of any image [20]. In fact, the best suitable method for segmentation depends on some real world conditions impossible to foresee: 1) Is it possible to restrict the cameras background to a static or single-colored one? 2) Do monochrome or color cameras come to use? 3) Which lighting conditions will be experienced in the OR? Following earlier approaches [21] [23] we use a uniformly colored background with mono-chrome cameras, so the actual segmentation can be done via gray scale thresholding. In order to generalize this approach, static background subtraction [24] and a histogram based segmentation [25] can be used as well. Alternative methods by movement detection [24], skin detection [24], [26] and detection through stereo information [26] were dropped due to their computing time requirements or expensive equipment. In order to support the segmentation process Jennings [26] and Sato [27] have searched for circular structures, resembling finger projections, in images. Independent from the camera position fingers always look semicircular. Such structures were searched for by crosscorrelating images with finger templates. The video setup became independent from changes in light level by static, small source illumination and optical filtering of daylight. The handtracking application thus has two constraints: 1) the palm of the hand must be parallel to the image plane of the cameras, 2) the background must be static and single-colored. To compute the position of the hand in space, we first calculate the positions for each camera s frames. This enables us to triangulate the third dimension [28], based on a public domain stereo-calibration routine [29], included in the known OpenCV 1 computer vision library. Following segmentation, we perform a distance transformation [30] on the binary image of the hand. The distance 1
3 transformation gives the smallest distance of an unmarked (black) pixel to a marked (white) pixel in the binary image according to D(q) =min( p, q ),q O, where O is the set p O of all marked pixels. As can be seen in Figure 2 the maximum (a) Profile fingers through (b) Gray scale values of pixels on profile Fig. 4. Estimation of fingertips via gray scale profiles (a) Binary image of the hand Fig. 2. (b) Distance transform of (a) Computation of the handcenter by distance transformation of the distance transform locates the center of the palm. Due to the time-consuming computation of a full image distance transform we use a region of interest (ROI), in which the distance transform is calculated. That way we even reduce artifacts due to faultily marked, bright areas in the binary image by forcing the distance transform s center to lie at least on the palm. The ROI has to be chosen for the very first hand position only. The position of the distance transform s maximum defines the center and its scaled value, respectively, the size of a registered square estimating the orientation of the hand. This square is concentrically rotated such, that one side is oriented parallel to the hand s ulnar edge, providing a good estimate for the hands directionality (see Fig. 3). Its rotation value is deduced by minimizing the standard deviation of all distance transform values lying outside this side: α =argmin(std(s β )), β where S β is the set of all points lying on the right side of the palm s square. Fig. 3. Computation of fingertips and center of palm. Small blue dots: edges of the fingers, middle of the fingers. Big blue dots: estimation of fingertip The knowledge of the hand s orientation gives us an idea where fingers are supposed to be. That way we are able to confine our search for the fingertips via gray scale profiles. Considering a gray scale profile through a finger in the binary image we obtain a periodic sequence of background values, hand values, again background values. Anatomically the size of the hand value part has to be limited. So we get two conditions for a finger-like part in the gray scale profile. A profile through all four fingers and the gray scale values at the profile are depicted in Figure 4. The profiles are parallel to the closer side of the palm s square. Peaks in the profile s derivative give the location of the finger s edges, whereas a position in between peaks belongs to the finger itself. A line connecting central finger positions of two profiles directs to the fingertip position. (Fig. 3). An overview of the handtracking algorithm is depicted in Figure 5. B. Synchronizing stereo-video and biosignal recording The basic idea behind this project was to provide synchonous data from hand movement in space and corresponding biosignals, in the future from subdural electrodes. The problem when synchronizing stereo-video and biosignals, however, is the strongly differing sampling rate. Where normal video cameras are able to record images with a frame rate of 25 Hz, our biosignal recording system uses a minimum rate of 5000 Hz. There are two possibilities to cope with those different sampling rates. 1) Integration of biosignals over a given time period to approach the sampling rate of the cameras ( [31]) or 2) Matching of samples by timestamps. Even though the first method seems well suited for slow EEG-recordings, our intended recording modality, ECoG, provides wider frequency content (up to 200Hz), not to mention multiunit neuronal recordings (up to 15kHz). However, if we integrate the signal to approach the sampling rate of a camera, we will loose these important frequencies, since an integrator acts as lowpass filter. We therefore chose the latter way, assigning each hand-finger position as well as each biosignal sample point a high resolution system time stamp. Since both, signal acquisition and stereo-video boards reside in the same PC, we were able to avoid time-synchronizing problems to be considered for seperate computers. The delay produced by moving a camera image into memory, where it gets timestamped, was measured to be 30 msec. By subtracting that from the stamped time, we are able to perform stereo-video and signal acquisition synchronized to within milliseconds.
4 Fig. 6. signals Validation setup to synchronously record stereo-video and EMG Fig. 7. Exemplary space coordinates [mm] of handmovement trajectory Fig. 5. Application flow handtracking Clearly, that way we get many more biosignal sample points for only one stereo-video sample point, but it seems neccessary to correlate detailed biological signals with informations about the position of a hand. IV. RESULTS Tests of the system were performed in a laboratory setting not the operation room to validate the overall concept of this synchronous approach while recording biosignals. We chose to record surface EMG signals correlated to a handgrasp movement as proof of concept and due to their simple accessibility (see Figure 6). A test person sat in front of a black background and was asked to reach for a pole in grasping distance. This resulted in a movement distance of approximately 30cm. Stereo-cameras were positioned 60cm away from the target pole in a distance of 8cm from each other. Clearly, the precision to be expected depends on the optical and geometrical setup, but as well on the speed of the movement. All parameters can be easily adjusted to other geometries, but resulted in our case in a measurement precision of approximately 2mm 3 in a volume of about 50x40x20 cm 3 (see Figure 7). The exemplary trajectories in Figure 7 reveal the well known difficulty of precisely copying a given movement - a huge spread in precise coordinates can be found. We consequently detected the final grasp by the sudden disappearance of any finger related structures from the video frames (see Figure 8). (a) Open (b) Grasp Fig. 8. A cylindrical grasp to a target. The rectangle surrounding the hand shows the ROI for the distance transform. This time corresponded very well to an increase in EMG activity on the digital flexor muscle (see red vertical line in Figure 9). EMG signal from the volunteers main arm muscles were intercepted by selfadhesive electrodes (Bluesense, Ambu, Denmark) connected to the amplifier. The increase in grip force is visible by a clear rise in EMG signal variation as detected by the rms-value of the signal.
5 Fig. 9. EMG signals of digital flexor muscle showing an increase in grip force corresponding to a video detected handclosing (red line) V. CONCLUSION This project was intended to develop a single-computer system to synchronously record markerfree hand-arm-movements and its corresponding biosignals. We showed the validity of this concept by detecting hand-grasp movements and EMG signals at the same time with high temporal and local precision. We are now able to measure upper limb trajectories with a patient safe video system and will report on the clinical results elsewhere. ACKNOWLEDGMENT This work was partly funded by the European Union grant VSAMUEL IST and the German Research Ministry grant navegate 16SV1433. We want to express our gratitude to M. Zelazny for building the amplifier. REFERENCES [1] J. R. Patterson and M. Grabois, Locked-in syndrome: A review of 139 cases, Stroke, vol. 17, no. 4, pp , [2] M. A. L. Nicolelis, Actions from thoughts, Nature, vol. 409, pp , [3] J. R. Wolpaw, N. Birbaumer, D. J. McFarlanda, G. Pfurtscheller, and T. M. Vaughan, Brain-computer interfaces for communication and control, Clinical Neurophysiology, no. 113, pp , [4] J. K. Chapin, K. A. Moxon, R. S. Markowitz, and M. A. L. Nicolelis, Real-time control of a robot arm using simultaneously recored neurons in the motor cortex, Nature Neuroscience, vol. 2, pp , [5] N. Birbaumer and A. Kübler, The thought translation device (ttd) for completely paralyzed patients, in IEEE Trans. Rehab. Eng., vol. 8, 2000, pp [6] J. R. Wolpaw, D. MacFarland, G. Neat, and C. Forneris, An eeg-based brain-computer interface for cursor control, Electroencephalogr. Clin Neurophysiol., no. 78, pp , [7] D. M. Taylor, S. I. H. Tillery, and A. B. Schwartz, Information conveyed through brain-control: Cursor versus robot, in IEEE Trans. Neural Syst. and Rehab. Eng. 11, vol. 2, 2003, p. 195 ff. [8] M. Serruya, N. Hatsopoulos, L. Paninski, M. Fellows, and J. Donoghue, Instant neural control of a movement signal, Nature, vol. 416, pp , [9] M. Laubach, J. Wessberg, and M. A. L. Nicolelis, Cortical ensemble activity increasingly predicts behaviour outcomes during learning of a motor task, Nature, vol. 405, pp , [10] P. Kennedy, D. Andreasen, P. Ehirim, B. King, T. Kirby, H. Mao, and M. Moore, Using human extra-cortical local field potentials to control a switch, Journal of Neural Engineering, vol. 1, pp , [11] P. Kennedy, S. S. Mirra, and R. A. E. Bakay, The cone electrode - ultrastructural studies following long-term recordings in rat and monkey cortex, Neuroscience Letters, vol. 142, no. 1, pp , [12] E. C. Leuthardt, G. Schalk, J. R. Wolpaw, J. G. Ojemann, and D. W. Moran, A brain-computer interface using electrocorticographic signals in humans, Journal of Neural Engineering, vol. 1, pp , [13] T. Ball, M. P. Nawrot, T. Pistohl, A. Aertsen, A. Schulze-Bonhagen, and C. Mehring, Towards an implantable brain-machine interface based on epicortical field potentials, in Biomedizinische Technik, Biomedical Engineering, 38. Jahrestagung der Deutschen Gesellschaft für Biomedizinische Technik im VDE - BMT 2004, vol. 49, Ilmenau, 2004, pp [14] C. Mehring, J. Rickert, E. Vaadia, S. Cardosa de Oliveira, A. Aertsen, and S. Rotter, Inference of hand movements from local field potentials in monkey mortor cortex, Nature Neuroscience, vol. 6, no. 12, pp , [15] O. Suess, T. Kombos, O. Ciklatekerlio, R. Stendel, S. Suess, and M. Brock, Impact of brain shift on intraoperative neurophysiological monitoring with cortical strip electrodes, Acta Neurochir, vol. 144, pp , [16] P. M. Black, The present and future of cerebral tumor surgery in children, Child s Nerv System, vol. 16, pp , [17] U. G. Hofmann, A. Folkers, F. Mösch, D. Höhl, M. Kindlundh, and P. Norlin, A 64(128)-channel multisize neuronal recording system, Biomedizinische Technik, vol. 47, no. E1, pp , [18] U. G. Hofmann et al., A novel high channel-count system for acute multi-site neuronal recordings, submitted to IEEE TBME, [19] B. Stenger, A. Thayananthan, P. H. S. Torr, and R. Cipolla, Human hand pose estimation using hierarchical detection, in Intl. Workshop on Human-Computer Interaction, Prague, Czech Republic, [20] K. Dorfmüller-Ulhaas and D. Schmalstieg, Finger tracking for interaction in augmented environments, in Proceedings ISAR 01, New York, [21] K. Abe, H. Saito, and S. Ozawa, 3d drawing system via hand motion recognition from two cameras, in IEEE International Conference on Systems, Man and Cybernetics, [22] J. Segen and S. Kumar, Shadow gestures: 3d hand pose estimation using a single camera, in IEEE, Computer Vision and Pattern Recognition, vol. 1, [23] J. M. Rehg and T. Kanade, Digiteyes: Vision-based human hand tracking, Tech. Rep., [24] J. Martin, V. Devin, and J. L. Crowley, Active hand tracking, in IEEE Third International Conference on Automatic Face and Gesture Recognition, FG 98, Nara, Japan, [25] S. Ahmad, A usable real-time 3d hand tracker, in Proc. 28th Asilomar Conference on Signals, Systems and Computers. IEEE Computer Society Press, 1995, pp [26] C. Jennings, Robust finger tracking with multiple cameras, in Proc. of the International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-Time Systems, 1999, pp [27] Y. Sato, Y. Kobayashi, and H. Koike, Fast tracking of hands and fingertips in infrared images for augmented desk interface, in Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition. IEEE Computer Society, 2000, p [28] E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision. New Jersey: Prentice Hall, [29] J.-Y. Bouguet, Camera calibration toolbox for matlab, doc/. [30] G. Borgefors, Distance transformations in digital images, Computer Vision, Graphics and Image Processing, vol. 34, pp , [31] J. D. Echard, M. Burrow, J. W. Miao, and D. Humphrey, Pattern recognition techniques for cortical control of a robotic arm.
Real Robots Controlled by Brain Signals - A BMI Approach
International Journal of Advanced Intelligence Volume 2, Number 1, pp.25-35, July, 2010. c AIA International Advanced Information Institute Real Robots Controlled by Brain Signals - A BMI Approach Genci
More informationExercise questions for Machine vision
Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided
More informationPresented by: V.Lakshana Regd. No.: Information Technology CET, Bhubaneswar
BRAIN COMPUTER INTERFACE Presented by: V.Lakshana Regd. No.: 0601106040 Information Technology CET, Bhubaneswar Brain Computer Interface from fiction to reality... In the futuristic vision of the Wachowski
More informationCOMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES
International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 9, Issue 3, May - June 2018, pp. 177 185, Article ID: IJARET_09_03_023 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=9&itype=3
More informationNon Invasive Brain Computer Interface for Movement Control
Non Invasive Brain Computer Interface for Movement Control V.Venkatasubramanian 1, R. Karthik Balaji 2 Abstract: - There are alternate methods that ease the movement of wheelchairs such as voice control,
More informationMotor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers
Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers Maitreyee Wairagkar Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, U.K.
More informationBRAIN COMPUTER INTERFACES FOR MEDICAL APPLICATIONS
Bulletin of the Transilvania University of Braşov Vol. 3 (52) - 2010 Series I: Engineering Sciences BRAIN COMPUTER INTERFACES FOR MEDICAL APPLICATIONS C.C. POSTELNICU 1 D. TALABĂ 1 M.I. TOMA 1 Abstract:
More informationPredicting 3-Dimensional Arm Trajectories from the Activity of Cortical Neurons for Use in Neural Prosthetics
Predicting 3-Dimensional Arm Trajectories from the Activity of Cortical Neurons for Use in Neural Prosthetics Cynthia Chestek CS 229 Midterm Project Review 11-17-06 Introduction Neural prosthetics is a
More informationCamera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital
More informationMobile robot control based on noninvasive brain-computer interface using hierarchical classifier of imagined motor commands
Mobile robot control based on noninvasive brain-computer interface using hierarchical classifier of imagined motor commands Filipp Gundelakh 1, Lev Stankevich 1, * and Konstantin Sonkin 2 1 Peter the Great
More informationBrain Computer Interfaces for Full Body Movement and Embodiment. Intelligent Robotics Seminar Kai Brusch
Brain Computer Interfaces for Full Body Movement and Embodiment Intelligent Robotics Seminar 21.11.2016 Kai Brusch 1 Brain Computer Interfaces for Full Body Movement and Embodiment Intelligent Robotics
More informationClassifying the Brain's Motor Activity via Deep Learning
Final Report Classifying the Brain's Motor Activity via Deep Learning Tania Morimoto & Sean Sketch Motivation Over 50 million Americans suffer from mobility or dexterity impairments. Over the past few
More informationDesign and implementation of brain controlled wheelchair
Design and implementation of brain controlled wheelchair R.Alageswaran Senior Lecturer alageswaranr@yahoo. com G.Vijayaraj Student vijay_gtav@yahoo.co. in B.Raja Mukesh Krishna Student funnyraja@gmail.com
More informationAugmented Desk Interface. Graduate School of Information Systems. Tokyo , Japan. is GUI for using computer programs. As a result, users
Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface Yoichi Sato Institute of Industrial Science University oftokyo 7-22-1 Roppongi, Minato-ku Tokyo 106-8558, Japan ysato@cvl.iis.u-tokyo.ac.jp
More informationThe Data: Multi-cell Recordings
The Data: Multi-cell Recordings What is real? How do you define real? If you re talking about your senses, what you feel, taste, smell, or see, then all you re talking about are electrical signals interpreted
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More informationROBOT APPLICATION OF A BRAIN COMPUTER INTERFACE TO STAUBLI TX40 ROBOTS - EARLY STAGES NICHOLAS WAYTOWICH
World Automation Congress 2010 TSl Press. ROBOT APPLICATION OF A BRAIN COMPUTER INTERFACE TO STAUBLI TX40 ROBOTS - EARLY STAGES NICHOLAS WAYTOWICH Undergraduate Research Assistant, Mechanical Engineering
More informationFOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM
FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method
More informationCamera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital
More informationNeuroprosthetics *= Hecke. CNS-Seminar 2004 Opener p.1
Neuroprosthetics *= *. Hecke MPI für Dingsbums Göttingen CNS-Seminar 2004 Opener p.1 Overview 1. Introduction CNS-Seminar 2004 Opener p.2 Overview 1. Introduction 2. Existing Neuroprosthetics CNS-Seminar
More informationVisual Interpretation of Hand Gestures as a Practical Interface Modality
Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate
More informationOff-line EEG analysis of BCI experiments with MATLAB V1.07a. Copyright g.tec medical engineering GmbH
g.tec medical engineering GmbH Sierningstrasse 14, A-4521 Schiedlberg Austria - Europe Tel.: (43)-7251-22240-0 Fax: (43)-7251-22240-39 office@gtec.at, http://www.gtec.at Off-line EEG analysis of BCI experiments
More information1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.
ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means
More informationEnabling Cursor Control Using on Pinch Gesture Recognition
Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on
More informationPREDICTION OF FINGER FLEXION FROM ELECTROCORTICOGRAPHY DATA
University of Tartu Institute of Computer Science Course Introduction to Computational Neuroscience Roberts Mencis PREDICTION OF FINGER FLEXION FROM ELECTROCORTICOGRAPHY DATA Abstract This project aims
More informationBand-specific features improve Finger Flexion Prediction from ECoG
Band-specific features improve Finger Flexion Prediction from ECoG Laurent Bougrain, Nanying Liang To cite this version: Laurent Bougrain, Nanying Liang. Band-specific features improve Finger Flexion Prediction
More informationCamera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital
More informationMSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation
MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation Rahman Davoodi and Gerald E. Loeb Department of Biomedical Engineering, University of Southern California Abstract.
More informationVICs: A Modular Vision-Based HCI Framework
VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project
More informationResearch Seminar. Stefano CARRINO fr.ch
Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks
More informationINTRACORTICAL microelectrode recordings in motor cortex,
1774 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 57, NO. 7, JULY 2010 Decoding 3-D Reach and Grasp Kinematics From High-Frequency Local Field Potentials in Primate Primary Motor Cortex Jun Zhuang,
More informationRobust Hand Gesture Recognition for Robotic Hand Control
Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State
More informationPupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique
PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique Yoshinobu Ebisawa, Daisuke Ishima, Shintaro Inoue, Yasuko Murayama Faculty of Engineering, Shizuoka University Hamamatsu, 432-8561,
More informationApplying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)
Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group bdawson@goipd.com (987) 670-2050 Introduction Automated Optical Inspection (AOI) uses lighting, cameras, and vision computers
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationfrom signals to sources asa-lab turnkey solution for ERP research
from signals to sources asa-lab turnkey solution for ERP research asa-lab : turnkey solution for ERP research Psychological research on the basis of event-related potentials is a key source of information
More informationAccuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery
Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery Matteo Fusaglia 1, Daphne Wallach 1, Matthias Peterhans 1, Guido Beldi 2, Stefan Weber 1 1 Artorg
More information780. Biomedical signal identification and analysis
780. Biomedical signal identification and analysis Agata Nawrocka 1, Andrzej Kot 2, Marcin Nawrocki 3 1, 2 Department of Process Control, AGH University of Science and Technology, Poland 3 Department of
More informationBrain-Machine Interface for Neural Prosthesis:
Brain-Machine Interface for Neural Prosthesis: Nitish V. Thakor, Ph.D. Professor, Biomedical Engineering Joint Appointments: Electrical & Computer Eng, Materials Science & Eng, Mechanical Eng Neuroengineering
More informationA SURVEY ON GESTURE RECOGNITION TECHNOLOGY
A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture
More informationImage Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network
436 JOURNAL OF COMPUTERS, VOL. 5, NO. 9, SEPTEMBER Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network Chung-Chi Wu Department of Electrical Engineering,
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationAn Overview of Brain-Computer Interface Technology Applications in Robotics
An Overview of Brain-Computer Interface Technology Applications in Robotics Janet F. Reyes Florida International University Department of Mechanical and Materials Engineering 10555 West Flagler Street
More informationOptic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball
Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball Masaki Ogino 1, Masaaki Kikuchi 1, Jun ichiro Ooga 1, Masahiro Aono 1 and Minoru Asada 1,2 1 Dept. of Adaptive Machine
More information3. Sound source location by difference of phase, on a hydrophone array with small dimensions. Abstract
3. Sound source location by difference of phase, on a hydrophone array with small dimensions. Abstract A method for localizing calling animals was tested at the Research and Education Center "Dolphins
More informationA SURVEY ON HAND GESTURE RECOGNITION
A SURVEY ON HAND GESTURE RECOGNITION U.K. Jaliya 1, Dr. Darshak Thakore 2, Deepali Kawdiya 3 1 Assistant Professor, Department of Computer Engineering, B.V.M, Gujarat, India 2 Assistant Professor, Department
More informationReal-Time Face Detection and Tracking for High Resolution Smart Camera System
Digital Image Computing Techniques and Applications Real-Time Face Detection and Tracking for High Resolution Smart Camera System Y. M. Mustafah a,b, T. Shan a, A. W. Azman a,b, A. Bigdeli a, B. C. Lovell
More informationCamera Overview. Olympus Digital Cameras for Materials Science Applications: For Clear and Precise Image Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Olympus Digital Cameras for Materials Science Applications: For Clear and Precise Image Analysis Passionate about Imaging
More informationThe CMU Pose, Illumination, and Expression (PIE) Database
Appeared in the 2002 International Conference on Automatic Face and Gesture Recognition The CMU Pose, Illumination, and Expression (PIE) Database Terence Sim, Simon Baker, and Maan Bsat The Robotics Institute,
More informationImprovement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere
Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Kiyotaka Fukumoto (&), Takumi Tsuzuki, and Yoshinobu Ebisawa
More informationMaster Thesis Proposal: Chess Brain-Computer Interface Design and Optimization for Low-Bandwidth and Errors
Master Thesis Proposal: Chess Brain-Computer Interface Design and Optimization for Low-Bandwidth and Errors Samuel A. Inverso Computer Science Department College of Computing and Information Sciences Rochester
More information8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and
8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE
More informationMEASURING AND ANALYZING FINE MOTOR SKILLS
MEASURING AND ANALYZING FINE MOTOR SKILLS PART 1: MOTION TRACKING AND EMG OF FINE MOVEMENTS PART 2: HIGH-FIDELITY CAPTURE OF HAND AND FINGER BIOMECHANICS Abstract This white paper discusses an example
More informationComputer Access Devices for Severly Motor-disability Using Bio-potentials
Proceedings of the 5th WSEAS Int. Conf. on COMPUTATIONAL INTELLIGENCE, MAN-MACHINE SYSTEMS AND CYBERNETICS, Venice, Italy, November 20-22, 2006 164 Computer Access Devices for Severly Motor-disability
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationSIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB
SIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB S. Kajan, J. Goga Institute of Robotics and Cybernetics, Faculty of Electrical Engineering and Information Technology, Slovak University
More informationLaboratory Project 1B: Electromyogram Circuit
2240 Laboratory Project 1B: Electromyogram Circuit N. E. Cotter, D. Christensen, and K. Furse Electrical and Computer Engineering Department University of Utah Salt Lake City, UT 84112 Abstract-You will
More informationOLYMPUS Digital Cameras for Materials Science Applications: Get the Best out of Your Microscope
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes OLYMPUS Digital Cameras for Materials Science Applications: Get the Best out of Your Microscope Passionate About Imaging
More informationNon-Invasive Brain-Actuated Control of a Mobile Robot
Non-Invasive Brain-Actuated Control of a Mobile Robot Jose del R. Millan, Frederic Renkens, Josep Mourino, Wulfram Gerstner 5/3/06 Josh Storz CSE 599E BCI Introduction (paper perspective) BCIs BCI = Brain
More informationDistributed Vision System: A Perceptual Information Infrastructure for Robot Navigation
Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp
More informationDigital Image Processing
Digital Image Processing Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Images taken from: R. Gonzalez and R. Woods. Digital Image Processing, Prentice Hall, 2008. Digital Image Processing
More informationShape sensing for computer aided below-knee prosthetic socket design
Prosthetics and Orthotics International, 1985, 9, 12-16 Shape sensing for computer aided below-knee prosthetic socket design G. R. FERNIE, G. GRIGGS, S. BARTLETT and K. LUNAU West Park Research, Department
More informationDigital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye
Digital Image Processing 2 Digital Image Fundamentals Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Images taken from: R. Gonzalez and R. Woods. Digital Image Processing, Prentice Hall,
More informationDigital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye
Digital Image Processing 2 Digital Image Fundamentals Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Those who wish to succeed must ask the right preliminary questions Aristotle Images
More informationEfficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision
Efficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision Peter Andreas Entschev and Hugo Vieira Neto Graduate School of Electrical Engineering and Applied Computer Science Federal
More informationA Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals
, March 12-14, 2014, Hong Kong A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals Mingmin Yan, Hiroki Tamura, and Koichi Tanno Abstract The aim of this study is to present
More informationDigital Image Processing
Digital Image Processing Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Images taken from: R. Gonzalez and R. Woods. Digital Image Processing, Prentice Hall, 2008. Digital Image Processing
More informationVarious Calibration Functions for Webcams and AIBO under Linux
SISY 2006 4 th Serbian-Hungarian Joint Symposium on Intelligent Systems Various Calibration Functions for Webcams and AIBO under Linux Csaba Kertész, Zoltán Vámossy Faculty of Science, University of Szeged,
More informationCHAPTER 7 INTERFERENCE CANCELLATION IN EMG SIGNAL
131 CHAPTER 7 INTERFERENCE CANCELLATION IN EMG SIGNAL 7.1 INTRODUCTION Electromyogram (EMG) is the electrical activity of the activated motor units in muscle. The EMG signal resembles a zero mean random
More informationDriver Assistance for "Keeping Hands on the Wheel and Eyes on the Road"
ICVES 2009 Driver Assistance for "Keeping Hands on the Wheel and Eyes on the Road" Cuong Tran and Mohan Manubhai Trivedi Laboratory for Intelligent and Safe Automobiles (LISA) University of California
More informationMEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING. J. Ondra Department of Mechanical Technology Military Academy Brno, Brno, Czech Republic
MEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING J. Ondra Department of Mechanical Technology Military Academy Brno, 612 00 Brno, Czech Republic Abstract: A surface roughness measurement technique, based
More informationAn Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques
An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques Kevin Rushant, Department of Computer Science, University of Sheffield, GB. email: krusha@dcs.shef.ac.uk Libor Spacek,
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationImage Measurement of Roller Chain Board Based on CCD Qingmin Liu 1,a, Zhikui Liu 1,b, Qionghong Lei 2,c and Kui Zhang 1,d
Applied Mechanics and Materials Online: 2010-11-11 ISSN: 1662-7482, Vols. 37-38, pp 513-516 doi:10.4028/www.scientific.net/amm.37-38.513 2010 Trans Tech Publications, Switzerland Image Measurement of Roller
More informationNon-Invasive Brain-Actuated Control of a Mobile Robot
Non-Invasive Brain-Actuated Control of a Mobile Robot Jose del R. Millan 1 ' 2, Frederic Renkens 2, Josep Mourino 3, Wulfram Gerstner 2 1 Dalle Molle Institute for Perceptual Artificial Intelligence (IDIAP)
More informationA new seal verification for Chinese color seal
Edith Cowan University Research Online ECU Publications 2011 2011 A new seal verification for Chinese color seal Zhihu Huang Jinsong Leng Edith Cowan University 10.4028/www.scientific.net/AMM.58-60.2558
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationVoice Assisting System Using Brain Control Interface
I J C T A, 9(5), 2016, pp. 257-263 International Science Press Voice Assisting System Using Brain Control Interface Adeline Rite Alex 1 and S. Suresh Kumar 2 ABSTRACT This paper discusses the properties
More informationAutomatic Locating the Centromere on Human Chromosome Pictures
Automatic Locating the Centromere on Human Chromosome Pictures M. Moradi Electrical and Computer Engineering Department, Faculty of Engineering, University of Tehran, Tehran, Iran moradi@iranbme.net S.
More informationShape Memory Alloy Actuator Controller Design for Tactile Displays
34th IEEE Conference on Decision and Control New Orleans, Dec. 3-5, 995 Shape Memory Alloy Actuator Controller Design for Tactile Displays Robert D. Howe, Dimitrios A. Kontarinis, and William J. Peine
More informationVein and Fingerprint Identification Multi Biometric System: A Novel Approach
Vein and Fingerprint Identification Multi Biometric System: A Novel Approach Hatim A. Aboalsamh Abstract In this paper, a compact system that consists of a Biometrics technology CMOS fingerprint sensor
More informationColour correction for panoramic imaging
Colour correction for panoramic imaging Gui Yun Tian Duke Gledhill Dave Taylor The University of Huddersfield David Clarke Rotography Ltd Abstract: This paper reports the problem of colour distortion in
More informationA Brain-Controlled Wheelchair Based on P300 and Path Guidance
A Brain-Controlled Wheelchair Based on P300 and Path Guidance Brice Rebsamen 1, Etienne Burdet 2,1, Cuntai Guan 3, Haihong Zhang 3, Chee Leong Teo 1, Qiang Zeng 1, Marcelo Ang 1 and Christian Laugier 4
More informationME 6406 MACHINE VISION. Georgia Institute of Technology
ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class
More informationBlood Vessel Tree Reconstruction in Retinal OCT Data
Blood Vessel Tree Reconstruction in Retinal OCT Data Gazárek J, Kolář R, Jan J, Odstrčilík J, Taševský P Department of Biomedical Engineering, FEEC, Brno University of Technology xgazar03@stud.feec.vutbr.cz
More informationInsights into High-level Visual Perception
Insights into High-level Visual Perception or Where You Look is What You Get Jeff B. Pelz Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of Technology Students Roxanne
More informationThe Use of Neural Network to Recognize the Parts of the Computer Motherboard
Journal of Computer Sciences 1 (4 ): 477-481, 2005 ISSN 1549-3636 Science Publications, 2005 The Use of Neural Network to Recognize the Parts of the Computer Motherboard Abbas M. Ali, S.D.Gore and Musaab
More informationAxioCam MRc 5 A World of Digital Possibilities
Microscopy from Carl Zeiss AxioCam MRc 5 A World of Digital Possibilities More flexibility and more performance in microscope camera technology Impressive Performance A trend setter in digital microscopy,
More informationAutomatic Crack Detection on Pressed panels using camera image Processing
8th European Workshop On Structural Health Monitoring (EWSHM 2016), 5-8 July 2016, Spain, Bilbao www.ndt.net/app.ewshm2016 Automatic Crack Detection on Pressed panels using camera image Processing More
More informationResearch Article. ISSN (Print) *Corresponding author Jaydip Desai
Scholars Journal of Engineering and Technology (SJET) Sch. J. Eng. Tech., 2015; 3(3A):252-257 Scholars Academic and Scientific Publisher (An International Publisher for Academic and Scientific Resources)
More informationStudent Attendance Monitoring System Via Face Detection and Recognition System
IJSTE - International Journal of Science Technology & Engineering Volume 2 Issue 11 May 2016 ISSN (online): 2349-784X Student Attendance Monitoring System Via Face Detection and Recognition System Pinal
More informationHigh Performance Imaging Using Large Camera Arrays
High Performance Imaging Using Large Camera Arrays Presentation of the original paper by Bennett Wilburn, Neel Joshi, Vaibhav Vaish, Eino-Ville Talvala, Emilio Antunez, Adam Barth, Andrew Adams, Mark Horowitz,
More informationTHE idea of moving robots or prosthetic devices not by
1026 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 51, NO. 6, JUNE 2004 Noninvasive Brain-Actuated Control of a Mobile Robot by Human EEG José del R. Millán*, Frédéric Renkens, Josep Mouriño, Student
More informationFingertip Detection: A Fast Method with Natural Hand
Fingertip Detection: A Fast Method with Natural Hand Jagdish Lal Raheja Machine Vision Lab Digital Systems Group, CEERI/CSIR Pilani, INDIA jagdish@ceeri.ernet.in Karen Das Dept. of Electronics & Comm.
More informationBRAIN COMPUTER INTERFACE (BCI) RESEARCH CENTER AT SRM UNIVERSITY
BRAIN COMPUTER INTERFACE (BCI) RESEARCH CENTER AT SRM UNIVERSITY INTRODUCTION TO BCI Brain Computer Interfacing has been one of the growing fields of research and development in recent years. An Electroencephalograph
More informationDisplacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology
6 th International Conference on Advances in Experimental Structural Engineering 11 th International Workshop on Advanced Smart Materials and Smart Structures Technology August 1-2, 2015, University of
More informationDemonstration of a Frequency-Demodulation CMOS Image Sensor
Demonstration of a Frequency-Demodulation CMOS Image Sensor Koji Yamamoto, Keiichiro Kagawa, Jun Ohta, Masahiro Nunoshita Graduate School of Materials Science, Nara Institute of Science and Technology
More informationThe all-in-one solution for wireless recording and stimulating in vivo
Wireless-Systems The all-in-one solution for wireless recording and stimulating in vivo 4, 8, 16 or 32 channels Options for electrical and optical Lightweight headstages Excellent signal--noise ratio Parallel
More informationAvailable online at (Elixir International Journal) Control Engineering. Elixir Control Engg. 50 (2012)
10320 Available online at www.elixirpublishers.com (Elixir International Journal) Control Engineering Elixir Control Engg. 50 (2012) 10320-10324 Wavelet analysis based feature extraction for pattern classification
More information