Radio Deep Learning Efforts Showcase Presentation

Similar documents
Deep Neural Network Architectures for Modulation Classification

DEEP LEARNING ON RF DATA. Adam Thompson Senior Solutions Architect March 29, 2018

Deep Learning. Dr. Johan Hagelbäck.

Exploiting the Unused Part of the Brain

On the Use of Convolutional Neural Networks for Specific Emitter Identification

Demystifying Machine Learning

Learning Deep Networks from Noisy Labels with Dropout Regularization

Generating an appropriate sound for a video using WaveNet.

An Introduction to Convolutional Neural Networks. Alessandro Giusti Dalle Molle Institute for Artificial Intelligence Lugano, Switzerland

신경망기반자동번역기술. Konkuk University Computational Intelligence Lab. 김강일

Deep Learning for Human Activity Recognition: A Resource Efficient Implementation on Low-Power Devices

CSC 578 Neural Networks and Deep Learning

Applications of Music Processing

Singing Voice Detection. Applications of Music Processing. Singing Voice Detection. Singing Voice Detection. Singing Voice Detection

Poker AI: Equilibrium, Online Resolving, Deep Learning and Reinforcement Learning

Machine Learning Practical Part 2: Group Projects. MLP Lecture 11 MLP Part 2: Group Projects 1

Stacking Ensemble for auto ml

Understanding Neural Networks : Part II

Research on Hand Gesture Recognition Using Convolutional Neural Network

Local and Low-Cost White Space Detection

Introduction to Machine Learning

Proposers Day Workshop

Artificial Intelligence and Deep Learning

MINE 432 Industrial Automation and Robotics

CSC321 Lecture 11: Convolutional Networks

CHAPTER 6 BACK PROPAGATED ARTIFICIAL NEURAL NETWORK TRAINED ARHF

Classifying the Brain's Motor Activity via Deep Learning

11/13/18. Introduction to RNNs for NLP. About Me. Overview SHANG GAO

GESTURE RECOGNITION FOR ROBOTIC CONTROL USING DEEP LEARNING

Decoding Brainwave Data using Regression

Deep Learning Convolutional Neural Networks for Radio Identification

Recurrent Neural Radio Anomaly Detection

Colorful Image Colorizations Supplementary Material

Augmenting Self-Learning In Chess Through Expert Imitation

Machine Learning and RF Spectrum Intelligence Gathering

DYNAMIC CONVOLUTIONAL NEURAL NETWORK FOR IMAGE SUPER- RESOLUTION

Recurrent neural networks Modelling sequential data. MLP Lecture 9 / 13 November 2018 Recurrent Neural Networks 1: Modelling sequential data 1

Deep Learning for Autonomous Driving

Lesson 08. Convolutional Neural Network. Ing. Marek Hrúz, Ph.D. Katedra Kybernetiky Fakulta aplikovaných věd Západočeská univerzita v Plzni.

Recurrent neural networks Modelling sequential data. MLP Lecture 9 Recurrent Neural Networks 1: Modelling sequential data 1

AI for Autonomous Ships Challenges in Design and Validation

Demodulation of Faded Wireless Signals using Deep Convolutional Neural Networks

Classification Accuracies of Malaria Infected Cells Using Deep Convolutional Neural Networks Based on Decompressed Images

Deep Neural Networks (2) Tanh & ReLU layers; Generalisation and Regularisation

Deep Learning Basics Lecture 9: Recurrent Neural Networks. Princeton University COS 495 Instructor: Yingyu Liang

Autonomous driving made safe

arxiv: v1 [cs.lg] 23 Aug 2016

Cómo estructurar un buen proyecto de Machine Learning? Anna Bosch Rue VP Data Launchmetrics

Lecture 17 Convolutional Neural Networks

Author(s) Corr, Philip J.; Silvestre, Guenole C.; Bleakley, Christopher J. The Irish Pattern Recognition & Classification Society

TD-Leaf(λ) Giraffe: Using Deep Reinforcement Learning to Play Chess. Stefan Lüttgen

Institute for Critical Technology and Applied Science. Machine Learning for Radar State Determination. Status report 2017/11/09

CROSS-LAYER FEATURES IN CONVOLUTIONAL NEURAL NETWORKS FOR GENERIC CLASSIFICATION TASKS. Kuan-Chuan Peng and Tsuhan Chen

Multimedia Forensics

Chapter 2 Channel Equalization

Adversarial Examples and Adversarial Training. Ian Goodfellow, OpenAI Research Scientist Presentation at HORSE 2016 London,

Landmark Recognition with Deep Learning

Are there alternatives to Sigmoid Hidden Units? MLP Lecture 6 Hidden Units / Initialisation 1

Creating Intelligence at the Edge

Enhancing Symmetry in GAN Generated Fashion Images

Using Deep Learning for Sentiment Analysis and Opinion Mining

The Art of Neural Nets

A Neural Algorithm of Artistic Style (2015)

On Emerging Technologies

Figure 1. Artificial Neural Network structure. B. Spiking Neural Networks Spiking Neural networks (SNNs) fall into the third generation of neural netw

Spectral Detection and Localization of Radio Events with Learned Convolutional Neural Features

Attention-based Multi-Encoder-Decoder Recurrent Neural Networks

Vehicle Color Recognition using Convolutional Neural Network

DEEP DIVE ON AZURE ML FOR DEVELOPERS

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Continuous Gesture Recognition Fact Sheet

Synthetic View Generation for Absolute Pose Regression and Image Synthesis: Supplementary material

Machine Learning and Decision Making for Sustainability

Tiny ImageNet Challenge Investigating the Scaling of Inception Layers for Reduced Scale Classification Problems

MSc(CompSc) List of courses offered in

INTRODUCTION TO DEEP LEARNING. Steve Tjoa June 2013

Application of Classifier Integration Model to Disturbance Classification in Electric Signals

Graph-of-word and TW-IDF: New Approach to Ad Hoc IR (CIKM 2013) Learning to Rank: From Pairwise Approach to Listwise Approach (ICML 2007)

arxiv: v1 [cs.ce] 9 Jan 2018

REAL TIME EMULATION OF PARAMETRIC GUITAR TUBE AMPLIFIER WITH LONG SHORT TERM MEMORY NEURAL NETWORK

Classification of Road Images for Lane Detection

Intelligent Non-Player Character with Deep Learning. Intelligent Non-Player Character with Deep Learning 1

SSB Debate: Model-based Inference vs. Machine Learning

RF Sensing in the Internet of Things: General Deep Learning Framework for

Session 2: 10 Year Vision session (11:00-12:20) - Tuesday. Session 3: Poster Highlights A (14:00-15:00) - Tuesday 20 posters (3minutes per poster)

Learning Approximate Neural Estimators for Wireless Channel State Information

DETECTION AND RECOGNITION OF HAND GESTURES TO CONTROL THE SYSTEM APPLICATIONS BY NEURAL NETWORKS. P.Suganya, R.Sathya, K.

Deep Learning Overview

IEEE TRANSACTIONS ON MULTI-SCALE COMPUTING SYSTEMS, VOL. 1, NO. 1, JANUARY

Biologically Inspired Computation

Chapter 2 Distributed Consensus Estimation of Wireless Sensor Networks

Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors

Eur Ing Dr. Lei Zhang Faculty of Engineering and Applied Science University of Regina Canada

Low frequency extrapolation with deep learning Hongyu Sun and Laurent Demanet, Massachusetts Institute of Technology

Deep Learning Based Transmitter Identification using Power Amplifier Nonlinearity

Proximity Matrix and Its Applications. Li Jinbo. Master of Science in Software Engineering

Automatic Speech Recognition (CS753)

Deep Learning Models for Wireless Signal Classification with Distributed Low-Cost Spectrum Sensors

Data-Starved Artificial Intelligence

AI & Machine Learning. By Jan Øye Lindroos

Transcription:

Radio Deep Learning Efforts Showcase Presentation November 2016 hume@vt.edu www.hume.vt.edu Tim O Shea Senior Research Associate

Program Overview Program Objective: Rethink fundamental approaches to how we do radio signal processing from a data-centric machine learning driven perspective provide large steps forward in system performance, generalization and autonomy! High Level Motivation: We have seen major changes in how signal processing is done in the vision, voice and text fields over the last 5 years due to major advances of techniques in the field of deep learning Learn from and adopt many of these methods to greatly advance our radio capabilities! Move from expert based features to learned features Move from rule based control system to learned control systems Focus on methods which generalize well and learn directly from data Leverage architectures which work well in vision/voice and adopt them to the radio domain Convolutional networks, Recurrent networks, Attention Models, Deep Reinforcement Learning Program Personnel Tim O Shea, Research Faculty Kayla Brosie, Graduate Student Seth Hitefield, Graduate Student

Technical Focus Areas Specific Focus Areas Signal Sensing and Spectrum Awareness Signal Detection and Separation Signal Modulation and Protocol Identification Control System Learning and Resource Optimization Anomaly Detection and Semi-Supervised Learning of Emitters Learning to Communicate Learning New Physical Layer Representations for Radio Transport Learning from Sparse Structure on Legacy Physical Layers Network Optimization Methods Hyper-parameter Optimization, architecture search methods Network parameter learning, probabilistic alternatives to back-prop Current deep dive areas

Signal Modulation and Protocol Classification Dataset Generation GNU Radio based labeled synthetic dataset with known ground truth and realistic channel effects (fading, Doppler, noise) Working to expand modulations, variation and channel effects More information available from GRCon Paper and at http://radioml.com http://pubs.gnuradio.org/index.php/grcon/article/view/11 Modulation Recognition Demonstrated Conv-Net based classifier learning on raw RF data which outperforms most current day expert classifiers With some tuning, complexity vs state of the art DARPA expert system is 4-6x lower in FLOP count for forward classification (no on-line learning) Full paper @ https://arxiv.org/abs/1602.04105 Looking at leveraging more recent advances from vision domain

Signal Modulation and Protocol Classification Semi-Supervised Learning How can we learn from non-labeled radio data? Since this is most of the world? Organize and identify classes autonomously based on sparse representations and clustering methods Bootstrap from similar supervised learned features and unsupervised feature learning Raw RF Anomaly Detection Applications of reconstruction-based anomaly detection to raw wide-band radio data to detect anomalies (malicious users, interference, failures, etc) Leverage recurrent sequence prediction models in place of Kalman predictor In error vector distribution based novelty detection method

Hyper-Parameter Optimization Methods Hyper-Parameter Search In neural networks we typically have two kinds of parameters to define the network Weight Parameters: Used in neuron to define the transfer function from input to out, tuned using back-propagation of loss function through network gradients Fully connected layer weight matrix, bais matrix Convolutional layer filter weights Recurrent layer weights and biases Hyper-Parameters: Chosen to define how the network is structured typically before training begins, defines how many actual parameters gradient based training must optimize Network depth, width, layer types, connections, activation functions, convolutional filter configurations, weight initializations, learning rate, dropout rate, regularization rates Since gradient descent for a single model can be very expensive we need a lot of concurrency to be able to efficiently search the parameter space and the hyperparameter space Ongoing work on developing an open distributed architecture for performing hyper-parameter searches https://github.com/osh/dist_hyperas Runs on top of Keras (Theano and TensorFlow backends) Meta-Model + Dataset Hyper-Param Search Controller Node (SGD Model Optimization) Performance Ranking Node (SGD Model Optimization) GPU GPU GPU GPU