REBUILDING THE BRAIN: ENGINEERING NEUROMORPHIC PROCESSING

Size: px
Start display at page:

Download "REBUILDING THE BRAIN: ENGINEERING NEUROMORPHIC PROCESSING"

Transcription

1 Conference Section B12 Paper #170 Disclaimer This paper partially fulfills a writing requirement for first year (freshman) engineering students at the University of Pittsburgh Swanson School of Engineering. This paper is a student, not a professional, paper. This paper is based on publicly available information and may not provide complete analyses of all relevant data. If this paper is used for any purpose other than these authors partial fulfillment of a writing requirement for first year (freshman) engineering students at the University of Pittsburgh Swanson School of Engineering, the user does so at his or her own risk. REBUILDING THE BRAIN: ENGINEERING NEUROMORPHIC PROCESSING Jonathan Coles, jmc307@pitt.edu, Sanchez 5:00,, drv7@pitt.edu, Mahboobin 10:00 Abstract We will discuss the societal impact of neuromorphic technology, a programming method that mimics human thought with silicon microchips. With radically higher processing speeds, these microchips, called neuromorphic chips, greatly overpower digital computers. It is likely that if they are perfected, brain-mimicking microchips will be found in nearly every computational device in the near future. The basis of neuromorphic chip technology is mimicking neural function using silicon. The chips are comprised of millions of neurons that emulate the human brain s processing capability. The chips basic composition is that of today s microprocessors with additional silicon neurons, which work in an analogous manner to process information. The newest front of neuromorphic engineering is IBM s chip, TrueNorth. This new chip has approximately a million silicon neurons, similar to the brain of a bee. Because of this, it is capable of executing upwards of 40 million operations a second for every watt of energy it consumes, a vast improvement over today s digital computers. The processing power comes from the neural networks created by silicon circuits within the chip, which is about the size of a postage stamp. TrueNorth integrates pattern recognition and memory, the core of artificial intelligence. Our paper will elaborate on how neuromorphic engineering will have a strong global impact on technology and society. We will use examples of current research, detailed technological description, and rational assessment of ethics to ensure the determination of neuromorphic engineering s possibility and potential. Key words artificial intelligence, biomimicry, computer science, electrical engineering, neuromorphic engineering, Google, processing, Qualcomm. MODERN PROCESSING AND NEURAL BIOMIMICRY Within the last century, computers have become an integral part of our everyday lives. 2 billion personal computers received frequent use in 2015, with higher amounts of computers being used in 2017 [1]. The functionality of computational processing that is programmed into our laptops and PCs has allowed for our species to take great leaps into historically untouched depths of science and engineering. Our computers have evolved along with our scientific knowledge; despite this we have still been using iterations of the same computer for decades. Although innovations have been made in computer processing by tech giants such as IBM, Intel, and Qualcomm, industry-standard computers are still designed to function using analog number-crunching that reads binary data; information that consists of billions of ones and zeroes. The speed and power of a computer using this form of computation are limited by battery power and random access memory (RAM) storage; the true innovations to digital computation come from innovations to RAM and batteries. Digital processors in modern computers have gotten faster at reading large amounts of numbers, but doing so still takes far too much energy and hardware space. Neuromorphic technology cuts that energy use, and increases speed while reducing processor size. Neuromorphic engineering is a field focused around biomimicry of the mammalian brain. Scientists in this field practice using tiny silicon nodes that mimic the neurons that exist within the brain of an animal. Information can be processed by these neurons organically. Much like our own brain, a chip using neuromorphic technology can process images very quickly, learn and study patterns of information, make informed inductions, and execute millions of operations every second while using very little energy. This technology is reinventing how computers work as a whole, and is doing so by designing solutions to all of modern computational processing s biggest problems. We will describe and illustrate the functionality and design of current neuromorphic research by detailing the technology and construction of neuromorphic chips. We shall simultaneously exemplify that neuromorphic engineering is vital in improving information processing and artificial intelligence. If done correctly, neuromorphic engineering could have massive global implementations in every field of engineering and computer science, and will find its way into the modern computer. WHAT IS NEUROMORPHIC ENGINEERING? To fully comprehend all aspects of neuromorphic engineering, one must have an advanced understanding of not only the development of electronics, but also a deep University of Pittsburgh Swanson School of Engineering Submission Date

2 comprehension of how our brains work. Tom Simonite wrote on this topic in an article for the MIT Technology Review Journal. Brains compute in parallel as the electrically active cells inside them, called neurons, operate simultaneously and unceasingly. Bound into intricate networks by threadlike appendages, neurons influence one another s electrical pulses via connections called synapses. When information flows through a brain, it processes data as a fusillade of spikes that spread through its neurons and synapses [2]. The brain contains approximately two hundred billion neurons. There are approximately one quadrillion synapses in the human brain. Each synapse functions similarly to a microprocessor by passing nervous system signals from one neuron to another. Micro-processing power is based on the total number of petaflops contained within the system. One petaflop is equal to one quadrillion calculations per second. The most powerful supercomputer can process about 16 petaflops of data, while human brains can process over 35 [3]. Neurological processing outweighs the ability of any modern computer. As you read this sentence, billions of neurons are firing in your brain, allowing you to process the information written on this page. Your brain is also simultaneously processing motor control, breathing, organ function, and auditory and visual cognition. There is almost no conscious thought put into any of these processes. Biological and Silicon Systems There are only two effective computing devices in the physical world, digital computers and brains. Both are physical machines; one is built from silicon and metal while the other is made from hydrocarbons and aqueous solutions. They both are able to cipher information. Brains, however, can actually interpret and react to sensory and motor interfaces within their environment whilst preforming cognitive-level tasks [4]. It is almost unbelievable how efficiently the brain is able to compute data. If this efficiency were integrated into normal computers, there would be a huge technological shift. Information processing would be utterly revolutionized. Uncovering the innermost functions of the human brain can lead to these technological advancements. This is the basis of neuromorphic engineering: understanding the computational principles used by the brain to develop technology that mimics biological processes. Neuromorphic engineering, a relatively new subfield of electrical engineering, uses electrical analog circuits to mimic neuro-biological structures present in the nervous system [4]. It is a comprehensive integration of brain function and technology: the frontier of artificial intelligence. Neuromorphic engineering is an integral part of electrical engineering; whose principal undertaking is to propel technology forward. The amalgamation of the body and technology has propelled our civilization forward within the past century. The future of technology lies in the hands of neuromorphic engineers. The most influential neuromorphic engineering professional groups is the Qualcomm Zeroth team. They are dedicated to creating neuromorphic technology. Their goals include biologically inspired learning, enabling devices to see and perceive the world as humans do, and creating and defining a neural processing unit. Instead of the conventional coding technique, preprogramming behaviors and their possible outcomes, the engineers at Qualcomm have created software that adapts to its environment and learns through possible consequences as it processes information. This is a very important feature in their technology because it truly mimics neurological methods of computation. Another pivotal point in the Zeroth technology is that it replicates biological senses and simulates synapses in brain communication. The ability to program these senses comes from mathematical data collected by neuroscientists that depicts neuron behavior. This data is able to precisely characterize when certain neurons send and receive information and then analyze how and where it is processed within the brain. Neurons can be timed through the electrical impulses in each cell s membrane. This innovation in neuroscience has helped inspire innovations of neuromorphic technology. The Qualcomm engineers are striving to create a definite form of neural processing that can be directly implemented into society [5]. The high caliber of work that these engineers are doing creates a precedent for future data processing. This research is an important stepping stone for all future neuromorphic engineering endeavors. A relatively recent invention, the neuromorphic chip, integrates all of this information and implements true neurobiological processing into conventional computing. NEUROMORPHIC CHIPS: THE FUTURE OF INFORMATION PROCESSING Neuromorphic engineers have integrated biological language with computing in order to effectively imitate subconscious brain function. The basis of neuromorphic chip technology is mimicking neural function using silicon. The chips are comprised of millions of neurons that emulate the human brain s processing capability. The chips basic composition is that of today s microprocessors with additional silicon neurons, which work in an analogous manner to processes information. Engineers are trying to imitate brain functionality by building mechanical versions of neurobiological systems. These chips are designed to processes auditory and sensory data and then respond based on qualitative measures. These responses have not been preprogrammed but are actually learned through observation and careful analysis of environmental stimuli. [6]. Generally, the chips are made by using collections of transistors to imitate the electrical spiking behavior of a neuron and then creating silicon synapses between them by 2

3 wiring them together. The chips also contain silicon microcolumns, repeating clumps of neurons that perform certain functions [2]. Programming all of these aspects in a harmonious way is an extremely delicate process. The largest working surface neuromorphic engineers use is smaller than a pinhead and contains hundreds of neurons and thousands of synapses that need to be individually programmed. Inside the Neuromorphic Chip Neuromorphic chips transmit and respond to information sent in spikes of energy. This differs from regular super computers, which operate with continuously varying voltages. These spikes charge only a small fraction of the silicon neuron as they are processed. Conventional processing techniques have to keep every single transmission line at a certain voltage, constantly and continuously [7]. This creates a dramatic difference in the amount of power used for conventional processing. The neuromorphic chips are not endlessly running and changing voltage; they are programmed with the ability to pinpoint and spike certain neural processes at the time they occur. This is wildly more efficient. This difference of energy consumption is one of the most important factors when comparing conventional processing to neuromorphic processing. The newest design for these neuromorphic chips is based on two different technologies: lateral spin valves and memristors. A spin valve is a device that consists of conducting magnetic materials. The electrical resistance between the materials can change depending on the relative alignment of the magnetization between the layers. The lateral spin valves within the chips are tiny magnets connected by metal wires that can switch orientation depending on the spin of the electrons passing through them [7]. They act like a brain s neural network. The way these values are connected throughout the chip creates a framework that information can easily pass through. Then energy is sent throughout this framework in spikes as mentioned above. The spin valves operate at terminal voltages which are measured in millivolts; this is significant again because it is much less energy than conventional microchips. The memristors are two terminal electrical components that link electric charge and magnetic flux. Fundamentally, they act like resistors with memory storage. They are able to regulate the way the lateral spin valves store information. The structure that these two forms of technology create is the ideal environment for the exact kind of processing biological systems do well: analog-data-sensing, cognitive-computing, and associative memory [7]. Memristors and lateral spin valves are still both relatively new in the field of neuromorphic engineering; their development only further pushes the boundaries between biological and synthetic systems. This technology is not the only way that neuromorphic chips are being made. Another method of constructing neuromorphic chips is through programmable neural silicon. Carver Mead, the creator of the first silicon retina, started mimicking ion-flow across a neuron s membrane with electron flow through a transistor s channel. The electrons within the transistor act very similarly to the chemical ion s we have in our neural membrane. This is integrated into the chip s hardware by the creation of versatile ion-channel analog and reconfigurable synaptic connections [8]. This means that the analog emulates a specific range of behaviors based on the arrangement of particular displays or transmissions. The neural silicon is softwired within this analog and assigned unique addresses so that every time a spike occurs the chip actually outputs that specific neuron s address; this process is something that can be found in a human brain as well. These silicon neurons are modeled after biological neurons using transistors and circuits. A circuit is created and converted into an array of silicon neurons. The neurons are linked with transistors throughout the circuit. The purpose of the neuron is to integrate synaptic inputs leading to depolarization, which is how signals can actually travel through the circuit, and to initiate action potential that propagates to the neurons terminals within the transistors. Theses terminals effect the channel current through specific voltages [9]. Because of the capability of the silicon neurons, the terminal voltages are spikes rather than continuous current. This leads to much less energy consumption. The components of the circuit mimic biological processing. FIGURE 1 [11] Example of a silicon neuron within engineered circuit TrueNorth, Pointing towards the Future The newest frontier of neuromorphic engineering is IBM s chip, TrueNorth. The chip is made up of hundreds of circuits working in parallel. The chip contains 5.4 billion transistors but only uses 70 milliwatts of power. By comparison, modern computers have approximately 1.4 billion transistors but use watts of power. The efficiency of this chip is astronomical. It can compute over 46 billion operations a 3

4 second per watt of energy [6]. Its low power consumption could have a serious impact on present technology. Energy consumption could be tremendously decreased if IBM s technology were integrated into conventional computers. The transistors in the chip work similarly to that of the neural networks within the brain. TrueNorth has one million silicon neurons, which is about as complex as the brain of a bee. The neurons are able to signal when certain information passes a specific threshold [6]. They organize this data based on pre-programmed patterns. The difference between this processing technique and conventional computation is that the chip is able to learn from these patterns and integrate new solutions based on its environment. This all comes from the bases of neuromorphic engineering. The chip can hold all of this information, much like a memory, and use it for future tasks. TrueNorth actually integrates pattern recognition with memory storage, which allows the chip to function like many tiny microcolumns, each performing a specific task. This is the core of artificial intelligence. Engineers have created a mind that can process and store information based on the past and the present. TrueNorth, IBM s brain-like microprocessor, has been found to be exceptionally proficient at inference work for deep neural networks. In particular, the chip has demonstrated it s especially good at image recognition, being able accurately classify such data much more efficiently, from an energy perspective, than traditional processor architectures, suggesting new applications in mobile computing, IoT, robotics, autonomous cars, and HPC. [7]. TrueNorth is one of the first truly energy-efficient neuromorphic chips. The neural networks within the chip are much faster at processing information than the computers of today. This technology can propel many aspects of mainstream machinery forward. Processing speed will increase and power consumption will decrease through the implementation of this technology. Implementing neuromorphic concepts will lessen the gap between humans and machines. This is a slightly frightening idea. Do we really want to bridge this divide? Will creating artificial intelligence hinder human progress? These are ethical questions neuromorphic engineers must face when completing their research. There is still a long way to go before completely replicating the human brain, but neuromorphic engineering is a step towards the future. FIGURE 2 [10] Example of a TrueNorth chip core FUTURE IMPROVEMENTS ON CURRENT COMPUTING STANDARDS Modern computers function using digital processing, a method that reads binary data in ones and zeroes. These ones and zeroes signal on or off to the computer, informing the processor of the details of the data (such as pixel arrangement or numerical value.) This is the language of modern digital computers. A byte of information consists of eight digits of binary code, for example: Spelling out the word porridge in binary code is done with the following string of ones and zeroes: Most images on our computer consist of several megabytes of binary code; one megabyte of binary code is approximately eight-million digits. Programs and applications can have several gigabytes of information; upwards of eight-billion ones and zeroes. Some hard-drives can store a terabyte of information, or eight-trillion binary digits. If an image has three megabytes of information, a computer has to process twenty-four-million ones and zeroes to be able to display the image. Modern computers are capable of reading this amount of information in a few milliseconds, but doing so still takes a lot more energy and time than our brains would take to analyze the same image. Neuromorphic engineering rethinks the modern approach to information processing. In modelling computer chips from the mammalian brain, neuromorphic engineers are designing new chips that will conquer the current standard of large data and high energy usage [10]. These chips do this by mimicking organic neural synapses with silicon nodes. Electrical current is transferred very quickly between the fabricated neural net of the neuromorphic chips, producing an emulation of mammalian thought and reaction [11]. This is a much faster process than crunching billions of numbers to process an image. Computers with neuromorphic chips can be programmed to be capable of human-level vision and speech functions that use microphones and cameras. These features 4

5 would make computer image and sound recognition function at the same level that human sight and listening occur [12]. The main innovation made by neuromorphic engineers comes from the shift in thought of the processing ideology. Instead of encoding data into machine-dictated strings of countless numbers, the neuromorphic chips are designed to read data as a human or animal would. These chips are similar to an animal s brain in more ways than one; neuromorphic processing is more energy-efficient than digital processing. This increase in efficiency reduces the need for large batteries, as less energy is needed in order for the computer to operate. Smaller batteries and greater energy efficiency will improve the environmental sustainability of our computers, as less electricity, battery acids and metals will be used to fabricate and operate the billions of computers in use today. In addition, neuromorphic chips are also much faster and more powerful than digital processors, just like our brains [2]. This increase in speed and power would diminish the need for high amounts of RAM and memory storage, as neuromorphic chips would process data with greater speed and energy efficiency; this would also greatly decrease the size of the hardware and alleviate the environmental strain of manufacturing large computers. Societal Impact If modern computers were to be built using neuromorphic chip processors, members of every engineering field, as well as all computer-using consumers, would experience a strong upgrade in available technology. New computers built with neuromorphic chips would feature battery life nearly twice as long as the current standard, smaller devices, more ergonomic hardware design, 3000% faster processing times, and similarly higher processing power [12]. With all of these innovations, new software and operating systems would be able to feature historically unprecedented functionality such as speech and sight [12], because of the staggeringly high processing speed and power of the neuromorphic chips. A computer that uses neuromorphic processing would function faster and stronger than a digital computer while using a fraction of the energy. This would cut the massive amounts of energy used by the worlds 2 billion computers in half. The largest, most expensive digital computers used by engineers could be replaced by the smallest of laptops that use neuromorphic processing, as the two would be equal in speed and power. Software developers would be able to create programs with functionalities that are currently only dreamt of, as they would not be nearly as restricted by RAM and processing speed. The possibility of software innovations occurring as a result of neuromorphic engineering is highlighted in Jeffrey Burt s article on the future of memristor research for Next Platform magazine: Much of the talk around artificial intelligence these days focuses on software efforts various algorithms and neural networks and such hardware devices as custom ASICs for those neural networks and chips like GPUs and FPGAs that can help the development of reprogrammable systems. [13] With faster development of programs that can function on less energy and power restriction, the average quality and capability of software would greatly increase. Glitches and drops in frame rate would not occur. Programs would run on less energy, making our computers more sustainable. Applications could be developed that would predict and evaluate real world situations, and make the engineers job much easier. Engineers could get a lot more done with less stress, and do better work [12]. Here is an example: when using Google SketchUp, a popular 3D modelling program used by many architects and civil engineers, my 2015 MacBook Pro was freezing every couple seconds. The more details I add to my model, the more frequently I experienced glitches. This is because my computer s processor was struggling to analyze the billions of zeroes and ones that it was being fed by the program. With a neuromorphic processor, the program could function much more simply. My computer would be able to comprehend the needs of a program the same way that our brains would. The program would communicate with the silicon neurons of the neuromorphic chip, and interpret my interactions with the keyboard and mouse. It would be able to organically understand what I was trying to do; how I was trying to edit my model, and it would do just that for me. The neuromorphic chips can function the same way for other programs. This would make everyday computer use much simpler and faster once completed neuromorphic processors are implemented into commercial computers. Market and Industry Sustainability With nearly a decade of research on neuromorphic technology, developers are nearing completion of marketready processors. The efficiency of modern computers has been rising slowly for many years, and the technology industry is in need of a big leap. The switch to neuromorphic computing from the current standard of digital computing would be very worthwhile, as upwards of 2 billion computers would use less energy, and function up to 3000 times faster [12]. The digital processing computers that would become obsolete would create a lot of waste after being replaced by neuromorphic computers; this would put strain on the environment. Unfortunately for the earth, they would be replaced anyway in 2-4 years, which is the common lifecycle of a computer [14]. Many computers would have already gone through their lifecycles by this time; implementing neuromorphic chips would simply replace them with much better technology. When neuromorphic chips are developed to be industryready, computer companies such as Apple, Microsoft, and Dell must be prepared to make the switch from digital to neuromorphic. And likely, they will be, as implementing this new technology will greatly increase the performance of their products. Most Dell and Apple computers already use digital processors developed by the same technology companies that 5

6 are leaders in neuromorphic engineering, such as Qualcomm and Intel. The computer companies can simply replace the digital processors in their designs with neuromorphic processors made by the same company, making the switch very simple. The future market implementation will be dictated by how the companies that produce the neuromorphic processors market their products to the companies producing the computers. Research and development is set to be concluded on the first round of market ready chips by 2026; the global market for neuromorphic chips is predicted to grow to 10 billion dollars by then [15]. This means that by 2026, 10 billion dollars must be spent in order for these chips to have a prominent presence in the market. This seems expensive until compared to the 2.9-trillion-dollar size of the current global technology market. A 10-billion-dollar investment is very small in the grand scheme of the technology industry. This investment would lead to massive improvements to the overall functionality and environmental sustainability of our modern technology, diminishing the energy use of 2 billion computers by almost 50%. The market implementation of neuromorphic chips would not only revolutionize computer processing, but would also be a fantastic victory for global environmentalists. CONCLUSION The possibilities of neuromorphic technology elaborate the prospect of the next great revelation in computer and electrical engineering. Neuromorphic processing presents a rise in computer capability that is historically unrivalled, and could lead way to revolutionary changes to our everyday lives. The concept of organic interaction with our computers gives fuel to the hope of creating a truly capable form of artificial intelligence, and further underlines the true possibilities that could arise from neuromorphic technology. Silicon synapse technology used in neuromorphic processors is the greatest opportunity the human race has had thus far to move past the centuries-old method of binary computing, and head into a new age of technological advancement. With robust improvements to energy usage, and potential for massive increases on the functionality of modern software, neuromorphic technology is in a position to make the world s 2 billion computers run faster, and be more environmentallyfriendly. It is set to take over the insides of our beloved laptops and cellphones within a decade. SOURCES [1] How Many Computers Are in The World? Worldometers.info. Accessed [2] Simonite, T. Thinking in Silicon. MIT Technology Review Accessed [3] Webster, S. Earth s supercomputing power surpasses human brain three times over. Rawstory Accessed [4] Giacomo,Indiveri. Horiuchi, Timothy K. Frontiers in Neuromorphic Engineering. Frontiers in Neuroscience. Date of Publication 03/10/2011. Accessed /full [5] Introducing Qualcomm Zeroth Processors: Brain Inspired Computing. Qualcomm. Date of Publication 03/10/2013. Date Accessed ng-qualcomm-zeroth-processors-brain-inspired-computing [6] Hof, R. Neuromorphic Chips. MIT Technology Review Accessed chips/ [7] Intel Reveals Neuromorphic Chip Design. MIT Technology Review Date Accessed [8] Programmable Neural Silicon. Brains in Silicon, Stanford University Date Accessed [9] Designing and Testing Neuromorphic Chips. Brains in Silicon, Stanford University Date Accessed 3/23/ [10] Markoff, John. IBM Develops a New Chip That Functions Like a Brain. The New York Times. Date of Publication Date Accessed [11] Feldman, M. IBM Finds Killer App for TrueNorth Neuromorphic Chip. Top 50 List Date Accessed [12] Esser, Steven. Convolutional Networks for Fast, Energy Efficient Neuromorphic Computing. IBM Research Almaden Date Accessed [13] Burt, Jeffrey. Memristor Research Highlights Neuromorphic Device Future. Next Platform Date Accessed [14] Dubash, Manek. The Desktop Lifecycle: How Long is it Anyway? The Register Accessed [15] Global Neuromorphic Chip Market to surpass US$ 10 Billion in Revenues by 2026; Smart Machine Integration and AI Systems Driving Market Growth. Future Market Integration Date Accessed

7 ACKNOWLEDGEMENTS Firstly, we would like to thank Beth Newborg, who did a fantastic job providing feedback as our writing instructor. Also, Alyssa Srock and Mr. Wunderley for guiding us through the writing process as our conference chairs. Of course, we would like to thank our parents, Radisav Vidic, Natasa Vidic, Nicholas Coles, and Jennifer Matesa, all four of whom work for the University of Pittsburgh and have greatly influenced our education. We would not be where we are today if it was not for their amazing guidance and trust. Lastly, we would like to thank our friends. Without them our days would have been much harder and filled with a lot less laughter. 7

8 8

Artificial intelligence, made simple. Written by: Dale Benton Produced by: Danielle Harris

Artificial intelligence, made simple. Written by: Dale Benton Produced by: Danielle Harris Artificial intelligence, made simple Written by: Dale Benton Produced by: Danielle Harris THE ARTIFICIAL INTELLIGENCE MARKET IS SET TO EXPLODE AND NVIDIA, ALONG WITH THE TECHNOLOGY ECOSYSTEM INCLUDING

More information

On Intelligence Jeff Hawkins

On Intelligence Jeff Hawkins On Intelligence Jeff Hawkins Chapter 8: The Future of Intelligence April 27, 2006 Presented by: Melanie Swan, Futurist MS Futures Group 650-681-9482 m@melanieswan.com http://www.melanieswan.com Building

More information

Raising the Bar Sydney 2018 Zdenka Kuncic Build a brain

Raising the Bar Sydney 2018 Zdenka Kuncic Build a brain Raising the Bar Sydney 2018 Zdenka Kuncic Build a brain Welcome to the podcast series; Raising the Bar, Sydney. Raising the bar in 2018 saw 20 University of Sydney academics take their research out of

More information

KÜNSTLICHE INTELLIGENZ JOBKILLER VON MORGEN?

KÜNSTLICHE INTELLIGENZ JOBKILLER VON MORGEN? KÜNSTLICHE INTELLIGENZ JOBKILLER VON MORGEN? Marc Stampfli https://www.linkedin.com/in/marcstampfli/ https://twitter.com/marc_stampfli E-Mail: mstampfli@nvidia.com INTELLIGENT ROBOTS AND SMART MACHINES

More information

Humanification Go Digital, Stay Human

Humanification Go Digital, Stay Human Humanification Go Digital, Stay Human Image courtesy: Home LOCAL AND PREDICTABLE WORLD GLOBAL AND UNPREDICTABLE WORLD MASSIVE DISRUPTION IN THE NEXT DECADE DISRUPTIVE STRESS OR DISRUPTIVE OPPORTUNITY DISRUPTION

More information

The computational brain (or why studying the brain with math is cool )

The computational brain (or why studying the brain with math is cool ) The computational brain (or why studying the brain with math is cool ) +&'&'&+&'&+&+&+&'& Jonathan Pillow PNI, Psychology, & CSML Math Tools for Neuroscience (NEU 314) Fall 2016 What is computational neuroscience?

More information

The Three Laws of Artificial Intelligence

The Three Laws of Artificial Intelligence The Three Laws of Artificial Intelligence Dispelling Common Myths of AI We ve all heard about it and watched the scary movies. An artificial intelligence somehow develops spontaneously and ferociously

More information

Proposers Day Workshop

Proposers Day Workshop Proposers Day Workshop Monday, January 23, 2017 @srcjump, #JUMPpdw Cognitive Computing Vertical Research Center Mandy Pant Academic Research Director Intel Corporation Center Motivation Today s deep learning

More information

BLUE BRAIN - The name of the world s first virtual brain. That means a machine that can function as human brain.

BLUE BRAIN - The name of the world s first virtual brain. That means a machine that can function as human brain. CONTENTS 1~ INTRODUCTION 2~ WHAT IS BLUE BRAIN 3~ WHAT IS VIRTUAL BRAIN 4~ FUNCTION OF NATURAL BRAIN 5~ BRAIN SIMULATION 6~ CURRENT RESEARCH WORK 7~ ADVANTAGES 8~ DISADVANTAGE 9~ HARDWARE AND SOFTWARE

More information

ISSCC 2003 / SESSION 1 / PLENARY / 1.1

ISSCC 2003 / SESSION 1 / PLENARY / 1.1 ISSCC 2003 / SESSION 1 / PLENARY / 1.1 1.1 No Exponential is Forever: But Forever Can Be Delayed! Gordon E. Moore Intel Corporation Over the last fifty years, the solid-state-circuits industry has grown

More information

What is Artificial Intelligence? Alternate Definitions (Russell + Norvig) Human intelligence

What is Artificial Intelligence? Alternate Definitions (Russell + Norvig) Human intelligence CSE 3401: Intro to Artificial Intelligence & Logic Programming Introduction Required Readings: Russell & Norvig Chapters 1 & 2. Lecture slides adapted from those of Fahiem Bacchus. What is AI? What is

More information

Computer Science as a Discipline

Computer Science as a Discipline Computer Science as a Discipline 1 Computer Science some people argue that computer science is not a science in the same sense that biology and chemistry are the interdisciplinary nature of computer science

More information

A Divide-and-Conquer Approach to Evolvable Hardware

A Divide-and-Conquer Approach to Evolvable Hardware A Divide-and-Conquer Approach to Evolvable Hardware Jim Torresen Department of Informatics, University of Oslo, PO Box 1080 Blindern N-0316 Oslo, Norway E-mail: jimtoer@idi.ntnu.no Abstract. Evolvable

More information

A Balanced Introduction to Computer Science, 3/E

A Balanced Introduction to Computer Science, 3/E A Balanced Introduction to Computer Science, 3/E David Reed, Creighton University 2011 Pearson Prentice Hall ISBN 978-0-13-216675-1 Chapter 10 Computer Science as a Discipline 1 Computer Science some people

More information

Neuromorphic Analog VLSI

Neuromorphic Analog VLSI Neuromorphic Analog VLSI David W. Graham West Virginia University Lane Department of Computer Science and Electrical Engineering 1 Neuromorphic Analog VLSI Each word has meaning Neuromorphic Analog VLSI

More information

Neural Networks The New Moore s Law

Neural Networks The New Moore s Law Neural Networks The New Moore s Law Chris Rowen, PhD, FIEEE CEO Cognite Ventures December 216 Outline Moore s Law Revisited: Efficiency Drives Productivity Embedded Neural Network Product Segments Efficiency

More information

AI Frontiers. Dr. Dario Gil Vice President IBM Research

AI Frontiers. Dr. Dario Gil Vice President IBM Research AI Frontiers Dr. Dario Gil Vice President IBM Research 1 AI is the new IT MIT Intro to Machine Learning course: 2013 138 students 2016 302 students 2017 700 students 2 What is AI? Artificial Intelligence

More information

Cognitronics: Resource-efficient Architectures for Cognitive Systems. Ulrich Rückert Cognitronics and Sensor Systems.

Cognitronics: Resource-efficient Architectures for Cognitive Systems. Ulrich Rückert Cognitronics and Sensor Systems. Cognitronics: Resource-efficient Architectures for Cognitive Systems Ulrich Rückert Cognitronics and Sensor Systems 14th IWANN, 2017 Cadiz, 14. June 2017 rueckert@cit-ec.uni-bielefeld.de www.ks.cit-ec.uni-bielefeld.de

More information

Executive summary. AI is the new electricity. I can hardly imagine an industry which is not going to be transformed by AI.

Executive summary. AI is the new electricity. I can hardly imagine an industry which is not going to be transformed by AI. Executive summary Artificial intelligence (AI) is increasingly driving important developments in technology and business, from autonomous vehicles to medical diagnosis to advanced manufacturing. As AI

More information

THE NEXT WAVE OF COMPUTING. September 2017

THE NEXT WAVE OF COMPUTING. September 2017 THE NEXT WAVE OF COMPUTING September 2017 SAFE HARBOR Forward-Looking Statements Except for the historical information contained herein, certain matters in this presentation including, but not limited

More information

SenseMaker IST Martin McGinnity University of Ulster Neuro-IT, Bonn, June 2004 SenseMaker IST Neuro-IT workshop June 2004 Page 1

SenseMaker IST Martin McGinnity University of Ulster Neuro-IT, Bonn, June 2004 SenseMaker IST Neuro-IT workshop June 2004 Page 1 SenseMaker IST2001-34712 Martin McGinnity University of Ulster Neuro-IT, Bonn, June 2004 Page 1 Project Objectives To design and implement an intelligent computational system, drawing inspiration from

More information

Weebit Nano (ASX: WBT) Silicon Oxide ReRAM Technology

Weebit Nano (ASX: WBT) Silicon Oxide ReRAM Technology Weebit Nano (ASX: WBT) Silicon Oxide ReRAM Technology Amir Regev VP R&D Leti Memory Workshop June 2017 1 Disclaimer This presentation contains certain statements that constitute forward-looking statements.

More information

GenNet, 20 Neurons, 150 Clock Ticks 1.2. Output Signal 0.8. Target Output Time

GenNet, 20 Neurons, 150 Clock Ticks 1.2. Output Signal 0.8. Target Output Time TiPo A d Pointer Neural Net Model with Superior Evolvabilities for Implementation in a Second-Generation Brain-Building Machine BM2 Jonathan Dinerstein Sorenson Media, Inc. jon@sorenson.com (435) 792-37

More information

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Application Areas of AI   Artificial intelligence is divided into different branches which are mentioned below: Week 2 - o Expert Systems o Natural Language Processing (NLP) o Computer Vision o Speech Recognition And Generation o Robotics o Neural Network o Virtual Reality APPLICATION AREAS OF ARTIFICIAL INTELLIGENCE

More information

History and Philosophical Underpinnings

History and Philosophical Underpinnings History and Philosophical Underpinnings Last Class Recap game-theory why normal search won t work minimax algorithm brute-force traversal of game tree for best move alpha-beta pruning how to improve on

More information

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit) Vishnu Nath Usage of computer vision and humanoid robotics to create autonomous robots (Ximea Currera RL04C Camera Kit) Acknowledgements Firstly, I would like to thank Ivan Klimkovic of Ximea Corporation,

More information

CSC384 Intro to Artificial Intelligence* *The following slides are based on Fahiem Bacchus course lecture notes.

CSC384 Intro to Artificial Intelligence* *The following slides are based on Fahiem Bacchus course lecture notes. CSC384 Intro to Artificial Intelligence* *The following slides are based on Fahiem Bacchus course lecture notes. Artificial Intelligence A branch of Computer Science. Examines how we can achieve intelligent

More information

The Power of Exponential Thinking

The Power of Exponential Thinking The Power of Exponential Thinking An Introduction to Singularity University 2016 Singularity University What is Singularity University (SU)? We are a global community using exponential technologies to

More information

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures

More information

Supercomputers have become critically important tools for driving innovation and discovery

Supercomputers have become critically important tools for driving innovation and discovery David W. Turek Vice President, Technical Computing OpenPOWER IBM Systems Group House Committee on Science, Space and Technology Subcommittee on Energy Supercomputing and American Technology Leadership

More information

Neuro-Fuzzy and Soft Computing: Fuzzy Sets. Chapter 1 of Neuro-Fuzzy and Soft Computing by Jang, Sun and Mizutani

Neuro-Fuzzy and Soft Computing: Fuzzy Sets. Chapter 1 of Neuro-Fuzzy and Soft Computing by Jang, Sun and Mizutani Chapter 1 of Neuro-Fuzzy and Soft Computing by Jang, Sun and Mizutani Outline Introduction Soft Computing (SC) vs. Conventional Artificial Intelligence (AI) Neuro-Fuzzy (NF) and SC Characteristics 2 Introduction

More information

Chapter 1: Introduction to Neuro-Fuzzy (NF) and Soft Computing (SC)

Chapter 1: Introduction to Neuro-Fuzzy (NF) and Soft Computing (SC) Chapter 1: Introduction to Neuro-Fuzzy (NF) and Soft Computing (SC) Introduction (1.1) SC Constituants and Conventional Artificial Intelligence (AI) (1.2) NF and SC Characteristics (1.3) Jyh-Shing Roger

More information

Harnessing the Power of AI: An Easy Start with Lattice s sensai

Harnessing the Power of AI: An Easy Start with Lattice s sensai Harnessing the Power of AI: An Easy Start with Lattice s sensai A Lattice Semiconductor White Paper. January 2019 Artificial intelligence, or AI, is everywhere. It s a revolutionary technology that is

More information

Global Intelligence. Neil Manvar Isaac Zafuta Word Count: 1997 Group p207.

Global Intelligence. Neil Manvar Isaac Zafuta Word Count: 1997 Group p207. Global Intelligence Neil Manvar ndmanvar@ucdavis.edu Isaac Zafuta idzafuta@ucdavis.edu Word Count: 1997 Group p207 November 29, 2011 In George B. Dyson s Darwin Among the Machines: the Evolution of Global

More information

ES 492: SCIENCE IN THE MOVIES

ES 492: SCIENCE IN THE MOVIES UNIVERSITY OF SOUTH ALABAMA ES 492: SCIENCE IN THE MOVIES LECTURE 5: ROBOTICS AND AI PRESENTER: HANNAH BECTON TODAY'S AGENDA 1. Robotics and Real-Time Systems 2. Reacting to the environment around them

More information

Deep Green. System for real-time tracking and playing the board game Reversi. Final Project Submitted by: Nadav Erell

Deep Green. System for real-time tracking and playing the board game Reversi. Final Project Submitted by: Nadav Erell Deep Green System for real-time tracking and playing the board game Reversi Final Project Submitted by: Nadav Erell Introduction to Computational and Biological Vision Department of Computer Science, Ben-Gurion

More information

GPU Computing for Cognitive Robotics

GPU Computing for Cognitive Robotics GPU Computing for Cognitive Robotics Martin Peniak, Davide Marocco, Angelo Cangelosi GPU Technology Conference, San Jose, California, 25 March, 2014 Acknowledgements This study was financed by: EU Integrating

More information

What We Talk About When We Talk About AI

What We Talk About When We Talk About AI MAGAZINE What We Talk About When We Talk About AI ARTIFICIAL INTELLIGENCE TECHNOLOGY 30 OCT 2015 W e have all seen the films, read the comics or been awed by the prophetic books, and from them we think

More information

In 1984, a cell phone in the U.S. cost $3,995 and

In 1984, a cell phone in the U.S. cost $3,995 and In 1984, a cell phone in the U.S. cost $3,995 and weighed 2 pounds. Today s 8GB smartphones cost $199 and weigh as little as 4.6 oz. Technology Commercialization Applied Materials is one of the most important

More information

5G R&D at Huawei: An Insider Look

5G R&D at Huawei: An Insider Look 5G R&D at Huawei: An Insider Look Accelerating the move from theory to engineering practice with MATLAB and Simulink Huawei is the largest networking and telecommunications equipment and services corporation

More information

SpiNNaker. Human Brain Project. and the. Steve Furber. ICL Professor of Computer Engineering The University of Manchester

SpiNNaker. Human Brain Project. and the. Steve Furber. ICL Professor of Computer Engineering The University of Manchester SpiNNaker and the Human Brain Project Steve Furber ICL Professor of Computer Engineering The University of Manchester 1 200 years ago Ada Lovelace, b. 10 Dec. 1815 "I have my hopes, and very distinct ones

More information

ASIC-based Artificial Neural Networks for Size, Weight, and Power Constrained Applications

ASIC-based Artificial Neural Networks for Size, Weight, and Power Constrained Applications ASIC-based Artificial Neural Networks for Size, Weight, and Power Constrained Applications Clare Thiem Senior Electronics Engineer Information Directorate Air Force Research Laboratory Agenda Nano-Enabled

More information

OECD WORK ON ARTIFICIAL INTELLIGENCE

OECD WORK ON ARTIFICIAL INTELLIGENCE OECD Global Parliamentary Network October 10, 2018 OECD WORK ON ARTIFICIAL INTELLIGENCE Karine Perset, Nobu Nishigata, Directorate for Science, Technology and Innovation ai@oecd.org http://oe.cd/ai OECD

More information

Hardware Software Science Co-design in the Human Brain Project

Hardware Software Science Co-design in the Human Brain Project Hardware Software Science Co-design in the Human Brain Project Wouter Klijn 29-11-2016 Pune, India 1 Content The Human Brain Project Hardware - HBP Pilot machines Software - A Neuron - NestMC: NEST Multi

More information

SUPERCHARGED COMPUTING FOR THE DA VINCIS AND EINSTEINS OF OUR TIME

SUPERCHARGED COMPUTING FOR THE DA VINCIS AND EINSTEINS OF OUR TIME SUPERCHARGED COMPUTING FOR THE DA VINCIS AND EINSTEINS OF OUR TIME We pioneered a supercharged form of computing loved by the most demanding computer users in the world scientists, designers, artists,

More information

Computational Intelligence Introduction

Computational Intelligence Introduction Computational Intelligence Introduction Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Neural Networks 1/21 Fuzzy Systems What are

More information

Application of AI Technology to Industrial Revolution

Application of AI Technology to Industrial Revolution Application of AI Technology to Industrial Revolution By Dr. Suchai Thanawastien 1. What is AI? Artificial Intelligence or AI is a branch of computer science that tries to emulate the capabilities of learning,

More information

Perspectives on Neuromorphic Computing

Perspectives on Neuromorphic Computing Perspectives on Neuromorphic Computing Todd Hylton Brain Corporation hylton@braincorporation.com ORNL Neuromorphic Computing Workshop June 29, 2016 Outline Retrospective SyNAPSE Perspective Neuromorphic

More information

Artificial intelligence: powering the deeplearning machines of tomorrow

Artificial intelligence: powering the deeplearning machines of tomorrow White Paper Artificial intelligence: powering the deeplearning machines of Deep learning neural networks demand sophisticated power solutions Abstract Once very much a science fiction dream, artificial

More information

Introduction to Artificial Intelligence

Introduction to Artificial Intelligence Introduction to Artificial Intelligence By Budditha Hettige Sources: Based on An Introduction to Multi-agent Systems by Michael Wooldridge, John Wiley & Sons, 2002 Artificial Intelligence A Modern Approach,

More information

CS 131 Lecture 1: Course introduction

CS 131 Lecture 1: Course introduction CS 131 Lecture 1: Course introduction Olivier Moindrot Department of Computer Science Stanford University Stanford, CA 94305 olivierm@stanford.edu 1 What is computer vision? 1.1 Definition Two definitions

More information

Artificial Intelligence and Robotics Getting More Human

Artificial Intelligence and Robotics Getting More Human Weekly Barometer 25 janvier 2012 Artificial Intelligence and Robotics Getting More Human July 2017 ATONRÂ PARTNERS SA 12, Rue Pierre Fatio 1204 GENEVA SWITZERLAND - Tel: + 41 22 310 15 01 http://www.atonra.ch

More information

Integrate-and-Fire Neuron Circuit and Synaptic Device with Floating Body MOSFETs

Integrate-and-Fire Neuron Circuit and Synaptic Device with Floating Body MOSFETs JOURNAL OF SEMICONDUCTOR TECHNOLOGY AND SCIENCE, VOL.14, NO.6, DECEMBER, 2014 http://dx.doi.org/10.5573/jsts.2014.14.6.755 Integrate-and-Fire Neuron Circuit and Synaptic Device with Floating Body MOSFETs

More information

Demystifying Machine Learning

Demystifying Machine Learning Demystifying Machine Learning By Simon Agius Muscat Software Engineer with RightBrain PyMalta, 19/07/18 http://www.rightbrain.com.mt 0. Talk outline 1. Explain the reasoning behind my talk 2. Defining

More information

Semiconductors: A Strategic U.S. Advantage in the Global Artificial Intelligence Technology Race

Semiconductors: A Strategic U.S. Advantage in the Global Artificial Intelligence Technology Race Semiconductors: A Strategic U.S. Advantage in the Global Artificial Intelligence Technology Race Falan Yinug, Director, Industry Statistics & Economic Policy, Semiconductor Industry Association August

More information

INTEL INNOVATION GENERATION

INTEL INNOVATION GENERATION INTEL INNOVATION GENERATION Overview Intel was founded by inventors, and the company s continued existence depends on innovation. We recognize that the health of local economies including those where our

More information

What we are expecting from this presentation:

What we are expecting from this presentation: What we are expecting from this presentation: A We want to inform you on the most important highlights from this topic D We exhort you to share with us a constructive feedback for further improvements

More information

2017 Technology, Media and Telecommunications Predictions Middle East edition

2017 Technology, Media and Telecommunications Predictions Middle East edition 2017 Technology, Media and Telecommunications Predictions Middle East edition Foreword Welcome to the 2017 edition of Deloitte s Predictions for the technology, media and telecommunications (TMT) sectors.

More information

FROM BRAIN RESEARCH TO FUTURE TECHNOLOGIES. Dirk Pleiter Post-H2020 Vision for HPC Workshop, Frankfurt

FROM BRAIN RESEARCH TO FUTURE TECHNOLOGIES. Dirk Pleiter Post-H2020 Vision for HPC Workshop, Frankfurt FROM BRAIN RESEARCH TO FUTURE TECHNOLOGIES Dirk Pleiter Post-H2020 Vision for HPC Workshop, Frankfurt Science Challenge and Benefits Whole brain cm scale Understanding the human brain Understand the organisation

More information

Integrate-and-Fire Neuron Circuit and Synaptic Device using Floating Body MOSFET with Spike Timing- Dependent Plasticity

Integrate-and-Fire Neuron Circuit and Synaptic Device using Floating Body MOSFET with Spike Timing- Dependent Plasticity JOURNAL OF SEMICONDUCTOR TECHNOLOGY AND SCIENCE, VOL.15, NO.6, DECEMBER, 2015 ISSN(Print) 1598-1657 http://dx.doi.org/10.5573/jsts.2015.15.6.658 ISSN(Online) 2233-4866 Integrate-and-Fire Neuron Circuit

More information

INTRODUCTION TO DEEP LEARNING. Steve Tjoa June 2013

INTRODUCTION TO DEEP LEARNING. Steve Tjoa June 2013 INTRODUCTION TO DEEP LEARNING Steve Tjoa kiemyang@gmail.com June 2013 Acknowledgements http://ufldl.stanford.edu/wiki/index.php/ UFLDL_Tutorial http://youtu.be/ayzoubkuf3m http://youtu.be/zmnoatzigik 2

More information

2. The Crypto Story So Far

2. The Crypto Story So Far 0 Contents 1. Abstract 2. The crypto story so far 2.1. The problem 3. Fornix Our purpose 4. The Fornix Solution 4.1. Master-nodes 4.2. Proof-of-Stake System 5. Use Cases 6. Coin Details 7. Project Roadmap

More information

TRUSTING THE MIND OF A MACHINE

TRUSTING THE MIND OF A MACHINE TRUSTING THE MIND OF A MACHINE AUTHORS Chris DeBrusk, Partner Ege Gürdeniz, Principal Shriram Santhanam, Partner Til Schuermann, Partner INTRODUCTION If you can t explain it simply, you don t understand

More information

Static Power and the Importance of Realistic Junction Temperature Analysis

Static Power and the Importance of Realistic Junction Temperature Analysis White Paper: Virtex-4 Family R WP221 (v1.0) March 23, 2005 Static Power and the Importance of Realistic Junction Temperature Analysis By: Matt Klein Total power consumption of a board or system is important;

More information

ARTIFICIAL INTELLIGENCE

ARTIFICIAL INTELLIGENCE ARTIFICIAL INTELLIGENCE AN INTRODUCTION Artificial Intelligence 2012 Lecture 01 Delivered By Zahid Iqbal 1 Course Logistics Course Description This course will introduce the basics of Artificial Intelligence(AI),

More information

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization Learning to avoid obstacles Outline Problem encoding using GA and ANN Floreano and Mondada

More information

MENA-ECA-APAC NETWORK MEETINGS, 2017

MENA-ECA-APAC NETWORK MEETINGS, 2017 MENA-ECA-APAC NETWORK MEETINGS, 2017 INNOVATION AND DISRUPTIVE TECHNOLOGY Sleem Hasan, Founder and CEO, Privity November 15, 2017 "Technology is the ONLY discipline I have identified that has the ability

More information

GPU ACCELERATED DEEP LEARNING WITH CUDNN

GPU ACCELERATED DEEP LEARNING WITH CUDNN GPU ACCELERATED DEEP LEARNING WITH CUDNN Larry Brown Ph.D. March 2015 AGENDA 1 Introducing cudnn and GPUs 2 Deep Learning Context 3 cudnn V2 4 Using cudnn 2 Introducing cudnn and GPUs 3 HOW GPU ACCELERATION

More information

The Impact of Artificial Intelligence. By: Steven Williamson

The Impact of Artificial Intelligence. By: Steven Williamson The Impact of Artificial Intelligence By: Steven Williamson WHAT IS ARTIFICIAL INTELLIGENCE? It is an area of computer science that deals with advanced and complex technologies that have the ability perform

More information

Technology trends in the digitalization era. ANSYS Innovation Conference Bologna, Italy June 13, 2018 Michele Frascaroli Technical Director, CRIT Srl

Technology trends in the digitalization era. ANSYS Innovation Conference Bologna, Italy June 13, 2018 Michele Frascaroli Technical Director, CRIT Srl Technology trends in the digitalization era ANSYS Innovation Conference Bologna, Italy June 13, 2018 Michele Frascaroli Technical Director, CRIT Srl Summary About CRIT Top Trends for Emerging Technologies

More information

Introduction to Neuromorphic Computing Insights and Challenges. Todd Hylton Brain Corporation

Introduction to Neuromorphic Computing Insights and Challenges. Todd Hylton Brain Corporation Introduction to Neuromorphic Computing Insights and Challenges Todd Hylton Brain Corporation hylton@braincorporation.com Outline What is a neuromorphic computer? Why is neuromorphic computing confusing?

More information

Smarter oil and gas exploration with IBM

Smarter oil and gas exploration with IBM IBM Sales and Distribution Oil and Gas Smarter oil and gas exploration with IBM 2 Smarter oil and gas exploration with IBM IBM can offer a combination of hardware, software, consulting and research services

More information

Years 9 and 10 standard elaborations Australian Curriculum: Digital Technologies

Years 9 and 10 standard elaborations Australian Curriculum: Digital Technologies Purpose The standard elaborations (SEs) provide additional clarity when using the Australian Curriculum achievement standard to make judgments on a five-point scale. They can be used as a tool for: making

More information

The Evolution of Artificial Intelligence in Workplaces

The Evolution of Artificial Intelligence in Workplaces The Evolution of Artificial Intelligence in Workplaces Cognitive Hubs for Future Workplaces In the last decade, workplaces have started to evolve towards digitalization. In the future, people will work

More information

For personal use only

For personal use only 23 April 2015 NORTH AMERICA, ASIA AND AUSTRALIA ROADSHOW PRESENTATION Please find attached a presentation for the Company s upcoming roadshow through North America, Asia and Australia relating to the proposed

More information

UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING

UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING Aaron R. Rababaah* 1, Ahmad A. Rabaa i 2 1 arababaah@auk.edu.kw 2 arabaai@auk.edu.kw Abstract Traditional

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

John Lazzaro and John Wawrzynek Computer Science Division UC Berkeley Berkeley, CA, 94720

John Lazzaro and John Wawrzynek Computer Science Division UC Berkeley Berkeley, CA, 94720 LOW-POWER SILICON NEURONS, AXONS, AND SYNAPSES John Lazzaro and John Wawrzynek Computer Science Division UC Berkeley Berkeley, CA, 94720 Power consumption is the dominant design issue for battery-powered

More information

EXPLORING THE EVALUATION OF CREATIVE COMPUTING WITH PIXI

EXPLORING THE EVALUATION OF CREATIVE COMPUTING WITH PIXI EXPLORING THE EVALUATION OF CREATIVE COMPUTING WITH PIXI A Thesis Presented to The Academic Faculty by Justin Le In Partial Fulfillment of the Requirements for the Degree Computer Science in the College

More information

MA/CS 109 Computer Science Lectures. Wayne Snyder Computer Science Department Boston University

MA/CS 109 Computer Science Lectures. Wayne Snyder Computer Science Department Boston University MA/CS 109 Lectures Wayne Snyder Department Boston University Today Artiificial Intelligence: Pro and Con Friday 12/9 AI Pro and Con continued The future of AI Artificial Intelligence Artificial Intelligence

More information

Appendices master s degree programme Artificial Intelligence

Appendices master s degree programme Artificial Intelligence Appendices master s degree programme Artificial Intelligence 2015-2016 Appendix I Teaching outcomes of the degree programme (art. 1.3) 1. The master demonstrates knowledge, understanding and the ability

More information

AI 101: An Opinionated Computer Scientist s View. Ed Felten

AI 101: An Opinionated Computer Scientist s View. Ed Felten AI 101: An Opinionated Computer Scientist s View Ed Felten Robert E. Kahn Professor of Computer Science and Public Affairs Director, Center for Information Technology Policy Princeton University A Brief

More information

A.I in Automotive? Why and When.

A.I in Automotive? Why and When. A.I in Automotive? Why and When. AGENDA 01 02 03 04 Definitions A.I? A.I in automotive Now? Next big A.I breakthrough in Automotive 01 DEFINITIONS DEFINITIONS Artificial Intelligence Artificial Intelligence:

More information

Chapter 6: DSP And Its Impact On Technology. Book: Processor Design Systems On Chip. By Jari Nurmi

Chapter 6: DSP And Its Impact On Technology. Book: Processor Design Systems On Chip. By Jari Nurmi Chapter 6: DSP And Its Impact On Technology Book: Processor Design Systems On Chip Computing For ASICs And FPGAs By Jari Nurmi Slides Prepared by: Omer Anjum Introduction The early beginning g of DSP DSP

More information

Real- Time Computer Vision and Robotics Using Analog VLSI Circuits

Real- Time Computer Vision and Robotics Using Analog VLSI Circuits 750 Koch, Bair, Harris, Horiuchi, Hsu and Luo Real- Time Computer Vision and Robotics Using Analog VLSI Circuits Christof Koch Wyeth Bair John. Harris Timothy Horiuchi Andrew Hsu Jin Luo Computation and

More information

Analog Circuit for Motion Detection Applied to Target Tracking System

Analog Circuit for Motion Detection Applied to Target Tracking System 14 Analog Circuit for Motion Detection Applied to Target Tracking System Kimihiro Nishio Tsuyama National College of Technology Japan 1. Introduction It is necessary for the system such as the robotics

More information

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use:

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use: Executive Summary Artificial Intelligence (AI) is a suite of technologies capable of learning, reasoning, adapting, and performing tasks in ways inspired by the human mind. With access to data and the

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

The Elusive Machine Intelligence Prof. Suash Deb

The Elusive Machine Intelligence Prof. Suash Deb The Elusive Machine Intelligence Prof. Suash Deb Dept. of Computer Science & Engineering C. V. Raman College of Engineering Bidyanagar, Mahura, Bhubaneswar ORISSA, INDIA MACHINE INTELLIGENCE Any aspect

More information

Embedding Artificial Intelligence into Our Lives

Embedding Artificial Intelligence into Our Lives Embedding Artificial Intelligence into Our Lives Michael Thompson, Synopsys D&R IP-SOC DAYS Santa Clara April 2018 1 Agenda Introduction What AI is and is Not Where AI is being used Rapid Advance of AI

More information

Master Artificial Intelligence

Master Artificial Intelligence Master Artificial Intelligence Appendix I Teaching outcomes of the degree programme (art. 1.3) 1. The master demonstrates knowledge, understanding and the ability to evaluate, analyze and interpret relevant

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks ABSTRACT Just as life attempts to understand itself better by modeling it, and in the process create something new, so Neural computing is an attempt at modeling the workings

More information

Guidelines to Promote National Integrated Circuit Industry Development : Unofficial Translation

Guidelines to Promote National Integrated Circuit Industry Development : Unofficial Translation Guidelines to Promote National Integrated Circuit Industry Development : Unofficial Translation Ministry of Industry and Information Technology National Development and Reform Commission Ministry of Finance

More information

Creating a Poker Playing Program Using Evolutionary Computation

Creating a Poker Playing Program Using Evolutionary Computation Creating a Poker Playing Program Using Evolutionary Computation Simon Olsen and Rob LeGrand, Ph.D. Abstract Artificial intelligence is a rapidly expanding technology. We are surrounded by technology that

More information

The A.I. Revolution Begins With Augmented Intelligence. White Paper January 2018

The A.I. Revolution Begins With Augmented Intelligence. White Paper January 2018 White Paper January 2018 The A.I. Revolution Begins With Augmented Intelligence Steve Davis, Chief Technology Officer Aimee Lessard, Chief Analytics Officer 53% of companies believe that augmented intelligence

More information

Jeff Bezos, CEO and Founder Amazon

Jeff Bezos, CEO and Founder Amazon Jeff Bezos, CEO and Founder Amazon Artificial Intelligence and Machine Learning... will empower and improve every business, every government organization, every philanthropy there is not an institution

More information

Policy Forum. Science 26 January 2001: Vol no. 5504, pp DOI: /science Prev Table of Contents Next

Policy Forum. Science 26 January 2001: Vol no. 5504, pp DOI: /science Prev Table of Contents Next Science 26 January 2001: Vol. 291. no. 5504, pp. 599-600 DOI: 10.1126/science.291.5504.599 Prev Table of Contents Next Policy Forum ARTIFICIAL INTELLIGENCE: Autonomous Mental Development by Robots and

More information

MICROPROCESSOR TECHNOLOGY

MICROPROCESSOR TECHNOLOGY MICROPROCESSOR TECHNOLOGY Assis. Prof. Hossam El-Din Moustafa Lecture 3 Ch.1 The Evolution of The Microprocessor 17-Feb-15 1 Chapter Objectives Introduce the microprocessor evolution from transistors to

More information

Welcome to CSC384: Intro to Artificial MAN.

Welcome to CSC384: Intro to Artificial MAN. Welcome to CSC384: Intro to Artificial Intelligence!@#!, MAN. CSC384: Intro to Artificial Intelligence Winter 2014 Instructor: Prof. Sheila McIlraith Lectures/Tutorials: Monday 1-2pm WB 116 Wednesday 1-2pm

More information

Nanoelectronics the Original Positronic Brain?

Nanoelectronics the Original Positronic Brain? Nanoelectronics the Original Positronic Brain? Dan Department of Electrical and Computer Engineering Portland State University 12/13/08 1 Wikipedia: A positronic brain is a fictional technological device,

More information