Application of AI Technology to Industrial Revolution By Dr. Suchai Thanawastien 1. What is AI? Artificial Intelligence or AI is a branch of computer science that tries to emulate the capabilities of learning, comprehension, problem solving, collaboration, innovation of human being. Historically, the word Artificial Intelligence or AI was defined in 1956 by Dartmouth Assistant Professor John McCarthy. AI is a general term that refers to hardware or software that exhibits behavior which appears intelligent. In the words of Professor McCarthy, it is the science and engineering of making intelligent machines, especially intelligent computer programs. This technology can extend human capability allowing people and society to achieve more. AI technology has existed for decades, via rules-based programs or Expert System, that delivers rudimentary intelligence in specific contexts. The progress of computer science brought us the Neural Network that can simulate some basic mechanic of how human process data to solve a problem. This technology is the foundation of machine learning and deep learning which is now a practical AI technology that can produce tangible results in many key industries and businesses. In a nutshell, the current AI technologies includes: expert systems intelligent agents, machine learning deep learning natural language processing chatbot computer vision, autonomous cars voice recognition Surveillance Big data analytics In this dawn of AI technologies, any business area that integrates AI capabilities, will enable the business gain adaptability in dealing with abnormal or unforeseen scenario. With the flourish of IoT devices, AI is the only technologies that can handle and process extreme data to uncover patterns and new insight to manage the modern enterprise and intelligence industrial complex. Examples of some of the applications of AI for the following domain of applications. 1
1. Problem Solving: Legal assessment, financial asset management, financial application processing, games. 2. Knowledge Creation: Medical diagnosis, e-commerce product recommendation, purchase prediction, stock trading, fraud prevention. 3. Goals Planning: Logistics, scheduling, navigation, Chess palying, predictive maintenance, demand forecasting, inventory management. 4. Natural language Processing: Voice control, chatbot, real-time language Translation, real-time transcription. 5. Deep Context-aware Computing: Autonomous vehicles, surveillance, judiciary decision. 2
2. Application of AI Technology to Industrial Revolution Industry evolution is all about applying multiple technologies to create new opportunity and potential. In this decade, the industry transformation is branded as Industry 4.0 or the 4 th industry revolution. In order to contextualize this, we will briefly summarize the previous industry revolutions. The First Industry Revolution (Industry 1.0) began in Great Britain around 1760. It involves mechanization with water and stream power. The Second Industry Revolution (Industry 2.0) began around 1870 in German and United States and Great Britain. It characterized by machines powered by electricity and telephones. The Third Industry Revolution (Industry 3.0) began in 1950s. It powered by the emergence of computers and internet with electronic control system in 1970s. The progress involved automated production lines and digital communications. The key to a successful 4 th industry revolutions are based on the Internet of Things (IoT), and Big Data processing by AI and the creation of Physical-Cyber System for smart manufacturing that relies on IoT, Big Data, and AI processing. The manufacturing process of industry 4.0 will generate a huge volume of real-time data machine learning is used to improve manufacturing scheduling by analyzing the data recorded about a manufacturing process to find inefficiency and suggest improvement. Robotic is used extensively by all size of factories with AI assuming the roles of operators, supervisors, and quality assurance officers; hence the coming of intelligence manufacturing robot. The significance of moving toward complete factory automation using Artificial Intelligence for Industry 4.0 can be seend from the agenda as laid out at the G20 Conference by Prof. Wolfgang Wahlster, DFKI, Germany (Chair), et. al. They reached a consensus whereby the G20 countries should support coordinated research, development and deployment activities on AI for the fourth industrial revolution, in particular in the following nine priority areas: 1. Hybrid Teams of Human Workers and Collaborative Robots in Smart Factories. 2. Deep Learning for State-based and Predictive Maintenance of Networked Production Machines and for Understanding Human Behaviors of Shop Floor Workers. 3. Semantic Technologies for Worldwide Interoperability of Machine-to- Machine Communication in Smart Factories and Logistics. 4. Human-Aware and Real-Time Production Planning & Scheduling for Multi-agent Systems and Dynamic Plan Revision. 5. Intelligent Industrial Assistance Systems for Human Workers: Proactive and Situation-Aware On-line Help and Training on the Shop Floor. 3
6. Trusted Industrial Data Exchange Hubs and Machine Learning for Industrial Process Mining. 7. Active Digital Product Memories and Digital Twins for Intelligent Asset Tracking and Production Cockpits. 8. Security Technologies for Intelligent Intrusion Detection and Penetration Testing for Smart factories. 9. Long-Term Autonomy and Self-Learning as well as Self-Healing Capabilities of Industrial Components. They also outlined the agenda for international collaboration among the G20 members that can boost the application of AI in manufacturing: 1. There is an urgent need for international collaboration on open standards for AI in Industrie 4.0. 2. An AI on-demand platform and a large-scale AI infrastructure that offers open specifications and example implementations of basic AI components for Industrie 4.0 based on top-notch, cloud-based computing and data services should be supported to provide a framework for the fast adoption of AI technology, also for SMEs in the production sector. 3. Reference Models, Semantic Representation Languages, and Simulation Platforms must integrate the latest AI developments to ensure long-term impact. 4. Open and Secure Data Exchange of Production Data should be supported so that advanced Machine Learning can be applied to these training data sets in order to reach a new level of productivity, efficiency and quality in manufacturing. 5. A consensus on the social, legal, ethical and privacy implications of AI technology in manufacturing will help to increase acceptance and early adoption. 4
3. AI s Self-Learning, Deep Learning and Big Data The profoundly important AI has the capability that exhibit human capabilities in reasoning to solve problem; understand, gathering and representing knowledge; planning and execution to achieve goals; communication in machine, coded or natural language; can understand the world around it and can learn and improve oneself. Basically, AI s machine learning capability and deep learning capability can be extended to provide self-learning capability. I) Self-Learning Current attempt in developing self-learning AI program can be seen from the following projects. A team of researchers from Belgium used an AI algorithm called reservoir computing, combined with another algorithm called backpropagation, the team developed a neuro-inspired analog computer that can train itself and improve at whatever task it s performing. At Google, the next generation of machine-learning, AutoML can self-update and will be capable of creating custom programs to solve unforeseen problems. Google s AutoML system recently produced a series of machine-learning codes with higher rates of efficiency than those made by the researchers themselves since AutoML has a machine learning software that creates self-learning code and can run simulation to find the areas of improvement by working in a plan-do-check-act loop, until the goal is reached. Also, Intel has announced a neuromorphic artificial intelligence (AI) test chip named Loihi, which it said is aimed at mimicking brain functions by learning from data gained from its environment. It have resulted in a self-learning, energy-efficient AI chip that uses asynchronous spiking to take inferences from its environment and become constantly smarter. II) Machine learning (ML) ML is a sub-set of AI. All machine learning is AI, but not all AI is machine learning. ML is basically a set of algorithms. Machine learning algorithms learn through training. Therefore, the quality of their predictions improve with experience. The more input data, the better result will be achieved. Machine learning uses neural network and/or other algorithmic models customized for the specific problem context that is too complex for humans to solve to find the solutions. The goal of most machine learning is to develop a prediction engine for a 5
particular use case. There are more than 15 algorithmic models for ML such as Random Forest, Bayesian Networks, Support Vector Machines. The practical implementation, the designer usually has to experiment with several algorithms or use a set of algorithms to find the best solution to solve the problem. III) Deep Learning In identifying an object, we need an AI program for every types of objects we want to identify. Even with general machine learning random forests, Bayesian networks, support vector machines and more it s difficult to write programs that perform this type of task. Deep learning (DL) is a sub-set of machine learning which includes the task of feature specification and optimization. Deep learning deploys multiple layers of neural networks. Hence, it convey the depth of processing through the neural networks (so, the word Deep Learning). Usually, there are the input layer, and multiple hidden layers, and the output layer which will provide the results. Some of the more notably examples of using Deep Learning include: Navigation of self-driving cars cars are learning to recognize obstacles and react to them appropriately. Precision medicine design genetically tailored medicine. IV) Big Data The invention of ML and DL gain tremendous benefits from having Big Data to drive the AI algorithm to reach maturity in the specific application domains. With the access to large volumes of data, business now can find patterns for business insights or patterns for detecting abnormality. However, the trend is that big data analytic is needed to support better processing by AI, cloud implementation of big data warehouse, more innovative big data analytic, and data cleansing tool that uses ML or DL. 6
4. FINTECH Application of AI CB Insight reported in July 2016 that 41 startups may be introducing AI to Fintech. As many as 658 AI deals were closed with startup funding of $5021 B USD in 2016.Japan Exchange Regulatory and Tokyo Stock Exchange, Inc. announced in February, 2017 that, AI will be developed for market surveillance using technologies from NEC Corp and Hitachi, Ltd. The following will provide a brief overview of the application of AI to financial industry. Stock Market The big data stream of stock trading provides great opportunity to create market surveillance system that already used algorithmic rules and deep learning to analyze and discover new patterns of data for meaningful human interpretation. Currently, a branch of AI processing called congnitive computing was used to process large volume of unstructured data in conjunction with structured data from the trade data and associated as revealed by Nasdaq s SMARTS market surveillance system The next step would be to apply full AI processing, not just to discover new patterns or no patterns and provide. In this areas, the ideas of applying AI still in the infancy stage. Trade activities that can processed by AI efficiently includes: Unstructured data processing. Trader communications such as email, memo, speech, collusive communications, traders and financial entity relationship. Trader profiling. Predictive evaluation. Trading pattern analysis. Cluster of traders trading patterns. Rule-based alert with alert scoring. Applying context-awareness to AI to better interpret an alert. Illegal trade detection. Fraud Detection. Hedge Fund (Bridgewater Associate LP) Use deep learning to analyze large pattern from data from all sources including social media to look for ways to make market prediction so as to reduce risk and increase return. Develop New Investment Strategy (Sentient Technologies) Use distributed Ai to process huge investment data continuously to develop new unique investment strategies in the global financial market. Autonomous Software (HK-based Aidyia) Aim to develop AI software that can work independently without human interaction. Customer Service, Pre-Screening, Advisement. Creating human-like Chatbot. 7