Implementing Physical Capabilities for an Existing Chatbot by Using a Repurposed Animatronic to Synchronize Motor Positioning with Speech

Similar documents
Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

AI Application Processing Requirements

Arduino Platform Capabilities in Multitasking. environment.

The five senses of Artificial Intelligence. Why humanizing automation is crucial to the transformation of your business

The Five Senses of Intelligent Automation

Megamark Arduino Library Documentation

The five senses of Artificial Intelligence

Devastator Tank Mobile Platform with Edison SKU:ROB0125

Saphira Robot Control Architecture

EE-110 Introduction to Engineering & Laboratory Experience Saeid Rahimi, Ph.D. Labs Introduction to Arduino

Arduino STEAM Academy Arduino STEM Academy Art without Engineering is dreaming. Engineering without Art is calculating. - Steven K.

ROBOTICS & IOT. Workshop Module

ROBOTICS & IOT. Workshop Module

FACULTY MENTOR Khoshabeh, Ramsin. PROJECT TITLE PiB: Learning Python

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Internet of Things (Winter Training Program) 6 Weeks/45 Days

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM

Responding to Voice Commands

Welcome to Arduino Day 2016

Neural Networks The New Moore s Law

KINECT CONTROLLED HUMANOID AND HELICOPTER

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

Workshops Elisava Introduction to programming and electronics (Scratch & Arduino)

DTMF Controlled Robot

Embedding Artificial Intelligence into Our Lives

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

MATLAB DIGITAL IMAGE/SIGNAL PROCESSING TITLES

How to AI COGS 105. Traditional Rule Concept. if (wus=="hi") { was = "hi back to ya"; }

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Automobile Prototype Servo Control

Arduino Lesson 1. Blink. Created by Simon Monk

Toward Culture-Aware Elderly Care Robots. Nak Young Chong School of Information Science Japan Advanced Institute of Science and Technology

Mechatronics. STEAM Clown Production. STEAM Clown & Productions Copyright 2016 STEAM Clown

UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING

WifiBotics. An Arduino Based Robotics Workshop

RF module and Sensing Workshop Proposal. Tachlog Pvt. Ltd.

Advanced Mechatronics 1 st Mini Project. Remote Control Car. Jose Antonio De Gracia Gómez, Amartya Barua March, 25 th 2014

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

OPEN SOURCES-BASED COURSE «ROBOTICS» FOR INCLUSIVE SCHOOLS IN BELARUS

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Development of a MATLAB Data Acquisition and Control Toolbox for BASIC Stamp Microcontrollers

PSU Centaur Hexapod Project

PURPOSE OF THIS EBOOK

Multiple Presence through Auditory Bots in Virtual Environments

J. La Favre Using Arduino with Raspberry Pi February 7, 2018

Rodni What will yours be?

Intelligent Tactical Robotics

Four Quadrant Speed Control of DC Motor with the Help of AT89S52 Microcontroller

Embedded & Robotics Training

Understanding the Arduino to LabVIEW Interface

Essential Understandings with Guiding Questions Robotics Engineering

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

Knowledge Enhanced Electronic Logic for Embedded Intelligence

Lab 2: Blinkie Lab. Objectives. Materials. Theory

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Training Schedule. Robotic System Design using Arduino Platform

Pre-Proposal Presentation Team 1: Portable Micro-sensor Reader

Project Proposal. Underwater Fish 02/16/2007 Nathan Smith,

Technology and the Stage:

CSTA K- 12 Computer Science Standards: Mapped to STEM, Common Core, and Partnership for the 21 st Century Standards

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

ReVRSR: Remote Virtual Reality for Service Robots

An IoT Based Real-Time Environmental Monitoring System Using Arduino and Cloud Service

ILR #1: Sensors and Motor Control Lab. Zihao (Theo) Zhang- Team A October 14, 2016 Teammates: Amit Agarwal, Harry Golash, Yihao Qian, Menghan Zhang

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Formation and Cooperation for SWARMed Intelligent Robots

International Journal of Advance Engineering and Research Development

AUTOMATIC ELECTRICITY METER READING AND REPORTING SYSTEM

The Real-Time Control System for Servomechanisms

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

INTELLIGENT HUMANOID LEGS

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

III. MATERIAL AND COMPONENTS USED

Multi-Vehicles Formation Control Exploring a Scalar Field

This list supersedes the one published in the November 2002 issue of CR.

Web-Enabled Speaker and Equalizer Final Project Report December 9, 2016 E155 Josh Lam and Tommy Berrueta

Computational Crafting with Arduino. Christopher Michaud Marist School ECEP Programs, Georgia Tech

TWEAK THE ARDUINO LOGO

EE 314 Spring 2003 Microprocessor Systems

GPU ACCELERATED DEEP LEARNING WITH CUDNN

WHAT ARE CONVERSATIONAL SYSTEMS?

SOUND SOURCE RECOGNITION FOR INTELLIGENT SURVEILLANCE

CURIE Academy, Summer 2014 Lab 2: Computer Engineering Software Perspective Sign-Off Sheet

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Introduction to Real-Time Systems

3-Degrees of Freedom Robotic ARM Controller for Various Applications

Controlling Humanoid Robot Using Head Movements

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

Internet of Things Student STEM Project Jackson High School. Lesson 3: Arduino Solar Tracker

Autonomous Vehicle Speaker Verification System

Using Gestures to Interact with a Service Robot using Kinect 2

Robotics & Embedded Systems (Summer Training Program) 4 Weeks/30 Days

C&D Summit 2018 / CIMON / May 31, 2018 / 2018 IBM Corporation. Presentation should start with this video:

Marine Debris Cleaner Phase 1 Navigation

Voice Command Based Robotic Vehicle Control

JEPPIAAR SRR Engineering College Padur, Ch

EGG 101L INTRODUCTION TO ENGINEERING EXPERIENCE

COSC343: Artificial Intelligence

Transcription:

Implementing Physical Capabilities for an Existing Chatbot by Using a Repurposed Animatronic to Synchronize Motor Positioning with Speech Alex Johnson, Tyler Roush, Mitchell Fulton, Anthony Reese Kent State University Computer Science Department, OH, United States Abstract A chatbot is a type of computer program which contains the functionality for conducting conversations through auditory or textual means. These programs have recently shifted from the use of rule-based logic to more advanced use of artificial intelligence for carrying conversations [1]. Our project examined implementing physical capabilities for an existing chatbot by using a repurposed animatronic controlled by an Arduino micro controller. This project sought to synchronize motor positioning including eye and jaw movement with speech output from text provided through intelligent web APIs. Keywords Chatbot, Robotics, Animatronic, Arduino, Speech Synthesis, Cleverbot I. INTRODUCTION The Human Robot Interaction (HRI) is one of the areas in computer science which has many recent interests and applications, especially in social robotic [7], [8]. In this area recognition, emotion, especially from facial expression play the main role [12], [14]. Using some new methods, even make the accuracy and performance, better [9], [10] and [11]. There are many methods which can be used for finding better accuracy and for learning part [15], [16], [17] and [18]. Even we can use some traits, for example, facial features data to make better interaction with human [18], [19] and [20]. Using the physical components and chassis of a Wowwee Robotics' Alive Elvis animatronic, along with several web APIs and.net libraries, our group created a robot capable of holding a conversation with a human user through verbal and physical communication. The provided servo motors of the animatronic were attached to an Arduino microcontroller and reprogrammed with compiled code written in C++ running on the Arduino as well as an external program in C#. From an attached computer, an external program runs which handles speech recognition, taking audio input from a user and using the result to call multiple web APIs such as Cleverbot and Watson to determine an intelligent response. Commands for precise motor movement based on the response are then issued to the Arduino through serial ports which are synchronized with speech output. While the end result of this project is still a prototype, our robot handles the basic attributes of imitating human conversation. In this article, in the next session we will explain the Algorithm and codes and circuits in details. In section 3 the flowchart is presented. Section 4 describes the Client code and the last section conclude the paper. WWW.IJASCSE.ORG 20

II. ARDUINO CODE While the programming for processing, speech is handled externally, motor control is handled directly by the Arduino. The Arduino can interface with external devices through its integrated serial port which using a set baud rate, sends data over the channel a single bit at a time. Bytecodes are sent from the external C# program to the Arduino which are then interpreted into commands specific commands to the servomotors. Sending simple bytecodes that are resolved into commands allows for exponentially faster serial communication as opposed to sending full commands. Table1. The wire connectivity for each part of facial function The Arduino is wired to several servo motors though with a larger board and shield more motors in the robot s head are accessible. A servo motor is a type of actuator that allows for precise control when positioning itself. It contains a coupled sensor that allows for position feedback which can be interpreted by its controller [6]. Understandably, servo motors are extremely useful for use in robotics. As the motors in this project are used to mimic facial features, the degree range in certain parts such as the jaw features a relatively low degree range. The amount of power that is applied to the motor depends on both the distance and speed at which it needs to travel. Fig.1. The motor and Arduino connectivity III. Flowchart In figure 2 you can find the flowchart of the algorithm. WWW.IJASCSE.ORG 21

recognition library with a detected audio input device to translate words to text input which can be more easily processed. What the user said is then resolved from audio to text using some of C# s libraries. This text is then used for several concurrent network calls one of which is the API for Cleverbot. The result that is then passed to the cloud based Cleverbot API occurs in a background worker thread which determines an appropriate response. A. Cleverbot Cleverbot is a web application which uses artificial intelligence to converse with humans based on past interactions stored in an enormous database. Unlike most catboats, it learns from human input in past conversations and typically hosts over 80,000 conversations at any given time [2]. Our program sends a string to the api which then processes it and returns a response. From our program s perspective, using Cleverbot is quite simple. All that is needed is an API user account and key. Fig. 2. The flowchart of algorithm At the first, the system gets the data from the user then convert the audio to text and send the query to the Watson and clevbot. Then the Watson and clevbot generate the response the comment related to these responses will send to the Arduino. The Arduino generates the proper speech and move the motor for moving face part. IV. CLIENT PROGRAM On the external program which runs on either a connected laptop or single board computer, such as a Raspberry Pi, a loop run which controls the conversation between the human and robot. The loop begins by using functions from a speech B. Watson Concurrently in another thread, a call to the Watson API is made which will attempt to perform tone analysis on the text to determine the suggested mood, which will sequentially help determine the robotics nonverbal response. Watson is a computer system of IBM that is capable of answering questions with natural language. One specific functionality offered is the ability to determine the mood and tone in a conversation [3]. This function is referred to as the Watson Rone Analyzer Service and it is capable of detecting three types of tones from given text which are emotions, social tendencies and writing style. The emotions and social tendencies stem from the Big Five personality traits that are commonly categorized by many psychologists. Although using this API is not entirely free, a developer API key allows a WWW.IJASCSE.ORG 22

programmer to demo the functionality which for our project s case is enough. // Open the jaw halfway at fastest speed // Slowly close the jaw // Move eyes to look at coord (100, 150) // Blink eyes Table2. The motor specifications Byte Codes Examples Part Speed X Pos Y Pos 0 255 0 125 0 55 0 0 1 255 100 150 2 - - - Once the results have returned from the web, a suitable response is determined and a bytecode is send over serial to the Arduino. In the event of networking or server side issues with the APIs, generic values will be passed instead to continue the conversation. As shown on the graph, bytecodes for this project consist of 4 bytes: one for the part, one for the speed, and two of the desired coordinates. By sending simple bytecodes to the Arduino as opposed to full strings of commands, much time is saved when reading the input. After the Serial. readbytes () function has run, which reads the characters from the serial port into a buffer, the analogread () function takes only around 100 microseconds to complete a read from this buffer and thus is capable of reaching a maximum rate of nearly 10,000 times in a given second [4]. Though this project only makes use of several motors, the approach used in the code could be easily scaled to work with an exponential amount of motors. C..NET Libraries Our external or client program which runs on a laptop or Raspberry Pi running Windows separate from the Arduino exists as a Windows GUI application or a Windows Form (Winforms) application as it is also known as. Windows Forms is an event-driven GUI class library that is included as part of Microsoft s.net Framework. This project makes additional use of several of the built in libraries for C# on the client program including Speech Recognition and Speech Synthesis. This allows the program to communicate with the user in real time as the program interprets speech to text and later outputs text. The user interface that is provided allows the user to open and close serial communications between the client device and Arduino while additionally offering the ability to manually command motors and start the API routines. D. Syllable Count Another key element of the program is a function which calculates the amount of syllables in a given word. This is required to properly synchronize the speech output with the motors. Depending on the word length and the number of syllables, different commands are sent to the motors to adjust accordingly. To determine the syllables of a word, many times simply counting the vowels will provide an accurate answer. But cases such as silent ending and double vowels need to be accounted for. This program does this for each word in a given sentence while mapping the output to mouth movements. Shown below is some of the code for sending mouth movement bytecodes to the Arduino from the Windows Form application while synchronizing with the speech synthesizer speech output. V. CONCLUSIONS This project involved several different aspects that proved difficult during implementation. As our group s robot focused on human interaction, WWW.IJASCSE.ORG 23

efficiency in programming was necessary as quick responses were needed to maintain a conversation. Between the different API calls, concurrency of tasks was required as remote calls over the internet acted as a performance bottleneck. Another challenge we faced was with controlling the chassis many servo motors. Servo motors are closed-loop servomechanisms that are controlled by an electric signal which will determine the position to. Servo motors typically can only turn a total 180 degrees, 90 in each direction. As the bytecode send to the Arduino can only contain a value from 0 to 255, position exact movement requires scaling for the passed variable. In one instance, due to an issue with determining the value of the bytecode, the jaw of the robot became stuck attempting to open to an angle that it could not reach. Regardless of the difficulties with implementation, our group was able to successfully repurpose the animatronic into a chatbot. Although the hardware used was relatively simple, the approach taken in handling event-driven motor coordination could easily be scaled to much more complex and precise hardware. REFERENCES [1] J. C. Wong, "What is a chat bot, and should I be using one?" in The Guardian, The Guardian, 2016. [Online]. Available: https://www.theguardian.com/technology/2016/apr/06/what-is-chatbot-kik-bot-shop-messaging-platform. [2] "Creating a bot instance Cleverbot.io," Cleverbot.io. [Online]. Available: https://docs.cleverbot.io/docs. [3] "Tone analyzer service documentation," in IBM.com. [Online]. Available: https://www.ibm.com/watson/developercloud/doc/toneanalyzer/. [4] Arduino, "Analog Read," 2016. [Online]. Available: https://www.arduino.cc/en/reference/analogread. [5] Microsoft, "How to: Create a windows forms application," 2016. [Online].Available:https://msdn.microsoft.com/enus/library/ms235634( v=vs.80).aspx. [6] K. Ross, "What s a servo: A quick tutorial, [Online]. Available: http://www.seattlerobotics.org/guide/servos.html. [7] Mehdi Ghayoumi, Arvind. K. Bansal, Real Emotion Recognition by Detecting Symmetry Patterns with Dihedral Group, MCSI, Greece, 2016. [8] Mehdi Ghayoumi, and et al, Follower Robot with an Optimized Gesture Recognition System, RSS, USA, 2016. [9] Mehdi Ghayoumi, Arvind. K. Bansal, Architecture of Emotion in Robots Using Convolutional Neural Networks, RSS, USA, 2016. [10] Mehdi Ghayoumi, Arvind. K. Bansal, Emotion in Robots Using Convolutional Neural Networks, ICSR2016, USA, 2016. [11] Mehdi Ghayoumi, Arvind. K. Bansal, Multimodal Convolutional Neural Networks Model for Emotion in Robots, FTC, USA, 2016. [12] M. Ghayoumi, A. Bansal, Unifying Geometric Features and Facial Action Units for Improved Performance of Facial Expression Analysis, CSSCC 2015. [13] Mehdi Ghayoumi, A Review of Multimodal Biometric Systems Fusion Methods and Its Applications, ICIS, USA, 2015. [14] M. Ghayoumi, A. Bansal, An Integrated Approach for Efficient Analysis of Facial Expressions, SIGMAP 2014. [15] M. Ghayoumi, A Quick Review of Deep Learning in Facial Expression, Journal of Communication and Computer, 2017. [16] M. Ghayoumi, A. K. Bansal, Emotion Analysis Using Facial Key Points and Dihedral Group, International Journal of advanced studies in Computer Science and Engineering (IJASCSE), 2017. [17] Mehdi Ghayoumi, Maha A. Therefore, Arvind K. Bansal, A Formal Approach for Multimodal Integration to Derive Emotions, Journal of Visual Languages and Sentient Systems, 2016. [18] Timothy Zee, Mehdi Ghayoumi, Comparative Graph Model for Facial Recognition Systems, CSCI, Vegas, USA, 2016. [19] Mehdi Ghayoumi, Maha Tafar, Arvind. K. Bansal, Towards Formal Multimodal Analysis of Emotions for Affective Computing, DMS, Italy, 2016. [20] H. Abrishami Moghaddam and M. Ghayoumi Facial Image Feature Extraction Using Support Vector Machines, Proc. VISAPP 2006, Setubal, Portugal. WWW.IJASCSE.ORG 24