Concept of ELFi Educational program. Android + LEGO

Similar documents
(Skip to step 11 if you are already familiar with connecting to the Tribot)

Follow the Light Pre-Quiz

LEGO MINDSTORMS PROGRAMMING CAMP. Robotics Programming 101 Camp Curriculum

Your EdVenture into Robotics You re a Programmer

6.111 Final Project Proposal Kelly Snyder and Rebecca Greene. Abstract

3READY. Android STB + Multiscreen Solution

B. The specified product shall be manufactured by a firm whose quality system is in compliance with the I.S./ISO 9001/EN 29001, QUALITY SYSTEM.

Bringing an all-in-one solution to IoT prototype developers

i3touch COLLABORATE MORE EFFECTIVELY WITH INTERACTIVE TOUCH DISPLAYS

MATLAB & Image Processing (Summer Training Program) 4 Weeks/ 30 Days

Image Processing Using MATLAB (Summer Training Program) 6 Weeks/ 45 Days PRESENTED BY

Instructions For Using Kindle Fire Hd 8.9 Camera

16CH 1080p HD-SDI Security MAGIC Lite Series DVR System - Auto detects Analog/960H/HD-SDI

i3touch COLLABORATE MORE EFFECTIVELY WITH INTERACTIVE TOUCH DISPLAYS

IoT Software Platforms

1 Feb Grading WB PM Low power Wireless RF Transmitter for Photodiode Temperature Measurements

Materials: Programming Objectives:

DEVELOPING IN THE IOT SPACE

Your EdVenture into Robotics You re a Controller

interactive displays

CREATE. CONTROL. CONNECT.

MotionPro. Team 2. Delphine Mweze, Elizabeth Cole, Jinbang Fu, May Oo. Advisor: Professor Bardin. Midway Design Review

The software concept. Try yourself and experience how your processes are significantly simplified. You need. weqube.

INSTALATION AND OPERATION MANUAL ABYSSAL OS Overlay Module Version 1.3

USER S MANUAL. Introduction

The software concept. Try yourself and experience how your processes are significantly simplified. You need. weqube.

Development of extemporaneous performance by synthetic actors in the rehearsal process

SpringerBriefs in Electrical and Computer Engineering

USING LIVE PRODUCTION SERVERS TO ENHANCE TV ENTERTAINMENT

Speech Recognition and Signal Processing for Broadcast News Transcription

Bridging Legacy Systems & the Internet of Things. Matt Newton Director of Technical Marketing OPTO 22

Training Document for Comprehensive Automation Solutions Totally Integrated Automation (T I A)

Architecture of Industrial IoT

Software Quick Manual

The RedRat-X. Integration Guide

Intelligent Monitoring Software IMZ-RS300. Series IMZ-RS301 IMZ-RS304 IMZ-RS309 IMZ-RS316 IMZ-RS332 IMZ-RS300C

Implementation of A Low Cost Motion Detection System Based On Embedded Linux

Design Principles and Practices. Cassini Nazir, Clinical Assistant Professor Office hours Wednesdays, 3-5:30 p.m. in ATEC 1.

ORM0022 EHPC210 Universal Controller Operation Manual Revision 1. EHPC210 Universal Controller. Operation Manual

Classroom Projectors

Press Publications CMC-99 CMC-141

THE NEXT GENERATION OF CITY MANAGEMENT INNOVATE TODAY TO MEET THE NEEDS OF TOMORROW

A Demonstration Platform for Small Satellite Constellation Remote Operating and Imaging

SCode V3.5.1 (SP-601 and MP-6010) Digital Video Network Surveillance System

Real-time body tracking of a teacher for automatic dimming of overlapping screen areas for a large display device being used for teaching

Pivoting Object Tracking System

IoT Challenges in H2020. Mirko Presser, MSci, MSc, BSS/BTECH/MBIT Lab

Transparent Computer Shared Cooperative Workspace (T-CSCW) Architectural Specification


Lesson Sequence: S4A (Scratch for Arduino)

Integrating Device Connectivity in IoT & Embedded devices

Smart Home. The beginning of a smarter home. Ambi Kodak LaMetric Netatmo Tend

Is there a Future for AI without Representation?

Internet of Things - IoT Training

INSTALATION AND OPERATION MANUAL ABYSSAL OS Overlay Module Version 1.0.1

PRELIMINARY. QuickLogic s Visual Enhancement Engine (VEE) and Display Power Optimizer (DPO) Android Hardware and Software Integration Guide

C8491 C8000 1/17. digital audio modular processing system. 3G/HD/SD-SDI DSP 4/8/16 audio channels. features. block diagram

Casambi App User Guide

ITU-T Y.4552/Y.2078 (02/2016) Application support models of the Internet of things

THE FUTURE OF VOICE ASSISTANTS IN THE NETHERLANDS. To what extent should voice technology improve in order to conquer the Western European market?

TUPEX MOWI Wireless light control

MAGICQLSeries-4CH1080pDVRSystem-SupportsEX- SDI/HD-SDI/960H/Analog/IP

Promotion Package Pricing

The BBC micro:bit: What is it designed to do?

Exhibits. Open House. NHK STRL Open House Entrance. Smart Production. Open House 2018 Exhibits

EasyAir Philips Field Apps User Manual. May 2018

MAGICLiteSeries-16CH1080pDVRSystem-SupportsEX- SDI/HD-SDI/960H/Analog/IP

ITU-T Y Functional framework and capabilities of the Internet of things

Bridging the Interoperability Gap of the Internet of Things. BIG IoT Project. Rosa Ma Martin (inlab FIB, UPC) JORNADAS TÉCNICAS RedIRIS 2017

a Collaborative Composing Learning Environment Thesis Advisor: Barry Vercoe Professor of Media Arts and Sciences MIT Media Laboratory

Performance Projectors

15th International Conference on New Interfaces for Musical Expression (NIME)

WAVES Cobalt Saphira. User Guide

DigiPoints Volume 2. Student Workbook. Module 1 Components of a Digital System

Cambridge International Examinations Cambridge International General Certificate of Secondary Education. Paper 1 May/June hours

SCode V3.5.1 (SP-501 and MP-9200) Digital Video Network Surveillance System

Building Intelligent Edge Solutions with Microsoft IoT

Outline. Why do we classify? Audio Classification

The Fernsehfee Blocks Out. Commercials. Fernsehfee

FS1-X. Quick Start Guide. Overview. Frame Rate Conversion Option. Two Video Processors. Two Operating Modes

Internet of Things: Cross-cutting Integration Platforms Across Sectors

In this paper, the issues and opportunities involved in using a PDA for a universal remote

Internet of Things (IoT) and Big Data DOAG 2016 Big Data Days

Premium INSTALLATION AND USER GUIDE ENGLISH TAHOMA BOX. - INSTALLATION AND USER GUIDE. Rev A _01-16

Do you have a mature IoT solution? Join us with the Open Call. Alicia Cano - Medtronic.

ENGINEERING COMMITTEE Energy Management Subcommittee SCTE STANDARD SCTE

USER S MANUAL. ITEM NO AGES: 8+

Term OVERVIEW ARTS with Anna Keeping beat/rhythm Elements of music ( tempo, pitch, beat, rhythm) Exploring instruments and how they are played

SELECTING A HIGH-VALENCE REPRESENTATIVE IMAGE BASED ON IMAGE QUALITY. Inventors: Nicholas P. Dufour, Mark Desnoyer, Sophie Lebrecht

3 rd International Conference on Smart and Sustainable Technologies SpliTech2018 June 26-29, 2018

USER INTERFACE. Real-time video has helped Diebold cut training time by 35 percent as well as improve call resolution times.

Whole House Lighting Controller

IoT Strategy Roadmap

Sequential Logic. Introduction to Computer Yung-Yu Chuang

Microbolometer based infrared cameras PYROVIEW with Fast Ethernet interface

PROTOTYPE OF IOT ENABLED SMART FACTORY. HaeKyung Lee and Taioun Kim. Received September 2015; accepted November 2015

MAGICUSeries-4CH1080pDVRSystem4Kouput- SupportsEX-SDI/HD-SDI/HD-TVI/A-HD/960H/Analog/ IP

LedSet User s Manual V Official website: 1 /

FOSS PLATFORM FOR CLOUD BASED IOT SOLUTIONS

UVM Testbench Structure and Coverage Improvement in a Mixed Signal Verification Environment by Mihajlo Katona, Head of Functional Verification, Frobas

Transcription:

Concept of ELFi Educational program. Android + LEGO ELFi Robotics 2015

Authors: Oleksiy Drobnych, PhD, Java Coach, Assistant Professor at Uzhhorod National University, CTO at ELFi Robotics Mark Drobnych, Google Science Fair 2014 Global Finalist, Mindstorms Robot Expert, High School Student. Myroslava Drobnych, consultant in psychology of kids, Senior Lecturer in Uzhhorod National University

Aim to explain in easy way how autonomous intelligent robots work. Auditory students and enthusiasts of 14+ years old. Equipment Prerequisites access to Lego EV3 Home or Core set and PC with Windows installed, Linux or Mac, Android 4+ phone or tablet. Expected Knowledge NXT G or EV3 G graphic programming understanding, basic knowledge of mechanical patterns for LEGO Technic parts. Program Description This program is provided for STEM education groups, FLL teams and individuals who learn robotics using popular Lego EV3 sets. This program opens doors for professional software development, architecture and fundamental concepts of building intelligent autonomous robots for beginners, especially for those who gained some experience in simple mechanics and graphic programming of LEGO EV3 controllers.

Introduction Problem There is a complex gap between school robotics and professional one. A lot of FAT books about Autonomous Robots, Natural Language Understanding and Artificial Intelligence look like a huge barrier for kids and students interested in robotics. A lot of kids are good at mechanics and controlling code, FLL and WRO contests but isolated from thriving world of real talking and thinking robots. Solution The solution is to provide simplified application programming interface and simple examples of Voice Recognition, Natural Language Understanding, Face and visual Object detection based on famous LEGO MINDSTORMS sets. This simplicity is possible now because of new innovative libraries and clouds developed by numerous Startups including ELFi Robotics. In short, we propose to add a robotic brain to EV3 based LEGO models so that kids can start talking to their robots. After this first step we ll show how to teach a personal robot and how to program its brain. Result After testing this program together with a small group of high school students we discovered an increasing interest in direction of programming,

machine learning, generating ideas for new applications of thinking, talking and observing robots. The concept of layered architecture Layered architecture of ELFi Brain allows to clearly separate concerns over the source code. We have 3 layers: Brain Algorithms Controllers Controllers have a well known code for all EV3 LEGO enthusiasts. It has the code which controls motors, sensors, Bluetooth Mail Box and the other parts of MINDSTORMS platform. So if you need to do some actions like raise a left hand of robot you have to add a code inside EV3 G program first. Algorithms a glue between controller programs and a robotic brain. For example dance command will reuse and repeat many simple and small controller codes. Otherwise we had to create tons of repeatable code for EV3 G. This approach allows students to understand one of fundamental principles of professional software coding: DON T REPEAT YOURSELF. Brain part of ELFi architecture is the most interesting and intriguing one. It makes our robot behave like a human. Brain allows to create complex reactions of robot to external world, events, voice commands, moving faces or others object in visible area or message from another robot via special channel of communication. Robot can analyse many contexts and prehistory before acting in response to particular event. This is why we call ELFi Intelligent robot.

We ll use a simple chart of Brain Algorithms Controllers architecture in many steps of our educational program to give a quick idea how software changes are involved into each task.

Learning Path STEP 1 : What is intelligent autonomous robot This intro lesson explains what is robotic brain and how dramatically it can change a behavior of any EV3 model. We created a new robotic platform for learning how to build robots with brain using popular construction sets. We call those robots ELFi which is a shortening of EV3 LEGO Friend. Exercise: propose your own use of such a robot. Let s discuss! Target of this step is the understanding of: general concept of intelligent autonomous robot what robotic brain can do why your old EV3 models are not intelligent robots? what is machine learning how the intelligent robot can move without a remote control or color line on the floor

STEP 2 : Building a simple talking robot with LEGO and Android You will become familiar with programming tools needed during the whole course of our educational program. First of all you will install Brain App on your android phone or tablet. You can talk using this app and ask it to show building instruction for Small ELFi robot. We ll consider the other parts of software for ELFi Educational Program and learn how to connect Brain App to the body of a small ELFi robot. Exercise: install ELFi robotics Brain App on Android device, connect to ELFi Small body. Play. Target of this step is the experience of: installing ELFi brain on Android phone or tablet building Small ELFi body from LEGO MINDSTORMS set connecting ELFi brain and EV3 brick together talking to ELFi

Here is Small ELFi a minimalistic type of ELFi robots:

STEP 3 : Architecture of robotic software. Brain Algorithms Controllers. Full stack example. At this level we ll try to see the whole ELFi Robotics platform in practice. We ll quickly overview the Software Architecture of ELFi robot, study its components and their purpose. Finally, we ll try to do a quick coding for all 3 layers of its architecture to get practical understanding of the robotic platform. Exercise: add left, right, forward movement commands. Target of this step is understanding of: what is AI cloud how does voice command work where to place EV3 G code in ELFi architecture how to create a simple algorithm

See the affected parts of ELFi Software Architecture:

STEP 4 : Algorithms We ll learn existing controller commands which can be reused in your own algorithms. In fact you can go far without even writing a single controller program. In this lesson we ll combine commands like go forward, turn left to draw a picture of small house. Exercise: Draw the house on a paper or whiteboard using a pen attached to Small ELFi. Target of this step is the understanding of: where to find existing controller programs how to combine controller programs with algorithms how to execute algorithm without adding a new voice command

See the affected parts of ELFi Software Architecture:

STEP 5 : Controllers This lesson is focused on Controller layer of ELFi Software Architecture. We re going to learn how to add a new EV3 G program to the controller layer and how to connect it with Party time spoken command. Exercise: Write your own dance for Small ELFi. Have fun. Target of this step is understanding of: where to put a new controller program how to bind a new controller program to algorithm

See the affected parts of ELFi Software Architecture:

STEP 6 : Brain The ultimate task of robotic brain is support of conversation with a human. Face detection, voice recognition, text to speech, XMPP chat, Emotions all those tools are working together as a Brain of ELFi robot under control of Android Operating System. You can give a voice command, talk or even create a new command via voice conversation with ELFi. Exercise: Describe new command verbally. This command should be a synonymic intent to any existing voice command. Play. Share. Target of this step is understanding of: which operations does ELFi brain perform what is Face Detection what is Voice Recognition what is Natural Language Understanding what is XMPP channel how to setup a new voice command verbally

See the affected parts of ELFi Software Architecture:

STEP 7 : Java for Robots. Crash course This is the right time to start writing some code for ELFi brain. We ll write this code using Java programming language. We don t need an extensive Java course. We need just a starting point. More knowledge you will obtain in further lessons. In this lesson you will learn what is variable, reference, object, class and method. Exercise: set a new emotion to your voice command from the previous lesson. Target of this step is understanding of: how to work with Java code where to look for answers to questions about Java language what are the main building blocks of Java code how to change emotions of ELFi using Java code.

See the affected parts of ELFi Software Architecture:

STEP 8 : Android for Robots. Crash course At this level we ll continue learning Java language but we ll do it in the context of Android programming. We ll have a good look of the most important parts of Android SDK: Activity, Service, Broadcast Receiver, Content Provider and their use in ELFi robotic Brain. This is the right time to start writing some code for ELFi brain. Our robot can use Android front camera for face detection. There are existing android libraries for this purpose. We ll apply one of the simplest solutions FaceDetector. Exercise: Write an ability to center a robot direction to your face. This ability is useful for creation of strong personal touch during your talk with robot. Target of this step is understanding of: what is Android which parts of Android are in use of ELFi Brain what is direct Bluetooth command what is Face Detector and how it can be used

See the affected parts of ELFi Software Architecture:

STEP 9 : Event machine ELFi s Brain has a lot of slots to inject your own code. One of them is Brain Event Listener. This is a Java class which allows you to add custom event processing code Before, After or Instead of the original event processor. Exercise: Substitute Ninja Emotion for your own picture. Add brand new emotion and link it with spoken command. Play. Share. Target of this step is understanding of: what is event, listener and processor how to intercept an event

See the affected parts of ELFi Software Architecture:

STEP 10 : NLU processing It s time to dive more deeply in Natural Language Understanding and it s integration with Android code. We ll create another Robotic Game resembling popular millionaire game show. Our task is to inject our code for generating random questions and supporting the whole logical workflow of this game. As usual, by gestures, music and emotions of ELFi. Exercise: Create The Millionaire Robotic Game Target of this step is understanding of: what is Intent, Entity, Context and Parameter how to work with NLU within ELFi Brain

See the affected parts of ELFi Software Architecture:

STEP 11 : Indoor navigation Intelligent autonomous robot has to share living area with humans. In other words, to recognize rooms and know how to change direction from one room to another. This is what we call Indoor navigation. There are several strategies for implementation of indoor navigation. We ll use algorithms for moving between rooms with orientation based on walls and IR beams detection. Exercise: Develop a Home Patrol mode.in this mode, ELFi will loop over defined rooms and send an email with picture taken with a front camera if the human face will be detected. We ll use WiFi signal strength for definition of current position. We ll use NLU command to start the patrol mode. Target of this step is understanding of: which strategies does exist for indoor navigation which controller programs are involved into indoor navigation how we model a compass for ELFi

See the affected parts of ELFi Software Architecture:

STEP 12 : Scheduler Orientation in time is also very important for intelligent robots. We ll consider patterns of Android SDK for dealing with time and periodic tasks: Alarm Manager and Handler. Exercise: Teach ELFi to play with your cat (imaginary or the real one) in defined time of the day. We ll also discover another important feature defining if some object marked with a Bluetooth beam. We need this additional functionality to define if cat are in the same room so that cat can see a mouse. Target of this step is understanding of: what is Android Alarm how to assign periodic tasks for ELFi how get the Bluetooth signal strength of an another device

See the affected parts of ELFi Software Architecture:

STEP 13 : Color recognition This lesson is dedicated to robotic vision. Robot can detect objects and colors using front camera of Android device. Exercise: Create a Toreador game. You can start the game by showing some colored thing to ELFi. Then ELFi will follow this. Imagine you are a Toreador and ELFi is a Bull. Have fun. Target of this step is understanding of: what is robotic vision what target is good for follow me operations where to put code working with images taken by front camera

See the affected parts of ELFi Software Architecture:

STEP 14 : Robotic game We ll create a new Android Activity in ELFi style and develop a real Robotic Game application. Exercise: Develop Tic Tac Toe powered by object recognition. The idea of this Robotic Game is following: we ll be shown a Tic Tac Toe board instead of ELFI s face. We ll use Android Face Detection to show our next move in the game. Target of this step is understanding of: why Face Detection is so important for Android devices alternative ways for interacting with a robot how many faces can ELFi detect at the same time

See the affected parts of ELFi Software Architecture:

STEP 15 : Robot to robot communication Collaborating robots is a very important concept. One ELFi can delegate some task to the other ELFi, indeed. Finally we can have several robots specialized in some tasks and exchanging messages. We ll discover XMPP protocol for messages exchange between ELFi s. Exercise: In this lesson we ll create a specialized robot with water pump located near some plant. A small ELFi will delegate the task called water the plant according to the schedule. Target of this step is understanding of: what is XMPP why HTTP is not good for peer to peer communication channels what is session, chat and message how to use XMPP for robot to robot communications

See the affected parts of ELFi Software Architecture:

STEP 16 : AI methods. Text classification and Sentiment Analysis. This lesson is very important for understanding the principles of Machine Learning. In fact we have already used those principles indirectly via AI Cloud for interpretation of a spoken text in the previous lessons. Now we ll try to understand how it works. We ll do it in practical way, as usual. Our task is getting a context of unknown command. ELFi will be able to react and keep conversation with human outside command execution paradigm. We ll start with the Text Classification. ELFi will try to find the general context of unknown command. Imagine we ll say ELFi do you know that gravitation is a fundamental force in our Universe?. This is not a command. But using text classification, ELFi can define that we re talking about Physics and Astronomy. He can react to some interesting news about mentioned chapters of science or propose to watch some educational video. We ll use existing AI Clouds for this task. But before we ll learn how does this function work. We ll consider a simple clustering algorithm called K Means for classification of unknown speech. Finally we ll use another AI method which allows to define emotions from some spoken text. This method also can find out a subject of the emotion. For example, the phrase I like to play with my cat will be recognized as Positive, directed to a Cat entity. This allows ELFi to coordinate his emotional state with human and keep conversation.

Exercise: Connect ELFi to AI Cloud for speech classification and Sentiment Analysis. Develop some strategies to react to unknown speech text. Target of this step is understanding of: what is machine learning how does K Means method work what is text classification what is sentiment analysis

See the affected parts of ELFi Software Architecture:

STEP 17 : Your own robot. This is the right time to build your own robot. You can start with some drafts on a piece of paper, calculation of needed motors, EV3 bricks, beams and other parts. After creating the physical body of your robot you have to adjust controller codes responsible for left/right/forward movement. Those movements are integral parts of many algorithms like indoor navigation, follow me mode, dancing etc. Then you can switch to creation of unique features of your robot and its spoken abilities. Exercise: Build your own intelligent robot. Take your time. This is a long term task. Let s be in touch. Share your progress.

Big ELFi by Mark Drobnych