Monday 4 January 2016

Eclipse Open IoT Challenge 2.0 - project beginning

Hi! For the edition 2.0 of Eclipse Open IoT Challenge I'm trying to create a mobile robot based on principles of Cloud Robotics. The robot will be designed as a toy. It will demonstrate sophisticated behaviour and its main purpose is to stimulate a baby crawling development.

Sample use case

The robot is placed on the floor of the room. The user initiates the robot from the robot management web app. Robot starts the "Search" behavior, trying to detect a baby (hereinafter - the Object) who plays, say, on a play rug. Robot detects the Object and starts the "Approach" behavior. When the robot infers that the Object is found it starts the "Found" behaviour: takes an image from camera and sends notification to the management web app. Robot stops near the Object and starts the "Entertainment" behavior (plays music and shows funny smiles on LED panel) trying to capture the attention of the Object. If the robot determines that the distance to the Object starts to shorten (the baby tries to catch the robot), the robot starts the "Stimulate" behaviour trying to increase the distance. If the robot detects that it is close to the obstacle from the opposite side, for example, to the room wall, the robot starts the "Escape" behaviour: turns 90 degrees and begins to move forward.


Architecture

The robot itself will implement only basic actions and commands, such as: go forward, stop, turn right, turn left, play music, take camera image, measure distance, detect motion, etc. All the complex behavior will be implemented in the Cloud part of the solution. The robot will communicate with its "cloud brains" via CoAP protocol, providing several resources as a CoAP server. For example:
 GET  robot/sensor/distance/front
GET robot/sensor/distance/rear
GET robot/sensor/motion/left
GET robot/sensor/motion/right
PUT robot/move/forward
PUT robot/move/backward
PUT robot/action/playmuisic
PUT robot/action/takeimage
PUT robot/action/showsmileyfaces
Sensor resources will be treated as observable resources, providing collected data as notifications for the CoAP client side in the Cloud.



Components of the solution:
  1. Two wheeled autonomous battery powered mobile robot based on Raspberry Pi, connected to the wireless router via Wi-Fi wireless USB Adapter Dongle. Several sensors, camera module, LED matrix, audio amplifier and speakers will be mounted on the robot chassis. Robot software will be deployed on RPi as a fat jar, containing Rhiot project (https://github.com/rhiot/rhiot) components such as Pi4j component, Webcam component and Apache Camel components (CoAP component, Exec component, Netty component). Custom Camel routes and processors will be implemented for sensor data collection and camera image capture as well as for the robot servos control, audio files uploading and playback and LED matrix control. The robot will receive control commands from the Cloud client via CoAP protocol and push collected sensor data to the Cloud in form of observable resources notifications.


The following components will be deployed on a single DigitalOcean Ubuntu 14.04 droplet, where each component (except the 5th one) will be run inside its own Docker container as a microservice. These components will interact with each other using the functionality of Apache Camel via HTTP-based endpoints.

  1. CoAP Gateway component, based on Apache Camel and Californium. Works as a CoAP-HTTP cross-proxy. This component will provide RESTful API to enable other components to interact with the robot.
  2. Robot Behaviour Component, based on Apache Camel and custom Java code. The component will implement the robot behaviour using behavior-based robotics paradigm (BBR) and Subsumption architecture.
  3. Management Component implemented as a RESTful API plus single-page web client application. The purpose of this Component is to provide the end-user management tools to manual robot control, to configure of certain types of robot behavior (for example, to create playlist with children's songs for uploading, to set up symbol sequences for the robot LED display), to view history of perceptions and actions of the robot, stored in the Datastore component. Additionally, this component will utilize WebSocket protocol to display the current status of the robot, events of the robot state change and displaying the current image from the robot camera.
  4. Data-store component, based on PostgreSQL database. The component persists time series robot sensor measurement data, camera images, the robot behavior history data, notifications event data and current state of the robot.

Currently I'm working on the robot hardware/software part.