The post demonstrates the second generation version of the previous post. The second generation robot communicates directly to the smartphone without the help of the router by making the smartphone, a WiFi access point. I have installed sensors to monitor the environmental parameters like temperature, pressure, Light intensity. The Raspberry Pi, the PIC micro-controller and the sensors form hierarchical I2C bus architecture. To avoid the real time delay for the navigation of the robot, the driving control for the motors has been moved to Pi. The sensor data is available to the user in the smartphone.The block diagram depicts the system assembly.
Hierarchical I2C Bus Architecture:
Hardware
Implementation:
Raspberry PI is a 3.3V system and requires a level shifter
to communicate to PIC Microcontroller which is a 5V device. I could find a
level shifter circuit as shown below in http://playground.arduino.cc/Main/I2CBi-directionalLevelShifter
which helped me to build my own level shifter with transistors.
I2C is a communication protocol which allows data transfer
among devices with help of two wire namely SCL and SDA which sends and receives
data serially. PIC 18F25K22 has two independent I2C Bus network in which the
one of the Buses is used for communication between RPI (master) and PIC
(slave). The other I2C bus forms a communication path between PIC (master) and
the I2C sensors (slave). Each device has its own address in I2C bus
architecture through which the master and the slave devices communicate each
other. With reference to my application, RPI acts as hierarchical master device
which communicates to the sensors and servo motors indirectly through the
sub-master PIC. PIC micro-controller communicates to the sensor and reads the sensor
values continuously and when RPI gives commands to PIC to pass the sensor
readings or to control the Tilt/Pan system, PIC stops its usual routine work
and services the interrupt from RPI and resumes its work after the service.
System
software design:
Libraries
used in RPI for navigation and video streaming:
WEBIOPI - WebIOPi is
a REST framework which allows you to control Raspberry Pi’s GPIO from a browser.
It’s written in JavaScript for the client and in Python for the server. You can
fully customize and easily build your own web app. You can even use all the
power of WebIOPi directly in your own Python script and register your functions
so you can call them from the web app. WebIOPi also includes some other
features like software PWM for all GPIO. In our application, the android
smartphone communication with Pi through this library using Http Client
services.
GSTREAMER - GStreamer is a pipeline based multimedia framework which
allows a programmer to create a variety of media-handling components, including
simple audio playback, audio and
video playback, recording, streaming and editing. The
pipeline design serves as a base to create many types of multimedia applications
such as video editors, streaming media broadcasters and media players. We use this library to decode the video information from
the USB Webcam.
Basic
algorithm:
- Setting up the RPI as a wireless accessible device using a wireless USB adapter.
- An android application is developed to command the robot and to get the video stream from the robot. The application communicates to the RPI through HTTP protocol for the navigation control and for receiving the sensor data using the IP address of the RPI.
- Getting the data from the sensor to PIC through I2C bus communication and also interfacing servo based Tilt/Pan system where the camera is mounted.
- Now the PIC is interfaced with the hierarchical master RPI to send the sensor’s values to the RPI as RPI requests for the data and to receive commands to control the Tilt/Pan system.
- As the RPI get the values from PIC Microcontroller, it store in the memory which gets updated to the android application after 500milliseconds.
- Interfacing DC motor with RPI through a L928 driver which is controlled as the commands are received from the android application.
The below picture demonstrates how the light sensor data (Lux) changes according to the light intensity that falls on the sensor. In the first picture, the robot is roaming in the dark region, so that the light intensity is measured less. As the moves to the brighter region, the light intensity value increases in the second picture.