Sunday, December 29, 2013

Stellaris Launchpad First Steps


I've got my LM4F120 Launchpad up and running. Here are some initial notes someone might find useful.

The install process takes a while, but is not difficult if you follow the instructions on the TI Wiki. I used the combined StellarisWare and Code Composer Studio download labelled EK-LM4F120XL-CCS on this page. You will need both Code Composer Studio (CCS) and StellarisWare. CCS is apparently used by a wide variety of development boards in addition to the Launchpad. StellarisWare gets you the sample projects and Launchpad-specific drivers and libraries.

Instructions for installing the software and loading the sample projects are here.

The "Hello" example blinks the outrageously bright onboard RGB LED and outputs "Hello, world!" to the serial port. As part of the Launchpad ICDI drivers, you get a serial port driver that the Launchpad can print debug information to.

It was not immediately obvious that Code Composer Studio does not display this output. It's a serial port, so you can use a program like Tera Term or Putty to see it. The Hello app is configured to use 115200,N,8,1.

You'll see some references to a terminal plugin for CCS. I tried this, and wrestled with it for a while before finding a reference to a known bug that prevents it from working at greater than 9600 baud. I removed it and switched to Tera Term. It works great.

I am very excited to have the board up and running, along with the dev environment. I look forward to writing some code for it - I intend to use it for my sonar project.

Sunday, December 22, 2013

New Project - Sonar experiments



Digital Sonar Experiment Part 1

For the last 10 years or so, I have wanted to build an underwater robot of some sort. I assume this is the result of reading too much Cussler, and doing a bit of SCUBA diving, but I can't seem to shake it. I've been following a number of projects, partitcularly OpenROV, with great interest. One of the challenges that does not seem to have been tackled much by hobbyists is an affordable underwater sonar. There are plenty of inexpensive sonar units for land and aerial robotics - the AR Drone makes good use of them for low altitude sensing - but there don't seem to be any for water use yet.

I figured it made the most sense to understand the algorithms and get them debugged in air first, since sound travels much faster in water, and there are issues with acoustic coupling and waterproofing in water that are not a problem in air.

Although I intend to build the unit with a microprocessor board eventually, I chose to explore what could be done with a sound card first, just for ease of programming. I found that it has been done before, successfully, and documented well by this gentleman. His page gave some very useful information about echo detection, introducing me to the idea of correlation functions to detect a short signal in another signal stream. With that information, I found an online digital signal processing (DSP) book, that discussed correlation functions and their uses.

Echo Detection

I knew nothing of signal processing when I started this, I was surprised that it's not really difficult from an algorithm standpoint. It does involve a fair amount of computation. Essentially, an audio signal, once normalized, consists of a bunch of amplitude values between -1 and 1 at some sampling rate. For human audible frequencies, that's commonly 44100 samples per second. The reason for this is that for perfect reproduction of a signal, you must sample it at twice the frequency you are looking to reproduce. This is the Nyquist Rate. Therefore, 44100 hz should be able to perfectly reproduce the signal up to 22050 hz, which I figured was the limit of my laptop's audio hardware. A second's worth of audio is 44100 float values between -1 and 1.

The simplest way of performing a correlation function is to load one array with the reference signal you are looking for - the ping - and another array with the recorded results. Start with the recorded array at position 0, and multiply each value in the chirp with the next $chirplength samples from the recorded audio. Sum the products of each mutiplication (essentially calculating the area under the resulting curve) and record it into a results array at position 0. Shift to position 1 in the recorded audio array, and repeat, this time storing the results in position 1 of the results array. Repeat for all positions in the recorded audio array, up until you hit the spot where the chirp length would pass the end of the recorded audio array, and stop.

What you are doing is creating a sliding window, comparing the chirp to each chunk of audio in the recorded audio array. The output of the correlation is really interesting - when the audio is a very close match, you get a large positive spike in the results that correspond to a position in the recorded audio. That spike represents a copy of your chirp in the recorded audio. The first time you see it, it's the chirp your speaker sent. After that, it's echoes of that chirp off objects.

Knowing the position of each echo, you can compute the time that it took from when the primary pulse was sent with your sample rate. That's the time the sound was in flight. Divide by two for the round trip, compare against the speed of sound, and boom, you've got distance to your target. Neat, huh?

The correlation function works best on a chirp signal - audio that changes in frequency over time. The chirp must be very short, or the echo will arrive before the chirp finishes playing. I found that chirps in the 1.5 ms range gave best results for me - it results in a minimum range of 3-4 feet. I generated the chirp in Audacity, sweeping from 4000-10000 hz at full amplitude.

Here's a recorded slice of audio, showing the chirp, some echoes and background noise.



Here's the output of the correlation function:



Once you can do it for a single chirp, you can assign an intensity to the spikes, and graph them over many pulses. This gives you a fish finder type view - in this example, time goes from top to bottom, and distance from left to right. Note the noise on the far left side of the image - that's probably caused by the speaker continuing to "ring" for a few milliseconds after the signal stops.

I chose WinPython with the PyAudio library to test with, because it comes bundled with graphing libraries and other useful things. I used a USB combined microphone/speaker intended for chat, because I could move it around and it was pretty directional.



It works - here's a sample run showing the returns with the speaker/mic starting near the wall, and then moving slowly away, then back in, etc. You can see a white line that look like peaks and valleys corresponding to the distance, along with background noise and multipath echoes. The minimum distance was about 3 feet - the maximum was about 8.

The left side of the screen represents the location of the sensor - imagine it's attached to the hull of a boat. White and grey dots represent echoes at different distances. The white line that is produced as I bring the speaker/mic away from the wall gets farther to the right, indicating more time in flight, and longer distances. As I bring it closer to the wall, it gets closer to the left side of the screen. One way to visualize this - if the sensor was mounted on a boat, the line would show the depth of water over the bottom. You also get weaker returns for sounds that have bounced around on indirect paths, or schools of fish.




Video of operation:

Sound Card Sonar Experiment 1 from Jason Bowling on Vimeo.

Next steps:

1) Implement it on a nice fast micro, probably a Stellaris Launchpad.
2) Get it working in water with new transducers
3) If possible, at some point, maybe: Sidescan. :-)

Ideas for improvement? Did you find it useful? I'd appreciate a comment if so. Thanks!

Code follows:

import pyaudio
import wave
import struct
import math
import pylab
import scipy
import numpy
import matplotlib
import os
import scipy.io.wavfile
import threading
import datetime
import time
from Tkinter import *

class SonarDisplay:
     #Displays greyscale lines to visualize the computed echos from the sonar pings. Time and distance increase as you go from left to right. It's like a fishfinder rotated on its side.

     def __init__(self):
        self.SCREENHEIGHT = 1000
        self.SCREENWIDTH = 1024
        self.SCREENBLOCKSIZE = 2
        master = Tk()
        self.w = Canvas(master, width=self.SCREENWIDTH, height=self.SCREENHEIGHT, background="black")
        self.w.pack()

     def getWidth(self):
       return self.SCREENWIDTH

     def getHeight(self):
       return self.SCREENHEIGHT

     def getBlockSize(self):
       return self.SCREENBLOCKSIZE

     def plotLine(self, result, y):
        x = 0
        numJunkSamples = 250
        intensityLowerThreshold = .26 #peaks lower than this don't get plotted, since they are probably just background noise
 
        #on my machine, the first couple hundred samples are super noisy, so I strip them off
        #these are the first samples immediately after the peak that results from the mic hearing the ping being sent
        #Until the echo is clear of these (about 3 feet in air) it is hard to pull out of the noise, so that is minimum range
 
        result = result[numJunkSamples:]
        limit = self.SCREENWIDTH / self.SCREENBLOCKSIZE
 
        if limit < len(result):
          limit = len(result) 
 
        for a in range(0,len(result)):
          intensity = 0
          if (result[a] > intensityLowerThreshold):
             intensity = result[a] * 255.0
          if (intensity > 255):
               intensity = 255
          if (intensity < 0):
               intensity = 0
     
          rgb = intensity, intensity, intensity
          Hex = '#%02x%02x%02x' % rgb
          self.w.create_rectangle(x, y, x + self.SCREENBLOCKSIZE, y + self.SCREENBLOCKSIZE, fill=str(Hex), outline=str(Hex))
          x = x + self.SCREENBLOCKSIZE    
        self.w.update()

class Sonar:

     #Mic initialization and audio recording code was taken from this example on StackOverflow: http://stackoverflow.com/questions/4160175/detect-tap-with-pyaudio-from-live-mic
     #Playback code based on Pyaudio documentation

     def callback(self, in_data, frame_count, time_info, status):
       data = self.wf.readframes(frame_count)
       return (data, pyaudio.paContinue)

     def __init__(self):
        self.FORMAT = pyaudio.paInt16 
        self.SHORT_NORMALIZE = (1.0/32768.0)
        CHANNELS = 2
        self.RATE = 44100  
        INPUT_BLOCK_TIME = .20
        self.INPUT_FRAMES_PER_BLOCK = int(self.RATE*INPUT_BLOCK_TIME)
        WAVFILE = "test.wav"

        print "Initializing sonar object..."
        #Load chirp wavefile
        self.wf = wave.open(WAVFILE, 'rb')        
        #init pyaudio
        self.pa = pyaudio.PyAudio()

        #identify mic device
        self.device_index = None            
        for i in range( self.pa.get_device_count() ):     
           devinfo = self.pa.get_device_info_by_index(i)   
           print( "Device %d: %s"%(i,devinfo["name"]) )

        for keyword in ["mic","input"]:
           if keyword in devinfo["name"].lower():
              print( "Found an input: device %d - %s"%(i,devinfo["name"]) )
              self.device_index = 1    # I selected a specific mic - I needed the USB mic. You can select an input device from the list that prints.
             
        if self.device_index == None:
           print( "No preferred input found; using default input device." )

        # open output stream using callback 
        self.stream = self.pa.open(format=self.pa.get_format_from_width(self.wf.getsampwidth()),
                channels=self.wf.getnchannels(),
                rate=self.wf.getframerate(),
                output=True,
                stream_callback=self.callback)
        
        notNormalized = []
        self.chirp = []

        #read in chirp wav file to correlate against
        srate, notNormalized = scipy.io.wavfile.read(WAVFILE)
       
        for sample in notNormalized:
           # sample is a signed short in +/- 32768. 
           # normalize it to 1.0
           n = sample * self.SHORT_NORMALIZE
           self.chirp.append(n)


     def ping(self):
        #send ping of sound

        #set up input stream
        self.istream = self.pa.open(   format = self.FORMAT,
                            channels = 1,  #The USB mic is only mono
                            rate = self.RATE,
                            input = True,
                            input_device_index = self.device_index,
                            frames_per_buffer = self.INPUT_FRAMES_PER_BLOCK)

           
        # start the stream 
        self.stream.start_stream()
    
        # wait for stream to finish 
        while self.stream.is_active():
            pass

        self.stream.stop_stream()
        #reset wave file for next ping
        self.wf.rewind()

     def listen(self):
       #record a sort section of sound to record the returned echo

       self.samples = []
       
       try:
            block = self.istream.read(self.INPUT_FRAMES_PER_BLOCK)
       except (IOError) as e:
            # Something bad happened during recording
            print( "(%d) Error recording: %s"%(self.errorcount,e) )
            
       count = len(block)/2
       format = "%dh"%(count)
       shorts = struct.unpack( format, block )
       for sample in shorts:
           # sample is a signed short in +/- 32768. 
           # normalize it to 1.0
           n = sample * self.SHORT_NORMALIZE
           self.samples.append(n)

       self.istream.close()
       #Uncomment these lines to graph the samples. Useful for debugging.
       #matplotlib.pyplot.plot(self.samples)
       #matplotlib.pyplot.show()

     def correlate(self):
       
    #perform correlation by multiplying the signal by the chirp, then shifting over one sample and doing it again. Highest peaks correspond to best matches of original signal.
    #Highest peak will be when the mic picks up the speaker sending the pings. Then secondary peaks represent echoes.
    
       #my audio system has a significant delay between when you send audio and when it plays. As such, we send the wav, start recording, and get a large number of
       #samples before we hear the ping. That just slows correlation down, so we remove it. This probably requires tuning between different systems. Safest, but slowest, is zero.       

       junkThreshold = 5000  
       self.samples = self.samples[junkThreshold:]
 
       self.result = []

       for offset in range(0, len(self.samples)-len(self.chirp)):
           temp = 0
           for a in range(0, len(self.chirp)):
               temp = temp + (self.chirp[a] * self.samples[a + offset])
    
           self.result.append(temp)
        

     def clip(self):
       #highest peak is the primary pulse. We don't need the audio before that, or the chirp itself. Strip it + chirpLength off. Remaining highest peaks are echoes.
       largest = 0
       peak1 = 0        
       for c in range(len(self.result)):
            if (self.result[c] > largest):
               largest = self.result[c]
               peak1 = c
                 
       self.result = self.result[peak1:]
       return self.result


#main control code

#initialize sonar and display  
sonar = Sonar()
display = SonarDisplay()

screenHeight = display.getHeight()
screenWidth = display.getWidth()
screenBlockSize = display.getBlockSize()   #size of each row in pixels

#each ping results in an array of correlated values with peaks corresponding to the time that echoes were recieved.
#send pings until we have reached the bottom of the display. An improvement would be to add scrolling.
y = 0

for a in range(0,screenHeight-screenBlockSize,screenBlockSize):

 sonar.ping()
 sonar.listen()
 sonar.correlate()
 result = sonar.clip()     
 display.plotLine(result, y)  
 y = y + screenBlockSize
 
 

Thursday, December 19, 2013

Android Rover Block DIagram

I'm shelving the rover for a while to move on to a different project, and a coworker and his son are going to tinker with it for a while. I drew up some notes on the hardware and present them here in case they are useful to anyone.


Sunday, November 10, 2013

Android Rover Client Improvements


I've made a number of improvements to the rover's Android client over the last couple weeks. These were based on feedback from a friend who drove it.

We first set up my home router to pass connections from the internet on the rover's control ports to the rover on my home network. That enabled my friend to connect from him home and drive it around my house. Seeing him remotely control it from 40 miles away was great fun. He reported that the video worked well and the rover drove well, but the client had some issues with reliability and usability on his phone, which was a different screen resolution. He also recommended I switch to a horizontal orientation to maximize screen use, and let me tinker with his ARDrone, which has a pretty nice Android client.

I first corrected a reliability problem - since the control device sends a command, and then waits for a sensor value from the rover (currently signal strength on the rover), it's important to tell quickly when the rover is no longer responding. As on the server, I set a short timeout value on the client socket, which causes an exception to be thrown if the rover doesn't respond within 2 seconds. Upon that exception, the client goes into a loop trying to connect every 2 seconds. The rover is set to time out after a second, so it should recover and start to listen again on the control port within the 2 second window the client waits for.

With that in place, even on a fairly flaky network connection, the client will do it's best to reconnect to the rover. I also added an LED indicator and signal strength indicator so you could tell when the rover was approaching the edge of the range of the WIFI.

These indicators are ImageViews which switch between different images. All the artwork is modified from images on Open Clipart, which is a very cool resource for those of us with no artistic talent.

My buddy also recommended that the app confirm a press of the exit button, and to save the last IP that was used so that the user does not have to enter it each time. I did some reading and found the Android SharedPreferences feature, which is a dead easy way to save user preferences without messing with direct file IO.

At this point, it's pretty reliable. It's been driven over the Internet several times, including over a cellular 3G connection. Over high speed Internet connections at both ends, it's surprisingly smooth and responsive. Unsurprisingly, it noticeably lags over the cell network. A more efficient video encoding scheme, like H264, would undoubtedly improve that.

I'm pleased with the result. It's been a great way to learn some Android, and how to control something over a network.

In case it's helpful to someone, here's the code.


Sunday, October 20, 2013

Android Rover Control Client Up First Drive



I was able to get the Android client up and running last night. It's a fairly simple affair, since I am definitely an Android GUI newbie, but it is working enough to drive around. Today I took it for my first drive outside my line of sight, driving strictly with the camera. Good fun!

First, I got a new laptop that has a processor that has the right support for the Intel HAXM hardware accelerated simulator. That is a dramatic improvement - it boots fast and runs quickly enough to test the video streaming and network control code. HIGHLY recommended to use it if your hardware supports it. It's worth the 20-30 minutes to set it up.

Right now, the client only has an ImageView for displaying the frames from the rover's camera, ImageButtons for the directional controls, and a SeekBar to set the motor speed. Additionally, there is a text box to enter the rover's IP address.

For momentary contact buttons, I used an OnTouchListener, with the direction being set with MotionEvent.ACTION_DOWN. The UP event calls a stop when you release the button.

I need to do a better job of handling rotation/app pause type events and make sure the socket code handles most events cleanly. I also intend to punch a couple holes in my home router so that some friends can try to drive it over the Internet.

Sunday, September 22, 2013

Android Rover Client: Dirt Simple Video Streaming, Part II

After a brief hiatus for summer in Ohio, some travel, and a whole lot of work, I've returned to the rover project and decided to write an Android client to drive the rover from. I started with trying to get the video feed working first.

You might recall that I'm using a very crude way of pulling video from the rover. The Android phone that controls the rover is running IP Webcam, and the client just pulls static JPEG images and displays them. This was very easy on the PC Java client, but there was a little bit of a challenge to overcome getting it to work on another Android phone.

The GUI components on Android run in a standard Activity. You can't do any time consuming work in the GUI thread or the OS will shut it down to enforce a reasonable user experience. My initial thought was to launch a new thread and repeatedly download the image in another thread and display it. I quickly found out that you can not update an ImageView from any thread except the main GUI thread, so I started looking at other approaches.

I settled on AsynchTask, which is made for precisely this sort of thing. If you want to do a time consuming background task that then interacts with the GUI, AsyncTask is a good place to start. It abstracts away the work of thread handling for you.

I started with an AsyncTask to download and display an image. I used a function that I found (reference given in the code sample below). This worked - it pulled a single image and displayed it. At that point I just needed to figure out how to wrap it in a loop, and I was good.

The trick to running AsyncTasks sequentially is to know that you can launch a function upon completion. I just had the AsyncTask call a launcher function to start another instance of itself as it completes.

If you try something like:

while (true)
     run_asynctask;

it won't work - it will try to launch them in parallel, which is not allowed. If you instead call the launcher function from the onPostExecute() of the AsyncTask, it runs sequentially. You can then add a conditional in your launcher to switch the feed on and off.

This code was tested and streams 320x240 JPEG frames from one phone to the other fairly smoothly, just like the PC client.

Code:

All this goes in the GUI thread: I start with a call from onCreate():

vidLoop();

Launcher function:

void vidLoop() 
    {
     if (connected == 1)
      {
      ImageDownloader id = new ImageDownloader();
      id.execute(vidURL); //vidURL is a String with the URL of the image you want
      }
    }



Async Task Code. This downloads an image from a URL and displays it in an ImageView called imageView1, then calls vidLoop() upon completion to do it again.



//this very useful chunk of code is from http://www.peachpit.com/articles/article.aspx?p=1823692&seqNum=3
    private class ImageDownloader 
    extends AsyncTask{
    protected void onPreExecute(){
            //Setup is done here
        }
        @Override
        protected Bitmap doInBackground(String... params) {
            //TODO Auto-generated method stub
            try{
                URL url = new URL(params[0]);
                HttpURLConnection httpCon = 
                (HttpURLConnection)url.openConnection();
                if(httpCon.getResponsepre() != 200)
                    throw new Exception("Failed to connect");
                InputStream is = httpCon.getInputStream();
                return BitmapFactory.depreStream(is);
            }catch(Exception e){
                Log.e("Image","Failed to load image",e);
            }
            return null;
        } 
        protected void onProgressUpdate(Integer... params){
            //Update a progress bar here, or ignore it, it's up to you
        }
        protected void onPostExecute(Bitmap img){
            ImageView iv = (ImageView)findViewById(R.id.imageView1);
            if(iv!=null && img !=null){
                iv.setImageBitmap(img);
                //start next image grab
                vidLoop();
            }
        }
            protected void onCancelled(){
            }
        }

Thursday, July 25, 2013

Android Rover - Alpha Code

I have gotten a request for a copy of my code. As it stands, its not pretty - it's been hacked on as I went as opposed to a proper top down design. The class design needs work. The network communication protocol is extremely wasteful - for ease of debugging since I started sending commands to the phone with a telnet client. There are plenty of improvements that I intend to do for robustness.

But it does work. You can drive around with it. So it might be of use to someone. As I make it better, I'll post updates.

The client program is pretty Spartan - it was my first stab at sending commands to the rover and getting sensor values back (signal strength, currently). The video window thread just repeatedly grabs static images from the phone running IP Webcam. You drive by enabling the motors, and then moving the mouse to the extreme edges of your screen.

The code that runs on the phone is an extension of the HelloIOIOService sample application that is provided for the IOIO.

Have fun. :-)

Android client program 

Android Service that runs on the robot and controls the IOIO


Recent lack o' progress...

A few weeks back, my Android build environment self destructed. Efforts to fix it on the Windows machine it was on were completely futile. No idea what happened - very frustrating.

Anyway. I set up the ADT on a Linux box and it's running well. I'm finishing up some code cleanup and I'll post it once completed. Then on to more sensors and maybe some navigation code.


Saturday, June 29, 2013

Video: The Rover In Action

video

A brief demo of the rover running around under WIFI control.

Link to higher quality Vimeo version: https://vimeo.com/69379853

Next step is some code cleanup on the client and server code so I can put it here. I hope it might be useful for someone.

Possibilities I am tossing around for next steps:

- Adding some sensors and allowing autonomous driving
- If I did decide to pursue the video preview frame some more, I could use the camera for some simple machine vision, and that would be cool.
- Adding GPS support and teaching it to drive to waypoints.
- An Android client so it can be driven from a phone instead of a laptop.

Saturday, June 22, 2013

Video streaming and services

Well, I found that I could get the camera preview callback function to work if I used it in an application instead of a service, but so far the approach to getting it to run from a service eludes me. I could make a separate video streaming app that runs as an application, but since IP Webcam already does a good job of that, I think I'll switch gears back to other functionality for a while.

I really like running the robot control code as a service. An application has to worry about screen orientation changes, screen blanking, and other events imposed upon it by the operating system that a service cheerfully ignores. I am not sure I want to give that up to implement my own streaming since I already have a working solution. Perhaps I'll come back to it later.

I am thinking of adding GPS support next.

Friday, June 21, 2013

Camera preview frames from a software service in Android

OK, so getting preview frames from a service in Android isn't as easy as I thought. It appears that it really doesn't want you getting preview frames unless they are being displayed visibly on a SurfaceView on the screen. I can't get my custom onPreviewFrame to fire. I'll drop back and see if I can get it to work with a simple non-service application - there is a very simple demo app that comes with the framework that I'll try next.

It must be possible - IP Webcam can run in the background - but it's not immediately obvious how.

Sunday, June 9, 2013

Integrated video streaming: Initial steps

Currently, the rover uses the excellent IPWebcam program for Android to serve video frames. The client just makes HTTP requests for an image in a tight loop. At low resolution (320x240) I'm getting 5-10 fps over my ancient WiFi. This works, and was easy to implement, but now that I've gotten the underlying architecture sorted out, and the rover driving well, I really want to integrate the video server. I expect this to be a challenge, but part of the point of the project is to learn about Android development. So here goes.

Yesterday, using examples on the web, I got a function working that opens the camera, takes a photo, and saves it to the SD card. The main challenge was doing this from a service, with no GUI - most examples are centered around giving the user a preview to aim with.

It turns out that the use of a preview surface is not optional - I tried using the camera API without one, and though it took pictures and returned byte arrays of varying sizes, the images were all black. Once I found a post describing how to set up a preview surface, it started working as expected.

My phone is set up to emit an audible shutter sound when you take a picture, and you can't turn it off. This is presumably for privacy reasons. I figured that since IP Webcam isn't emitting shutter sounds multiple times a second, it must be capturing the preview stream and converting it into images (which doesn't result in a shutter sound).

It appears that the previews are coming off the camera in YUV format, and each time a frame is available, it fires a callback function that you can define. It should just be a matter of converting the YUV image to a JPEG and then shoving it over the network.

For higher resolution, I'd need to investigate H.264 streaming, which I suspect is not trivial, so for now I am going to focus on the simpler approach.

Camera code: very alpha. This might come in handy for later to snap a high res picture of whatever the rover is looking at. I think I can use the camera object to turn on the flash LED to use as a headlight, too! :-) This code works from a service.

This code was heavily based on examples found on these sites and some others on StackOverflow:

http://p2p.wrox.com/book-professional-android-application-development-isbn-978-0-470-34471-2/72528-article-using-android-camera.html

http://handycodeworks.com/?p=19


private void takePicture()
 {
  
  Camera cam = Camera.open();
  Camera.Parameters parameters = cam.getParameters();
   
  parameters.set("jpeg-quality", 70);
  parameters.setPictureFormat(PixelFormat.JPEG);
  parameters.setPictureSize(320, 200);
  
  cam.setParameters(parameters);
  
  //So you can't take a picture without mapping it's preview to a surface. If you do, you get all black images.
  SurfaceView view = new SurfaceView(this);
  try {
  cam.setPreviewDisplay(view.getHolder());
  cam.startPreview();
  
  } catch (IOException e) {
        }
  
  //give the startPreview time to complete, or you get a black image.
  try {
  Thread.sleep(1000);
  } catch (InterruptedException e) {}
  
  //this is what gets fired on the jpeg data when a picture gets taken below
  PictureCallback mPicture = new PictureCallback() {
         @Override
         public void onPictureTaken(byte[] data, Camera cam) {
          Long d = new Date().getTime();
          
             File pictureFile = new File("/mnt/sdcard-ext/DCIM/Camera/" + d.toString() + ".jpg");
             
             if (pictureFile == null) {
              Log.d(DEBUG_TAG, "Something bad happened while writing image file.");
                 return;
             }
             try {
              Log.d(DEBUG_TAG, "Byte array: " + data.length + " bytes");
                 FileOutputStream fos = new FileOutputStream(pictureFile);
                 fos.write(data);
                 fos.close();
             } catch (FileNotFoundException e) {

             } catch (IOException e) {
             }
         }
     };
     
     cam.takePicture(null, null, mPicture);
     
     try {
   Thread.sleep(2000);
   } catch (InterruptedException e) {}
     
     cam.release();
     
   
 }

Saturday, June 8, 2013

PWM Throttle Goodness

I realized that changing the motor controller input to a pin capable of PWM was as simple as flipping the connector that plugs into the motor board and then remapping it in software. It didn't even require lighting up the soldering iron. :-)

The throttle control works great at 1000 hz PWM. It makes the rover capable of much finer control. Right now I'm driving around with the mouse, and it's  a bit clunky. At some point I might order a USB joystick now that the rover side is working well.


Now that basic control is working, I have the following goals:


  • Make a decent video showing the thing driving around.
  • Integrate the video streaming into the Android app that handles the IOIO, rather than using IP Webcam.
  • Improve power management.
  • Possibly write a client program for Android as well, so you can drive it from a phone.

Friday, June 7, 2013

PWM motor troubleshooting

Well, flush with my easy success of making the LED controlled with the PWM, I set about modifying the code to drive the motors that way. I changed the relevant digital IO pins to PWM outputs, set the duty cycle in each function that controls the motors, and.... nothing. Nothing at all. Yeah...

Some troubleshooting led me to the chart at https://github.com/ytai/ioio/wiki/Getting-To-Know-The-Board
that shows which pins can be used as PWM outputs. I made a basic error - I connected one of my 4 motor driver pins to pin 8, which is not PWM capable. The interesting part was that it didn't just disable the motor controlled by pin 8 - by trying to set that pin to be a PWM output in my program, it disabled ALL PWM functionality.

As soon as I commented out the references to pin 8, the other motors spun up just fine, and are nicely speed controlled. It's easily fixed - I'm going to swap the motor controller input on 8 to pin 11, and update the code. But it's late, and that can wait until tommorow. :-)

IOIO PWM outputs... or a network dimable LED!

The more I work with the IOIO, the more impressed I am with it. It's just a joy to code for - so well thought out and documented.

It turns out PWM is very easy. PWM (Pulse Width Modulation) refers to hardware on the IOIO that generates regular pulses of varying "on" durations. This can be used to simulate analog voltages, drive servos, or rapidly pulse the digital inputs on a motor controller to control motor speed. It's this last use that is my immediate goal.

In reading the IOIO wiki at https://github.com/ytai/ioio/wiki/PWM-Output I learned that the yellow status LED on my board is hooked to a PWM output. If you apply PWM to an LED, it dims as you change the duty cycle. Perfect. My little client program already has a JSlider component that has a value of 0-100 that's sent over the network to the rover as part of the command string. It was really easy to set up and use that value in my IOIO loop.

//initialize the pin as a PWM output
private PwmOutput led_;

//in the IOIO loop, set the duty cycle on that pin according to the value of the JSlider. Scale it to 0-1.
float dc = (float) (servoPanValue/100.0);
led_.setDutyCycle(dc); //0-1


This results in the LED being dimmed to varying brightness (off to 100%) as you move the slider. Too cool. Now I need to update my motor driver code to set up the pins as PWM outputs rather than pure digital IO to pulse the input pins on the motor controller board.

Sensor values and Metrics

Since my goal is to learn about controlling vehicles remotely over the network, I decided it was time to add some simple metrics to the client, and to return my first sensor value from the rover (aside from video).

I added some timing code to compute the video frame rate, the rate at which commands are being sent to the rover, and added a text display of the rover's current signal strength.

The control protocol is very simple. The client sends a string of commands, and the rover replies with a string of sensor values. Currently this only contains an integer representing signal strength, but will eventually include voltages from the IOIO, GPS data from the phone, orientation information from the accelerometers, etc.

Here's a quick snapshot of my little client program in action. Next step is to figure out how to use PWM on the IOIO board to control the rover's motor speed.




Thursday, May 30, 2013

It lives! First experimental drives...


I've gotten the board installed on the aluminum mounting plate, and all wired up. I've done some short experimental drives both outside and inside the house. My pug was not amused.

So far, it's driving well. I have had the rover shed a track once, but it has been otherwise reliable.

Right now it's very simple - the motors controller inputs are driven from the IOIO PWM ports, but they are just being turned fully on or off for right now. Now that I have a couple successful test drives, I have a number of improvements I want to make.

Lessons learned so far:

1) The Gravitech 3V->5V line level converter was less than satisfactory. I wired mine up according to the instructions, and connected the 3V outputs from the IOIO to the 3V inputs on the level converter board. No matter what I did, I only got 3V out the other side. I emailed their support account, but never got a response. This was disappointing. I found that the motor controller board seems to switch just fine at 3.3V logic, so I just eliminated the board from the design. 

2) Even with the very crude video streaming, there seems to be plenty of bandwidth even on my very old access point to stream and drive.

Now that I know the concept works, I have a lot of things I want to improve:

1) I need to set up my robot control app to start automatically when the phone boots to avoid a lengthy setup time when I want to drive it.

2) Variable speed, based on mouse location, is next. This will use the IOIO's PWM outputs to drive the PWM inputs on the motor controller board.

3) An Android app to drive the rover would be fun.

I'll also work to get some code and video posted. Here's some closeups of the deck and electronics mounting







Saturday, April 6, 2013

Rover5 deck design and construction


The Rover5 doesn't have a top cover, but it has plastic posts that are designed to take self-tapping screws at each corner of the chassis. I purchased a plate of .090" T6061 aluminum at a hobby shop and laid the components. After cutting it to size with an angle grinder, and cleaning up the edges on a belt sander, I rounded the corners off. I then drilled a 3/4" hole to pass the wiring from the motors and battery up through the deck. This was drilled with a step bit, a safe way to drill nice clean holes in sheet metal.


I laid out and drilled holes to attach the plate to the rover chassis, as well as holes for standoffs to mount the motor controller board and IOIO. I drilled 2 1/4" holes to mount the two power switches, as well.



At this point, I ran out of time for the day. Next step is to mount the standoffs, mount the circuit boards, and test the motors.Right now I'm using 6xAA batteries in the holder that came with the Rover 5. Eventually I intend to replace that with a LiPoly pack, but it will work fine for testing.

Sunday, March 24, 2013

Hardware planning and initial testing

My rover base and motor controller arrived, and I'm quite pleased with them. The rover base is larger than I expected and very solid. I've gotten as far as hooking up the controller to the base and manually jumpering +5V to the control pins to get the treads to spin and confirm all four channels of the controller are ok. Everything looks good.

The motor controller has two inputs per motor - a pulse width modulation input and a direction input. Each motor thus requires two inputs.

However, two motors drive each tread, and it's a safe bet that the motors on each side will be running at the same time and in the same direction, so I plan on driving each side of the vehicle from the same two IO pins.

Because the motor controller expects 5V logic inputs, and the IOIO uses 3.3V logic, I opted to use of the the cool little logic level shifter boards from Gravitech to convert them. Mouser Electronics carries them, and I got it there. Odds are it would have switched at 3.3V, but in the interest of reliability I decided it was worth the effort. I did look at the shifters from Sparkfun, which are small and inexpensive, but this one is easier to mount to the chassis, so it won.

Next step is to add the needed few lines of code to the server program that runs on phone to actually toggle the IOIO pins in response to client input, and verify I actually get 5V out the other side of the logic converter.  At that point I'll be ready to test the motor control over the network. More to follow.

Saturday, March 2, 2013

Rover platform selection



I put a fair amount of research into the robot platform. A standard approach is to modify an R/C car, but I wanted something that could maneuver in tight spaces, be durable, and capable of climbing over modest obstacles.

I selected the Dagu Rover 5 since it appears to be pretty capable. The only negative I've heard is that it can occasionally shed tracks if heavily loaded, but the videos I saw of it in action look perfect. It can also be changed to wheels later, if you prefer, so its pretty flexible. It includes shaft encoders on all 4 motors, which can tell a microprocessor how much the shaft is turning. This would be very useful if I wanted to make the rover autonomous.

Driving the motors was an interesting challenge - each is rated at 2.5 amps stall current, which is a bit over the rating of the standard LM293 controller boards. Dagu sells a $27 board specifically for this rover, with 4x 2.5 amp outputs, and circuitry to read the shaft encoders. I eventually determined that's a pretty good deal and selected it.

Here's a demo video a fellow on Youtube shot of his Rover 5 buzzing around outside.




Sunday, February 17, 2013

WIFI Signal Strength in Android

This evening I did some significant cleanup on the Android code that will control the robot/rover. I also decided it would be nice to be able to tell if you were about to drive off past your WIFI range, so I looked at how it can be measured.

 I had added a WifiManager object earlier to lock the Wifi so that it would not be throttled back by the phone. It quite reasonably does this to save power, but you don't want that to happen if you are using the wireless link to drive around, so the application requests a lock at startup and releases it on shutdown.

 If you have a WifiManager, it's easy to ask it the current state of the link, and it will return a bunch of information including signal strength in dB. If you want it to report "bars", it has a function to compute how many bars you are getting on whatever scale you prefer. Since I want a simple color coded indicator on the client control panel, I just went for a 0-5 bar scale.

 Here's the relevant code for reading signal strength and requesting/releasing WIFI locks:

import android.net.wifi.WifiManager;
import android.net.wifi.WifiManager.WifiLock;
import android.net.wifi.WifiInfo;

//owned by the class

WifiManager wifiManager = null;
WifiLock lock = null;
Integer signalStrength = 0;

//Prevent Android from throttling the wifi back to save batteries
private void obtainWifiLock()
{
if (!lock.isHeld()) 
 lock.acquire(); 
}
 
private void releaseWifiLock()
{
if (lock != null) 
     if (lock.isHeld()) 
            lock.release(); 
}
 
 
private void updateWifiStats()
{
//currently just updates the WIFI signal level
Integer numLevels = 6;
  
WifiInfo currentInfo = null;
 
currentInfo = wifiManager.getConnectionInfo();
signalStrength = wifiManager.calculateSignalLevel(currentInfo.getRssi(), numLevels);
  
}

Dirt simple video streaming

As mentioned in the last post, I found that even a very crude Java client could pull images from the Android phone's IP Webcam application at 320x200 fast enough for acceptable video. Good video streaming is a whole complex subject in itself, but I was surprised that this produced functional results. Here's standalone test code that's about as simple a streaming viewer as you are likely to find.

A simple launcher class:
import java.awt.*;
import javax.swing.*;
import java.io.*;
import java.awt.event.*; 
import javax.imageio.*;
import java.awt.image.*;

import javax.swing.event.*;

import java.net.*;
import java.io.*;

public class launcher
{


public static void main(String[] args)
{
int count = 0;

MJpegViewer b = new MJpegViewer();

JFrame frame2 = new JFrame();
frame2.setSize(600,450);
frame2.setTitle("Video Feed");
frame2.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame2.add(b);
 
//center frame on screen
//frame2.setLocationRelativeTo(null);
 
//set absolute screen position
frame2.setLocation(250, 0);
frame2.setVisible(true);

Image image = null;

try {
    URL url = new URL("http://192.168.2.104:8080/shot.jpg"); 
      
 

while (count < 10000)
 {  
    image = ImageIO.read(url);
  
 BufferedImage buffered = (BufferedImage) image;
 b.setBufferedImage(buffered);
 count = count + 1;
 }

} catch (IOException e) {}

}

}

The extended image frame:
import java.awt.*;
import javax.swing.JComponent;
import java.util.Scanner;
import javax.swing.*;
import java.awt.*;
import java.io.*;
import java.awt.event.*;
import java.awt.geom.*;
import java.awt.image.*;
import javax.imageio.*;
import java.lang.Math;

public class MJpegViewer extends JComponent {

private static BufferedImage img = null;

public void setBufferedImage(BufferedImage newImg)
{
img = newImg;
repaint();
}



public void paintComponent(Graphics g) {
           
  //System.out.println("In paintComponent"); 
  g.drawImage(img, 0, 0, 600, 450, null);
      }


}

Saturday, February 9, 2013

Streaming Video From the Rover

While experimenting with the excellent IP Webcam video server for Android, I found that the standard web browser and programs like VLC were intolerant of disconnects of the WiFi. Each time it disconnected, it would require manual intervention to reconnect, so I started looking at what it would take to make a very simple viewer in my Java client program.

It turns out to be relatively easy, though there is plenty of room for improvement. I found that even on my little netbook, I could get smooth video at 320x200 from the Android server just by repeatedly pulling down a static JPG from the IP Webcam and shoving each image into a simple extended JContainer. I expected it to be jerky, but it worked surprisingly well.

Note: I first tried a JLabel, which is the simplest way to load a picture in Java that I'm aware of. It was VERY slow. Not recommended. :-)

While it would likely be a bit faster to read the MJPEG stream it can provide, this was much easier to get working, and should be fine for what I want to do with it.

Once I had it working in a test program, I added a dedicated thread to my client program to handle the video connection and display. So now the client program consists of a GUI thread, a data/command thread, and a video thread.

The video thread launches and goes into a nested loop. The outer loop is keyed to a flag that gets set when the user requests a connection to the robot. The inner loop grabs the images from the Android server. If the connection drops or an image retrieval error occurs, it breaks out of the inner loop, resets and continues retrying until the user requests a disconnect. As a result, like the command/sensor thread, it automatically reconnects after the WiFi connects back up, and resumes the video feed.

The handling of the loss of the network connection has been consistently the hardest part of this so far, but that nested loop approach has worked well in both threads to automatically re-establish the connection.


Communications and Control - Initial Planning

I've built some simple robots, mostly of the "drive around and use IR or ultrasound to not run into things" variety. Once I had a basic back-and-forth network socket program working, it was time to think about overall program flow.

A traditional robotics paradigm is the "Sense, Think, Act" cycle. A robot takes input from it's sensors, performs processing on them to try to identify the best course of action, and then commands the robot's actuators to do something. The process then repeats.


At the moment, I'm not building a robot in the typical sense. That's because a human is in the loop, making the decisions based on sensor input. I wanted to make sure that the platform could be used as a robot, just by changing the software on the phone, but right now I'm interested in building a reasonably robust remotely operated vehicle. I'll continue to use "robot" because it's convenient. :-)

On reflection, I decided that a remotely operated vehicle can follow the same sense-think-act cycle. The primary difference is that the thinking is done off-vehicle, by the human operator.


With that understanding, I started thinking about how the communications will work. I intend to use the excellent IPCam program to stream the video from the phone. It works great, is robust, can run as a service, and can auto-start when the phone boots up. 

The rest of the program will run in a program based on the IOIO service example, described in a previous post. Thus, the video stream will be separate from the command and sensor stream.


I've run a test with the IOIO service and IPCam, and found that streaming video and sending sensor/command data back and forth at the same time works fine. I just used a browser for the video stream, and my little Java program for the sensor/command data. 

I did find that neither the browser or VLC will attempt to reconnect to the video server on the phone if the connection drops. I may decide to integrate a simple MJPEG viewer into the Java client to make it reconnect automatically, as the command/sensor connection does. Doing so would also be a good step towards allowing control from a phone or tablet, rather than a laptop. 







Sunday, January 27, 2013

Remotely Operated Rover - Software Design and Networking

Once I had the IOIO working from the sample application, I started looking into how to control it across the network. Fairly early on, I needed to decide if the phone was going to act as the server, or the client. I figured there were advantages each way:

- If the phone was the server, it would be pretty easy to allow it to accept connections  from a client anywhere on the internet. My home router could pass the connection through to the port the phone was listening on, and you'd be off and running.

- If the phone was the client, it would be easier to make work over the 3G/4G cell network. It would reach out using whatever network connection it had (cell, wifi, either one) and connect to a fixed server, which would control it. It would be difficult or impossible to use the cell network if the phone was running the server, since the carriers almost certainly have firewalls in place.

I eventually settled on option one.. the phone would be the server. A client would connect to it and recieve sensor data, and issue commands to the phone to use actuators on the rover.

Next came a client, and control protocols. I wanted something fast - sensor data and control commands should be sent promptly to give a smooth driving experience. That ruled out a simple CGI/webserver sort of arrangement.  A web browser could act as a client only if something like Ajax was used to stream the commands. It would be slick to do that - a client computer likely already had a web browser. But I decided against it on several grounds:

- I don't know anything about Ajax. Like, nothing at all.
- It would complicate writing the server
- I wanted something as fast as possible, and I figured I could write a tighter custom protocol that would require less parsing

So I settled on a simple socket server running on the phone, and tested it with a Telnet client on Linux. I figured a proper client program could come later.

With that decision made, I started looking at socket programming in Android and Java. I had never done sockets before, so it was a good opportunity to learn some client/server stuff. It became apparent that I would need to break my network communications off into another thread to avoid blocking the server when it was waiting for input. If you happen to be using a GUI, this is particularly important since your GUI will stop responding during the blocking operation. Android will actually clobber your app if it blocks the GUI thread for more than a fraction of a second to ensure a good user experience.

So I decided on three threads:

1) The "master" thread, based on the HelloIOIOService example code
2) The IOIO looper thread
3) The network communication thread.

Threads in Java are actually pretty easy to spawn off, and child threads can access variables and functions of the parent. I am fairly new to Java, but it's not difficult using examples on StackOverflow and tutorial sites. It was fairly easy to spawn a thread, open a socket, listen on a port, and then exchange data with the master thread.

The tricky part was to make it work reliably. I wanted a continuous stream of sensor data and commands flowing between the client and server, a few times a second. So several issues you run into are:

1) You need to properly handle a client politely disconnecting and wait for new connections.
2) You need to handle a client just vanishing, rudely, detect it, and set up for when the client comes back.
3) You need to update the master thread so it can tell the IOIO looper what to do.
4) You need to properly handle the network dropping out from under the whole mess.

The first two proved challenging. I ran into a problem where the client could connect, and begin bouncing data back and forth, and run for several minutes. It would then crash. I eventually figured out that the Droid X2 has a weird problem - when running version 2.3.5, it will disconnect from my WiFi every 5 minutes or so, for about 5 seconds. I flashed it with a new 2.3.5 ROM called Eclipse, which is very nice. The problem persists, however. I'm convinced that it's an issue in the 2.3.5 kernel, which I can't change because of the Droid's encrypted boot loader. Thanks, Motorola.

On reflection, though, I realized this was actually a great way to make sure my code was robust - both the client and server needed to detect when the wireless connection dropped out and recover, gracefully and quickly. That's working fairly well now - both ends usually detect it and they reconnect after a few seconds. This would be a real problem for an aerial or underwater vehicle, but a ground rover can just stop and wait for the connection to come back up. Either way, it's excellent practice for coding on the unreliable internet.

Once the code is done, I'll post it for anyone looking to do something similar, but it's not as reliable as I would like at the moment.

Remotely operated rover - Andoid and IOIO Basics

When I first started the project, I had never written an Android program before. I've built simple Atmel-AVR and Basic Stamp robots, and usually the first thing you do is make sure you can blink an LED from the microprocessor - it's "Hello, world" for micros. First, I needed to get the compiler up and running for Android, and understand something about how Android works.

I started with the excellent Android dev tutorials and worked through the first few programs. I then followed the tutorial at Sparkfun to get the IOIO sample projects working, and tinkered with them a bit. I found that the IOIO worked great on my Galaxy Nexus and Kyros 7127. My older Droid X2 needs the USB connection set to "charge only" to work. I plan on using the Droid for the rover. I am pretty impressed with the design of the development kit for Android, and extremely impressed with the IOIO. A great deal of thought has gone into making it easy to program for and interface things with.

 I first experimented with the Hello IOIO application, which runs a GUI application and a thread to talk to the IOIO. It worked fine, and is a great way to test the IOIO on your phone. The Hello application let's you turn an LED on the IOIO on and off from the phone.



 The sample programs also include a sample service, that runs in the background and only communicates with the user via statuses. I decided this was the way to go for controlling a rover. Android apps have life cycle that you must track to deal with incoming calls, screen rotation, and other events. That actually tears down the program and restarts it, and your program has to deal with it. This is not ideal for a realtime control application.

The service makes it a little trickier to provide feedback and debug, but you can use the logging feature to get what you need. A robot or rover will be talking to the user over the network anyway, right? :-)

 The next step was to decide how to handle the networking aspects, which proved to be very interesting. That will be the topic of the next post.

Remotely operated rover with Android and IOIO - Goals

Hello, interested reader. I have benefited so much from other people's blog posts, both in my hobbies and my professional life, that I decided it was high time I contributed a little. I hope that someone finds it useful.

I've been interested in remotely operated vehicles for some time -  projects like those on DIY Drones, OpenROV, and plenty of others are absolutely fascinating to me. I like the idea of being able to send a machine into inaccessible areas for science or exploration. I decided to focus on a network-driven rover at first, with the intent of learning what it takes to drive something over a network.

My intend is to document that process here. I've seen similar projects on the web, but I've never seen the design process documented - the tradeoffs, the software architechture, the naive assumptions that proved wrong. So that's what I'll try to do.

The idea was to send commands from a computer or tablet to an Android phone, which would interface to the very cool IOIO board. Android phones have the advantage of packing a lot of processing power into a small, low power package, along with numerous sensors. That makes it a pretty attractive platform for a remotely operated vehicle (ROV) or robot.

I wanted to be able to send back sensor data from the rover, such as video, voltage levels, accelerometers, GPS data, etc. and display them on a simple console. So on the network, the command traffic would look like:

rover sends current sensor data to console
console sends back commands (turn on motors, etc)
repeat

Video would be handled on a separate connection.

This seems simple. If you've ever done socket and thread programming, it probably is. If you've never done that, or written a program for Android, it's not so simple, and it's an excellent learning experience. It's also a ton of fun.. getting an LED to turn on in response to a command over the network the first time is seriously cool.

With that basic overview, future posts will detail what I'm learning as I go, what works, what didn't, etc.