Sunday, June 9, 2013

Integrated video streaming: Initial steps

Currently, the rover uses the excellent IPWebcam program for Android to serve video frames. The client just makes HTTP requests for an image in a tight loop. At low resolution (320x240) I'm getting 5-10 fps over my ancient WiFi. This works, and was easy to implement, but now that I've gotten the underlying architecture sorted out, and the rover driving well, I really want to integrate the video server. I expect this to be a challenge, but part of the point of the project is to learn about Android development. So here goes.

Yesterday, using examples on the web, I got a function working that opens the camera, takes a photo, and saves it to the SD card. The main challenge was doing this from a service, with no GUI - most examples are centered around giving the user a preview to aim with.

It turns out that the use of a preview surface is not optional - I tried using the camera API without one, and though it took pictures and returned byte arrays of varying sizes, the images were all black. Once I found a post describing how to set up a preview surface, it started working as expected.

My phone is set up to emit an audible shutter sound when you take a picture, and you can't turn it off. This is presumably for privacy reasons. I figured that since IP Webcam isn't emitting shutter sounds multiple times a second, it must be capturing the preview stream and converting it into images (which doesn't result in a shutter sound).

It appears that the previews are coming off the camera in YUV format, and each time a frame is available, it fires a callback function that you can define. It should just be a matter of converting the YUV image to a JPEG and then shoving it over the network.

For higher resolution, I'd need to investigate H.264 streaming, which I suspect is not trivial, so for now I am going to focus on the simpler approach.

Camera code: very alpha. This might come in handy for later to snap a high res picture of whatever the rover is looking at. I think I can use the camera object to turn on the flash LED to use as a headlight, too! :-) This code works from a service.

This code was heavily based on examples found on these sites and some others on StackOverflow:

private void takePicture()
  Camera cam =;
  Camera.Parameters parameters = cam.getParameters();
  parameters.set("jpeg-quality", 70);
  parameters.setPictureSize(320, 200);
  //So you can't take a picture without mapping it's preview to a surface. If you do, you get all black images.
  SurfaceView view = new SurfaceView(this);
  try {
  } catch (IOException e) {
  //give the startPreview time to complete, or you get a black image.
  try {
  } catch (InterruptedException e) {}
  //this is what gets fired on the jpeg data when a picture gets taken below
  PictureCallback mPicture = new PictureCallback() {
         public void onPictureTaken(byte[] data, Camera cam) {
          Long d = new Date().getTime();
             File pictureFile = new File("/mnt/sdcard-ext/DCIM/Camera/" + d.toString() + ".jpg");
             if (pictureFile == null) {
              Log.d(DEBUG_TAG, "Something bad happened while writing image file.");
             try {
              Log.d(DEBUG_TAG, "Byte array: " + data.length + " bytes");
                 FileOutputStream fos = new FileOutputStream(pictureFile);
             } catch (FileNotFoundException e) {

             } catch (IOException e) {
     cam.takePicture(null, null, mPicture);
     try {
   } catch (InterruptedException e) {}

No comments:

Post a Comment