Saturday, October 31, 2015

Raspberry Pi Wifi Rover on Dagu Rover 5 Chassis


This is the second revision of my Wifi controlled rover. The first used an old Android phone and an IOIO board as a way to learn some Android programming and figure out how to control a vehicle over the network. It worked pretty well - a friend drove it across the internet from 40 miles away, and I learned a lot.

I did figure out that I didn't like Android for a robotics platform, since it is so highly optimized for GUI use. Keeping a program running at high priority in the background on an Android device isn't trivial, because it's not designed for that. You have to assume your program can be interrupted and resumed at any time. I decided that even though the phone came with all sorts of cool sensors and was extremely compact, I wanted the control over what was happening that comes with Linux.

My intent is to use this rover as a testbed for systems I will eventually install in an underwater remotely operated vehicle. It's also a lot of fun to drive. I wanted to document the overall design here - it has been done lots and lots of times, but a detailed writeup might be useful to someone.

So here we go. This is intended to show you one possible way to do it, and the thought processes that went into the design. It can all be improved.


Dagu Rover 5 chassis. This thing is pretty awesome, but I have a problem with mine shedding tracks occasionally that I have not been able to fix. I understand this was fixed in later versions than mine.

A motor controller capable of handling the current of all four motors. The stall current on each motor is 2.5A, and there are four of them.

A Raspberry Pi B+ and camera module and wifi dongle. Probably going to upgrade the dongle to something with a real antenna soon, as range is limited.

A 2200 mah LiPO battery from my quadcopter, providing 12.6V.  LiPo batteries are a signifcant fire hazard if not handled properly. The rover currently has no method to automatically kill power when pack voltage drops - consider a safer battery chemistry, like NiMH, if you are unfamiliar with the risks inherent in big unprotected LiPo packs.  An excellent guide is here.

At the very least, a low voltage alarm like those commonly used in RC aircraft is a must.

A battery eliminator circuit like those used in RC aircraft to generate a nice steady 5 volts from the 12.6V LiPo pack to power the Raspberry Pi and motor board.

Network Design

The rover starts up an access point and also starts two servers on the Raspberry Pi. Each listens on a different port. Full details on the software configuration are below.

An Android device connects to the access point and is issued an IP address. The IP address of the rover is fixed - it's acting just like a router for your home internet. This makes it very easy to take the router somewhere and run it with no additional infrastructure. However, it makes it harder to run over the internet. If that's your goal, it's better to just connect your rover to an existing Wifi network so that traffic can be routed to it from anywhere. I may add a switch later that allows me to flip between these modes.

My ROV will be designed to take to places with no infrastructure, and I wanted to be able to easily take the rover to show friends, so I chose to make the rover the access point.

I currently have the rover configured to act as an access point and hand out IP addresses in the 192.168.42.x range with no DNS or default gateway. The rover itself is on

Software Design

A traditional robotics paradigm is the "Sense, Think, Act" cycle. A robot takes input from it's sensors, performs processing on them to try to identify the best course of action, and then commands the robot's actuators to do something. The process then repeats.

We're not building a robot in the typical sense. That's because a human is in the loop, making the decisions based on sensor input. I wanted to make sure that the platform could be used as a robot, just by changing the software on the server, but right now I'm interested in building a reasonably robust remotely operated vehicle rather than something autonomous.

On reflection, I decided that a remotely operated vehicle can follow the same sense-think-act cycle. The primary difference is that the thinking is done off-vehicle, by the human operator.

I wanted to be able to send back sensor data from the rover, such as video, voltage levels, accelerometers, GPS data, etc. and display them on a simple console. So on the network, the command traffic would look like:

rover sends current sensor data to console
console sends back commands (turn on motors, etc)

Video would be handled on a separate connection, on it's own thread.

Currently, I'm not sending back any data from sensors. I will detail plans for that in the "Next Steps" below.

The server sets the appropriate IO pins, which drives the motor controller board. My rover has 4 motors, each controlled by a direction line and an enable line.

If the timeout value is exceeded, the server shuts down the motors, resets and waits for another connection,

The Python program at the end of this post implements this. Sending the full string defining the direction is horrendously inefficient - in the next revision of the client program I'll reduce that to, say, a single character. I originally did it this way to aid in debugging the client, and never got around to fixing it.

Client Program

The client I wrote is fairly simple. It rapidly makes HTTP requests to get an updated JPEG image from the rover, and updates the screen. A separate thread sends commands and gets a fake sensor value back. It attempts to reconnect when the connection is lost.

Doing the video this was is crude and eats a lot of network bandwidth compared to something like H.264, but it's easy to implement and actually works pretty well at 320x240 and 640x480.

Low(er) Lag Video Streaming on the Raspberry Pi

There are a number of tutorials for using the raspi-still command to grab a still frame and shove it across the network via a couple methods. These work well for a stream that can tolerate a lag, but it results in a delay of up to a second and the framerates are low. This is due to an inherent delay in the raspi-still program - it's not designed for that.

I got much better results using the Video for Linux (V4L) driver and MJPG-Streamer.It took some doing - you first have to compile the V4L driver. Good instructions are available here and here.

I ran into a problem getting mine to compile. I got an error, "undefined reference to symbol 'clock_gettime'". The solution was found here.

A great tutorial for compiling mjpg-streamer is here.

While compiling mjpg-streamer, I ran into a kernel specific problem with kernel version 3.18 . The solution was found here.

I use this command to launch the video server on port 6001.

/usr/local/bin/mjpg_streamer -i "/usr/local/lib/ -n -f 15 -q 80 -r 320x240" -o "/usr/local/lib/ -p 6001 -w /usr/local/www"

Access Point Configuration

One way to turn your Raspberry Pi into an access point is to use hostapd and dhcpd.

The Edimax WiFi dongle is not supported by the stock hostapd binary that you get with apt-get install. Dave Conroy has figured out how to make it work - he has a great document describing the process here (starts at the Prerequisites section). I used that to get it working, and some of the configuration options described on Adafruit's tutorial. My dhcpd.conf  file and hostapd.conf file are below.


# Sample configuration file for ISC dhcpd for Debian

# The ddns-updates-style parameter controls whether or not the server will
# attempt to do a DNS update when a lease is confirmed. We default to the
# behavior of the version 2 packages ('none', since DHCP v2 didn't
# have support for DDNS.)
ddns-update-style none;

# option definitions common to all supported networks...
#option domain-name "";
#option domain-name-servers,;

default-lease-time 600;
max-lease-time 7200;

# If this DHCP server is the official DHCP server for the local
# network, the authoritative directive should be uncommented.

# Use this to send dhcp log messages to a different log file (you also
# have to hack syslog.conf to complete the redirection).
log-facility local7;

# No service will be given on this subnet, but declaring it helps the 
# DHCP server to understand the network topology.

#subnet netmask {

# This is a very basic subnet declaration.

#subnet netmask {
#  range;
#  option routers,;

# This declaration allows BOOTP clients to get dynamic addresses,
# which we don't really recommend.

#subnet netmask {
#  range dynamic-bootp;
#  option broadcast-address;
#  option routers;

# A slightly different configuration for an internal subnet.
#subnet netmask {
#  range;
#  option domain-name-servers;
#  option domain-name "";
#  option routers;
#  option broadcast-address;
#  default-lease-time 600;
#  max-lease-time 7200;

# Hosts which require special configuration options can be listed in
# host statements.   If no address is specified, the address will be
# allocated dynamically (if possible), but the host-specific information
# will still come from the host declaration.

#host passacaglia {
#  hardware ethernet 0:0:c0:5d:bd:95;
#  filename "vmunix.passacaglia";
#  server-name "";

# Fixed IP addresses can also be specified for hosts.   These addresses
# should not also be listed as being available for dynamic assignment.
# Hosts for which fixed IP addresses have been specified can boot using
# BOOTP or DHCP.   Hosts for which no fixed address is specified can only
# be booted with DHCP, unless there is an address range on the subnet
# to which a BOOTP client is connected which has the dynamic-bootp flag
# set.
#host fantasia {
#  hardware ethernet 08:00:07:26:c0:a5;
#  fixed-address;

# You can declare a class of clients and then do address allocation
# based on that.   The example below shows a case where all clients
# in a certain class get addresses on the 10.17.224/24 subnet, and all
# other clients get addresses on the 10.0.29/24 subnet.

#class "foo" {
#  match if substring (option vendor-class-identifier, 0, 4) = "SUNW";

#shared-network 224-29 {
#  subnet netmask {
#    option routers;
#  }
#  subnet netmask {
#    option routers;
#  }
#  pool {
#    allow members of "foo";
#    range;
#  }
#  pool {
#    deny members of "foo";
#    range;
#  }

subnet netmask {
 option broadcast-address;
 option routers;
 default-lease-time 600;
 max-lease-time 7200;
 option domain-name "local";
 option domain-name-servers,;


#  this enables the 802.11n speeds and capabilities
The following commands are in a small script, /home/pi/startap, to start the dhcp server and hostapd. sudo service isc-dhcp-server start sudo hostapd /etc/hostapd/hostapd.conf &

Automatic startup

There are a number of ways to do this, but I decided the simplest way was to make small scripts to start each subsystem and then launch the from /etc/rc.local. I appended these commands to /etc/rc.local:

/home/pi/startAP &
/home/pi/ &
/home/pi/startVidServer &

Next Steps

I intend to add an Arduino that can communicate via USB to gather sensor data such as pack voltage. Ideally, the Arduino could control power to the Raspberry Pi to allow a complete shutdown. Even if you shut down the Raspberry Pi via a shutdown -h, it will still draw significant power while halted. That's not good - you need to be able to kill power when the pack is dead. I intend to design this and test it prior to using it in the ROV.

It needs some big honkin' bright lights. Just because.

A Sharp IR sensor or ultrasonic range finder would be cool to have and would allow for simple autonomous behavior, as well as being useful to a human operator.

Control server code

#!/usr/bin/env python

##example command set: true,stop,75
import socket
import sys
import traceback
import time
import syslog
import RPi.GPIO as GPIO
import time

GPIO.setup(18, GPIO.OUT)
GPIO.setup(23, GPIO.OUT)
GPIO.setup(24, GPIO.OUT)
GPIO.setup(22, GPIO.OUT)
GPIO.setup(27, GPIO.OUT)

pwm = GPIO.PWM(18, 1000)

##pin 22 = back left, true = reverse
##pin 23 = front left, true = reverse
##pin 24 = back right, true = reverse
##pin 27 = front right, true = reverse

value = 0

syslog.syslog('Rover: Server starting....')

host = ''
port = 6000
backlog = 1
size = 4096
count = 0

while 1:
 syslog.syslog("Rover: Waiting for connection...")
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)


        client, address = s.accept()

        syslog.syslog("Rover: Got client connection...")
 count = 0
 while (1):
   clientReq = client.recv(size)
   syslog.syslog("Rover: Socket error")

  if (clientReq == ""):
   syslog.syslog("Rover: Connection broken.")

  parsedCommands = clientReq.split(',')
  ##parsedCommands[0] is motorEnabled
  ##parsedCommands[1] is direction
  ##parsedCommands[2] is integer 0-100 representing throttle


  if (parsedCommands[1] == "forward"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "reverse"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "rotateRight"):
   value = int(parsedCommands[2])

  if (parsedCommands[1] == "rotateLeft"):
   value = int(parsedCommands[2])

  response = str(5) + "\n"
  count = count + 1
        syslog.syslog("Rover: Shutting down server socket.")

Saturday, June 27, 2015

Sonar Development: Towards Something That Works In Water

Bathymetric survey data from a NOAA ship.

After a few months of distractions to prepare for a new kid, build a quadcopter, and work a bunch, I'm trying to get my sonar project moving forward. As documented in previous articles, I've got a simple digital sonar working in air. It was a simple way to test the echo detection algorithms. I'm convinced if I can figure out a way to use piezo transducers to transmit sound in the water, I can make it work.

Previous installments:

Audible Frequency Chirp Sonar on the Stellaris Launchpad

Initial Experiments - Sonar in air with a conferencing speaker mic and Python

So the next challenge is how to mount the piezo element and efficiently couple the sound to the surrounding water. One way to do this appears to be to pot the transducer in a potting compound that closely matches the density of water.

This article from NOAA on building hydrophones for listening to whales details one way to pot a transducer, and also includes a high gain amplifier circuit. My current plan is to build one, and figure out how to get the ADC on the Launchpad reading the audio. From there, I can make a transmit circuit. The challenges will likely be in acoustic coupling, transducer selection, and getting enough power into the water to travel a reasonable distance.

I considered using piezo disks, but I found that getting any sort of output from them at all requires them being mounted at either their edges or nodal points in a resonant cavity known as a Helmholtz chamber. I don't think I can manufacture one to the precision needed for the small size. I'm going to work first with cylindrical piezo units as used in the hydrophone above.

I intend to try one with a resonant frequency in the audible range - that's not going to result in very good resolution, but should be easier to debug since I can hear it and use PC audio equipment to measure it. Once that works, I'll switch to higher frequencies.

The next step is to build a functioning hydrophone with a piezo element, get it working with the op-amp, and get that feeding into the ADC of the Launchpad. That will complete the receiver side, and test the methods of acoustically coupling the transducer to the water.

Sunday, June 14, 2015

F450 Quadcopter Mods: Walkera G-2D Gimbal and XIaomi Yi Camera

I built my F450 with aerial video in mind. Once I got it flying, it was time to select a camera and gimbal. 

The camera needs to be able to record at high framerates to reduce the "jello" effect of rolling shutter. If you try to strap a cheap keychain camera to the frame of your quad, it is very likely that the result will be a garbled mess of distortion. This is because the CMOS sensors in those cameras scan each frame into memory over a small period of time. Vibration causes the frame to move as it is being captured. 

Additionally, even if you get the vibration under control, the rapid movements in all directions as the quad flies around will make you ill. It's not a lot of fun to watch. 

The solution is a camera that can record at 60 fps and a motorized gimbal to compensate for the motion of the quadcopter and keep the camera level. There are gimbals that use servo motors, but the best use brushless motors, which are quiet and smooth. They nearly instantly compensate for the motion in pitch and roll that occurs from pilot inputs and wind gusts.

I selected the Xiaomi Yi camera. This has the same imaging sensor as a GoPro without some of the frills, and is much less expensive. They are currently available on Amazon Prime for $88. The don't come with a case, or even a lens cap. The Android version of the app is rather untrustworthy looking - it is currently distributed off of a file sharing site I normally associate with pirated software, rather than from the company's website. "Here! Run this random APK from the Internet on your phone! It will be fine!"

Yeah. I dug out an old phone that doesn't have access to any of my important stuff and used that. I used the app to set up the video mode (60 fps at 1080p) and timelapse mode (still frame every 3 seconds). You can toggle between these modes with the camera's button - you really only need the app once.

I also ordered a Walkera G-2D 2-axis gimbal. This only compensates for pitch and roll, but uncommanded yaw motions don't seem to be much of a problem. I am extremely pleased with this gimbal for the money. It has an onboard regulator, so you can run it straight off your 3S lipo pack. I connected it to my main power line on the quad and it fired right up. It supports the use of auxilliary channels on your receiver to aim it in roll and/or pitch, but it doesn't require it - you can set the tilt and roll angle with a couple of trim pots and leave it alone, and it requires no connection to your receiver. It even comes with a small tool to adjust the pots with and the needed Allen keys. It worked right out of the box, and bolted directly onto the lower frame of the F450, aligning nicely with the slots on the lower frame. I secured it with 4 bolts.

One note: the gimbal is not designed for the Xiaomi Yi and the existing mount doesn't fit. I found that the frame could easily be removed, a 1/4" cardboard shim cut to level off the mounting plate, and a large zip tie easily secures the camera to the gimbal. There is probably a more dignified way, but that works just fine.

I am really pleased with this combo. I am still seeing some vibration in the video that I want to eliminate, but it's by far the best video I've gotten from an RC model so far. More to come on the vibration problem as I work it out. (Update on how to fix this below)

Here are a couple of still frames of a local park, shot in timelapse mode.

And some video....

Video Test Flight 3 - Xiaomi Yi and Walkera G-2D Gimbal on F450 from Jason Bowling on Vimeo.

Update on the vibration problem, and a note about the camera:

1) The vibration was improved by changing the vibration dampeners that came with the gimbal with more rigid ones from HobbyKing. The dampeners that it comes with are too soft.

2) Additional improvements were made by inserting soft foam earplugs into all four vibration dampeners.

3) The lens rectification function on the Xiaomi Yi makes the edges of the video very blurry. Once I fixed the vibration, the edges were still bad. I turned the lens rectification off, and it's much better. Here's a test flight with these improvements.

TestFlightNoFisheyeCompensation from Jason Bowling on Vimeo.

Saturday, June 6, 2015

F450 Quadcopter Build and Flight Testing

This is my F450 quadcopter. There are many like it, but this one is mine.

I have a lot of experience with RC airplanes, but I'm new to quadcopters, so I want to document the build in case it is useful to others. I learned on the excellent and incredibly affordable Syma X1, which is serious fun for the money and a perfect trainer when flying indoors. I put a number of flights on an ARDrone, until it went berzerk and parked itself in a very tall tree. At that point. I decided something with a real, proper RC system was in order.

A few abbreviations:

ESC - electronic speed control. Converts control inputs from you (through the flight controller) into a throttle output to one of the motors.

FC - Flight controller - a small microprocessor board with gyros and accelerometers that stabilize your quadcopter in flight. It handles the mechanics of keeping the machine in the air by making small adjustments to the motor power many times a second, and turns your stick input into your desired motion.

BEC - battery eliminator circuit. Steps down the main flight pack's 12.6 volts to the 5V the receiver and flight controller needs. A regulator.

3S - a 3 cell lithium polymer battery.

Here's what I selected for parts:

A combo containing the frame, motors, ESCs, and propellors. 

This contained:

1 x F450 frame kit
1 x F450 Landing gear( 4pcs/set)
4 x Sunnysky X2212 980KV Brushless motor
4 x HP SimonK 30A Speed Controller
2 x 1045(CW+CCW) Black Propeller
2 x 1045(CW+CCW) Red Propeller

Knowing what I do now, I'd not have bought this as a kit. I would have bought the components individually. More on that later - live and learn.

KK2.1 Flight Controller (FC)

FlySky FS-T6 radio system

Turnigy 2200 mah 3S Lipo pack

That covered the obvious stuff. Then as I examined the kit I determined I needed some less obvious stuff.

Heavy silicone wire to connect the quad's power distribution to the battery

XT60 connectors. You need at least one on the end of those nice wires you just bought. The other end gets soldered to the power input on the frame's power distribution system.

A set of 5 male to male JR style servo connectors. These go between the flight controller and the receiver outputs.

A 5V switching regulator, because I don't trust the linear regulators on the ESCs.

A dedicated low voltage alarm. I never got the low voltage cutoff on the flight controller to work right. This one works great. You need one or the other, since quadcopter ESCs don't have a low voltage cutoff like airplanes do. Set to 10.8V, you have 30-60 seconds to get it on the ground before you lose power.

A pack of 3S balancing wires, to connect the battery to the low voltage alarm.

Whew. OK. Once you have the stuff, building it is actually quite easy. You need a higher power soldering tool - I used a soldering gun - since you need to solder heavy wires to the copper traces on the frame. There is an extremely helpful build video from Legend RC here:

Other very useful links if you are new to quadcopters:

Identifying the props and their locations

Connectors and Plugs for Quadcopter Newbies

A great guide to quadcopter wiring. This goes over how to connect the various boards.

Be sure to read the KK2.1 manual section on powering the board carefully. I chose to cut the red wire from all 4 ESCs that connects to the FC motor outputs and power it with a dedicated switching Battery Eliminator Circuit (BEC). The switching regulator runs cooler and more efficiently that the linear regulators on the ESCs.

One of my ESCs was dead on arrival. I didn't find it until the kit was 90% built. I couldn't return the whole kit, and even returning the dead ESC to China would have a been a serious pain. I tracked down the same part on Amazon and bought a replacement, along with a spare. This is a serious drawback to buying the kit.

After very carefully checking propellor rotation direction, as well as making sure the correct prop was on the correct motor, I did a quick test flight, and was surprised to find that it flew fine with stock settings on the KK 2.1. I did make some PID adjustments, but it was quite controllable. 

There were bugs to work out. My KK 2.1's low voltage alarm, set to 10.8V, would howl continuously in flight, and cease on landing. I never figured out why. I turned it off and installed a dedicated low voltage alarm, listed above, and it works superbly. 

On the first few flights, I have trouble with split second instances where the motors would just STOP. All at once, for a fraction of a second. It would fall abruptly, and then recover, unless I happened to be low. I first blamed the linear regulator on my BEC.I tested with a dedicated receiver pack, did a quick test flight, and presto, it was fixed. Victory! I installed a nice dedicated switching BEC, went flying, and SMACK, it fell out of the sky again. It finally dawned on me to range test it. On the ground, with the motors spinning just above idle, I started walking backwards. At 40 feet or so, the receiver light blinked out. A few steps forward, it came back on.

Argh. Radio trouble. Gambled. Ordered new receiver. Got lucky - that fixed the problem. No way to return cheap dead receiver, at least not economically, so into the trash it went and I ate the $15. But it passed a range test and works fine farther than I can see the quadcopter.

ALWAYS RANGE CHECK YOUR MODELS. I have known this for years, and got lazy, and it bit me.

Several more flights, and a new problem cropped up. Propellor blades started randomly separating from the hubs. Once in flight, causing a crash from 30 feet, and once on takeoff, narrowly missing me. Cheap plastic props that came with the kit are absolute garbage - to the point that they are dangerous. Into the trash they went. Ordered some 10x4.5 carbon fiber props, which are absolutely superb. My flight time immediately improved from 7 minutes to 9. I'm not sure if they would fail before the bones in my finger would, so... respect. 

One final note about propellors - they aren't perfectly balanced from the factory. Take the time to balance them - mine flew much more smoothly and quietly than before they were balanced. My video quality dramatically improved too, since it eliminated the jello/rolling shutter artifacts I was getting.

I borrowed a friend's Dubro prop balancer and used scotch tape on the back of the blades to balance them. Went surprisingly quickly. I intend to buy a balancer and add it to the periodic maintenance list. I never bothered with planes, but it matters a lot for multicopters.

I now have perhaps two dozen flights, and the bugs are worked out. It is a reliable machine, climbs well, and has plenty of lifting power. I printed a camera mount for an ancient Canon point and shoot camera and it hauled it around just fine - all 1/2 lb of it. I have since upgraded camera and added a gimbal - more on that soon.

Knowing what I know now, I would not have bought the kit - I would have bought the same components, with decent propellors. That way, if I got a bad speed control, I could return it, rather than the entire kit.  Other than that, I am pretty pleased with it. 

Saturday, February 21, 2015

GT2 Belt Drive Conversion of Printrbot Simple (Wood late 2013 model)

One of the defining characteristics of the 2013/early 2014 versions of the Printrbot Simple was the use of Kevlar fishing line for the motion transfer on the X/Y axis. A rubber hose gets superglued to the stepper shaft, a Dremel sanding wheel gets glued to that, and the fishing line gets several wraps around it. It kept cost down (the original was just shy of $300 in kit form) and it works surprisingly well. Mine has held up for quite a lot of printing over the 13 months I have had it running.

However, it did have a couple of disadvantages. It required tightening every now and then. It can result in a loss of precision, because the fishing line can walk back and forth on the drum. And frankly - it's just not very dignified looking. Here is a view of the X axis drive with the bed removed.

Thanks to the work of Thingiverse contributor iamjonlawrence there is a printable conversion to GT2 belts for both the X and Y axis. Newer Simple models come with belts, though they cost more than the original Simple kit did.

Y axis
X axis

Jon is a mechanical engineer, and it shows in his hobby work. He has released a number of upgrades for various versions of the Printrbot Simple, and it is accompanied by professional drawings and detailed bills of material. The parts in these kits were very well thought out and fit perfectly, I highly recommend his work.

I printed two sets of the parts - I was concerned that I would have my printer torn apart, and if I messed up a part I would not be able to print replacements. This turned out to not be necessary, but I still think it is worth doing.

I made sure my printer was calibrated well before printing the parts. The tolerances are snug, but if your printer is printing accurately it will fit with only minor brushes with a file to remove burrs or other loose material from the print.

In addition to the McMaster Carr part numbers called out on Jon's BOM, I used the following components from Amazon:

808 Bearings
Belts and pulleys (there is plenty of belt for this conversion - I had enough left over to replace one of the belts if I ever need to)

Note that FLDM printers tend to print holes and slots slightly small. I calibrated mine to accurately print outside dimensions, and just drill my holes to the right size. With the hardware specified on the BOM, a 7/64 bit will make the hole sized nicely for the screw to thread into. A 1/8" bit will allow the screw to pass through smoothly.

You'll need to recalibrate the X and Y axis since the pulleys are slightly larger than the original drums.

Also, have extra zip ties handy, you'll need them to put things back together.

Procedure - Y Axis

First, I clipped the zip ties holding the Y axis carriage to the motion rods. I then removed the stepper.

Next, I installed the new motor plate and bearings, and aligned the pulley.

I fed the belt in and checked motion, and secured one end of the belt to the stop.

The stop gets belted on.

The second stop gets attached. 

Securing the belt to the tension block is a little tricky. I had to remove material from the slot that the belt passes through to let it pass through twice. The drawing shows clearly that the belt should just fit through the slot when folded back on itself. A short length of filament acts as a pin to hold the filament in place. There are detailed shots of this in the X axis section.

Tension is adjusted by turning the screws in the tension block. At this point, I connected to the printer and tested the motion. All looked good, so I moved on to the X axis.

Procedure - X Axis

The X axis is more involved because you have to install a replacement motor mount plate for the X axis stepper. This is not a trivial process, but it went pretty smoothly.

First, the bed is removed and the X carriage is removed by clipping the zip ties holding it in place.

The side opposite the control board is removed.

The bottom plate can now be pulled free and the X axis motor mount is removed.

The new bearing plate is assembled.

Carefully align the pulley with the bearings. They are a snug fit, but they do fit, and don't allow any slop when assembled.

Install the new motor plate and reassemble.

The new belt ends are held in place with the zip ties securing the carriage to the motion rods. As the drawings call out, the carriage is flipped over and a new hole drilled for the X axis end stop screw.

Here is a detail shot of how the belt tension blocks work. I had to open the slots a bit with an exacto knife, just enough to pass the belt when it is folded back on itself.

Test it! I had to remove a little material from the carriage to get it to run smoothly, just rounding over an edge.

I had to modify my back clips that hold my heated bed on. I just bent and cut office clips into a z-shape. Details on the heated bed installation is here.

Initial test prints look really good. I am in the processing of recalibrating the X and Y motion in the Printrboard, since the pulleys are slightly larger than the original sanding drums. I will post the final values once they are determined.

Update: X and Y values for M92 are both just a hair above 80. I have mine printing to within .001" on a 2.000 inch square test model.

In Repetier Host, the GCode commands can be entered into the GCode command box on the manual tab, shown below. Enter the command you want to run and hit the Send button.

A good overview of the math is available on this excellent blog entry by Zheng3.

Sunday, January 18, 2015

Audible Frequency Chirp Sonar with the Stellaris Launchpad

Over the last year I've been working towards an underwater sonar system for ROVs and surface boats. In order to learn the basic signal processing required to detect the echoes, I initially got a simple sonar working in air with a desktop conferencing USB speaker/mic running on Windows. A writeup, including source, is here. That article describes the algorithms used in detail and would be a good read if you want the details of how this works.

The next logical step seemed to be to get it working on a microcontroller. There are plenty of low cost ultrasonic sonar modules available that work really well in air, but the idea was to work towards getting a sonar that worked in water. There are currently no low cost sonar modules for hobby use in water. Additionally, the low cost modules only give one echo - with a signal processing approach like this, you get a series of echoes that may convey more information about the environment. As an example, a boat floating above a school of fish could detect both the fish and the bottom.

I selected a Stellaris Launchpad because of the high speed analog to digital converters (ADC) and the 32 Kof RAM. At the required sample rates, the Launchpad has just enough RAM to send a chirp, and then record a fraction of a second of audio so that the echoes can be determined. Higher frequency sound will require a higher sampling rate, so I may need to switch to a Teensy 3.1, which has 64K of RAM.

A chirp waveform is computed and sent to a small piezo speaker driven by a simple transistor circuit. The piezo supply voltage (VCC in the diagram below) is provided by 3 nine-volt batteries in series to obtain 27V. This diagram shows how it is connected. This is not my diagram - I found it online, but I don't have a reference. If this is yours, please drop me a line.

The return echo is detected by a small amplified microphone from Adafruit. I like this module because it has an integrated level shift. Rather that swinging from -V to +V, it is shifted to 0 to +3.3V so that it can be connected to an ADC. It's very convenient.

A couple 3D printed parts hold it all to the board just to keep it pointed in the right direction.

The chirp is sent, and the audio immediately starts recording to catch the echo. The same correlation function as used in the previous article is used to pull the echoes out of the recorded audio. The intensities of the correlation function are sent through the debug port to the PC so that it can be plotted. 

I need to work on optimizing the echo detection code - currently it works on the audio from each pulse for 4 seconds or so. Also, the power output of the audio transducer is very low, so range is pretty limited. It has an effective range of between 3-9 feet. Closer than 3 feet, the echo is hard to pick out of the noise produced when the pulse is sent.

As in the original experiments with the speaker/mic, the results are plotted with a simple Python program set up similarly to a fishfinder display. The results of a test run are shown below. Source for the Python display is modified from code from the previous article. 

Source code for the Launchpad is given below.

Next steps are to work on getting transducers working under water and increasing transmit power. I've made a simple hydrophone to test transducers with - update coming soon.

Audible Frequency Chirp Sonar with the Stellaris Launchpad from Jason Bowling on Vimeo.

#include "inc/hw_ints.h"
#include "inc/hw_memmap.h"
#include "inc/hw_types.h"
#include "driverlib/sysctl.h"
#include "driverlib/interrupt.h"
#include "driverlib/gpio.h"
#include "driverlib/timer.h"
#include "driverlib/debug.h"
#include "driverlib/fpu.h"
#include "driverlib/pin_map.h"
#include "driverlib/rom.h"
#include "utils/uartstdio.h"
#include "driverlib/adc.h"
#include "inc/hw_timer.h"
#include "inc/hw_ints.h"

#define numSamples 6000 //size of receive buffer
#define sampleRate 80000 //sample rate at which the audio for sending and receiving is performed
#define pulseLength .0015  //transmitted pulse duration in seconds

#define chirpStartFreq 5000  //in Hz
#define chirpEndFreq 8000  //in Hz

int chirpLength = 0;

 unsigned long g_sampleCounter = 0;
 unsigned long ulADC0_Value[1];
 unsigned long rxBuffer[numSamples];
 int pulse[900]; //stores waveform for sending and comparison. Only need integers for square wave. Could do with bits to save memory
// must be at least pulseLength * sampleRate

 //double output[501];

void initConsole()
  // Initialize the UART at 115200.
     //ROM_GPIOPinTypeUART(9600, GPIO_PIN_0 | GPIO_PIN_1);
     UARTprintf("\nConsole Initialized. System clock is %4d\n", SysCtlClockGet());


void initADC()
   // The ADC0 peripheral must be enabled for use.

         // For this example ADC0 is used with AIN0 on port E7.


         // Select the analog ADC function for these pins.


         // Enable sample sequence 3 with a processor signal trigger.  Sequence 3
         // will do a single sample when the processor sends a signal to start the
         // conversion.
         ADCSequenceConfigure(ADC0_BASE, 3, ADC_TRIGGER_PROCESSOR, 0);

         // Configure step 0 on sequence 3.  Sample channel 0 (ADC_CTL_CH0) in
         // single-ended mode (default) and configure the interrupt flag
         // (ADC_CTL_IE) to be set when the sample is done.  Tell the ADC logic
         // that this is the last conversion on sequence 3 (ADC_CTL_END).  Sequence
         // 3 has only one programmable step.

         ADCSequenceStepConfigure(ADC0_BASE, 3, 0, ADC_CTL_CH0 | ADC_CTL_IE |

         // Since sample sequence 3 is now configured, it must be enabled.
         ADCSequenceEnable(ADC0_BASE, 3);

         // Clear the interrupt status flag.  This is done to make sure the
         // interrupt flag is cleared before we sample.
         ADCIntClear(ADC0_BASE, 3);


 //configure 32 bit periodic timer
  TimerConfigure(TIMER0_BASE, TIMER_CFG_32_BIT_PER);

void startTimer()
 unsigned long ulPeriod;

 //set timer rate
   ulPeriod = (SysCtlClockGet()/(sampleRate*3));
   TimerLoadSet(TIMER0_BASE, TIMER_A, ulPeriod -1);

   TimerEnable(TIMER0_BASE, TIMER_A);

void stopTimer()

         // Disable the Timer0A interrupt.

         // Turn off Timer0A interrupt.
         TimerIntDisable(TIMER0_BASE, TIMER_TIMA_TIMEOUT);

         // Clear any pending interrupt flag.
         TimerIntClear(TIMER0_BASE, TIMER_TIMA_TIMEOUT);

void initLED()
 //enable GPIO pins for LED

void initPiezo()
 //enable GPIO pins for piezo

void generateChirpWaveform()
unsigned long int freq = chirpStartFreq;
int value = 1;
unsigned long int start = 0;
unsigned long int stop = 0;
unsigned long int counter = 0;
int count = 0; //temp
int sampleComplete = 0;

//step through array from 0 to 1/2*freq, setting value. Invert value. Proceed to 1/2*freq, setting value. Calculate new freq. Repeat.
//values stored in pulse[]

while (!sampleComplete)

stop = (int) start + ((1.00 / (freq * 2.00)) * sampleRate);

for (counter = start; counter < stop; counter ++)
 {//check position and set sampleComplete when at end of chirp
 if (counter < pulseLength * sampleRate)
  pulse[counter] = value;
  sampleComplete = 1;
//invert waveform value to be set for next half of cycle
if (value == 1)
 value = 0;
 value = 1;

//calculate new freq based on position in pulse. Ratio of stop/chirpLength vs freq increment / chirpEndFreq
freq = chirpStartFreq + (((chirpEndFreq- chirpStartFreq) * stop)/(pulseLength * sampleRate));

//position for writing next half cycle
start = stop;
chirpLength += 1;
} //end while


void playChirp()
long int count = 0;
long int endSample;

endSample = sampleRate * pulseLength;

 while (count < endSample)
  if (pulse[count])

  SysCtlDelay((SysCtlClockGet() / (sampleRate * 3)));
  count ++;


void ftoa(float f,char *buf)
 //code from
 int pos=0,ix,dp,num;
    if (f<0 data-blogger-escaped-buf="" data-blogger-escaped-dp="0;" data-blogger-escaped-f="" data-blogger-escaped-pos="" data-blogger-escaped-while="">=10.0)
    for (ix=1;ix<8 data-blogger-escaped-f="f-num;" data-blogger-escaped-if="" data-blogger-escaped-ix="" data-blogger-escaped-num="">9)
            if (dp==0) buf[pos++]='.';
buf[pos - 1] = '\0';

void processSample()
int a = 0;
int bufferStartPosition = 0;

double normalizedSample = 0.0;
double windowSum = 0.00; //cumulative sum for this window
double temp = 0.0;

char buffer[20] , *str;
str = buffer;

//audio values in rxBuffer are shifted integers. Normalized audio is -1 to 1. Recorded samples are 0 to 4096
//Divide by 4096 and subtract .5 to shift to this range.

//stored pulse is stored 0 to 1. Multiply by 2 and subtract 1 to normalize.

while (bufferStartPosition < ( numSamples - chirpLength))
for (a = 0; a < chirpLength; a++)

 normalizedSample = (rxBuffer[bufferStartPosition + a]/4096.0) - .5;
 //temp = normalizedSample * ((pulse[a] * 1.00));
 temp = normalizedSample * ((pulse[a] * 2.00) - 1.00);
 windowSum = windowSum + (normalizedSample * temp);


bufferStartPosition += 1; //increment bufferStartPosition to move window
windowSum = 0.00;
//end outer loop


int main(void)
//initialization complete

 int pingCount = 0;

 while(pingCount < 1000)
  //record audio
  pingCount = pingCount + 1;

 while (1) {}


void Timer0IntHandler(void)
 // Clear the timer interrupt

 // Trigger the ADC conversion, Wait for conversion to be completed.
 ADCProcessorTrigger(ADC0_BASE, 3);

 //everything after this can be moved out of the ISR
 //set a flag and poll for it in main()
 while(!ADCIntStatus(ADC0_BASE, 3, false))

 //take an ADC reading
 ADCIntClear(ADC0_BASE, 3);
 ADCSequenceDataGet(ADC0_BASE, 3, ulADC0_Value);

 if (g_sampleCounter < numSamples)
  rxBuffer[g_sampleCounter] = ulADC0_Value[0];
  g_sampleCounter = 0;



Saturday, December 27, 2014

Pebble Smartwatch Review

I toyed with buying a Pebble for several months before I actually pulled the trigger. I bought it to solve a specific problem, and then discovered it solved some other problems I had not actively been working on a solution for. It also has a few drawbacks. Here's what I've learned after using it for a couple months.

I keep my phone silenced at work, so that it does not interrupt meetings. However, I sometime miss the silent alerts - I just don't feel the vibration when a call or text comes in. Pretty much the only people who call or text me during the work day are family, and if they do, they need to reach me. Even when I do feel the alert, I consider it bad form to fish my phone out of my pocket to see what it is - it's not very polite at best, and can give the impression you are ignoring a boss or customer at worst.

I knew the Pebble could alert me to incoming texts and phone calls. That, alone, was worth the gamble. The price dropped to $99 after the Android Wear watches came out, and I decided it was time to give it a shot. I spoke with a couple people who had the Android Wear watches, and although they are impressive technology, I was put off by the high price and very short battery life. A coworker indicated they don't even last a full day, and I didn't want that. So I bought the standard Pebble.

The good...

  • The battery life is outstanding. It will run for an entire week, the way I use it.
  • The alerting works very well. The watch vibrates against your wrist and it's hard to miss. It's quiet and subtle.
  • You can configure which alerts you want. I have mine set to only alert on texts and phone calls, but not things like Facebook notifications. That way, if my phone buzzes but the watch does not, I know it can be ignored until a convenient time.
  • It's waterproof.
  • The application on the phone is a convenient way to load apps and watch faces. 

The not so good...

  • The display driver appears to have some bugs. Sometimes it will start to have distortion or speckles across the display that range from annoying to completely obscuring the content. A restart normally fixes it and it doesn't come back for a while. UPDATE: This is a known hardware issue, caused by intermittent connection between the LCD and mainboard. A search on "pebble screen tearing" brings up lots of results that indicate the fix is to contact support and RMA the watch. I intend to do this.

  • The display darkens significantly in cold air, less than 30 deg F. 
  • The stock wrist band on the low end version is made of some sort of rubber, and it is clammy against the skin if it gets damp. It can easily be replaced, but it stands out against the rest of the device.
  • The pedometer function is wildly optimistic. If I zero it and then hop in the car and drive for 45 minutes, I will have logged 2000 steps by the time I get out of the car. I compared it to an actual pedometer and it was about 50% high by the end of the day.
  • Instead of pulling out your phone, which makes you appear distracted, you now tend to glance at your watch, giving the impression you are in a hurry.

The completely unexpectedly useful....

  • Having a silent alarm clock that wakes you by gently pulsing your wrist is extremely handy if you want to wake at a different time than your partner.
  • The app Sleep As Android integrates seamlessly with the Pebble and tracks your sleep by tracking your movements. If you tell it when you need to be up, along with an acceptable time window, it will watch your sleep and nudge you awake when you are in the shallowest part of your sleep cycle. I find I awake more refreshed and alert. 

  • Additionally, I really like being able to see who is calling and texting if I can't easily get to my phone. Examples are cold weather, or with the phone in a dry box while kayaking.
  • The ability to show different data on the watchface, including time zones or weather, is pretty handy.
  • The music playback controls are pretty cool, and work well with Google Music. It does not appear to work with Amazon Music.

It's not as dramatic a change as, say, the laptop or smartphone, but it's inexpensive and handy. I never miss important calls or text any more, and the sleep tracking and data displays are modest time savers. Overall, I am quite pleased with it.