Thursday, November 25, 2010

Ideas for cheap Laser Rangfinder

The bounty for hacking the XV-11's $30 LIDAR, teardown talk on Sparkfun and hacking thread on Trossen Robotics prodded me into doing some quick experiments.

I've been pondering, but not actually doing anything with laser-based object detection for a year now, for both the Trinity-style firefighting competition and the Sparkfun AVC.

The XV-11 unit looks to me like a camera and a laser pointer spun by a small CD motor (details of the system in this white paper). Picture from Sparkfun below. How hard would it be to hack something like this together?



Awhile ago I bought a line-generating laser to identify obstacles. Powered with a LM317 generating the required 3.0V, the laser generates a pretty nice line once you focus it.



Based on ideas I found online, one places the laser at some point low on a robot, with the laser's axis and line parallel to the ground, while the camera is placed some distance above and pointed slightly down, so that parallax can be leveraged.

A horizontal line is generated lower in the camera image for nearby objects, and higher in the image for far away objects.

Since I'm using Pokey's grayscale camera, rather than a color camera, I took pictures with and without a photographic red filter to see if that would help distinguish the laser reflection.

Laser line, red filter. Note height of line on cup, laser printer.

Laser line, no red filter

One could probably write code to detect the line even in a grayscale image although this would be substantially easier with an IR laser and IR filter. Or with a color camera.

Here's a color picture of the test area, my desk. Old HP laser printer to the left, McD's cup, front and center, etc.

As for the LIDAR concept, put a camera next to a laser pointer, align them to leverage horizontal parallax so one can measure distance based on the position of the dot.

I was curious to see what a tiny red laser pointer dot would look like on the grayscale camera. The dot is pretty tiny at about 8' distance (see red arrows).

Distant dot (red arrow), no red filter

Distant dot, with red filter

Maybe too tiny to reliably detect, although with the LIDAR one can constrain the search area vertically and possibly horizontally.

And constrain the search algorithm to square and circular shapes.

Come to think of it the code could correlate dot size with position for added robustness.

And actually the red filter seems to make the laser dot the brightest thing in the image, once the exposure and gain is set properly.

Nearby dot, no red filter

Nearby dot with red filter
Once code and camera can find the dot, the system has to do it quickly so it can be mounted on something spinny.

Doing a 360 degree sweep at 10Hz like the XV-11 means 3600fps which suggests it is using a much faster optical device than Pokey's camera or anything like it. I am guessing Neato is using a linescan sensor. You also need a fast processor. Speculation on the Sparkfun teardown blog post points to a Texas Instruments DSP device, TMS320C2801 at the heart of the XV-11 LIDAR's image processing.

I hit a wall at 3fps with an Arduino running Pokey's camera. I'm in the middle of interfacing with a Propeller but at absolute best I can get from that camera is 30fps. Nowhere close to fast enough for LIDAR. But workable with the laser line sensor concept. But all that's best left to another blog post or two.

I plan to just wait for someone to hack the XV-11 LIDAR and wait for spare parts to come out of Neato and hope I can afford the darn thing or build it myself for $30 as promised.

Sunday, November 21, 2010

Robotics goodies on sale...

Check it out, microcontrollers and other robotics goodies on sale. No affiliation with either company. Hope this is helpful to others on a shoestring budget like me.

Pololu is doing a black Friday sale with great prices on what I think is really useful stuff. I'm a fan of Pololu; Pokey's brain and drivetrain came from them. Good guys. Anyway, I have my eye on the $20 Orangutan B-328, $10 dual serial motor controller, $5 step-up step-down regulator, and $9 3-axis accelerometer but there's a bunch of stuff others might want too.  Hm, I better check my bank account balance...

Newark's Element-14 has some nice promotional deals going on controller boards like a $17 Multicomp Arduino board (35 day lead time, though), Multicomp Mega for $35, a Multicomp Nano for $28, an Amicus18 PIC board for $32 and an NXP OM11042 rapid prototyping board for $41.25 with the NXP LPC2368 ARM7 MCU.

Friday, November 19, 2010

Sparkfun AVC 2011: RC Truck

I had some birthday money burning a hole in my pocket and I've been thinking about the Sparkfun Autonomous Vehicle Contest too much lately.

I bought an budget-friendly, entry-level 1:10 RC truck.  An ElectrixRC Circuit.


I guess I'm going to go ahead and do this AVC thing. There's a long to do list. Let's see here...
  • Interface Arduino MCU to RC truck (done)
  • Research truck's handling limits* (in progress)
  • Collect acceleration data (in progress)
  • Devise Electronic Stability Control algorithms
  • Setup Android coding environment
  • Ramp up on Android coding
  • Explore using Android phone for navigation
  • Arduino to Android interface (bluetooth?)
  • Devise curb & object detection mechanism -- vision? laser? lidar?
Since Sparkfun hasn't actually announced the 2011 contest yet, I'm not sure how much time I have left but I'll assume it's happening around tax time again.  In fact, there's no guarantee the AVC will happen at all but I'm hopeful based on a comment on their website by an employee:

Although we haven't made any official AVC 2011 announcement (and this isn't one either), everybody around the office is assuming that AVC will continue every year into the indefinite future. It's just too much fun not to do it! Trevor

What I've learned so far about using my phone is that it lacks gyros, kind of a big deal I fear.  It makes the interface with the Arduino a little messier than I'd prefer.

Pic from Sparkfun.com
The 1-axis gyros I ordered from Sparkfun arrived, so I will most likely let the Arduino handle IMU interfacing, and have the phone manage the high level GPS and compass navigation.

---
* Yes, of course I'm going to get to play with the truck! Duh!

Friday, November 12, 2010

Thompson Robotics Expo

Pokey has been on loan to George of SHARC who took the robot to the Thompson Robotics Expo in Loveland last weekend.

The expo featured two robotics competitions. Of the 60 teams from the Thompson School District in Loveland, 22 teams competed to go to the VEX World Championship in Orlando, Florida while another 38 teams competed in a FIRST Lego League qualifier.

The event also showcased exhibits from the Denver Museum of Nature and Science, a robot mannequin simulator from Banner Health, MATE underwater robots, a robotic dog from Road Narrows, an Antartica underwater robot simulator, a demonstration of Thompson Valley's and Loveland High's FIRST Robotics Challenge robots, BEST robotics competition demo, and of course the SHARC firefighting demo with several robots, including Pokey.

George sent me an email a few days before the expo. With the motivation of an event deadline I got busy, reassembled the robot and thrashed on the code to add a number of enhancements.

Though still without a working fan motor driver, and lacking any kind of maze navigation code (bluetooth remote control only) Pokey's working better than ever and I'm told he performed well for the kids. Cool.

Pokey navigating the upper right corner of the maze. (Pic by George Mitsuoka)

Monday, November 1, 2010

Android (Phone) Robot for Robomagellan

As of a couple weeks ago, I've entered the new millenium with the purchase of an Android phone, an LG Ally (pic from intomobile.com).

Though I'm notoriously a Mac guy and most of my friends expected me to get an iPhone, my blogs, pictures, email, calendar, etc., are all hosted on Google.

And I was drawn to it because it's programmable in Java. On the other hand, programming the iPhone... well, let's just say that Robotgrrl successfully scared the bejeepers out of me in one of her blog entries with such quotes as:

If you’re ever looking for a challenge worth pulling your teeth out, making an iPhone App is a great place to start. 

and

The language, the concepts, everything is completely different from what is seen in languages like Java and C++.

Um, no thanks. Not that I'm a Java master, or anything. I'm a newb (and how many people are truly masters with an API that big anyway?).  But at least I'd be facing a learning curve instead of a learning cliff equipped with pillboxes and landmines. So Android was the obvious choice all around.

I got to thinking, gee, this thing has a GPS, a compass, and a 3-axis accelerometer, all the stuff that would cost me a couple hundred bucks to add to a scratch-built robomagellan robot.

That is, if I were to follow my dream of entering the Sparkfun Autonomous Vehicle Contest in 2011. Or any other robomagellan-esque challenge at any time in the near future.

So instead of a scratch-built robomagellan robot, why not an Android brain with the navigation smarts coded in, sitting atop some cheap RC chassis equipped with an Arduino brain stem, and some simple comm code on both ends for control.

That has to be easier and better than trying to do all this with 8-bit, 16MHz microcontrollers and C, right? I'd say so. Did I mention that OpenCV has been ported to Android?

Here's a great PDF write-up on the topic of Androids for robotics and robomagellan.