Thursday, February 27, 2014

AVC Path Following

I implemented an improved path following algorithm on Data Bus and later, the SHARC self-driving Jeep that won Doping in the 2014 Sparkfun AVC.

The old, simple path following algorithm worked but didn't correct cross track error. The new algorithm chases a virtual rabbit, computing where the rabbit is and how to intercept it.

The Pure Pursuit algorithm is conceptually elegant and simple. It's easy to program, it's popular, and, clearly, it works quite well.

Pure Pursuit Algorithm

The robot follows a virtual rabbit that travels around the legs of the course, with each leg defined by two points. The rabbit is always located along the current leg of the course, A=[W0 W1].

The projection of the robot's position onto A is point P and the rabbit is located along A, a fixed distance from P. The fixed distance between P and the rabbit is called the lookahead distance.


The algorithm moves the rabbit's position at each update step, computes a relative bearing from the robot to the rabbit, computes an arc path tangential to its heading that intercepts the rabbit. The result of this unending pursuit is a smooth correction to the robot's heading and cross track error. This algorithm is called Pure Pursuit.

Here's the math I used to compute the intercept arc. The distance from W0 to P is the projection of the robot's position onto W, given by a dot product:


Where A=[W0 W1] and B=[W0 robot]. Vector A, divided by its magnitude, is the unit vector pointing along A. The dot product can be computed with trig functions (slow) or you can do it this way (fast):


 

Once you have a scalar from the dot product you can find your goal point, the rabbit, like this:





Even if the math looks spooky, the code is trivial. Here's the Processing code I used for my simulation.

  // Leg vector
  float Ax = Xw[next] - Xw[prev];
  float Ay = Yw[next] - Yw[prev];
  // Bot vector
  float Bx = x - Xw[prev];
  float By = y - Yw[prev];
  // Dot product
  float legLength = sqrt(Ax*Ax + Ay*Ay);
  float proj = (Lx*Bx + Ly*By)/legLength;

  // Goal point ("rabbit")
  float Rx = (proj+lookAhead)*Ax/legLength + Xw[prev];
  float Ry = (proj+lookAhead)*Ay/legLength + Yw[prev];


See? Not bad at all!

Turning

The robot now knows where the rabbit is. The robot knows it's heading and position. It can compute the relative bearing to the rabbit. But how much should the robot turn to catch the rabbit?

An elegant approach with smooth behavior that doesn't require complex programming logic or trial and error tuning is to use an intercept arc that intersects both robot and rabbit, and is tangential to the robot's heading. Here's how.

We have the robot at B, it's heading described by BC, and the rabbit or goal point at G. The distance between B and G is D.


A circle that intersects B and G and is tangential to BC with radius R will have an origin along a line that is perpendicular to BC and passes through B. We simply need to find out R, the radius for this circle. Time to break out some trigonometry.



Draw another radius line perpendicular to BG. This line will bisect BG (each line is D/2 in length). Studying the right triangles generated by these lines, notice that the relative bearing, theta, is also the angle between the new radii intersecting B and G, respectively.


We can express D/2 in terms of R and Theta, then solve for R:



The robot recomputes a new intercept arc at every update step. The result is a continuous path towards the goal point. The path is followed, the cross track error is accounted for.

Conclusion

So, despite planning to test this on a small 1:10 RC car, I ended up proving it out on the full size Jeep. Once the turning radius and lookahead distances are set reasonably, it works a treat!

If I get some time I'll post up the calculations for converting from arc radius to steering angle to servo signal. Suffice it to say that determining the correct steering angle to traverse the intercept arc is relatively simple to figure out, using basic geometry.

References
Path Tracking for a Miniature Robot.pdf
coulter_r_craig_1992_1/coulter_r_craig_1992_1.pdf

Clock for my Vision Impaired, Aging Mom

My 82 year old mother can't see very well and regularly forgets what day it is. I looked around for a clock to help her keep track of the day of week but decided in the end to build my own using components from tindie.com.




You can build one of these yourself. Here's how I built the prototype...

Tuesday, February 25, 2014

Things-API exposes Arduino as REST

Things-API (github), by by Vicius Senger, exposes Arduino's physical computing capabilities as REST services through Java on a Raspberry Pi. I had a chance to use it during the JavaOne Raspberry Pi Embedded Challenge.

Here's an overview of how it works...

Thursday, February 20, 2014

TOTT Bot: Chassis Design

Requirements | Design 1 | Design 2 | Chassis ]


At long last I finished the Trash Bot's chassis. You can build your own Take Out The Trash roBot (TOTT Bot). Just read on to see the design and find out what Actobotics parts I used...

Tuesday, February 18, 2014

Linux for Robotics Hobbyists

A year ago I decided to migrate to Linux Mint. The first few weeks were frustrating but I finally overcame. I've made Linux my primary OS for robotics.

It's very fast, it can virtually everything I need it to do. And it's free. Here's how you can make Linux a great platform for robotics design and development.

Friday, February 14, 2014

Infinite Output in Different Languages

I enjoy the Olimex Weekend Programming Challenge. I missed #33 but it's one of my favorites so far: generate infinite output with the smallest programs possible in different languages. How many languages can you do it in?


Above is how you'd enter the code in Commodore 64 BASIC (the listing would expand the keywords), The ? is PRINT, the G shift-O (right angle character) is GOTO.

Here are the solutions Olimex received. I've selected a few interesting solutions and added some of my own:

Wednesday, February 12, 2014

OpenMV: low cost, hackable, scriptable machine vision

Introduction

OpenMV Cam will be the most hackable, low cost machine vision platform out there, because Ibrahim and I want to change the nature of hobby and educational robotics.

OpenMV Cam will be low cost; you can write scripts in Micro Python and a friendly IDE that run on the module controlling machine vision algorithms. The module supports Serial, SPI, and I2C, USB streaming for integration into robots or your desktop. It's easily hackable, based on the popular STM32F4, easily programmed with an open source toolchain; we can easily write our own software for it.

OpenMV Cam Is Available

And you can help. We're producing a short run of OpenMV Cam modules on Tindie so folks can help us add the final polish to the software. Eventually we'll do a fundraiser campaign. The firmware and IDE are pretty far along, actually. As of 9/16 we're in the process of assembling OpenMV Cam modules. You can backorder them on Tindie. Nobody gets charged until they are ready to ship.

Demonstrations

Imagine the projects you could build using 25fps face detection. An automatic bathroom mirror light, perhaps? Or automatically ring the doorbell when someone is at your door. Here's a video of using a Viola-Jones classifier for face detection to identify a face, then using FAST/FREAK to track the face regardless of scale and rotation. This is Ibrahim in the video. The video shows the IDE in action with the processing being handled by the OpenMV Cam.



What if you could do 60fps multi-color blob detection? Buy two and build stereo blob tracking? Laser scanning? Sort M&M's like a boss? Flame detection for Trinity competitions? More? This video shows blob detection as seen in the IDE with the camera module doing the processing.


Imagine what hobby electronics--or STEM education--will become when machine vision is as affordable as an Arduino?  Imagine the things we could do together, the problems we could solve.

Here's a video with the OpenMV Cam hooked up to an LCD to stream video. It'll also save video to the microSD card, and stream it to the computer as you saw above.


Join The Community

We're looking to build a community and we'd like you to join our Google Group.

Micro Python

You'll be able to script it in Micro Python, a lightweight Python for MCUs. It loads scripts off the microSD card. Some bindings are in place with full control in the works.

There's an IDE you can use with the camera that has a Python shell, a frame buffer viewer and you can use it to run scripts, save them to flash.


Also, for more flexibility, you can use several OmniVision sensors on this board: the 0.3MP OV7660, the 1.3MP OV9650/OV9655, and the 2MP JPEG OV2640. It's the latter sensor we like best and which ships with the OpenMV Cam on Tindie.

Hackable Microcontroller

What have you always wanted your machine vision system to do? Because it's running a widely known STM32F4 single core MCU, you can write and flash and debug your own firmware using an open source toolchain, CMSIS, and STM's handy Standard Peripheral Library.

There's already a growing community of support around this chip family with Pixhawk, STM32F4 Discovery boards, and more.

We're presently using an STM32F407 running at 168MHz using the native camera interface. Ibrahim has experimented with overclocking to 240MHz.

Algorithms

The OpenMV currently implements multi-object, multi-color blob tracking, Viola-Jones, Haar cascades (easily convert from OpenCV), and FAST + FREAK algorithms. You can use these to do relatively high frame-rate face and object detection.

Fundraiser

We'll be doing a fundraiser (Kickstarter, Indiegogo, or something along those lines) in the future. For now we'd like to involve you in the community and put some polish on the software.


Thursday, February 6, 2014

AVR Dragon


I've been having some fun times with an Atmel AVR Dragon. I've been borrowing an awesome JTAG ICE MkII for a few years, can't afford to buy one for my very own. My fallback has been a fine Pololu AVR programmer. Here are my initial impressions for anyone considering getting an AVR Dragon.

Tuesday, February 4, 2014

SHARC AVC Meeting

Dinosaurs and coolers in 2011
SHARC met again on Saturday to discuss our plans for the 2014 Sparkfun AVC.

A number of folks new to the AVC and some new to robotics are interested in entering which is great. Some of the more experienced SHARC folks are going to give them some guidance along the way as we attend working sessions to build up our respective rovers.

Expect to see a slew of SHARC robots overtaking the AVC this year!

2013: Two wheeled robots, hovercraft, cars, trucks...
I had previously mentioned the top secret SHARC AVC entry. We talked about that some more and we have a plan. It's going to be massive.

If we can pull it off, that is. I'd rather make some progress on the iffy bits before making the big reveal. Talking about what you're going to do is much easier than doing it.

We did feel obligated to ask the Sparkfun folks if our idea would be generally allowed. To which they said, "bring it on."