A recent test run provided the following speed data from the wheel encoders, which digitize position at regular intervals.
Here's a detail view of one of the plots.
|Detail speed plot showing quantization error|
First I tried filtering the data but still ended up with a bunch of spikes. Before I could filter out the noise, I supposed that I should learn about the nature of the noise.
|Filtered speed, green, shows spikes|
|Picture of Pokey showing quantization error|
Dither adds random noise to each pixel, dissipating the quantization error effect. At least to our eyes. Here's the plot with some noise added.
That doesn't help the robot much. Our eyes/brains do quite a bit of noise filtering. One of the AVC entrants pointed me to the double exponential filter, basically a super fancy moving average. I opted to try a less fancy exponential filter (the green line below).
|Dithered speed, blue, and exponential filtering, green|
It's still not glass smooth, but it doesn't need to be. More smoothing means greater lag between the filtered signal and the real one. But clearly the filtered plot is much better; as with the picture above, the effective resolution of the signal is higher.
Is this filtering necessary? I don't know but I am considering some navigation-related calculations that depend on speed (and distance) to improve navigational accuracy. It seems to me that improved resolution will reduce error in these calculations. If it's simple to implement in the real world, I may just go ahead and do it.