[FoRK] coffee drone
aaron at bavariati.org
Sun Jun 24 09:14:19 PDT 2012
On Sat, Jun 23, 2012 at 08:14:32AM -0400, Damien Morton wrote:
> My guess is that you'd want your drone to first survey the area it was
> going to be operating in, and use some offline process to build up a map of
> the area.
Your receiving station gets to do the visual processing and overall
mapmaking. Drones just need to know enough to navigate.
> Or, if you felt your drone had enough energy budget to carry something like
> a beaglebone around with it, you might be able to do the mapping on-board.
I think one of those drones demo'd was carrying an x86 chip.
> What we really need is an inexpensive low-power per-pixel depth sensor.
You cited working implementations of two of the most effective techniques used
in nature. (I count optical flow as structure from motion.)
About the only way to improve on that is with active techniques, which
pollute the EM environment and give away position. But I guess you could
build something like a CCD array with a PLL and phase detector at each
pixel, coupled with a modulated LED emitter. That, or a threshold
detector, a latch and a cap at each pixel coupled with a picosecond laser
emitter. Sounds silly, though.
More information about the FoRK