[FoRK] coffee drone
dmorton at bitfurnace.com
Sun Jun 24 10:25:07 PDT 2012
On Sun, Jun 24, 2012 at 12:14 PM, Aaron Burt <aaron at bavariati.org> wrote:
> On Sat, Jun 23, 2012 at 08:14:32AM -0400, Damien Morton wrote:
> > Or, if you felt your drone had enough energy budget to carry something
> > a beaglebone around with it, you might be able to do the mapping
> I think one of those drones demo'd was carrying an x86 chip.
Actually a BeagleBone or Gumstix might be able to do it on-board. These
boards are in the <40g range, and require 2W of power - a battery weighing
<10g should be enough to power them for a single flight.
> > What we really need is an inexpensive low-power per-pixel depth sensor.
> You cited working implementations of two of the most effective techniques
> in nature. (I count optical flow as structure from motion.)
I was actually giving Stephen a chance to perk up about the company he is
working at - they are working on plenoptic imaging chips for mobile phones,
and these chips are capable of per-pixel depth measuring at low to moderate
z-resolution. Far less resolution than Lidar, but a lot cheaper, and
sufficient for obstacle avoidance.
> About the only way to improve on that is with active techniques, which
> pollute the EM environment and give away position. But I guess you could
> build something like a CCD array with a PLL and phase detector at each
> pixel, coupled with a modulated LED emitter. That, or a threshold
> detector, a latch and a cap at each pixel coupled with a picosecond laser
> emitter. Sounds silly, though.
Its called Flash Lidar. Not silly at all, per pixel time-of-flight sensors.
More information about the FoRK