An insect-inspired autocorrelation model for visual flight control in a corridor
Abstract
In this paper we propose and demonstrate stable robot controllers that use a small omnidirectional array of visual sensors and a fly-like autocorrelation scheme for sensing visual motion using minimal computation. Designing the controllers required deriving a model for the response of an array of correlators observing the motion of a flat moving surface such as the ground or a wall. The model operates in the frequency domain and incorporates the effects of perspective, motion parallax, and spatial blurring. Using it, suitable parameters for inter-sensor spacing and blur width were found that mitigated the effect of incorrect estimates arising from aliasing. Controllers that decomposed the correlator response into harmonics to observe and control the robot’s state were implemented on a fan-actuated hovercraft robot. They were able to stabilize it as it moved through a corridor, the first use of correlators to control the motion of a flight-like (nonkinematic) dynamic vehicle.