Source: Circuits and Systems, 2007. ISCAS 2007. IEEE International Symposium on, p.617--620 (2007)
The visual navigation system of a UAV is a complex embedded device designed to modify the path of the platform depending on objects or events detected on the ground. In the visual field of the autopilot these events could be formalized as specific space-time signatures. Processing all pixels captured by the on-board camera(s) in real time with high frame rate needs huge computational effort that is often unnecessary. An adequate computational strategy would focus on the interesting locations only as in the visual system of various species. In this article we describe an automatic focusing mechanism relying on optical flow calculation for detecting moving objects on the ground, thus efficiently separating the motion of interest from ego-motion of the platform.