Traditional approaches to calculating self-motion from visual information in artificial devices have generally relied on
object identification and/or correlation of image sections between successive frames. Such calculations are
computationally expensive and real-time digital implementation requires powerful processors. In contrast flies arrive at
essentially the same outcome, the estimation of self-motion, in a much smaller package using vastly less power. Despite
the potential advantages and a few notable successes, few neuromorphic analog VLSI devices based on biological vision
have been employed in practical applications to date. This paper describes a hardware implementation in aVLSI of our
recently developed adaptive model for motion detection. The chip integrates motion over a linear array of local motion
processors to give a single voltage output. Although the device lacks on-chip photodetectors, it includes bias circuits to
use currents from external photodiodes, and we have integrated it with a ring-array of 40 photodiodes to form a visual
rotation sensor. The ring configuration reduces pattern noise and combined with the pixel-wise adaptive characteristic of
the underlying circuitry, permits a robust output that is proportional to image rotational velocity over a large range of
speeds, and is largely independent of either mean luminance or the spatial structure of the image viewed. In principle,
such devices could be used as an element of a velocity-based servo to replace or augment inertial guidance systems in
applications such as mUAVs.
The visual pathway that leads from the retina to the tangential cells in the third optical ganglion of the fly is a sophisticated system for the detection of visual motion. The tangential cells, whose responses are thought to characterize the state of egomotion of the animal, show a remarkable ability to encode velocity information about optic flow patterns to which they are sensitive, independent of the structure and contrast of viewed scenery. We describe a simulation study based on a model that accounts for key physiological features observed in the biological system, which contains nonlinear features that we expect to contribute to this capability. One of these features is motion adaptation, a phenomenon on which recent research has shed new light. We conclude that our models significantly reduce dependence of response on variable natural scenery, although they still do not perform as well in this respect as the biological neurons. This biological system has inspired an implementation of visual motion processing in analog VLSI technology. The neuromorphic circuits are intended for eventual on- or near-focal plane integration with photosensing. We describe the design approach and present results from preliminary versions of these circuits.
Visual detection and processing of motion in insects is thought to occur based on an elementary delay-and-correlate operation at an early stage in the visual pathway. The correlational elementary motion detector (EMD) indicates the presence of moving stimuli on the retina and is directionally sensitive, but it is a complex spatiotemporal filter and does not inherently encode important motion parameters such as velocity. However, additional processing, in combination with natural visual stimuli, may allow computation of useful motion parameters. One such feature is adaptation in response to motion, until recently thought to occur by modification of the delay time constant, but now shown to arise due mainly to adjustment of contrast gain. This adaptation renders EMD output less dependent on scene contrast and enables it to carry some velocity information. We describe an ongoing effort to characterize this system in engineering terms, and to implement an analog VLSI model of it. Building blocks for a correlational EMD, and a mechanism for computing and implementing adjustment of contrast gain are described. This circuitry is intended as front-end processing for classes of higher-level visual motion computation also performed by insects, including estimation of egomotion by optical flow, and detection of moving targets.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.