Foveal automatic target recognition using a neural network

Susan S. Young, Peter D. Scott, Cesar Bandera

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

This paper proposes a method for identifying and classifying a target from its foveal imagery using a neural network. The method's criterion for identifying a target is based on finding the global minimum of an energy function. This energy function is characterized by matching the candidate target and a library of target models at several levels of resolution of nonuniformly sampled foveal image data. For this purpose, a top-down and bottom-up (concurrent) matching procedure is implemented via a multi-layer Hopfield neural network. The corresponding energy function supports not only connections between cells at the same resolution level, but also interconnections between two sets of nodes at two different resolution levels. The proposed method also utilizes feature analysis at the higher resolution levels of the target to relocate the center of the fovea to a more salient region of the target (gaze control). The results of an experimental scenario for foveal target recognition are presented.

Original languageEnglish (US)
Title of host publicationIEEE International Conference on Image Processing
Editors Anon
PublisherIEEE
Pages303-306
Number of pages4
Volume1
StatePublished - Dec 1 1996
Externally publishedYes
EventProceedings of the 1996 IEEE International Conference on Image Processing, ICIP'96. Part 2 (of 3) - Lausanne, Switz
Duration: Sep 16 1996Sep 19 1996

Other

OtherProceedings of the 1996 IEEE International Conference on Image Processing, ICIP'96. Part 2 (of 3)
CityLausanne, Switz
Period9/16/969/19/96

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Foveal automatic target recognition using a neural network'. Together they form a unique fingerprint.

Cite this