Biologically Inspired Algorithms for Visual Navigation and Object Perception in Mobile Robotics

Persistent Link:
http://hdl.handle.net/10150/612074
Title:
Biologically Inspired Algorithms for Visual Navigation and Object Perception in Mobile Robotics
Author:
Northcutt, Brandon D.
Issue Date:
2016
Publisher:
The University of Arizona.
Rights:
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
Embargo:
Release after 06-Apr-2017
Abstract:
There is a large gap between the visual capabilities of biological organisms and visual capabilities of autonomous robots. Even the most simple of flying insects is able to fly within complex environments, locate food, avoid obstacles and elude predators with seeming ease. This stands in stark contrast to even the most advanced of modern ground based or flying autonomous robots, which are only capable of autonomous navigation within simple environments and will fail spectacularly if the expected environment is modified even slightly. This dissertation provides a narrative of the author's graduate research into biologically inspired algorithms for visual perception and navigation with autonomous robotics applications. This research led to several novel algorithms and neural network implementations, which provide improved capabilities of visual sensation with exceedingly light computational requirements. A new computationally-minimal approach to visual motion detection was developed and demonstrated to provide obstacle avoidance without the need for directional specificity. In addition, a novel method of calculating sparse range estimates to visual object boundaries was demonstrated for localization, navigation and mapping using one-dimensional image arrays. Lastly, an assembly of recurrent inhibitory neural networks was developed to provide multiple concurrent object detection, visual feature binding, and internal neural representation of visual objects. These algorithms are promising avenues for future research and are likely to lead to more general, robust and computationally minimal systems of passive visual sensation for a wide variety of autonomous robotics applications.
Type:
text; Electronic Dissertation
Keywords:
neural networks; object perception; sparse mapping; visual navigation; Electrical & Computer Engineering; mobile robotics
Degree Name:
Ph.D.
Degree Level:
doctoral
Degree Program:
Graduate College; Electrical & Computer Engineering
Degree Grantor:
University of Arizona
Advisor:
Higgins, Charles M.

Full metadata record

DC FieldValue Language
dc.language.isoen_USen
dc.titleBiologically Inspired Algorithms for Visual Navigation and Object Perception in Mobile Roboticsen_US
dc.creatorNorthcutt, Brandon D.en
dc.contributor.authorNorthcutt, Brandon D.en
dc.date.issued2016-
dc.publisherThe University of Arizona.en
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.en
dc.description.releaseRelease after 06-Apr-2017en
dc.description.abstractThere is a large gap between the visual capabilities of biological organisms and visual capabilities of autonomous robots. Even the most simple of flying insects is able to fly within complex environments, locate food, avoid obstacles and elude predators with seeming ease. This stands in stark contrast to even the most advanced of modern ground based or flying autonomous robots, which are only capable of autonomous navigation within simple environments and will fail spectacularly if the expected environment is modified even slightly. This dissertation provides a narrative of the author's graduate research into biologically inspired algorithms for visual perception and navigation with autonomous robotics applications. This research led to several novel algorithms and neural network implementations, which provide improved capabilities of visual sensation with exceedingly light computational requirements. A new computationally-minimal approach to visual motion detection was developed and demonstrated to provide obstacle avoidance without the need for directional specificity. In addition, a novel method of calculating sparse range estimates to visual object boundaries was demonstrated for localization, navigation and mapping using one-dimensional image arrays. Lastly, an assembly of recurrent inhibitory neural networks was developed to provide multiple concurrent object detection, visual feature binding, and internal neural representation of visual objects. These algorithms are promising avenues for future research and are likely to lead to more general, robust and computationally minimal systems of passive visual sensation for a wide variety of autonomous robotics applications.en
dc.typetexten
dc.typeElectronic Dissertationen
dc.subjectneural networksen
dc.subjectobject perceptionen
dc.subjectsparse mappingen
dc.subjectvisual navigationen
dc.subjectElectrical & Computer Engineeringen
dc.subjectmobile roboticsen
thesis.degree.namePh.D.en
thesis.degree.leveldoctoralen
thesis.degree.disciplineGraduate Collegeen
thesis.degree.disciplineElectrical & Computer Engineeringen
thesis.degree.grantorUniversity of Arizonaen
dc.contributor.advisorHiggins, Charles M.en
dc.contributor.committeememberTharp, Hal S.en
dc.contributor.committeememberVasić, Baneen
dc.contributor.committeememberBilgin, Alien
dc.contributor.committeememberHiggins, Charles M.en
All Items in UA Campus Repository are protected by copyright, with all rights reserved, unless otherwise indicated.