HKUST Institutional Repository >
Electronic and Computer Engineering >
ECE Master Theses >
Please use this identifier to cite or link to this item:
|Title: ||Learning mobile robot control for obstacle avoidance based on motion energy neurons|
|Authors: ||Gao, Minqi|
|Issue Date: ||2009 |
|Abstract: ||Image motion due to self motion is an important cue biological systems use for gathering information about the environment. The motion energy model is commonly used to model the responses of motion selective neurons in the mammalian primary visual cortex. Here we investigate the hypothesis that these low level responses are directly useful for navigation. This avoids the need for estimating a model of the environment and the delay incurred in computing it.
In order to discover the relationship between the neuron responses and the motor control required to avoid obstacles in the environment, we use reinforcement learning to train a robot equipped with infrared depth sensors to avoid objects using the outputs of simulated motion energy neurons by maximizing the long term average of infrared signals it receives.
Our experiments with a Koala root indicate that the responses of the motion energy neuron are reliable indicators of image motion and can effectively trigger obstacle avoidance behaviors. Furthermore, we demonstrate that the obstacle avoidance can emerge via reinforcement learning. Therefore, it is possible for a mobile robot to behave intelligently within its environment based on responses of the motion energy neurons without creating an explicit representation of the environment; rather a low-level neutrally inspired representation of visual motion can drive avoidance behavior directly.
Keywords- image motion, self motion, motion energy model, reinforcement learning robot navigation|
|Description: ||Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2009|
x, 49 p. : ill. ; 30 cm
HKUST Call Number: Thesis ECED 2009 Gao
|Appears in Collections:||ECE Master Theses|
Files in This Item:
All items in this Repository are protected by copyright, with all rights reserved.