Abstract
We have developed a learning mechanism that allows robots to discover the conditional effects of their actions. Based on sensorimotor experience, this mechanism permits a robot to explore its environment and observe effects of its actions. These observations are used to learn a context operator difference table, a structure that relates circumstances (context) and actions (operators) to effects on the environment. From the context operator difference table, one can extract a relatively small set of state variables, which simplifies the problem of learning policies for complex activities. We demonstrate results with the Pioneer 1 mobile robot.
Original language | English (US) |
---|---|
Pages | 247-253 |
Number of pages | 7 |
DOIs | |
State | Published - 1998 |
Externally published | Yes |
Event | Proceedings of the 1998 2nd International Conference on Autonomous Agents - Minneapolis, MN, USA Duration: May 9 1998 → May 13 1998 |
Other
Other | Proceedings of the 1998 2nd International Conference on Autonomous Agents |
---|---|
City | Minneapolis, MN, USA |
Period | 5/9/98 → 5/13/98 |
ASJC Scopus subject areas
- General Engineering