Loading…

Soccer without intelligence

Robot soccer is an excellent testbed to explore innovative ideas and test the algorithms in multi-agent systems (MAS) research. A soccer team should play in an organized manner in order to score more goals than the opponent, which requires well-developed individual and collaborative skills, such as...

Full description

Saved in:
Bibliographic Details
Main Authors: Mericli, T., Akin, H.L.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robot soccer is an excellent testbed to explore innovative ideas and test the algorithms in multi-agent systems (MAS) research. A soccer team should play in an organized manner in order to score more goals than the opponent, which requires well-developed individual and collaborative skills, such as dribbling the ball, positioning, and passing. However, none of these skills needs to be perfect and they do not require highly complicated models to give satisfactory results. This paper proposes an approach inspired from ants, which are modeled as Braitenberg vehicles for implementing those skills as combinations of very primitive behaviors without using explicit communication and role assignment mechanisms, and applying reinforcement learning to construct the optimal state-action mapping. Experiments demonstrate that a team of robots can indeed learn to play soccer reasonably well without using complex environment models and state representations. After very short training sessions, the team started scoring more than its opponents that use complex behavior codes, and as a result of having very simple state representation, the team could adapt to the strategies of the opponent teams during the games.
DOI:10.1109/ROBIO.2009.4913322