Loading…
Situation-specific learning for ego-vehicle behavior prediction systems
We present a system able to predict the future behavior of the ego-vehicle in an inner-city environment. Our system learns the mapping between the current perceived scene (information about the ego-vehicle and the preceding vehicle, as well as information about the possible traffic lights) and the f...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We present a system able to predict the future behavior of the ego-vehicle in an inner-city environment. Our system learns the mapping between the current perceived scene (information about the ego-vehicle and the preceding vehicle, as well as information about the possible traffic lights) and the future driving behavior of the ego-vehicle. We improve the prediction accuracy by estimating the prediction confidence and by discarding unconfident samples. The behavior of the driver is represented as a sequence of elementary states termed behavior primitives. These behavior primitives are abstractions from the raw actuator states. Behavior prediction is therefore considered to be a multi-class learning problem. In this contribution, we explore the possibilities of situation-specific learning. We show that decomposing the perceived complex situation into a combination of simpler ones, each of them with a dedicated prediction, allows the system to reach a performance equivalent to a system without situation-specificity. We believe that this is advantageous for the scalability of the approach to the number of possible situations that the driver will encounter. The system is tested on a real world scenario, using streams recorded in inner-city scenes. The prediction is evaluated for a prediction horizon of 3s into the future, and the quality of the prediction is measured using established evaluation methods. |
---|---|
ISSN: | 2153-0009 2153-0017 |
DOI: | 10.1109/ITSC.2011.6083108 |