Loading…
A Neural Network Approach for Predicting Collision Severity
The development of a collision severity model can serve as an important tool in understanding the requirements for devising countermeasures to improve occupant safety and traffic safety. Collision type, weather conditions, and driver intoxication are some of the factors that may influence motor vehi...
Saved in:
Main Authors: | , |
---|---|
Format: | Report |
Language: | English |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The development of a collision severity model can serve as an important tool in understanding the requirements for devising countermeasures to improve occupant safety and traffic safety. Collision type, weather conditions, and driver intoxication are some of the factors that may influence motor vehicle collisions. The objective of this study is to use artificial neural networks (ANNs) to identify the major determinants or contributors to fatal collisions based on various driver, vehicle, and environment characteristics obtained from collision data from Transport Canada. The developed model will have the capability to predict similar collision outcomes based on the variables analyzed in this study. A multilayer perceptron (MLP) neural network model with feed-forward back-propagation architecture is used to develop a generalized model for predicting collision severity. The model output, collision severity, is divided into three categories - fatal, injury, and property damage only. Since the data is qualitative in nature, the development of a reliable model with a considerable level of accuracy is ensured. Once the neural network model is developed, a sensitivity analysis is conducted to determine the effective relationship between the input and output variables. This paper presents techniques that allow the development of a generalized model for conducting a comprehensive analysis using qualitative data and artificial neural networks. |
---|---|
ISSN: | 0148-7191 2688-3627 |
DOI: | 10.4271/2014-01-0569 |