Loading…
Multi-agent deep-learning based comparative analysis of team sport trajectories
Computational analysis of multi-agent trajectories is a fundamental issue in the study of real-world biological agents. For trajectory analysis, combining movement data with labels (e.g., whether a team scores in a ball game) can provide additional insights compared to relying only on trajectory dat...
Saved in:
Published in: | IEEE access 2023-01, Vol.11, p.1-1 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Computational analysis of multi-agent trajectories is a fundamental issue in the study of real-world biological agents. For trajectory analysis, combining movement data with labels (e.g., whether a team scores in a ball game) can provide additional insights compared to relying only on trajectory data. However, existing deep-learning-based methods consider only single-agent animal trajectories, and cannot be directly applied to multi-agent trajectories in sports. In this paper, a comparative analysis method to analyze multi-agent trajectories in ball games is proposed. A neural network approach using multi-agent motion characteristics (e.g., the distances between agents and objects) as the input is adopted, which is based on an attention mechanism designed to automatically detect segments in trajectories that are characteristic of a given class. This enables us to understand differences between classes by highlighting segmented trajectories and which variables correlate with the given labels. The effectiveness of our approach was verified by comparing various baselines with effective/ineffective attack labels and goal/non-goal labels using different sizes of the dataset. The effectiveness of our method is also demonstrated through a use case that analyzes the attacking plays in an NBA dataset. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3269287 |