Loading…

Learning Scene-Independent Group Descriptors for Crowd Understanding

Groups are the primary entities that make up a crowd. Understanding group-level dynamics and properties is thus scientifically important and practically useful in a wide range of applications, especially for crowd understanding. In this paper, we show that fundamental group-level properties, such as...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on circuits and systems for video technology 2017-06, Vol.27 (6), p.1290-1303
Main Authors: Jing Shao, Loy, Chen Change, Xiaogang Wang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Groups are the primary entities that make up a crowd. Understanding group-level dynamics and properties is thus scientifically important and practically useful in a wide range of applications, especially for crowd understanding. In this paper, we show that fundamental group-level properties, such as intra-group stability and inter-group conflict, can be systematically quantified by visual descriptors. This is made possible through learning a novel collective transition prior, which leads to a robust approach for group segregation in public spaces. From the former, we further devise a rich set of group-property visual descriptors. These descriptors are scene-independent and can be effectively applied to public scenes with a variety of crowd densities and distributions. Extensive experiments on hundreds of public scene video clips demonstrate that such property descriptors are complementary to each other, scene-independent, and they convey critical information on physical states of a crowd. The proposed group-level descriptors show promising results and potentials in multiple applications, including crowd dynamic monitoring, crowd video classification, and crowd video retrieval.
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2016.2539878