Tracking Interacting Objects

Tracking Interacting Objects Using Integer Programming

In this work, we show that tracking different kinds of interacting objects can be formulated as a network-flow Mixed Integer Program. This is made possible by tracking all objects simultaneously and expressing the fact that one object can appear or disappear at locations where another is in terms of linear flow constraints.

We demonstrate the power of our approach on scenes involving cars and pedestrians, bags being carried and dropped by people, and balls being passed from one player to the next in team sports. In particular, we show that by estimating jointly and globally the trajectories of different types of objects, the presence of the ones which were not initially detected based solely on image evidence can be inferred from the detections of the others.

FIBA dataset
People Car dataset
PETS2006 dataset

Source Code

Our code is available under the terms of the version 3 of the GNU General Public License as published by the Free Software Foundation.

Please note that, this tracker is based on POM and KSP, whose code are also publicly available. To run our tracker, please first get a licence of GUROBI.


ECCV’14 Version Password: DNECCV14

PAMI’16 Version  Password: XWPAMI16


The People-Car dataset used in this project can be downloaded from the link below. Because they are many frames, we only annotated a subset of them. If you would like to contribute additional annotations, please contact us.


People-Car Dataset

If you use this dataset for publication purposes, please reference the papers listed below.


X. Wang; E. Türetken; F. Fleuret; P. Fua : Tracking Interacting Objects Using Intertwined Flows; IEEE Transactions on Pattern Analysis and Machine Intelligence. 2016. DOI : 10.1109/Tpami.2015.2513406.
X. Wang; E. Türetken; F. Fleuret; P. Fua : Tracking Interacting Objects Optimally Using Integer Programming. 2014. European Conference on Computer Vision (ECCV), Zurich, Switzerland, September 6-12, 2014. p. 17-32.


Xinchao Wang [email]