|
27 | 27 | - [Path Track](#path-track)
|
28 | 28 | - [TAO](#tao)
|
29 | 29 | - [GMOT-40](#gmot-40)
|
| 30 | + - [TAO-OW](#tao-ow) |
30 | 31 | - [KITTI-Tracking](#kitti-tracking)
|
31 | 32 | - [APOLLOSCAPE](#apolloscape)
|
32 | 33 | - [APOLLO Dection/Tracking](#apollo-dectiontracking)
|
@@ -79,6 +80,8 @@ Simple Cues Lead to a Strong Multi-Object Tracker [[code]()] [[paper](https://ar
|
79 | 80 |
|
80 | 81 | **OC-SORT**: Observation-Centric SORT: Rethinking SORT for Robust Multi-Object Tracking [[code](https://github.com/noahcao/OC_SORT)] [[paper](https://arxiv.org/pdf/2203.14360.pdf)]
|
81 | 82 |
|
| 83 | +**GTR**: Global Tracking Transformers [[code](https://github.com/xingyizhou/GTR)] [[paper](https://arxiv.org/pdf/2203.13250.pdf)] **CVPR 2022** |
| 84 | + |
82 | 85 | **StrongSORT**: StrongSORT: Make DeepSORT Great Again [[code](https://github.com/dyhBUPT/StrongSORT)] [[paper](https://arxiv.org/pdf/2202.13514.pdf)]
|
83 | 86 |
|
84 | 87 | **MAA**: Modelling Ambiguous Assignments for Multi-Person Tracking in Crowds [[code]] [[paper](https://openaccess.thecvf.com/content/WACV2022W/HADCV/papers/Stadler_Modelling_Ambiguous_Assignments_for_Multi-Person_Tracking_in_Crowds_WACVW_2022_paper.pdf)]
|
@@ -469,6 +472,15 @@ CMU 等在今年提出了一个新的大型 MOT 数据集,TAO(Tracking Any O
|
469 | 472 | 
|
470 | 473 | 遗憾的是,作者要等到论文收录之后再公开数据集。
|
471 | 474 |
|
| 475 | +<a id="markdown-tao-ow" name="tao-ow"></a> |
| 476 | +### TAO-OW |
| 477 | + |
| 478 | +MOTChallenge团队和其他人一起提出的开放世界多目标跟踪,包括数据集、基准提交网站、提出指标,推动了MOT的开放世界任务的发展。 |
| 479 | + |
| 480 | + |
| 481 | + |
| 482 | +地址为[https://openworldtracking.github.io/](https://openworldtracking.github.io/)。 |
| 483 | + |
472 | 484 | ---
|
473 | 485 |
|
474 | 486 | 以下是驾驶场景下的数据集
|
|
0 commit comments