Skip to content

This is the code for the paper "Gait Recognition in the Wild with Dense 3D Representations and A Benchmark. (CVPR 2022)", "Gait Recognition in the Wild with Multi-hop Temporal Switch", "Parsing is All You Need for Accurate Gait Recognition in the Wild", and "It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment".

Notifications You must be signed in to change notification settings

Gait3D/Gait3D-Benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gait3D-Benchmark

This repository contains the code and model for our CVPR 2022, ACM MM 2022, 2023, and 2024 papers. The Gait3D-Benchmark project is now maintained By Jinkai Zheng and Xinchen Liu. Thanks to all of our co-authors for their help, as well as the great repository that we list in the Acknowledgement.

Gait3D (SMPLGait)

MTSGait

Gait3D-Parsing (ParsingGait)

XGait

Gait Recognition in the Wild with Dense 3D Representations and A Benchmark (CVPR 2022) Gait Recognition in the Wild with Multi-hop Temporal Switch (ACM MM 2022) Parsing is All You Need for Accurate Gait Recognition in the Wild (ACM MM 2023) It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment (ACM MM 2024)
[Project Page] [Paper] [Paper] [Project Page] [Paper] [Paper]

What's New

Model Zoo

Results and models are available in the model zoo.

Requirement and Installation

The requirement and installation procedure can be found here.

Data Downloading

Please download the Gait3D dataset by signing this agreement.

Please download the Gait3D-Parsing dataset by signing this agreement.

We ask for your information only to make sure the dataset is used for non-commercial purposes. We will not give it to any third party or publish it publicly anywhere.

Data Pretreatment

The data pretreatment can be found here.

Train

Run the following command:

sh train.sh

Test

Run the following command:

sh test.sh

Citation

Please cite these papers in your publications if it helps your research:

@inproceedings{zheng2022gait3d,
  title={Gait Recognition in the Wild with Dense 3D Representations and A Benchmark},
  author={Jinkai Zheng, Xinchen Liu, Wu Liu, Lingxiao He, Chenggang Yan, Tao Mei},
  booktitle={IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2022}
}

@inproceedings{zheng2022mtsgait,
  title={Gait Recognition in the Wild with Multi-hop Temporal Switch},
  author={Jinkai Zheng, Xinchen Liu, Xiaoyan Gu, Yaoqi Sun, Chuang Gan, Jiyong Zhang, Wu Liu, Chenggang Yan},
  booktitle={ACM International Conference on Multimedia (ACM MM)},
  year={2022}
}

@inproceedings{zheng2023parsinggait,
  title={Parsing is All You Need for Accurate Gait Recognition in the Wild},
  author={Jinkai Zheng, Xinchen Liu, Shuai Wang, Lihao Wang, Chenggang Yan, Wu Liu},
  booktitle={ACM International Conference on Multimedia (ACM MM)},
  year={2023}
}

@inproceedings{zheng2024xgait,
  title={It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment},
  author={Jinkai Zheng, Xinchen Liu, Boyue Zhang, Chenggang Yan, Jiyong Zhang, Wu Liu, Yongdong Zhang},
  booktitle={ACM International Conference on Multimedia (ACM MM)},
  year={2024}
}

Acknowledgement

Here are some great resources we benefit from:

  • The codebase is based on OpenGait.
  • The 3D SMPL data is obtained by ROMP.
  • The 2D Silhouette data is obtained by HRNet-segmentation.
  • The 2D Parsing data is obtained by CDGNet.
  • The 2D pose data is obtained by HRNet.
  • The ReID featrue used to make Gait3D is obtained by FastReID.

About

This is the code for the paper "Gait Recognition in the Wild with Dense 3D Representations and A Benchmark. (CVPR 2022)", "Gait Recognition in the Wild with Multi-hop Temporal Switch", "Parsing is All You Need for Accurate Gait Recognition in the Wild", and "It Takes Two: Accurate Gait Recognition in the Wild via Cross-granularity Alignment".

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •