Skip to content

Latest commit

 

History

History
72 lines (61 loc) · 2.41 KB

Baseline.md

File metadata and controls

72 lines (61 loc) · 2.41 KB

Baselines Configurations

We have provided code to reimplement baselines to ablate predictor, decoder, and encoder correpondingly.

The baselines are configured as the following:

method mod graph gconv_unit_type encoder decoder
CVP (default) cvp fact_gc n2e2n traj comb_late
No-Factor [23] noFactor fact_fc - traj noFactor
No-Edge cvp fact_gc noEdge traj comb_late
Early-Feat cvp fact_gc n2e2n traj comb_early
Mid-Feat cvp fact_gc n2e2n traj comb_mid
Pixel cvp fact_gc n2e2n traj cPix
No-Z cvp fact_gc n2e2n noZ comb_late
FP [6] cvp fact_gc n2e2n fp comb_late
LP [6] lp fact_gc n2e2n lp comb_late

To run baseline, just set the non-default flag (in bold) to the corresponding one, since the default configuration is set to train our CVP.

Anyway, the remaining part kindly provides straightforward command which gives you the same config as the table shows:

Entity Predictor

python train.py --gpu ${GPU_ID} --mod noFactor  --graph fact_fc --decoder noFactor
  • No-Edge:
python train.py --gpu ${GPU_ID} --gconv_unit_type noEdge

Frame Decoder

  • Early-Feat
python train.py --decoder comb_early
  • Mid-Feat
python train.py --decoder comb_mid
  • Late-Feat (ours)
python train.py --decoder comb_late
  • Pixel
python train.py --decoder cPix

Latent Representations and Encoder

The --encoder could be traj(ours), noZ, fp, or lp.

  • No-Z baseline
python train.py --encoder noZ
python train.py --encoder fp
python train.py --mod lp --encoder lp

Testing

All models could directly be evaluated by running:

python test.py --checkpoint ${PATH_TO_MODEL} --test_mod best_100