For more information, please visit our project page.
Check INSTALL.md for installation details.
You can refer to DATA.md to download the original datasets and preprocess the data from scratch.
Or you can just download our preprocessed datasets (will release soon).
- Generate scenes (save as json file) and test CKL & physical metrics.
sh run/test_livingroom.sh exp_dir
- Load json file to generate images
sh run/test_livingroom_gen_image.sh exp_dir
- Test SCA, KID, and FID
# SCA
python synthetic_vs_real_classifier.py --path_to_real_renderings data/preprocessed_data/LivingRoom/ --path_to_synthesized_renderings your/generated/image/folder
# KID and FID
python compute_fid_scores.py --path_to_real_renderings data/preprocessed_data/LivingRoom/ --path_to_synthesized_renderings your/generated/image/folder
We also provide scripts for generating scenes from an unseen floor plan, such as room in ProTHOR. This script generate scene layout without reliance on any dataset, which provides a lite solution for user to apply on their own furniture dataset. See Procthor.md for more details.
- Base Model
- Training scripts
- Proprossed datasets
- Pretrained models
- Tutorial.ipynb
If you find our work useful in your research, please consider citing:
@inproceedings{yang2024physcene,
title={PhyScene: Physically Interactable 3D Scene Synthesis for Embodied AI},
author={Yang, Yandan and Jia, Baoxiong and Zhi, Peiyuan and Huang, Siyuan},
booktitle={Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2024}
}
Most of the code is borrowed from ATISS and DiffuScene. We thank for their great works and repos.
We thank Ms. Zhen Chen from BIGAI for refining the figures, and all colleagues from the BIGAI TongVerse project for fruitful discussions and help on simulation developments.