Skip to content

Commit

Permalink
Merge pull request #128 from fxia22/move_examples
Browse files Browse the repository at this point in the history
Move examples to gibson2 package
  • Loading branch information
fxia22 authored Dec 24, 2020
2 parents 1c3b00d + b38b326 commit 4cf34dc
Show file tree
Hide file tree
Showing 223 changed files with 97 additions and 163 deletions.
26 changes: 13 additions & 13 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -27,24 +27,24 @@ pipeline {
stage('Test') {
steps {
sh 'mkdir result'
sh 'pytest test/test_binding.py --junitxml=test_result/test_binding.py.xml'
sh 'pytest test/test_render.py --junitxml=test_result/test_render.py.xml'
sh 'pytest test/test_pbr.py --junitxml=test_result/test_pbr.py.xml'
sh 'pytest test/test_object.py --junitxml=test_result/test_object.py.xml'
sh 'pytest test/test_simulator.py --junitxml=test_result/test_simulator.py.xml'
sh 'pytest test/test_igibson_env.py --junitxml=test_result/test_igibson_env.py.xml'
sh 'pytest test/test_scene_importing.py --junitxml=test_result/test_scene_importing.py.xml'
sh 'pytest test/test_robot.py --junitxml=test_result/test_robot.py.xml'
sh 'pytest test/test_igsdf_scene_importing.py --junitxml=test_result/test_igsdf_scene_importing.py.xml'
sh 'pytest test/test_sensors.py --junitxml=test_result/test_sensors.py.xml'
sh 'pytest test/test_motion_planning.py --junitxml=test_result/test_motion_planning.py.xml'
sh 'pytest gibson2/test/test_binding.py --junitxml=test_result/test_binding.py.xml'
sh 'pytest gibson2/test/test_render.py --junitxml=test_result/test_render.py.xml'
sh 'pytest gibson2/test/test_pbr.py --junitxml=test_result/test_pbr.py.xml'
sh 'pytest gibson2/test/test_object.py --junitxml=test_result/test_object.py.xml'
sh 'pytest gibson2/test/test_simulator.py --junitxml=test_result/test_simulator.py.xml'
sh 'pytest gibson2/test/test_igibson_env.py --junitxml=test_result/test_igibson_env.py.xml'
sh 'pytest gibson2/test/test_scene_importing.py --junitxml=test_result/test_scene_importing.py.xml'
sh 'pytest gibson2/test/test_robot.py --junitxml=test_result/test_robot.py.xml'
sh 'pytest gibson2/test/test_igsdf_scene_importing.py --junitxml=test_result/test_igsdf_scene_importing.py.xml'
sh 'pytest gibson2/test/test_sensors.py --junitxml=test_result/test_sensors.py.xml'
sh 'pytest gibson2/test/test_motion_planning.py --junitxml=test_result/test_motion_planning.py.xml'
}
}

stage('Benchmark') {
steps {
sh 'python test/benchmark/benchmark_static_scene.py'
sh 'python test/benchmark/benchmark_interactive_scene.py'
sh 'python gibson2/test/benchmark/benchmark_static_scene.py'
sh 'python gibson2/test/benchmark/benchmark_interactive_scene.py'
}
}

Expand Down
2 changes: 1 addition & 1 deletion docker/base/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,4 @@ RUN pip install -e .
RUN python -m gibson2.utils.assets_utils --download_assets
RUN python -m gibson2.utils.assets_utils --download_demo_data

WORKDIR /opt/igibson/examples/demo
WORKDIR /opt/igibson/gibson2/examples/demo
2 changes: 1 addition & 1 deletion docker/headless-gui/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,5 @@ COPY entrypoint.sh /opt/misc

ENV QT_X11_NO_MITSHM=1
ENV DISPLAY=:0
WORKDIR /opt/igibson/examples/demo
WORKDIR /opt/igibson/gibson2/examples/demo
ENTRYPOINT ["/opt/misc/entrypoint.sh"]
2 changes: 1 addition & 1 deletion docs/assets.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ assets
├── test
│ └── mesh
└── example_configs
└── {}.yaml
└── {}.yaml (deprecated, will be removed)
```
`models` contains robot models and interactive objects, `networks` contain the neural network filler used in Gibson V1, `test` contains files for performing tests of installation.
Expand Down
4 changes: 2 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@


project = 'iGibson'
copyright = 'Stanford University 2018-2020'
author = 'Fei Xia, William B. Shen, Chengshu Li, Micael Tchapmi, Lyne Tchapmi, Noriaki Hirose, Kevin Chen, Junyoung Gwak Priya Kasimbeg, Alexander Toshev, Amir R. Zamir, Roberto Martı́n-Martı́n, Li Fei-Fei, Silvio Savarese'
copyright = 'Stanford University 2018-2021'
author = 'Bokui Shen*, Fei Xia*, Chengshu Li*, Roberto Martín-Martín*, Linxi Fan, Guanzhi Wang, Shyamal Buch, Claudia DArpino, Sanjana Srivastava, Lyne P. Tchapmi, Micael E. Tchapmi, Kent Vainio, Li Fei-Fei, Silvio Savarese (*Equal Contribution)'

github_doc_root = 'https://github.com/StanfordVL/iGibson'

Expand Down
8 changes: 3 additions & 5 deletions docs/environments.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ In this example, we show how to instantiate `iGibsonEnv` and how to step through
- `reward`: a scalar that represents the current reward
- `done`: a boolean that indicates whether the episode should terminate
- `info`: a python dictionary for bookkeeping purpose
The code can be found here: [examples/demo/env_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/env_example.py).
The code can be found here: [gibson2/examples/demo/env_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/env_example.py).

```python
from gibson2.envs.igibson_env import iGibsonEnv
Expand All @@ -206,9 +206,7 @@ import logging
def main():
config_filename = os.path.join(
os.path.dirname(gibson2.__file__),
'../examples/configs/turtlebot_demo.yaml')
config_filename = os.path.join(gibson2.example_config_path, 'turtlebot_demo.yaml')
env = iGibsonEnv(config_file=config_filename, mode='gui')
for j in range(10):
env.reset()
Expand All @@ -228,4 +226,4 @@ if __name__ == "__main__":
```

#### Interactive Environments
In this example, we show how to instantiate `iGibsobEnv` with a fully interactive scene `Rs_int`. In this scene, the robot can interact with all the objects in the scene (chairs, tables, couches, etc). The code can be found here: [examples/demo/env_interactive_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/env_interactive_example.py).
In this example, we show how to instantiate `iGibsobEnv` with a fully interactive scene `Rs_int`. In this scene, the robot can interact with all the objects in the scene (chairs, tables, couches, etc). The code can be found here: [gibson2/examples/demo/env_interactive_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/env_interactive_example.py).
2 changes: 1 addition & 1 deletion docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ iGibson's simulator can be installed as a python package using pip:
```bash
pip install gibson2 # This step takes about 4 minutes
# run the demo
python -m gibson2.scripts.demo_static
python -m gibson2.examples.demo.demo_static
```

Note: we support using a custom pybullet version to speed up the physics in iGibson, if you want to have the speed up, you would need to do the following steps after installation:
Expand Down
2 changes: 1 addition & 1 deletion docs/objects.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Instruction can be found here: [External Objects](https://github.com/StanfordVL/


### Examples
In this example, we import three objects into PyBullet, two of which are articulated objects. The code can be found here: [examples/demo/object_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/object_example.py).
In this example, we import three objects into PyBullet, two of which are articulated objects. The code can be found here: [gibson2/examples/demo/object_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/object_example.py).

```python
from gibson2.objects.ycb_object import YCBObject
Expand Down
2 changes: 1 addition & 1 deletion docs/physics_engine.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Typically, we use `p.createMultiBody` and `p.loadURDF` to load scenes, objects a
More info can be found in here: [PyBullet documentation](https://docs.google.com/document/d/10sXEhzFRSnvFcl3XxNGhnD4N2SedqwdAvK3dsihxVUA).

### Examples
In this example, we import a scene, a robot and an object into PyBullet and step through a few seconds of simulation. The code can be found here:[examples/demo/physics_engine_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/physics_engine_example.py).
In this example, we import a scene, a robot and an object into PyBullet and step through a few seconds of simulation. The code can be found here:[gibson2/examples/demo/physics_engine_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/physics_engine_example.py).

```python
import pybullet as p
Expand Down
12 changes: 4 additions & 8 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@
Assume you finished installation and assets downloading. Let's get our hands dirty and see iGibson in action.

```bash
cd examples/demo
python env_example.py
python -m gibson2.examples.demo.env_example
```
You should see something like this:
![quickstart.png](images/quickstart.png)
Expand Down Expand Up @@ -55,8 +54,7 @@ simulation framerate in iGibson.

### Benchmark static scene (Gibson scenes)
```bash
cd test/benchmark
python benchmark_static_scene.py
python -m gibson2.test.benchmark.benchmark_static_scene
```

You will see output similar to:
Expand All @@ -76,8 +74,7 @@ Rendering normal, resolution 512, render_to_tensor False: 265.70666134193806 fps
### Benchmark physics simulation in interactive scenes (iGibson scene)

```bash
cd test/benchmark
python benchmark_interactive_scene.py
python -m gibson2.test.benchmark.benchmark_interactive_scene
```

It will generate a report like below:
Expand All @@ -90,8 +87,7 @@ It will generate a report like below:
To run a comprehensive benchmark for all rendering in all iGibson scenes, you can excute the following command:

```bash
cd test/benchmark
python benchmark_interactive_scene_rendering.py
python -m gibson2.test.benchmark.benchmark_interactive_scene_rendering
```

It benchmarks two use cases, one for training visual RL agents (low resolution, shadow mapping off), another one for
Expand Down
14 changes: 6 additions & 8 deletions docs/renderer.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ We developed our own MeshRenderer that supports customizable camera configuratio

#### Simple Example

In this example, we render an iGibson scene with a few lines of code. The code can be found in [examples/demo/mesh_renderer_simple_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/mesh_renderer_simple_example.py).
In this example, we render an iGibson scene with a few lines of code. The code can be found in [gibson2/examples/demo/mesh_renderer_simple_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/mesh_renderer_simple_example.py).

```
import cv2
Expand Down Expand Up @@ -51,10 +51,9 @@ For `Rs` scene, the rendering results will look like this:
In this example, we show an interactive demo of MeshRenderer.

```bash
cd examples/demo
python mesh_renderer_example.py
python -m gibson2.examples.demo.mesh_renderer_example
```
You may translate the camera by pressing "WASD" on your keyboard and rotate the camera by dragging your mouse. Press `Q` to exit the rendering loop. The code can be found in [examples/demo/mesh_renderer_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/mesh_renderer_example.py).
You may translate the camera by pressing "WASD" on your keyboard and rotate the camera by dragging your mouse. Press `Q` to exit the rendering loop. The code can be found in [gibson2/examples/demo/mesh_renderer_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/mesh_renderer_example.py).

#### PBR (Physics-Based Rendering) Example

Expand All @@ -63,8 +62,7 @@ You can test the physically based renderer with the PBR demo. You can render any
obj files in the folder.

```bash
cd examples/demo
python mesh_renderer_example_pbr.py <path to ig_dataset>/objects/sink/sink_1/shape/visual
python -m gibson2.examples.demo.mesh_renderer_example_pbr <path to ig_dataset>/objects/sink/sink_1/shape/visual
```
![pbr_renderer.png](images/pbr_render.png)

Expand All @@ -73,12 +71,12 @@ You will get a nice rendering of the sink, and should see the metal parts have s


#### Velodyne VLP-16 Example
In this example, we show a demo of 16-beam Velodyne VLP-16 LiDAR placed on top of a virtual Turtlebot. The code can be found in [examples/demo/lidar_velodyne_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/lidar_velodyne_example.py).
In this example, we show a demo of 16-beam Velodyne VLP-16 LiDAR placed on top of a virtual Turtlebot. The code can be found in [gibson2/examples/demo/lidar_velodyne_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/lidar_velodyne_example.py).

The Velodyne VLP-16 LiDAR visualization will look like this:
![lidar_velodyne.png](images/lidar_velodyne.png)

#### Render to PyTorch Tensors

In this example, we show that MeshRenderer can directly render into a PyTorch tensor to maximize efficiency. PyTorch installation is required (otherwise, iGibson does not depend on PyTorch). The code can be found in [examples/demo/mesh_renderer_gpu_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/mesh_renderer_gpu_example.py).
In this example, we show that MeshRenderer can directly render into a PyTorch tensor to maximize efficiency. PyTorch installation is required (otherwise, iGibson does not depend on PyTorch). The code can be found in [gibson2/examples/demo/mesh_renderer_gpu_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/mesh_renderer_gpu_example.py).

12 changes: 6 additions & 6 deletions docs/robots.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Note that `robot_action` is a normalized joint velocity, i.e. `robot_action[n] =
Most of the code can be found here: [gibson2/robots](https://github.com/StanfordVL/iGibson/blob/master/gibson2/robots).

### Examples
In this example, we import four different robots into PyBullet. We keep them still for around 10 seconds and then move them with small random actions for another 10 seconds. The code can be found here: [examples/demo/robot_example.py](https://github.com/StanfordVL/iGibson/blob/master/examples/demo/robot_example.py).
In this example, we import four different robots into PyBullet. We keep them still for around 10 seconds and then move them with small random actions for another 10 seconds. The code can be found here: [gibson2/examples/demo/robot_example.py](https://github.com/StanfordVL/iGibson/blob/master/gibson2/examples/demo/robot_example.py).

```python
from gibson2.robots.locobot_robot import Locobot
Expand All @@ -54,7 +54,7 @@ import time
import numpy as np
import pybullet as p
import pybullet_data

import gibson2

def main():
p.connect(p.GUI)
Expand All @@ -65,19 +65,19 @@ def main():
p.loadMJCF(floor)

robots = []
config = parse_config('../configs/fetch_reaching.yaml')
config = parse_config(os.path.join(gibson2.example_config_path, 'fetch_reaching.yaml'))
fetch = Fetch(config)
robots.append(fetch)

config = parse_config('../configs/jr_reaching.yaml')
config = parse_config(os.path.join(gibson2.example_config_path,'jr_reaching.yaml'))
jr = JR2_Kinova(config)
robots.append(jr)

config = parse_config('../configs/locobot_point_nav.yaml')
config = parse_config(os.path.join(gibson2.example_config_path, 'locobot_point_nav.yaml'))
locobot = Locobot(config)
robots.append(locobot)

config = parse_config('../configs/turtlebot_point_nav.yaml')
config = parse_config(os.path.join(gibson2.example_config_path, 'turtlebot_point_nav.yaml'))
turtlebot = Turtlebot(config)
robots.append(turtlebot)

Expand Down
Loading

0 comments on commit 4cf34dc

Please sign in to comment.