AirSim is a simulator for drones, cars and more built on Unreal Engine. It is open-source, cross platform and supports hardware-in-loop with popular flight controllers such as PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped in to any Unreal environment you want.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
Check out the quick 1.5 minute demo
Drones in AirSim
Cars in AirSim
By default AirSim spawns multirotor. You can easily change this to car and use all of AirSim goodies. Please see using car guide.
If you have remote control (RC) as shown below, you can manually control the drone in the simulator. For cars, you can use arrow keys to drive manually.
AirSim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. The APIs are exposed through RPC and accessible via variety of languages including C++, Python, C# and Java.
These APIs are also available as a part of a separate independent cross-platform library so you can deploy them on an companion computer on your vehicle. This way you can write and test your code in simulator and later execute it on the real vehicles. Transfer learning and related research is one of our focus areas.
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button on the lower right corner. This will start writing pose and images for each frame. The data logging code is pretty simple and you can modify it to your heart's desire.
A better way to generate training data exactly the way you want is by accessing the APIs. This allows you to be in full control of how, what, where and when you want to log data.
Yet another way to use AirSim is so-called "Computer Vision" mode. In this mode, you don't have vehicle physics and dynamics but you can use keyboard to move around and use APIs to position the vehicle in any arbitrary pose and get images such as depth, disparity, surface normals or object segmentation.
- We now have the car model.
- No need to build the code. Just download binaries and you are good to go!
- The tutorial for using off-the-self environments with AirSim.
- The reinforcement learning example with AirSim
- New built-in flight controller called simple_flight that "just works" without any additional setup. It is also now default.
- AirSim now also generates depth as well as disparity images that is in camera plan.
- We also have official Linux build now! If you have been using AirSim with PX4, you might want to read the release notes.
More technical details are available in AirSim paper (FSR 2017 Conference). Please cite this as:
@inproceedings{airsim2017fsr,
author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor},
title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles},
year = {2017},
booktitle = {Field and Service Robotics},
eprint = {arXiv:1705.05065},
url = {https://arxiv.org/abs/1705.05065}
}
We welcome contributions to help advance research frontiers. Please take a look at open issues and Trello board if you are looking for areas to contribute to.
We are maintaining list of few projects, people and groups that we are aware of. If you had like to be featured in this list please add request here.
Join the AirSim group at Facebook to stay up to date or ask any questions.
If you run into problems, check the FAQ and feel free to post issues on the AirSim github.
This project is released under MIT License. Please review License file for more details.