Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project: Drone demo #16

Open
elpiel opened this issue May 23, 2020 · 1 comment
Open

Project: Drone demo #16

elpiel opened this issue May 23, 2020 · 1 comment
Assignees
Labels
roadmap-2022 Roadmap for 2022
Milestone

Comments

@elpiel
Copy link
Member

elpiel commented May 23, 2020

Non-Critical Aeronautics

Drones

Aerorust Drones

Current work

Many of you might already know that we are working on Parrot drones right now and building a Rust SDK called arsdk-rs to be able to connect, control and use other features of the drone with Rust.

Proposal

With this in mind I want to propose to create a demo project that will integrate different components and build an integrated system showing what is currently possible with Rust and Open-source.

Social part

The other aspect of the project is the possibility to connect and form bonds with other communities like Rust-ML (machine learning) WG & Rust-CV (Computer Vision).

Tech stack

There are many components already in place, which we can integrate together and build this project.

Components:
Legend:

  • ✔️ - we already have this component
  • 🛠️ - Work in progress
  • ❓ - Ideas are welcome

✔️ Sphinx Simulation provided by Parrot

🛠️ Arsdk-rs - Rust SDK for sending/receiving commands (i.e. controlling the drone)

❓ Computer Vision

Rust-ML (machine learning) WG or Rust-CV (Computer Vision) components that we can integrate with the Drone

🛠️ VR - o0Ignition0o/airsim-rs#6

This could be useful to monitor or perhaps even control the drone

🛠️ ❓ Microsoft AirSim - https://github.com/o0Ignition0o/airsim-rs

Check the project https://github.com/microsoft/AirSim for more information.

o0Ignition0o/airsim-rs
Simulation integration for a much more complete and advanced simulations.

Objectives

  • Integrated project as a portfolio and a way to showcase the Rust community to other communities and domains.
  • Talks surrounding the project for better exposure
  • Workshops - for exposure, finding sponsors and FUN 🎉
@elpiel elpiel added the roadmap-2022 Roadmap for 2022 label May 23, 2020
@elpiel elpiel added this to the Roadmap 2020 milestone May 23, 2020
@elpiel elpiel changed the title Project: Integrated Drone demo Project: Drone demo May 23, 2020
@vadixidav
Copy link

@elpiel asked me to write up what capabilities we have today at Rust CV. You can see that now in our goals section.

I think the simplest way you can utilize Rust CV today to do something interesting with the drone is using the technique found here: https://github.com/rust-cv/vslam-sandbox/blob/0a0bd760ceee2da38f0626a8a8678b9e98a657e1/src/main.rs. This code will allow you to perform some very rudimentary indirect visual odometry. We can make it a bit more robust by allowing it to use older frames in case of failure. This will allow you approximate how much the drone has moved and rotated on each frame.

Since the drone probably already has basic equipment onboard to detect motion, this could be used in a different way. You could use a reference image of a particular object with some features on it, such as a large sheet of paper (or, since this is a simulation, just a flat rendering of an image). The drone could then constantly estimate its pose relative to this object. Alternatively, you could trigger the drone to hold steady, and then it would remember a snapshot from that location. From that point on, it could hold steady at that location since it can figure out how its position has changed based on the current frame and that snapshot it took. This would also require a human to have a trigger to tell the drone "hold position". This would provide interesting value, since the onboard sensors (accelerometers and gyroscopes) cant tell you absolute position, but you can do that with computer vision so long as objects in the scene do not move (although if they do move, that likely wont throw it off unless it is a large change).

There are also other interesting things you could do by using other algorithms, but I think that this is the simplest way to integrate computer vision into this simulation today. The easiest of the above solutions is likely the "hold in place" concept, since no data needs to be prepared in advance, and all the information comes just from the drone camera.

Let me know if you are interested in me writing up a demo of this capability, and I can do that right away.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
roadmap-2022 Roadmap for 2022
Projects
None yet
Development

No branches or pull requests

2 participants