[OpenRelief Developer] you may find some CanberraUAV code useful
tridge at samba.org
Sat Jun 16 00:27:55 BST 2012
> This is an area where we can hopefully stand on your shoulders to
> reduce development and testing. Paul Gardner is our autopilot guy, and
> hopefully he can look into those kernel patches (etc) for us. I'm
> assuming we will find this code under https://github.com/tridge/cuav
yes. Note that these are actually patches to the Linux kernel you run on
the RPi, not to the APM code.
> These are all very useful indeed. Obviously we want to minimize our
> time to code and deploy. In short, we are going to review your code,
> and aim to use as much as possible. Our policy is upstreaming all the
> way, so anything we learn, we will send right back to you.
thanks. One of the big lacks in all this is documentation. We have a lot
of code you could use, but not docs to go with it.
> Right now we are looking at ArduPilot Mission Planner as the basis of
> our first GCS. The plan is to extend it to handle multiple drones and
> allow it to be utilized by RESTful API. It will be called by a little
> OpenRelief application and web server which distributes data/interacts
> with parties around the place.
I can understand the reason behind that strategy. For ourselves, we use
the APM mission planner for things like waypoint editing, but not much
more than that. Instead we're extending mavproxy to gain the features we
I decided early on to standardise on python as the core of our
code. Trying to interface that in a sane manner to a .Net executable
running under mono on Linux is a bit of a mess. However it's easy to
have MP up as an extra GCS display, as mavproxy is also a MAVLink proxy
(thus the original name). So it can forward MAVLink streams to any
number of other MAVLink aware programs, including the APM planner. So
you can run mavproxy on a laptop connected to a mavproxy in the plane,
and have the laptop mavproxy forward the mavlink stream to APM MP
running under mono on the same laptop. You then have full control of the
plane via both GCS interfaces at the same time (mavproxy is a
I know this may not make a lot of sense to you at the moment, which is
why I think I need to give you or someone else a demo.
At its core mavproxy is a CLI GCS, but the key to its flexibility is its
modules system. You can load additional python modules inside mavproxy
which extend its functionality a lot. For example, there are modules for
realtime graphing, showing a satellite imagery map with overlays,
controlling bottle drops, controlling cameras etc. Have a look in the
modules/ directory of MAVProxy.
> If I understand you correctly, we can use your code on the plane to
> talk to the ground GCS, thus filling in a blank we had ("how does the
> drone computer talk to the autopilot most effectively?").
yes, you can run mavproxy on the RPi in the plane. It talks to the APM,
and proxies the MAVLink stream to the GCS, while also running in-plane
processing on that stream (eg. controlling flight planning, cameras,
computer vision etc).
The mavproxy modules aren't restricted to MAVLink however. For example,
our camera module sends python objects to the ground station via cPickle
and block_xmit.py, allowing complex state to be transferred from the
plane to the GCS. As an example have a look at modules/camera.py at the
ThumbPacket class. That is an example of a python object describing a
feature detection hit from our machine vision code that is sent from the
plane to the ground station for display on the map. You could extend
MAVLink to encode that sort of information, but it's so much easier to
just send the python object over the link.
> Depending on the extent of your code and UI modules, it may even
> replace Mission Planner in our development, though from the above it
> seems most of your code is primarily suitable for making the drone
> itself smarter, and may initially work best for us in that context,
> delivering data to the ground Mission Planner instance. Do correct me
> if I am wrong :)
the two are not exclusive. By using mavproxy as a connector you can run
any number of ground station interfaces at the same time. The key is
that they all talk the same language (MAVLink).
> That's excellent. We understand that the DIYDRONES community most
> often uses FlightGear (hence we intended to do the same), but I guess
> you had some reasons for taking another route? Glad to learn why.
FlightGear is great for displaying simulations, and can also be used to
replay real flights. See mavplayback.py in mavlink repo for a script
that takes a MAVLink log and generates FlightGear fgFDM UDP packets so
that FlightGear replays a flight. You can then pause and step the
flight, allowing you to see exactly what happend.
What FlightGear is not good for is actually running the simulation. The
FlightGear code is too heavily synchronised between graphics and
simulation. If you use FlightGear for the simulation itself and not just
display then you will end up running the simulation at around 20Hz,
which is not nearly fast enough. Instead we use JSBSim at 1kHz to do the
aerodynamic simulation, and just use FlightGear to display the aircraft,
running FlightGear in a "display only" mode. Note that FlightGear
actually uses JSBSim internally when used in its full sim mode, so its
the same sim, but if you try to interface to FlightGear from APM
directly the graphics sync kills the realism of the simulation.
So we have a python script (runsim.py) that runs JSBSim directly at
1kHz, and maps its outputs into the SITL (software in the loop)
simulation of APM. That allows you to run a full mission simulation with
nothing more than a Linux box. No special hardware needed.
Again, I'm not expecting you to follow all of this just yet. There are a
lot of pieces to learn. What you need is for someone on your team to
take on the task of learning all of these tools then working out which
of then can be used in your project. Ideally it would be someone who
knows enough python to follow along the scripts.
More information about the Developer