The Jumping Janus.
February 4, 2015 at 12:05 pm 2 comments
The guys from webRTCfest held a hackathon in December – one of the challenges was to pilot a Parrot Jumping Sumo remotely over webRTC.
WebRTC is a good fit for this:
- low latency communication,
- realtime video
- encrypted media
- NAT traversal
While there isn’t a direct use-case for the Sumo (apart from fun) – one can imagine future drones being used to allow experts to do remote inspections, or just to play with your pet from a hotel room far away!
Neil Stratford and I discussed giving it a go, but the deadline conflicted with our other webRTC consulting work, so we gave it a miss. Then we noticed that no one had won that challenge and that Parrot had extended the deadline to the end of January…
Temptation proved too much.
I went out and bought a white drone :
I had a quick play with it from my android tablet – then down to work to enable webRTC use.
We’d discussed the possible architecture before the purchase and we’d decided that the Janus gateway from Meetecho was the best webRTC tool for this job. One of the main criteria was the the ARDrone SDK had a C API and some simple(ish) examples in the same language. So we felt that Janus would be the easiest webRTC gateway to interface to it.
We had 2 design problems:
- The Sumo acts as a wifi access point – so we would need to run a local gateway with 2 interfaces, one talking to the sumo over Wifi and the other plugged into my internet connection.
- The Sumo’s video stream is in the mjpeg format – which isn’t a supported webRTC codec.
In the end we had an architecture that looked like this:
We built the ARDrone SDK and forked Janus (both on github). and Neil started coding with some encouragement from me. Meanwhile I set up my Aleutia r50 to run Ubuntu and installed all the requisite packages (actually we started off on a fitPC, but the atom processor hadn’t got the horsepower for the job).
We hit 2 problems along the way:
- The transcoding was expensive – we maxed out CPU on the Aleutia transcoding the video stream from mjpeg to vp8, ( for the curious here’s the gstreamer pipeline:
” gst-launch-0.10 udpsrc port=5003 ! jpegparse ! jpegdec ! vp8enc ! rtpvp8pay ! udpsink sync=false host=127.0.0.1 port=5004″Fortunately mjpeg is a format that lets you drop any frame and all it impacts is the frame-rate not the image quality, so we cope by just dropping frames when we are busy. Unfortunately this results in more latency than is strictly necessary, we’d re-do this if it were anything other than a rushed hack. - We didn’t read the ARDrone docs. This meant we spent quite a bit of time trying to work out why the Sumo would sulk after a while. It turned out we we had not started the thread that sent the ACKs to the drone. We also managed to break the ‘jump’ mechanism by jumping it on a soft surface – so this white drone can’t jump.
Here is the drone in action, remotely controlled over webRTC – before the jump broke.
I’d like to add that the Janus gateway’s API was perfect for this task, we basically just adapted one of their plugins by adding calls to the ARDrone SDK. I particularly liked the declarative way that if you said (in javascript) that a connection would need a DataChannel, one just becomes available to the C gateway code.
We used the datachannel to send the commands from the browser to the gateway, ensuring a realtime response from the drone.
Our quick-hack code is available on bitbucket.
We’ve had it remotely controlled from France, Italy and even Cambridge !
Thanks to Parrot for sponsoring the challenge and to WebRTCfest for naming us the winners !
Entry filed under: Uncategorized, VoIP.
1.
John Balogh | February 4, 2015 at 4:20 pm
Your demo encourages me to use WebRTC for remote controlling model trains. More to think on before plugging in the bits though…
2.
Dean Elwood | February 4, 2015 at 10:01 pm
Fantastic!