Premiere at the Festival MONSTRA 2014 in Lisbon

 

For the length of a battery life the AR.Drone took flight and engaged in a memorable dialogue with human and artificial performers at the festival MONSTRA. Enjoy the trailer!

Advertisements

Drone Talk

 

It is not contemporary falconry you see in the first part of the video (although the idea has a lot of potential 🙂
Sandro and Simão hooked up via local network and had LabView transmit flight data via Open Sound Protocol (OSC) to Max. The result: the drone started to talk…

 

Generating flight behaviors

 

This rehearsal videos shows how our programmer Sandro Fioravanti is piloting the drone, recording the flight data in LabView and reproducing the flight. In the second part you can see side-to-side comparisons between the piloted and the reproduced flights. Simple flight maneuvers are reproduced almost perfectly, while more complex flight maneuvers show differences in space and time (check the starting mark and the tiles on the floor to compare).

We can generate controlled and semi-autonomous flight behaviors now, through the workflow described above, and at the same time we are facing new exciting challenges: creating and editing a database of flight behaviors; analyzing and minimizing the variations in complex flight maneuvers, and contextualizing these behaviors in our virtual ecology dramaturgically.

Shell Design

 

Yet another essential dimension of I-Care-Us is the experimentation with the design of the AR.Drone 2.0 shell. Not only technical issues (limit of the load that can be transported, influence on the flight behavior etc) arise immediately, but, more importantly, each iteration in any given shell design also dramatically alters the perception of the areal robot – the creature – and its interaction intentions and characteristics. An important part of our artistic research will address the interrelatedness of shell design and social interaction capacities of the drone.

AR.Drone 2.0 live performance app

vertical camera

Sandro has so far succeeded in building a (Windows) stand-alone application for the AR.Drone 2, which allows precise flight control (using a Logitech game-pad), including 20 different flight animations; LED-animations; receiving the video stream with little latency; and logging flight data. As I am writing this post, Sandro is implementing the OSC communication protocol to stream flight data to another application, and to receive data. The AR.Drone 2.0 stand-alone is a big step for all of us, because in LabView the app can be always updated according to our needs, even during a rehearsal!

rehearsal 2.0

 

Lately we have been exploring Aerial Silk techniques to expand the inter-species dialogue between the quadrotor and performer(s) into the ether. In this rehearsal video Miguel and Diana (his acrobatics coach & guest dancer for the session) explore a whole range of gestures and body motion to engage with Sandro’s delicate flight maneuvers. Sandro is a talented young Italian programmer who has joined our team to research what it takes to develop a custom-built toolkit for the AR Drone 2 in the scientific programming environment LabView.

the project

 

I-CARE-US is a performance for dancers and quadrotors by Fernando Nabais and Stephan Jurgens to be premiered in 2013. This rehearsal video shows first stages of experimentation with improvisational strategies for this “inter-species” dialogue, starting points for the programming of the quadrotor’s behaviors, and ideas for the use of the live video feed provided by the on board cameras.