Hybrid Futures: Jules LaPlace Introduction

The holidays are nearing, and so is our last event of the year: "Hybrid Futures. Between Art and Science". And what better place to speak about what's to come than at the new Futurium? The future is right around the corner, don't miss it. This time, our weeknote is dedicated to one of the two AI-experts Jules LaPlace.

American-born Jules LaPlace is a web developer and coder based in Berlin. His focus currently lies in building new tools for artistic expression, while at the same time exploring the offered potential for musicians with leading questions along the lines of "what are we hearing when we listen to neural automata?". In the past he led the development on both interactive and viral websites for clients such as Google, Nike and the Metropolitan Museum of Art. A software he built for human rights researchers at VFRAME was granted an Award of Distinction in Artificial Intelligence at Ars Electronica in 2019.

VFRAME started as an open-source prototype utilizing computer-vision's ability to detect objects to locate illegal munitions within vast amounts of video from conflict zones in Syria. For the project, a visual search engine and custom algorithms for processing million-scale video datasets were developed.

Now LaPlace primarily builds systems using neural networks for Hito Steyerl and Holly Herndon. For the latter, numerous training models were tested and employed with excerpts of her voice. The team was able to create a vocal synthesizer able to take nearly any audio input and produce output which came out sounding like Herndon. This voice model appeared on her 2019 album "Proto".

There are several projects he collaborated with Hito Steyerl for. A machine learning model synthesized the sound of breaking glass for "The City of Broken Windows". A network was trained on three minutes of glass breaking noise, which was ultimately enough to generate similar sounds on its own. In the film, these sounds are juxtaposed with employees of a security company who were training AI to hear when windows break to determine if a house is being broken into.

"This is the Future" uses machine learning algorithms to look 1/24th of a second into the future. A neural network is trained to memorize a short video and the output is fed back into itself, resulting in a feedback loop in the form of a psychedelic animation of the past, present and future all at once. Here, Artificial Intelligence is being used as a new form of divination.

He says that all these projects are connected by the important role that data itself plays: the systems aren't programmed directly, but instead the data is the new form of code.

Conclusively, him and his collaborators attempt to resist and challenge these processes from below by making software for small teamsusing small datasetsand using these to think critically about big problems.

We're excited to see where he thinks our future is heading and can't wait to find out more about his practice. If you are too, head on over to the Futurium on December 12th at 8pm and join us.