Artists Make DeepDream Dream Deeper With Secret 'DeepUI' Tool
Image: Michael Tyka/Google

FYI.

This story is over 5 years old.

Tech

Artists Make DeepDream Dream Deeper With Secret 'DeepUI' Tool

The makers of a DeepDream music video think neural networks could have a big place in pop culture.

It was 1968 when Phillip K. Dick posited the question, "Do androids dream of electric sheep?" While we still can't answer that with unequivocal certainty, we took one step closer this July as Google unveiled its DeepDream project.

DeepDream uses a convolutional neural network to augment imagery through a series of algorithms. In the most basic terms it's an artificial intelligence program that views images in order to categorise them for Google—a computer trying to bring order to the chaos of the mundane.

Advertisement

But it's sexier than it sounds, because in searching for patterns these computational visions create a kaleidoscopic pareidolia: ordinary images transformed by vivid colours, phantasmagoric pagodas, flora and fauna, hundreds of terrifying eyes and, err… lots of abstract dog snouts. Something which feels like a closer representation of the visions and hallucinations Dick experienced in his own life than any he wrote about in his fiction.

But, beyond altering your Facebook profile picture into something vaguely freaky, what does the future hold for this technology?

The first step in answering that conundrum comes from an unlikely source: UK pop trio and BBC "Sound of 2015" winners Years & Years. Or rather, the band's most recent music video (a remix of their best-selling single "Desire"), which was the brainchild of US-based director Brian Harrison and marks the first commercially released project to incorporate DeepDream.

Harrison has long been concerned with heightened states of perception, and his work is strongly influenced strongly by the writings of author, intellectual and renowned psychonaut Terence McKenna and his colleagues Ralph Abraham and Rupert Sheldrake (to whom the film is dedicated). "Most of my writing, directing and creative energies are focused on the mysteries of consciousness and the psyche," he explained.

The idea of blending DeepDream with film came to Harrison after watching a clip from Fear and Loathing In Las Vegas online, which had been uploaded by Roelof Pieters, a data science consultant and PhD candidate in Deep Learning at the Royal Institute of Technology in Stockholm, and was augmented with DeepDream to hallucinogenic effect. It left Harrison feeling compelled to utilize the technology, and not simply because of the rather unique cinematic potential. "When I saw the Fear and Loathing clip I was excited, not only by its visual implications, but by the idea of machines being birthed into consciousness through dreaming… and that their consciousness seemed completely psychedelic," he said.

Advertisement

Pieters created the video with an open-source DeepDream animation tool, which he built with collaborative partner Samim Winiger, whose work also explores the intersection between creativity and machine learning.

The pair, drawn together by a mutual appreciation for "experimentation, generative systems and ethical computing," met serendipitously online, and immediately saw the value in DeepDream's future. "Its release was a seminal moment," said Winiger. "It brought creative AI and generative tech in the consciousness of the general public."

"And besides the fun imagery, visualizing what a deep neural network learns is important for research," he continued. "Because it helps us develop a better understanding of machine learning processes and build better models."

Harrison contacted the duo not long after the release of DeepDream, and they immediately saw potential for collaboration on "Desire." "Pop culture is an important tool to drive interest and understanding of machine learning processes" explained Winiger. "And DeepDream combines machine learning and pop culture in a way not seen since fractals were popularized in the 80s."

Although if you thought it was as easy as simply overlaying DeepDream over moving images, think again. It was a laborious process, involving months of work and four days of shooting in locations across California. "We went through thousands of ideas and initial inputs before we even began the process," Harrison said. Meanwhile, Winiger and Pieters were coding editing software which allowed Harrison's hours of real-time footage to be pulled together and amalgamated with DeepDream.

Advertisement

They were aided by some still-under-wraps tech called "DeepUI," a tool designed by Winiger and Pieters to edit the seemingly random dreamscape of DeepDream. For now, the pair remain tight-lipped on its wider ramifications. "The DeepUI was developed in the context of this project." said Winiger. "And while it will be released as open-source soon, it's ongoing, so we're not discussing it in detail publicly."

"As engineers and researchers continue to move this tech forward, you'll start to see it more and more in all forms of the visual arts"

What they will say is that advances in creative AI are moving forward at a breakneck speed: "New approaches and technologies come out practically every week." explained Winiger. "We see these approaches as an emerging revolution for creativity, which allows machines and humans to collaborate as equal partners."

Where they take it next will be under the guise of Artificial Experience, an agency designed to bisect the technological and creative industries, which is currently in preparation stages but promises big things for 2016. "While Artificial Intelligence gets the headlines, Artificial Experience (as a medium that patterns your thinking) is often invisible," said Winiger. "AE is a distributed team, focused on artificial experience design that will bring applied machine learning to creative industries."

Whether the wild and weird imagery of DeepDream continues to captivate us, or ends up as tomorrow's digital equivalent of tie-dye, the concepts behind machine learning and their influence on the creative arts continue to grow apace. And as a filmmaker, Harrison remains excited for the future possibilities. "As engineers and researchers continue to move this tech forward, you'll start to see it more and more in all forms of the visual arts," he said. "It just amazes the mind to think that you are watching the embryonic stage of true machine consciousness: of man and machine interfacing in art."

@tomdefeat