FYI.

This story is over 5 years old.

Tech

Hungry Like An Optoelectronic Wolf: Downloading Animal Senses Into Human Reality

Tucked away in our pockets, the tiny camera-computers we've taken to carrying around with us have gotten fairly good at enhancing our otherwise offline natural world. But while this augmented portrait of our surroundings allows us to easily locate the...
Janus Rose
New York, US

Tucked away in our pockets, the tiny camera-computers we’ve taken to carrying around with us have gotten fairly good at enhancing our otherwise offline natural world. But while this augmented portrait of our surroundings allows us to easily locate the nearest Arby’s and find the fastest route to grandma’s house, it’s still very much incomplete.

What of the multitude of layers that lie outside of the human experience, those realms of data which flow so naturally through the biological senses of our fellow earth inhabitants? What if those pocket CPUs could overlay a bird’s ability to sense Earth’s magnetic fields, or a dog’s knack for smelling approaching intruders?

Advertisement

Enter Simone Ferracina’s Theriomorphous Cyborg, a game-like augmented reality experiment that draws on the idea of “infinite perceptual worlds.” That is to say, the world as we see it is only a tiny slice of an enormous panoply of sensory possibilities, limited only by the size of the universe’s collective biomass.

Against the illusion of a fixed common environment, [zoologist Jakob von] Uexküll advanced the notion of Umwelten, life-worlds specific to each individual animal and comprised of the operational and perceptual cues required to form—with varying degrees of complexity—complete functional cycles. The illustrations in Uexküll's books allow us to peer at the world as it would appear to a tick, a snail, a jackdaw, or a bee. "Thiment is useful"—writes philosopher Giorgio Agamben—"for the disorienting effect it produces in the reader, who is suddenly obliged to look at the most familiar places with non-human eyes." "

How do we then begin incorporating the individual worlds of other perception-capable organisms? Ferracina splits them into various “LEVELS,” each augmenting/distorting the user’s perception based on animal traits. One re-draws surroundings with directional patches of color based on earth’s magnetic fields, as mentioned above. Another garbles auditory input — including the user’s own voice — into a cacophony of animal sounds. And to simulate perceptual differences in the flow of time, one level approximates temporal displacement by overlaying a second delayed video signal. Clearly, having a pair of AR goggles would come in handy.

Advertisement

A similar project, Animal Superpowers takes a more hands-on approach to optoelectronic animal metamorphosis, using wearable computer augments to simulate specific traits of unique animal perspectives. A helmet equipped with a camera, video screen, and voice-changing mechanism makes the wearer a “giraffe” simply by altering their speech and raising their line of sight by 30cm. A red helmet affixed to two hand-worn “feelers” approximates an ant’s antennae-based perception using zoomed-in microscope cameras on each of the hand units.

By electronically blending these realities, we stand the chance of not only gaining a more complete perceptual understanding of the natural world and the various organisms we share it with, but transcending humanity entirely, merging all flavors of consciousness into a single interface of trans-biologic perception. Handheld digital devices will initialize our connection to this sensory social network, portals into the invisible alien worlds lurking within our own.

Connections:

via BLDGBLG