en

The VICE Channels

    GDC 11: Valve Talks Biofeedback, Biologic Inputs For Videogames

    Written by

    Joshua Kopstein

    Photos: Emi Spicer

    Imagine you’re playing a nerve-wracking round of Left 4 Dead and suddenly, right when you think you’ve found a moment of sanctuary from the undead hordes, a king-size mutated zombie charges out of the darkness and grabs you, throwing you 40 feet into a nearby wall. Your pulse races and a wave of anxiety washes over as adrenaline pours through your entire body in an instant.

    Now, imagine that everything going on in your body — the sweat on your forehead, the nervous tingle in your hands — is recorded into data, and used to enhance the game’s ability to scare the living shit out of you.

    That’s exactly what Valve has been playing around with recently, and their in-house experimental psychologist Mike Ambinder was on hand at the Game Developers Conference this week to show us how powerful the incorporation of biometric data in videogames can be.

    There’s a problem that exists, Ambinder says, in the way traditional control schemes allow us to interact with games. It’s a shallow, straightforward relationship that, while translating player intent into onscreen action, has failed to account for any other types of human cognition like player mood and other emotional cues.

    Think about it: every devopment in videogame control interfaces up to now — force feedback, pressure-sensitive buttons, etc — has been an attempt to emulate these human responses through regular physical inputs. Ambinder’s talk shows that we’re almost ready to go one step further.

    Measuring the valence and arousal levels of emotional responses in the player, he says, can add entirely new dimensions to the resultant gameplay. Using an improvised hardware device (seen above), Ambinder and his team was able to track and record some of these biological cues in players during sessions of Left 4 Dead by measuring SCL (skin conductance levels) in a human hand. The result: a neat map charting player responses to specific events in the game. If processed in real-time, it could potentially feed L4D’s already impressive A.I. Director with even more data to enhance gameplay.

    But biofeedback can go further than passively enhancing the games we play. Ambinder shows a demo of the upcoming Portal 2 played using eye tracking tech and immediately you can see this increased ability to interpret player intent. By allowing the game to see what the player is looking at, all sorts of new insights into player intent beyond the physical button-press become available.

    So if things like eye-tracking, facial mapping and SCL data-tracking are possible, what’s the hold-up? According to Ambinder, it’s not that easy. These technologies present a couple of problems — specifically, they are costly and require extensive analysis on the part of the software. But most importantly, they’re intrusive to the player. The idea of attaching a bunch of electrical nodes to your face or strapping on a blood pressure monitor doesn’t sit well with most humans, and even if they’re cool with it, players knowing they’re being tracked can lead to a significant bias in eye movements and other natural responses.

    These problems have answers though, Ambinder says, and Valve, a company with a reputation for expanding gameplay immersion through experimentation, seems to have a leg up on rolling out technology that will change the way we interact with software forever.

    Connect To Motherboard

    Most Popular

    Comments
    comments powered by Disqus