FYI.

This story is over 5 years old.

Tech

The Internet of Things Will Be a Giant Persuasion Machine

The internet is a giant lab for companies to poke and prod us, and the internet of things will be no different.
​Image: Wiki

It's 2020, and you've signed up for a weight-loss program. You're alone on a Saturday night because your significant other is out of town, none of your friends are available to catch a flick at the theatre, and a pizza is sounding pretty good. You Google the address for your favorite joint and walk over instead of ordering dro​ne delivery because it's cheaper, and you're old fashioned like that. Just before you order at the counter, however, your friend calls you to ask if you want to see a movie after all.

Advertisement

How did this happen? An algorithm analyzing your communications and monitoring your friend netw​orks learned that your partner was away, and your Facebook posts revealed that you wanted to see a movie but nobody was available. Sentiment analysis of your tweets​ suggested that you were feeling alone and a little sad.

Your networked fridge detected that you were out of kale. Your phone's GPS and Google Maps decided that you were likely moving towards the location of the pizza shop you googled. The weight-loss program you're a part of collected all these disparate points of data and notified your friend, who is also part of the program, and asked them to call you, suspecting that you might be experiencing a lapse of willpower. In other words, data has flowed through your social networks and gadgets to influence your routine behavior.

This is the future of behaviour modification, according to a series of papers in a recent special issue of the Proceedings of the IEEE, titled "The Digital Age And the Future Of Social Network Science And Engineering." As internet-connected devices continue to proliferate, forming the backbone of the internet of things, the strategies of persuasion in the online world will bleed into the offline one.

The strategies of persuasion in the online world will bleed into the offline one

The key, in this vision of the future, is to take the insights into our everyday lives gathered from the streams of data emanating from our interconnected devices and merge them with an actual human who can act on the information they provide.

Advertisement

"The technologies are already here," Dylan Walker, author of a paper in the issue titled "Design of Randomized Experiments in Networks," wrote in an email. "What is missing is the large-scale integration of these technologies into unified systems and platforms that collect digital signals of our individual behaviors and social interactions, transform these signals into meaningful metrics in the context of desired behavioral changes, and mediate social interventions (with or without humans directly 'in the loop')."

"Such systems will likely not emerge or be sustained in the absence of explicit buy-in and trust from their users," he continued.

It's well-known that the internet is already essentially a giant, data-rich laboratory for corporations to poke and prod us, sometimes just t​o see how we react. After it was discovered that Facebook had been manipulating users' news feeds to track the spread of "emotional contagions" across the social network—in other words, does seeing more sad stuff make you sad?—a vertiable shitshow of public complaints rolled in. But Facebook is still collecting massive amounts of data, and the algorithm that governs their news feed is still a secret.

Facebook isn't the only company performing experiments on its users. After the experiment's existence came to light, OKCupid came to the company's defense, outlining some of its own user e​xperiments. One involved removing all photos from the site to see how users would react, and another sent mismatched people on dates together. Every day, academics and researchers are figuring out ways to pull more information from our social feeds—what we like, how we express emotion, and who our friends are.

Advertisement

These trends will only continue, Walker and the authors of other papers in the special issue, argue. Just this week Facebook just moved a little closer to this vision with a ​suicide prevention initiative that asks friends to report posts indicating an imminent threat of self-harm so that the site can direct them to help.

But is this really anything new in terms of technologies being used to persuade us in new ways over the centuries?

"History also alleviates dystopian concerns: Broadcast television and the telephone have already brought persuasion en masse into our homes," wrote Walker. "That persuasion seems to be most effective when mediated or initiated by close and trusted peers is a strong indication that, even in a world of digital ubiquity, persuasion it is still subject to social checks and balances."

This all sounds somewhat innocuous, but it's worth noting that this vision depends heavily on the companies that have the infrastructure and products to collect all the data needed to create a feedback loop for a human to be in in the first place.

A tremendous amount of trust is involved in that relationship: Can we trust that the companies that control our data will take adequate measures to avoid being hacked? Can we trust that these companies won't use our data to surveil us in partnership with government agencies? In the example that opened this article, Google, the weight loss program, Apple, Facebook, Twitter, and the company that owns your smart fridge all have a stake in the data you provide. "If you want society to benefit overall from the development of online and offline persuasion," wrote Walker, "you might pay attention to who's holding the keys."