Last year, a former flame of mine got married. I was happy for her—the wedding looked beautiful—but I wasn’t entirely happy that a photo of the event remained stuck to the very top of my Facebook newsfeed. Every single time I opened the website, there it was, for a week straight. By the end it felt like my browser was out to haunt me.
I have Facebook’s newsfeed algorithm to thank for the haunting. It’s the equation that determines what social stories pop up for my account on a daily basis, and it’s supposed to select only things I will find compelling. It combines my supposed degree of personal interest, how well the post is doing with other users, the type of post and its timeliness, among other variables, according to the University of Georgia.
In my case, the algorithm failed. But others have gotten off far worse. Facebook’s Year in Review feature at the end of 2014 was supposed to collect heartwarming memories into a video slideshow, selecting the posts that got the most engagement. Unfortunately, the algorithm dredged up photos of the deceased, pets that had passed away and even apartments engulfed in flames. Slate’s Eric Meyer called this phenomenon, which resulted in displaying a photo of his late six-year-old daughter who had died from brain cancer, “algorithmic cruelty.”
From writer Caroline O’Donovan’s coinage of “algorithm fatigue” at Nieman Journalism Lab to technology critic Ian Bogost’s complaint that algorithms have been “allowed to replace gods in [users’] minds,” it’s clear we’re developing a kind of paranoia about what these mysterious programs can do. The problem is, we’ve become disgusted with algorithms even before we recognize what they really are and fail to realize just how much they actually help.
Today, algorithms drive our Spotify music recommendations, the search results we get from Google, and the high-frequency finance trading programs that may have incited a “flash crash” in 2012. They’re behind Amazon’s variable pricing strategies and the NSA’s surveillance net triggers that trawl emails for keywords. But they started out very simply indeed.
The root of the word itself, Algorismus, is the Latin name of the Iraqi mathematician who invented algebra, Muhammad ibn Musa al-Khwarizmi. Algorism became the practice of doing arithmetic with the decimal number system al-Khwarizmi invented. Its modern usage came to the fore in the mid-19th century with the work of Ada Lovelace, the British mathematician widely credited with creating the first computer program—an algorithm designed to calculate Bernoulli numbers—using Charles Babbage’s early analytical engines.
Algorithms are just equations—extremely specific equations based on proprietary data and user feedback, but equations nonetheless. Yet they form our only real interface with the mammoth amounts of data that the Internet collects. “I think algorithms were introduced to the user experience of computers and Internet services to sort, curate and deliver the Web we want,” says Matthew Plummer-Fernandez, an artist whose work, like the recent Novice Art Blogger, explores the boundaries of algorithmic sensitivity. “We have these pre-configured, needy customer expectations. If you order soup at a restaurant, you expect soup, not something that closely resembles soup.”
It’s this uncanny valley of feedback that makes algorithms so uncomfortable. We have been taught that the Internet exists to deliver us exactly what we want, when we want it. We expect it to anticipate our needs, before we even know what we’re looking for. So we only notice the existence of algorithms when they fail spectacularly. If the wrong Facebook story turns up or Google Maps tries to send us into the ocean, we’re reminded that we have little agency in the face of all this information. Like the lamp’s genie rebelling by taking our wishes too literally, ordinary users are powerless to make the Internet do just what they want.
Even getting served the exact right result is a reminder that our browsing habits are being tracked all the time. “Developers and engineers are technological solutionists—they think these targeted streams of information are helping us,” says O’Donovan. “But for the user, yet another algorithm springing up from the digital abyss can feel prying.” Looking up a single product on Amazon and then having that jacket or bag follow me around in every Google Ads sidebar for days afterward certainly inspires a case of algorithm fatigue.
Paradoxically, even when algorithms malfunction, they have a way of humanizing our online experience by reminding us that there are, well, humans behind virtual space.
“Every algorithm betrays the myth of unitary simplicity and computational purity,” Bogost writes. Rather than perfect digital servants, “mostly they are metaphors.” So if these equations are less about the perfection of technology than the imperfections of the humans who develop them, then perhaps the utility of the algorithm be less in showing us what we want than giving us an idea of what we might actually need.
Such is the mission of Pplkpr (“People Keeper”), a device made by artists Kyle McDonald and Lauren McCarthy that shows us how algorithms might help us manage our own lives a little better. Pplkpr is a wrist heart-rate monitor combined with an app that constantly measures users’ emotions in conjunction with the people they are around.
“Heart rate variability drives the algorithm that we’re using to extract stress levels from heart rate data,” McDonald tells me. “Your heart beats at irregular intervals, but when you’re stressed, running six miles, or having a difficult conversation, your heart will beat faster and more evenly.” If your heart rate shows you’re particularly stressed around one person, the app will delete their contact from your phone in a not-so-subtle hint to spend less time with them. “There were definitely a couple relationships that were ended because of this app, and some that were started,” McDonald says.
Rather than controlling your social life outright, the app is a way of surfacing emotions that we might not even consciously recognize and then prompting action on their behalf. The net effect is much more human than technological. “The algorithm only goes so far, but it’s met halfway by our willingness to trust and buy into these things, and make them real and work,” McCarthy explains. Like all gadgets, it requires the cooperation of the user.
Pplkpr confronts the basic paradox behind algorithm paranoia: We can’t live with them, but we can’t live without them. “What do you do with those feelings of ‘this thing is terrible or evil’ if your life is getting better from it?” McCarthy says.
To call an algorithm “cruel” is to blame the robot rather than its creator. The algorithm itself is neither good nor evil; it is simply a product of human ingenuity with all the biases and errors inherent in the process of its creation by insensitive engineers in an office in California.
Facebook uses a very human review team to oversee the day-to-day functioning of algorithms like those that determine its trending topics. The Year in Review product was likewise tested on a handful of employees, but its algorithm wasn’t shaped to avoid key terms like breakups, deaths, or fires. The company ended up apologizing at large for the algorithmic accidents and sent Meyer a personal apology.
Over the next decade, algorithms are likely to become even more invisible, subtly shaping our behaviors online and off. The problem we’re facing today isn’t so much that algorithms exist—they’re necessary to deal with our sprawling informational landscape—but that we haven’t done enough work on the human side of the equation. As Kyle McDonald points out, “We just have bad algorithms right now.”
A world with better algorithms would help us coexist with our technology more seamlessly than ever. That imaginary future is full of self-driving cars that never crash, coffee bars that auto-brew each cup according to the drinker’s habits, and wearable therapists that alert us whenever we’re feeling stressed out, then tell us how to fix it. It’s a world of customization and on-demand, omnipresent technology that we might already desire, but we’re still learning what it will take to get there. This endless fulfillment, however, also inspires a kind of deep ambivalence. We wish we didn’t want what algorithms have to offer us, but the extent to which we do freaks us out.
Creating technology that’s as responsive as our brains is too tempting. But building that future will require an evolution in ourselves as much as our devices.
Kyle Chayka is a writer in Brooklyn. He covers technology and culture for publications including Newsweek, Gizmodo, and Matter.
As would search requests based only on our most secret desires rather than just random searches for random data to plug holes in trivial conversations or to spice up web chats.
In other words it’s the post WWII consumer marketing bait and switch that always wants just a few more keys to the kingdom to insure complete bliss, utter slavery or whiter whites, I forget which…
A brillaint and fascinating article. I am going to have to read it again to digest. Love the history of math and the fact that an Iraqi mathmatician is in the mix.
Thanks for a thoughtful, well researched, and fascinating glimpse into the practical applications of math in this high tech world.
I always wonder if Steve Jobs was alive what he would think now.
Algorithms replace instinct and intuition. Before you feel it or intuit it, it is being displayed for you because of basic predictions. Algorithms keep you on your toes if nothing else, ye shall not forget.
An algorithm doesn’t have the ability to feel or sense feedback on its own though. So while it does as instructed it doesn’t realize things like disgust or disdain. I haven’t heard anyone yet say that they like having their shopping habits or browsing tracked and instantly offered back to them. I sure don’t. In fact it greatly offends me and has the opposite effect. Amazon can blow goats as far as I’m concerned and Facebook sucks for more reasons than just this.
An interactive tracker that can have certain capabilities either turned on or off according to an individuals wants and wishes would actually serve a potential customer. Tracking and attacking, the current model, is invasive and should be illegal.
One of the problems with algorithms is crap keeps showing up after you have moved on.
Algorithms are especially bad at Christmas time or birthdays and the kids use your computer that you have done online shopping on.