niedziela, 15 kwietnia 2018

You're Addicted to Your Smartphone. This Company Thinks It Can Change That


New apps can push better habits, more transparency

New apps can push better habits, more transparency
Illustration by Martin Gee for TIME
By Haley Sweetland Edwards
Updated: April 13, 2018 10:28 AM ET | Originally published: April 12, 2018
The headquarters of Boundless Mind looks as if it were created by a set designer to satisfy a cultural cliché. The tech startup is run out of a one-car garage a few blocks from California's Venice Beach.
On the morning I visited, in March, it was populated by a dozen screens–phones, tablets, monitors–and half as many 20-something engineers, all of whom were male and bearded, and one of whom wore a cowboy hat. Someone had written in blue marker across the top of a whiteboard in all caps: You're building amazing sh-t.
But that, more or less, is where the Silicon Valley stereotypes end. Ramsay Brown, 29, and T. Dalton Combs, 32, the co-founders of Boundless Mind, are hardly the college dropouts of tech lore; they're trained neuroscientists. And unlike most tech entrepreneurs, they are not trying to build the next big thing that will go viral. In fact, Boundless Mind's mission is almost the opposite. The company wants to disrupt America's addiction to technology. "It used to be that pathogens and cars were killing us," Brown says. "Now it's cheeseburgers and social media. It's our habits and addictions."
Every day, we check our phones an average of 47 times–every 19 minutes of our waking lives–and spend roughly five hours total peering at their silvery glow. There's no good consensus about what all this screen time means for children's brains, adolescents' moods or the future of our democratic institutions. But many of us are seized these days with a feeling that it's not good. Last year, the American Psychological Association found that 65% of us believe that periodically unplugging would improve our mental health, and a 2017 University of Texas study found that the mere presence of our smartphones, face down on the desk in front of us, undercuts our ability to perform basic cognitive tasks. New York University psychologist Adam Alter describes the current state of tech obsession as a "full-blown epidemic."
The problem, critics agree, begins with Silicon Valley's unique business model, which relies on keeping us in the thrall of our screens. The longer we are glued to an app–a value nicknamed eyeball time–the more money its creators make by selling our attention and access to our personal data to advertisers and others. You and I are not customers of Facebook or Google; we are the product being sold.
Boundless Mind, founded in 2015, has 10 employees and operates out of a one-car garage near Venice Beach
Boundless Mind, founded in 2015, has 10 employees and operates out of a one-car garage near Venice Beach
Scott Witter for TIME
This business model has driven an explosion of interest in what's known as persuasive technology, a relatively new field of research that studies how computers can be used to control human thoughts and actions. The field, which draws on advances in neuroscience and behavioral psychology, has fueled the creation of thousands of apps, interfaces and devices that deliberately encourage certain human behaviors (keep scrolling) while discouraging others (convey thoughtful, nuanced ideas). "If, 20 years ago, I had announced that we would soon be creating machines that control humans, there would have been an uproar," wrote B.J. Fogg, a Stanford University behavior scientist who was one of the first academics to seriously study how computers influence human behavior. But now, he notes, "we are surrounded by persuasive technologies."
Every major consumer tech company operating today–from behemoths like Amazon to the lone programmer building the next Candy Crush–uses some form of persuasive technology. Most of the time, the goal is unambiguous: the companies want to get us to spend as much time as possible on their platforms. Facebook's platform, for example, is not neutral. Its designers determine which videos, news stories and friends' comments appear at the top of your feed, as well as how often you're informed of new notifications. Snapchat's interface distributes badges to users who maintain daily streaks–a nifty system built in part on humans' well-studied psychological need to bank progress. "Your kid is not weak-willed because he can't get off his phone," Brown says. "Your kid's brain is being engineered to get him to stay on his phone."
In the past year, Silicon Valley insiders have raised the alarm about the real-world impact of all this persuasive tech. Former Google employee Tristan Harris and early Facebook investor Roger McNamee have accused the tech giants of deliberately creating addictive products, without regard for human or social health, and this year, two major Apple shareholders publicly called on the company to design a less-addictive iPhone. Others have championed the idea of tech detox. In San Francisco, "technology mindfulness" conferences, like Wisdom 2.0, have sprung up alongside tech-free private schools, tech-free meet-ups, and apps like Moment and Onward, which are designed to help people curb their phone use. In Germany, a growing number of corporations, including Volkswagen and BMW, have begun restricting how employees can send or receive nonemergency emails after hours, and in Brooklyn, a tiny device manufacturer, Light, is promoting a new "dumb phone" that does little more than make calls. It's been marketed as a phone that should be used as little as possible.
Brown and Combs are sympathetic to this backlash, but they're also deeply skeptical of the proposed solutions. "We're not getting rid of this stuff–there's no way," Brown says. "No piece of technology, once adopted, ever gets put back in the box." Instead, he and Combs propose a different tactic, born of the relentless optimism of Silicon Valley: fight fire with fire. Why not harness those same, powerful persuasive technologies that Big Tech has in its arsenal but, instead of deploying them to maximize eyeball time, use them to promote a healthy, democratic society?
Scientists don't yet know how using smartphones for hours every day affects our brains
Scientists don't yet know how using smartphones for hours every day affects our brains
Gijsbert van der Wal
Boundless Mind, founded in 2015 as Dopamine Labs, has raised $1.5 million and boasts just 10 employees and 14 customers. But its business model has the benefit of being provocative. "We're talking about mind control–oh my God, right?" Brown says, his eyes widening in mock disbelief. "But what if we sell you those mind-control tools to help people get off opioids? Or to communicate with each other on a more meaningful level?" Brown gestures to my phone, which sits like an arbiter between us. "We already know how to engineer your brain to be a good little social-media user," he says. "Why can't we engineer your brain to be who you want to be?"
The founders of Boundless Mind are in some ways a study in opposites. Brown, the more garrulous of the two, is fluent in the unself-conscious informality of the West Coast tech scene. He signs his emails with emojis–a bear, a red heart, a sun–describes himself on the company's website as an "escaped circus bear" and favors collared shirts unbuttoned to the sternum, revealing a tan wilderness of chest hair. Combs, who takes a backseat to Brown as the company's de facto spokesperson, tends to answer questions with numbers and data, his hands twitching toward a tablet nearby. On the two occasions we met, he wore a fleece, zipped all the way up. But both share a deep conviction that in a world saturated with interactive technology, our brains, however complex, can be hijacked and programmed–for better or worse.
The two met as graduate students in the neuroscience program at the University of Southern California. (Brown later received a master's degree in neuroinformatics, Combs a Ph.D. in neuroeconomics.) Their friendship was born over beers and a mutual disappointment in what are known as behavior-change apps–tools designed to help people commit to certain actions, like dieting or quitting smoking. It was clear to them as computational neuroscientists that despite any good intentions, those products were ignoring rich neurobiological research showing how our brains form new habits. This failure, they thought, was a market opportunity. "We realized that we have an uncommon understanding of where human behavior comes from and how to change it," Brown says. "Not just at the level of some New York Times best seller–'Do something for 30 days, it'll stick!'–but at a fundamental, academic level."
One day at their office, Brown walked over to a whiteboard, drew an outline of a human brain in orange marker and turned to face Combs and me. The brain, he explained, sounding like the graduate teaching assistant he once was, has two basic neural pathways for controlling behavior. One is structurally weak but helps us make conscious, intentional decisions to serve our long-term goals. The other is more automatic and easily suggestible. Brown drew an orange swirl in the middle of the brain: the basal ganglia. When the brain gets some sort of external cue, like the ding of a Facebook notification, that often precedes a reward, the basal ganglia receive a burst of dopamine, a powerful neurotransmitter linked to the anticipation of pleasure. That three-part process–trigger, action, reward–undergirds the brain's basic habit-forming loop, he said.
That loop is just the beginning, Combs added, jumping in. If you're trying to get someone to establish a new behavior–"to really glue it in tight"–computer engineers can draw on different kinds of positive feedback, like social approval or a sense of progress, to build on that loop. One simple trick is to offer users a reward, like points or a cascade of new likes from friends at unpredictable times. The human brain produces more dopamine when it anticipates a reward but doesn't know when it will arrive, Combs explained. Psychologists refer to this as behavioral change with variable rewards. Combs and Brown call it engineering "surprise and delight."
Most of the alluring apps and websites in wide use today were engineered to exploit this habit-forming loop. Snapchat, for example, which relies heavily on the trigger-action-reward triumvirate, also uses a powerful trick to get users to open the app daily. When two people send and receive Snaps with each other for days on end, both receive emoji flames next to their names, alongside a number, which ticks up every 24 hours, indicating how long the two have maintained their connection. If either misses a day, both lose their flame. That interface, while playful, capitalizes on what psychologists call the endowed progress effect. Fearful of zeroing out their banked progress, teenagers have handed over their log-in information to friends before vacations.
Pinterest, one of the first Silicon Valley firms to hire behavioral psychologists to work alongside designers, plays on our psychology in a different way. Its interface, which features an endless scroll of pictures arranged in a staggered, jigsaw-like pattern, is human catnip. It ensures that users always see a partial image of what comes next, which tantalizes our curiosity and deprives us of any natural stopping point, while simultaneously offering an endless well of new content. Brown and Combs refer to this as "bottomless bowl" design, a reference to a 2005 Cornell University study that found that participants ate 73% more soup when their bowls secretly self-refilled. Dozens of other apps employ similar interfaces. No matter how long you scroll down on Facebook, Instagram or Twitter, and no matter how many hours you spend watching YouTube or Netflix, there is always more content cued up to auto-play.
These psychological sleights of hand aren't all new, of course. Advertisers, studio producers, magicians and salesmen, to name just a few who have traditionally made their living through persuasion, have long relied on vulnerabilities in the human psyche. It should be no secret, for example, that casinos, which have no clocks or outside windows, are designed to eliminate external stopping cues. Or that slot machines are programmed with gamblers' dopamine receptors in mind.
Coming up...
How Being A Night Owl Endangers Your Health
10 Foods Filled With Probiotics
What's going on today is different, experts say, for the simple reason that we've never had technology like smartphones before. Unlike TV commercials or billboards, these pocket-size supercomputers are with us constantly–at work, in bed, at our kids' games. And unlike older media, which were essentially passive, our smartphones actively surveil us; they track our steps, log our GPS locations, note nearby devices and file away our clicks, likes and comments. Those digital bread crumbs amass over time, equipping tech companies with staggeringly precise information about each of us. Product designers then use that data, alongside machine-learning tools, to study how we react to certain interfaces, rewards and inputs, and to identify patterns in our behaviors. That allows them to predict, fairly precisely, Brown says, how we'll react in the future.
When the game company Zynga first launched FarmVille, the popular social-network game, in 2009, its designers closely studied how it was being played, says Gabe Zichermann, one of the pre-eminent experts on gamification. They analyzed users' data to determine, for example, how long it took players to run out of patience or to beat a level, he says. They then tweaked the interface to reflect those findings, making it alternately more frustrating–so that users would pay to skip a level–or rewarding, doling out freebies to users in danger of giving up.
That same process still happens today, only now–nearly a decade later–it's much more precise, Zichermann says. As cloud computing has gotten cheaper and machine-learning tools have gotten easier to use, even small tech companies can now analyze their users' behavior at a granular level. That allows them to identify not only which factors affect engagement by a typical user but also which factors most affect each user personally. In other words, apps today are often highly adaptive, deploying a unique set of rewards and feedback for each user, based on what has worked in the past. "It's pretty incredible how effective it can be," Combs says. "If you're setting the consequences of someone's behaviors and you tie those consequences to learning machines so that the consequences shift according to individual markers, you really do have exquisite control over shaping that individual's behavior–over how he spends his time."
Fogg, the behavior scientist who helped pioneer the study of computer-based persuasive technology in the mid-'90s, warned in his 1998 Stanford doctoral thesis about potential ethical problems arising from this work. But over the years, many outsiders have come to regard his research as something of an instruction manual for how to create addictive apps. One former student, Instagram founder Mike Krieger, came up with his design for the famously sticky photo-sharing app while enrolled in Fogg's program. Another young entrepreneur who took Fogg's professional training, Nir Eyal, the author of Hooked: How to Build Habit-Forming Products, now runs an annual Habit Summit in San Francisco. Participants, who pay up to $1,700 for the three-day conference, are given "practical steps" on how to design "habit-forming products."
This idea–that app developers are competing with one another to create ever more addictive products–isn't so much an embarrassing secret as a starting point, says Zichermann. "People joke all the time about trying to build a 'diaper product,'" he says. "The idea is, 'Make something so addictive, they don't even want to get up to pee.'" On an earnings call in April last year, Netflix CEO Reed Hastings told investors that his company's main competition was customers' sleep. "When you watch a show from Netflix and you get addicted to it, you stay up late at night," he said, adding, "We're competing with sleep, on the margin. And so, it's a very large pool of time."
Brown and Combs have no problem with persuasive technology. It's their bread and butter. Their objection is to how it is being used primarily by tech giants to boost eyeball time. What's good for corporate profits is not necessarily what's good for human health or society, Brown points out, adding, "And that's where this conversation has to start."
Boundless Mind's business model is to develop new versions of the same persuasive tools, combined with machine learning, that big tech firms already use–and then to sell them to nonprofits and companies promoting education, health or social welfare. Boundless Mind charges nonprofits and new startups $99 a month; larger companies' fees begin at $499 a month. One of Boundless Mind's new clients, AppliedVR, provides virtual-reality therapy to patients with chronic and acute pain at 190 hospitals nationwide. One of its products is a virtual game that helps patients manage post-operative pain by challenging them to shoot little red balls at bears in a virtual world. In order for the therapy to work, explained AppliedVR co-founder Matthew Stoudt, patients must ultimately find the interface addictive, at least on some level, so that "they want to keep coming back." Boundless Mind's technology will help AppliedVR learn from patients' past behavior in order to personalize the interface, making it uniquely rewarding for each user.
Before Boundless Mind takes on a new customer, Brown and Combs debate with their team the ethics of how a potential client will use its tools. They posted six questions on a blog–including "Are the actions that drive value for the publisher the same actions that drive value for the user?"–in part, they said, to keep themselves accountable. Last year, they turned down a client from a horse-betting website, a decision that Esther Dyson, a New York–based venture capitalist who funds Boundless Mind, applauded. While the company is still small–it has a valuation of just about $5 million–Dyson and the other investors are willing to leave cash on the table if it means "doing the right thing," she said. "They need to resist the temptation to use their technology for the wrong purposes."
That's easier said than done. As I was on the phone with Dyson, Facebook's beleaguered CEO, Mark Zuckerberg, posted his first public statement since news broke that the data firm Cambridge Analytica had used millions of people's personal Facebook data, without their permission, to aid the 2016 Trump campaign. (The firm was said initially to have lifted 50 million profiles; Facebook has since revised the number to 87 million.) On the surface, the Facebook scandal is about the exploitation of personal data. But viewed another way, it's about the intentional, aggressive cultivation and harvesting of that data through persuasive technology.
Since its launch a decade and a half ago, Facebook has been second to none at exploiting eyeball time. By 2016, users were spending an average of 50 minutes per day, a staggering portion of the average person's leisure time, on three of its platforms: Facebook, Instagram and Messenger. With each interaction, users have left digital traces of themselves, which together create detailed portraits of who they are, as individuals. Facebook sells that microtargeted access to advertisers, political campaigns and others.
In recent months, as Facebook has come under pressure, Zuckerberg has said the company's focus has changed. "I view our responsibility as not just building services that people like but building services that are good for people and good for society as well," he said April 10 during his Senate testimony. A Facebook spokesperson did not respond to questions from TIME about the use of persuasive technology on the platform. But she highlighted a number of recent tweaks to the company's carefully tuned interface, which have had a profound effect on our behavior. The company, which employs a bevy of social psychologists, now demotes viral videos, for example, a move that has resulted in users' spending 50 million fewer hours per day on the site in the last quarter of 2017.
When it comes to Facebook's impact on our lives, those tweaks may be a good thing. But they don't solve the basic problem–that tech firms, both big and small, now wield extraordinary control over what billions of us see and hear, how we communicate and ultimately how we behave. Andrew Przybylski, a psychologist at Oxford University, notes that we don't yet have robust, peer-reviewed studies on whether screen time is linked to depression or how children's brains are affected by tech. That's largely because those vast databases of user behavior owned by big tech firms like Facebook are proprietary. "They own the richest social database that has ever existed, and we can't touch it," Przybylski says. "We spend many hours engaged with them, but all the analysis of us happens behind closed doors."
Brown and Combs hope that Boundless Mind provides something of a counterbalance. By developing persuasive-technology tools "and then releasing them to everybody," Brown says, they want to level the playing field. "Otherwise, it's just trapped inside Facebook, and only they get to use it." As virtual reality becomes more ubiquitous, persuasive technologies will become increasingly precise, personalized and effective, Brown and Combs say. While many see that imminent future as something of a dystopia, they see it as promising. It means we have the power to engineer the society we want, Brown says. "We have the power to control our minds," he said. "That's quite a gift."
Correction: The original version of this story misstated the name of an Oxford University psychologist. He is Andrew Przybylski, not Adam Przybylski.
This appears in the April 23, 2018 issue of TIME.

Brak komentarzy: