Staying Human in the Age of Big Tech
Tyler Berg, Former Senior Product Designer
Article Categories:
Posted on
How can we use design to prioritize human interaction and spend less time on our screens?
Our lives are completely saturated with technology. From the moment we wake up in the morning, we’re immediately drawn to take out our phones and check our email, Slack, social media, or read some article that a friend sent us. After work, the urge to watch that extra episode on Netflix kicks in, especially when the next episode plays on auto-loop. Alexa’s there to turn the lights off for you when you go to bed. When you go to work the next day, your smart home camera is watching the house while you’re gone. You’re reading this article on some form of technology.
For better or for worse, our society is driven by smartphone technology.
While the development of so much technology centered around smartphones has been an exciting development overall, the degree to which it’s reached into our lives is staggering. Just last month, Amazon executive Dave Limp introduced a new line of Alexa products, like a “smart ring” and “smart eyeglasses.” There was one particular line that stuck with me when he was describing Amazon’s new Alexa rollouts:
“There is no reason not to put them everywhere in your house.”
There’s a lot to unpack in his statement. Why would we possibly need an Alexa device in every room of our house? Alexa devices are always “on.” Even if you don’t use a “wake word” to summon the device, an Alexa can still turn on and start speaking to you unsolicited. Since Amazon uses cloud computing to process spoken words you say to Alexa, your conversations can be streamed in real time to a team of analysts at Amazon that manually examine the data in its non-anonymized state. Which is also traceable back to your Alexa device. In short: having more Alexa devices equals even less privacy.
Are we really using design and technology for the right reasons? Just because we have the capability to create something, does it mean that we should? As technology companies continue to pursue maximum efficiency and maximum profit, there’s something to be said for subtlety. More isn’t always better.
User experience design is a new industry; having come to life in the early 1990s as personal computers started to become commonplace. UX pioneers like Don Norman (co-founder of Nielsen Norman Group), sought to define how people physically interacted with digital interfaces – as websites and applications were becoming closely aligned with the heartbeat of modern society. Yet as the industry has evolved over the past 25 years, and technology companies have taken even larger roles in our lives, it’s important that we evaluate their continued development with a critical eye and start to define what overreach is, and isn’t. We need to make sure we’re cognizant of the impact of our design decisions, and it’s critical that we approach our work with a human-first mindset.
Designers walk a delicate ethical line.
But human-first design doesn’t just mean designing things of value for other people. Designers have already thought – and have written plenty – about the responsibility of designing things that have value. And depending on who you ask, “value” can mean a lot of different things.
The distinction here is that we need to approach design with more nuance. More relatability. More restraint. More empathy. Have we thought critically enough about creating limits to addictive digital products? How do we ensure that the user won’t spend more time using this application than they will interact with real people? More importantly, were they even given the choice between technology consumption and human interaction to begin with?
Let’s take a look at the concept of infinite scroll on Facebook, Twitter, and Instagram’s news feed. When a user checks their feed, there is no end in sight. You can hypothetically scroll for as many new or relevant posts as you have. And for folks that are following hundreds or thousands of people, that can be an endless amount of time. Think about someone ordering a glass of wine at dinner. Unless you’re continually brought more wine, there’s an obstacle to continuous consumption. When faced with this impediment, you can decide if you want to order another. Or, you can stop there. You are given that choice.
In 2005, Cornell researcher Brian Wansink headed an experiment on overeating, where one group of participants were given bowls of soup that automatically refilled themselves after they were finished. The other group had normal bowls, and just had to ask for more if they wanted more soup. As you might expect, the group with automatically refilling bowls consumed more soup.
The same principles apply with news feeds, where infinite scroll makes this consumption decision for you. There’s no obstacle to continuous consumption. The amount of time the user spends on the application is the priority, not their own decision-making at an individual level.
The same can be seen on Netflix, where new episodes are played on auto-loop – increasing your likelihood of binging and watching that fourth consecutive episode of Mindhunter late into the night. What if we could design a feature on TV streaming applications that would let you know when the show ended and made sure you still got to sleep on time? (Shout out to my coworker Dave who came up with this idea).
What if we introduced limits to this type of addictive technology use? We should shift the focus to put decision-making in the hands of the user, and make sure that they are fully informed in how much they decide to consume. Applications should be transparent with users on their usage habit. Apple is making progress here, with the creation of the “Screen Time” feature on newer iOS updates. Flux creates a similar product, where you can time your sleep and adapt your screen’s brightness in relation to the time of day. We should not be trying to create dependencies; rather, we should be doing everything we can to alleviate them.
This is undoubtedly an ethical grey area. My coworkers Brandon and Curt have written some insightful stuff on this, too – and inspire a lot of our discussions on tech ethics at Viget.
Avoiding tech for tech’s sake.
When adapting technology to new mediums, it’s important that we keep two things in mind.
One: Are we creating technology for technology’s sake?
In other words, what problem is this design solving? More specifically, would we be better off without this? Just because we can design something doesn’t mean we should.
And two: how does this impact current societal habits?
Are we being thoughtful about merging people’s current physical traditions with new digital ones?
Just take a look at the “Smart Fridge.” It’s a normal refrigerator with a digital tablet on the front, which is an odd fit to say the least. Viget has actually done an exploration on this idea of “digitizing” the refrigerator, where we ultimately concluded that the refrigerator’s surface shouldn’t be digitized. The refrigerator is a messy collection of the type of things that will always be physical, like souvenir magnets, cherished photographs, children’s drawings, and printed recipes. There’s some charm that comes along with these things, and there’s no positive in removing it. Ultimately, a digital interface’s form should be directly related to its function – meaning, an object’s UI should closely correlate to its purpose and environment.
What does this mean for agencies?
Agencies work on a sizable amount of projects in the course of a year. As technology and design continue to grow and adapt, it’s more important now than ever that we approach all of our work with a discerning eye – and ensure that we’re creating work that is thoughtful and more cognizant of the nuances of every day life.
Valuing a balance between physical and digital realities – and embracing, not attempting to replace – the meaningful traditions of physical life, will help us avoid negligence in design. It will help create the work that brings the most impact and meaning to people.