Posted on May 24, 2019 by Matt Stewart
It’s natural for consumers to be curious about what’s going on behind the scenes of the internet machine, especially these days where data breaches, violations of user privacy, and social media infiltration campaigns are daily news.
We’re being bombarded by news reports detailing how websites, government agencies, and mega-corporations are buying and selling our personal information and leveraging what they find to push their own agendas.
There’s a lot for users to be worried about. And to make matters worse, even seemingly-innocent design choices are beginning to take their toll.
Dark Days for the Internet
For example, consider Facebook’s recently released “dark” update to its messenger app layout. This option supposedly reduces eye strain while browsing, making it easier for users to stay on-site longer and consume more content without getting tired.
A design choice like this seems natural from Facebook’s perspective, and on the surface, it looks harmless enough. But then you consider the multitude of studies confirming that extensive social media use is correlated with increased depression and loneliness, and Facebook’s push to keep our eyes on the screen takes a darker tone.
This is the dilemma designers face in these troubled days.
By all accounts, it’s not Facebook’s job to manage our mental health. Its job is to provide a service that we can choose to partake in. But where’s the line between natural design choices that play into a healthy, ethical UX and underhanded design choices that manipulate us into engagement?
Captology and Persuasive Technology
Let’s be clear, there are right ways and wrong ways to structure your UX design (which we’ll review in a minute). But before we get into the dark side, we need to take a step back and put these issues into context.
Every discussion of the UX comes back to captology: the study of computers as persuasive technologies. This area of study encompasses every aspect of website design, including research, analysis, the design itself, and any ethical implications that may come as a result of your choices.
It’s not a new concept by any means; it’s been studied for years, harkening back to the days of the Apple/Microsoft arms race in the 80s and 90s, where each new device was purported to be better, faster, and easier to use than its previous model.
But we live in an advanced age of personal computing, and captology discussions have evolved in turn. These days, research into the UX goes beyond basic psychological elements to identify specific behaviors, traits, and characteristics that encourage engagement.
There’s nothing wrong with this concept in theory. The problems come when companies take the above considerations too far—effectively turning to the “dark side” of UX design.
Dirty Design and the Dark UX
An entire subset of UX design exists out there that honest, legitimate companies may never have heard of: the dark UX.
In the simplest terms, a “dark UX” refers to web design choices that intentionally mislead, deceive, or otherwise manipulate users into taking specific actions.
- Automatic opt-ins when downloading software (for example, when software installers default to installing a third-party Bing or Google toolbar unless you uncheck the box)
- Forced subscription continuity, often seen in streaming services that automatically re-subscribe you without your knowledge
- Burying “unsubscribe” links in email communications, usually in small print or low contrast fonts
- Surreptitious posting—usually done on social media sites that automatically post or publish messages from users or apps without the account owner’s consent
- In the wake of the EU’s new data regulation, making it easy for users to “agree” to data collection policies via pop-up window—but offering no information on how to opt-out
Note that these actions aren’t illegal, but they’re certainly unethical. These are examples of how companies can deceive users into getting the marketing results they want without earning the consumer’s business through honest UX design.
Malicious or Merely Negligent?
These nasty design choices are obviously underhanded and easy to condemn. But what about more innocent examples, such as the Facebook dark layout? Facebook’s layout certainly isn’t a type of malicious design, but it may have unintended consequences (which Facebook may or may not care about).
Having users stay on-site for longer is good for Facebook, but it may be bad for us—and in these cases, it’s hard to tell who’s to blame. After all, nobody’s holding a gun to the user’s head and forcing him/her to stay logged in. But when you hear stories about how these social giants have control rooms full of people analyzing data and uncovering new ways to keep users glued to screen, it’s clear that we’re being influenced in ways that we have no way of understanding.
So, moving forward with the supposition that more screen time = bad, are these actions malicious? Or are they merely negligent?
As we weren’t on the development team who made the decisions, there’s no way of saying for sure what kind of intentions Facebook had. You could make a case for both. But in truth, the answer doesn’t matter, a viewpoint expressed well by Stanford’s David Rosenthal in a 2016 Pew Research Center study:
“The digital economy is based upon competition to consume humans’ attention […] Economies of scale and network effects have placed control of these tools in a very small number of exceptionally powerful companies. These companies are driven by the need to consume more and more of the available attention to maximise profit.”
Ethical Design in the Attention Economy
In short, businesses tend to act like businesses, putting their bottom lines above all else. Their goal isn’t to make you happy—it’s to create a network that you can’t help but use. And as Facebook, LinkedIn, and countless websites have shown, many are more than willing to sell their customers out for increased profit.
This is the key distinction between dark design and honest design. There’s nothing wrong with researching your users, learning about their behaviors, and designing a website that they’ll love. It’s how you collect this data, and what you do with it after the fact that really matters. Don’t try to trick them into providing information, and don’t think of them as mere tools for data collection.
As long as you’re being transparent with how your site operates and treating your users as human beings, you’ll have no problem creating an effective, honest UX design that people love.