Dark Design Patterns

2021-11-16

Originally posted on Medium by M. O. Fouda


In this article [1], the author talks about what he describes as digital sin. This is basically the harmful ways we use digital products today. The author however steers clear from putting the blame of this sin completely on us (the users) and puts the blame more on the designers and business owners (such as founder of Facebook Mark Zuckerberg).

It’s no secret that a lot of products today are designed for maximum user engagement, especially products that rely on users as part of its revenue. An example of that is Facebook, where the user data is what drives its ad business, which is where the company makes most of its revenue. The problem here, as the author puts it, is that the company (presented by its executives and its designers) design their products to take advantage of psychological vulnerability built into every human, essentially turning its users into addicts of its own products. This is harmful, as not only does it alter human behaviours in a negative way in the short term (the author describes people sleeping with their phones under their pillows despite well knowing of the harmful radiation the phone emits, or a father not paying attention to his daughter due to spending too much time on Buzzfeed, etc.) but the real harm the author warns comes from the long-term chemical and neuroplastic effects these deliberate design choices will have on humanity.

The article offers multiple researchers’ names and books that talk about the problem in depth. One of books I have read (outside of this course) that talks about the inner environment of how these design choices are made is a book called Hooked: How to Build Habit-Forming Products. This book describes tactics used by psychologists and designers in order to “hook” the brain on the products, and essentially trap the user in an endless hook after another, so to maximize engagement [3]. Some of these design choices are things such as a never-ending scroll (implemented widely across most apps today, such as Instagram, Tiktok, YouTube, etc.), the swipe to refresh, and even the choice of colours for text and buttons. This is something also touched on in the Nerdwritter1’s video of Dark Patterns in design [2]. In the video he gives examples of how Amazon makes it almost impossible for users to close their accounts, how some companies blend their unsubscribe buttons in other text to lower the chance of people unsubscribing, and how the developer of a game called TwoDots uses colours to trick users to give the app money.

Honestly this pattern is very close to my heart as someone working in the field of UX, and I always wonder, can we not use the same “trickery” to do good in the world? Can we not design apps that (with the consent of the user) trick them into reaching their goals? Or exercising? Or pull themselves out of poverty? Or increase voter turnout? Or increase engagement of students in education?

Like, given the boredom and inattention we all suffer as students in the education system, how about we use the same tactics of Hooked [3] in order to trap the student in an endless loop of learning?

But can design pattern really make a difference in real world issues?

There are things we know that can make it a better user experience and thus increase turnout. For example, letting people know easily when important elections are (as opposed to only making big news about municipal and federal elections). Giving people the time off to go vote (make it a vacation!) so that people don’t have to worry about work/school and commuting between towns and making it to the poles on time. Etc

On the opposite side, you have things like strategies applied by Republican Party down south, and many tactics applied in my home country of Egypt, such as making it impossible to vote by mail or waiting in line for hours to suppress votes (in the states); and imprison people who vote for candidates while pay other people to vote to a specific candidate to increase voter fraud (in Egypt).

  • I think this raises a really good point on transparency and trickery.
  • Steve Jobs (not a saint himself!) is often quoted saying about privacy of data something along the lines of “Ask people! Ask people every time. And if they say no, then it is a no”.
  • I believe there are tricks that can manipulate the brain, but I don’t think they need to be sneaky or hidden from the users. An example of that is treating yourself to a treat after an exercise to try to make the habit of exercising stick by “tricking” your brain to be rewarded every time you exercise. I think when it comes to user experience and design, they stem from studies of psychology and how the human behaviour is, so as long as you transparently tell your consumers this is where the design is and this is how it works, and the consumers make a conscious decision to use it, then there is no problem.
  • I almost think of the role of the designer as a “mentor” or a “coach” who is teaching someone who just got into exercising all the good moves and their effects on the body, but this person should have the freedom to always choose to exercise differently or walk away without being forced to stay. However, this person can transparently agree that in the workouts their personality is the type that needs to be “pushed” or needs to be tricked to finish the workout (such as saying, “keep pushing only 1 min left” when in reality there’s 10 mins left. Etc)

References:

[1] The reckoning: Silicon Valley confronts its digital sins — The Globe and Mail [2] How Dark Patterns Trick You Online — YouTube [3] Hooked: How to Build Habit-Forming Products