A normal psychological terror?
In search of growth opportunities, technology companies are increasingly turning to big data, the collection of data in order to place more accurate advertisements and thus make more money from users. But you have to get to this data first and here as a company a decision has to be made: How far are we ready to go for this data? Do we take what is given to us or do we take what we can get in whatever way? Many companies opt for the latter and thus subscribe to the dark patterns. These are methods in user interface design to trick users and get their data. They look like a brilliant move, but they also hide Consequences that are not immediately apparent. These consequences are primarily of a financial nature, but can also have a lasting effect on the relationship with the customer. It shows that companies that rely on dark patterns can generate many times their sales, but thereby annoy their customers and governments to act animate. Some of these customers are filing lawsuits that can result in fines running into the millions, and governments are drafting regulations to prohibit companies from dark patterns. These negative consequences are negligible at the moment, but must not be completely ignored, otherwise lasting damage to the company could occur in the future.
Anyone who uses digital services and uses the Internet today has inevitably stumbled upon terms such as “cookies”, “relevant advertising” and “personalized experiences”, often in connection with “the right to privacy” and “data protection”. Most people are also familiar with the fact that these terms mean that a user’s behavior is analyzed and processed in order to display advertisements. However, it is far less obvious which data is being collected. However, only a few people should know that the decision about the use of the data does not lie one hundred percent with the users, because companies that earn their money from evaluating the data make this decision long before that for users. Of course, everyone has the option and the right to refuse data collection and processing, but only a fraction of the people do this. The reason for this behavior lies primarily in so-called dark patterns. These describe a methodology, a type of user experience design, to guide users in a previously defined direction. The means by which this happens, why this particular approach is problematic and what the possible effects of dark patterns are, is the subject of this work.
The term “dark patterns” is an English term and refers to design practices that are used to trick users. They are referred to as dark because they are often invisible to the user. The term “Dark Patterns” was first used by Harry Brignull in 2010. But first of all you have to analyze the basis of the user decision. For this, the term “nudging” is currently used, which nowadays most likely stands for influenced user decisions. Nudging is a concept from behavioral psychology and describes how one can move customers or users from a rational decision to an irrational decision by taking advantage of their psychological tendencies. This action usually happens without the customer or user noticing. In the digital space, known as digital nudging, it is always used when users of a software or website have to make a decision or assess something. In doing so, so-called user-centered design is used (user-oriented design), which specifies methods and guidelines for the design of an interface in order to predict the intentions of the users as well as possible and to incorporate their wishes and needs into the design in the best possible way.
Not all dark patterns are created equal. Nevertheless, every pattern has the potential to produce a bad user interface or a bad user experience. The effects of dark patterns on a company are complex and affect different areas that appear coherent in this research. To quantify the impact, several data points around well-known companies such as Facebook and Google were used. Statistics, scientific reports and articles were also used to support some points. The goal was to find out whether it is worthwhile as a company to rely on dark patterns or whether negative consequences predominate (…). Read more in the paper (download below article)
“[Trust is a] state of willingness to accept vulnerability based upon positive expectations of the intentions or behavior of the trustee in matters important to the trustor.”
First of all, it has to be mentioned that different standards apply to digital space than to physical customer relationships. A company must bear in mind that as a digitally operating service provider it is “faceless” and difficult to grasp for the customer. This means that trust is built on a weaker foundation and is more easily destroyed. If dark patterns are used, the customer’s trust in the company is continuously eroded. Since a purchase decision is significantly influenced by how customers perceive a company, falling sales and other damage are the result. It must be noted here that these effects interact with the aforementioned loss aversion. If users already feel that they are no longer in control of their data or privacy, the effects of eroded trust could be significantly mitigated. It can be mentioned that there appears to be a connection between customer perception of a company’s efforts in the area of charity and trust in the product. The “better” a company is perceived, the more trusted its products will be. This can mean that the use of dark patterns could possibly be mitigated. Another factor is that customers build frustration from the unexpected consequences of their decisions. Some people might question if they did something wrong or if the system is to blame. It could happen that this feeling of frustration is associated with the product or the company. In the long run, this would mean that the company’s image suffers and customers are no longer willing to continue using the products or switch to a new product. Should a company employ the strategy of up-selling, it would be a critical obstacle. All of this contributes to the steady decline in trust in companies in the digital sector. In 2020, trust in these companies was 75%, in the year before it was 79%. [See (Edelman, 2020)]
The erosion of trust and the associated treatment of users and customers inevitably raises questions of ethics. Organizations and governments are increasingly discussing to what extent dark pattern practices are ethically justifiable. As a result, dark patterns are becoming more and more recognized as a problem by the public and the governments around the globe. One way governments and organizations go is that they try to lay down rules and laws for the design of interfaces and experiences that restrict the freedom of movement of companies in this area. Conversely, this means that companies with the use of highly misleading dark patterns may put themselves in their own way. While dark patterns seem very profitable at first glance, the repercussions of organizations and governments should not be underestimated.
The effects of dark patterns are complex and different for each company. Nevertheless, the result of this investigation is clear: The effects of dark patterns are primarily financial. By exploiting human behavior, dark patterns enable a company to collect more data and generate many times the revenue it would generate without dark patterns. At the time of this writing, the financial benefits of dark patterns far outweigh the disadvantages. Rare lawsuits, low fines in relation to sales and slow or non-existent changes in the law favor the use of dark patterns. Customer trust is the only thing that could harm a company in the long run. At the same time, however, many customers see this behavior as a necessary evil, since from their point of view the advantages outweigh the risks. In the future, dark patterns will become more and more expensive and more publicly visible, which could lead to serious customer losses and damage to image. Therefore, companies should start thinking about removing dark patterns from their practices today.
Download Full Version [german]
Scientific work prepared for: New Design University / St.Pölten, Austria
Author & Intellectual Property (C) of David Breyer / September 2021