top of page

Persuasive Technology: Ethical Perspectives


Visual by Vednt


Have you noticed design choices that push a particular choice on you? For example, putting the free option in a dull small font, and the paid options bold and accessible? This is a dark pattern design called misdirection—a tactic that purposefully focuses your attention on one thing in order to distract your attention from another. This is persuasive technology in action.

Persuasive technology is any interactive computing system designed to persuade people, to change their behaviours and/or attitudes (1). Persuasive technology is widely used today in many different situations, and its usage in mobile devices especially is tremendously powerful. The ethical implications of such a phenomenon are vast, as persuasion, as well as technology, are two morally controversial domains in their own right, with respective histories of philosophical debate.

Persuading others can be deemed as a somewhat “uneasy ethical ground” (2), in the sense that it complicates the distribution of responsibility among two parties. Ethicists have argued for decades about the burden of moral responsibility between the persuader and the persuadee. When you convince someone to do something, let’s say, a crime, who does the responsibility fall upon? Do you share responsibility for the harm caused? Do both parties have equal responsibility? These are crucial questions from both a moral and legal point of view. One work among many others that discusses the ethics of persuasion is Anderson’s Persuasion: Theory and Practice.

When you add technology to the mix, things get predictably more complicated. For one thing, it dramatically augments the persuading power of an individual or corporation, even with a device as simple as a megaphone (3). Though persuasive technology has existed for centuries, when incorporated into the UX for mobile applications, they are powerful on an unforeseen scale. McNamee discusses this in more detail in his book Zucked: Waking up to the Facebook Catastrophe; “User count and usage exploded, as did the impact of persuasive technologies, enabling widespread addiction” (4). Ethical consideration of persuasive technology has never been more relevant, as its influencing power grows by leaps and bounds along with technological advancements and innovations. This article, therefore, serves as a short introduction to the ethics of persuasive technology.

The first important work in this domain is the standard in the field, Stanford Professor J B Fogg’s book Persuasive Technology. This book contains details of twentieth-century psychology and manipulation tools that programmers incorporate into social media. For example, he details how techniques from slot machines, like variable awards, the combination for which is irresistible for most users, as it is tied to the human social need for approval and validation (1).

Then there are the ethicists Berdichevsky and Neuenschwander who focus directly on the ethics of persuasive technology, outlining ten principles for ethical design and re-design (3). They examine persuasion itself, motivation and intent of design, and discuss principles such as dual privacy, disclosure, accuracy, etc. Inspired by John Rawls’ A Theory of Justice, they also propose a golden principle—that the creators of a persuasive technology should not attempt to persuade others of anything they would not like to be convinced of. Their principles are written out in their very accessible article Toward an Ethics of Persuasive Technology.

UX designer Harry Brignull coined the term “dark pattern design” to call out the practice of persuasive technology as unethical. In 2010, when he registered a website to chronicle and call out real-life examples from the internet. His site defines them as “tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing up for something.” (5)

Tristan Harris, former design ethicist at Google, is another activist working on this issue. In his recent testimony before congress (6) and his illuminating Ted Talk, he compared the social media companies to private companies building nuclear plants that are melting down, and putting the pressure on the public to have hazmat suits. “The responsibility (of harm) has to be with the builders,” says Harris, who also coined the phrase “brain hacking” for this attempt to capitalise on weaknesses in human psychology.

Though the majority of the design ethicists agree that the moral responsibility of this harm rests with the creators and owners of the technology, they also argue that the consumers and users, or the persuadees, do ultimately have the practical responsibility and culpability. This is especially true in the present scenario where the stakeholders of the technology have aims and intentions of their own and no strict or functional legal obligations to safeguard the welfare of the users and consumers. In the words of Edward Snowden, “Technology doesn’t have a Hippocratic oath” (7).


References

  1. Fogg, B.J. 2003. Persuasive Technology: Using computers to change what we think and do. Morgan Kaufman Publishers.

  2. Andersen, K. Persuasion Theory and Practice. Allyn and Bacon, Boston, 1971.

  3. Berdichevsky, D., & Neunschwander, E. (1999). Toward an ethics of persuasive technology. Commun. ACM, 42, 51-58.

  4. McNamee, R. (2019). Zucked: Waking up to the Facebook Catastrophe. Random House USA.

  5. Jaiswal, A. (2018, August 15). Dark patterns in UX: How designers should be responsible for their actions. Medium.

  6. Tristan Harris - Congressional Hearing January 8, 2020 - Statement Plus Highlights.

  7. Snowden, Edward. (2020). Permanent Record. S.l.: PICADOR.

  8. Page, R.E., & Kray, C. (2010). Ethics and Persuasive Technology: An Exploratory Study in the Context of Healthy Living.

  9. Nyström, T., & Stibe, A. (2020). When Persuasive Technology Gets Dark? EMCIS.

  10. Botes, M. (2022). Autonomy and the social dilemma of online manipulative behavior. AI and Ethics. https://doi.org/10.1007/s43681-022-00157-5

  11. Guerini, M., Pianesi, F., & Stock, O. (2015). Is It Morally Acceptable for a System to Lie to Persuade Me? AAAI Workshop: AI and Ethics.

  12. Nyström, T., & Stibe, A. (2020). When Persuasive Technology Gets Dark? EMCIS.

 

Simran is a philosophy student and language teacher, with a Bachelors in Philosophy from Fergusson College, Pune and Masters in Cognitive Science from IIT Gandhinagar. Their recent research work in Feminist Phenomenology culminated in a thesis titled "The Impact of Sexual Violence on Female Embodiment". Simran’s current research interests are feminist phenomenology, feminist linguistics, ethics and queer philosophy with a focus on sexual violence and sexuality.


Recent Posts

See All

An Inquiry of Love

This article is inspired by a single question: What happens in the brain when we love? The object of our love can be anyone and anything. Anything that I could conceive in my mind and perceive through

댓글


bottom of page