e-Ethics Developer - Lesson 2

 

+++ to secure your transactions use the Bitcoin Mixer Service +++

 

Lesson #2: Ethics of Persuasive Technology

Introduction
Persuasion comes in a variety of forms and the intent may be innocent or may have more nefarious purposes. Throughout this lesson, we�ll be examining the ethics of persuasive technology. [3] Note that Captology is the study of persuasive technology. Take a moment to examine the graphic below:

Graphic that shows the convergence of ethics, persuasion, and technology

To explore ethical issues in persuasive technology in a compelling way, educator Daniel Berdichevsky and researcher Erik Neunschwander solicit �dark side� designs from students, that is, applications of persuasive technology with troubling ethical implications. Consider the following example:

Image taken from the Dark Side

Persuasion is viewed by some as an intentional effort to change attitudes or behavior and technology as the directed application of abstract ideas. Passive technological media, such as megaphones and billboards, facilitate persuasion without altering their pattern of interaction in response to the characteristics or actions of the persuaded party. Active persuasive technologies, however, are to some degree under the control of or at least responsive to the persuaded party. Or should be. The appearance of control may suffice for creating a persuasive experience, but if this appearance is not backed up by reality, the designer runs afoul of our accuracy principle.


Top of page

Persuasive Technology [3]
Analyzing the ethics of any specific persuasive act requires a systematic approach, beginning with a breakdown of standard persuasion and eventually encompassing persuasive technologies as needed. To support this approach, we propose a framework for analyzing acts of persuasion according to their motivations, methods, and outcomes�intended and unintended. Our development framework begins with the basic relationship of a persuader and a person being persuaded (see Figure 3).

Framework for evaluating the ethics of a persuasive interaction in a traditional persuasive 
	context.

In these instances, while a persuader may still use technologies like megaphones and billboards to convey the persuasive message, we ultimately look only at the two parties when distributing responsibility. However, our focus is on technologies created with the intention to persuade�sometimes called �endogenously� persuasive technologies. They differ from technological media in that they are actively persuasive intermediaries between the persuader and the persuaded person. Unlike billboards, they interact dynamically with the objects of their persuasion (see Figure 3).

Framework for evaluating the ethics of the more complex interaction of persuader, persuasive technology, and 
the party or parties being persuaded.

The framework of motivations, methods, and outcomes can be applied in evaluating the ethics of a persuasive act in either case, but the introduction of an actively persuasive technology requires the separate attribution of motivations to the designer and of the persuasive intent to the technology. Oddly, but meaningfully, the technology is both a method and the direct executor of persuasive methods. We must also consider whether technology alters or even shares in the distribution of responsibility for the intent, methods, and end result of a persuasive act. To date, computers have demonstrated neither the capacity to form their own intentions nor the ability to make their own choices. By any sensible standard, therefore, they are not free moral agents [4]�so when computers make serious mistakes, their programmers are often the first people blamed, users second, and Mother Nature third. The computer itself gets off easy.2 Similarly, we cannot realistically attribute responsibility for the persuasive act to the persuasive technology. The major difference between persuasion through active technology and through traditional person-to-person relationships and interactions is not motivation, since the persuader still intends to persuade, presumably for the same reason or outcome, and since the persuaded person still undertakes or experiences that outcome. Our ethical scrutiny of persuasive technology has to center on the methods employed in the persuasion itself.


Top of page

Motivations vs. Intent
The motivations underlying a persuasive act and the intent of that persuasive act are not the same. To figure out the motivation of a persuader, ask yourself, Why is this person persuading me or someone else to do something? The methods employed by persuasive technology are similar to those employed by persuasive people. For example, humans can persuade through flattery. Recent research has shown that computers can flatter too [3]. Humans can also persuade through conditioning, by rewarding and punishing desirable and undesirable behaviors. So can computers. However, technologies embed these methods in a new and compelling context. For instance, while humans can persuade through basic role playing, computers permit simulations to achieve unprecedented complexity, realism, and persuasive potential (see Khaslavsky et al.�s �Understanding the Seductive Experience� in this issue). Such differences are why we need to reconsider the implications for the ethics of traditional persuasive methods when these methods are undertaken by technologies instead of by humans. We must also evaluate the ultimate outcome of the persuasive act�the ethics of what the persuaded person is persuaded to do or think. If something is unethical for you to do of your own volition, it is equally unethical to do when someone persuades you to do it. We assert that designers of persuasive technologies should be held responsible only for reasonably predictable outcomes (see Figure 5).

Persuasive Technology

How sensitive should designers and programmers be to the ethics of the persuasive technology they design? Given this framework of motivations, methods, and outcomes, we can establish the first three of our principles for future persuasive-software design:

  • The intended outcome of any persuasive technology should never be one that would be deemed unethical if the persuasion were undertaken without the technology or if the outcome occurred independent of persuasion.
  • The motivations behind the creation of a persuasive technology should never be such that they would be deemed unethical if they led to more traditional persuasion.
  • The creators of a persuasive technology must consider, contend with, and assume responsibility for all reasonably predictable outcomes of its use.

We propose two principles for the design of persuasive technologies with regard to the collection and manipulation of information about users. The first is: The creators of a persuasive technology must ensure it regards the privacy of users with at least as much respect as they regard their own privacy.3 To complement this principle, we should consider whether personal information is shared with a third party or used exclusively by a particular technology. Our intent here is to persuade you to think critically about ethical issues at the convergence of technology and persuasion. However, remember that to analyze the motivation behind a persuasive act, it is important to put aside for a moment the intended outcome and ask, Why intend that outcome in the first place? But why should we want to persuade you? Because in the near future, persuasive technologies will be commonplace, affecting many people in many ways. By initiating this dialog in the professional community and by proposing a first set of principles for persuasive design efforts, we hope to steer the field in a positive direction from the outset.


Top of page
Next Lesson