Duplex, Ickiness and the Moral Obligations of Bots

Or, why at least one former philosopher doesn’t see anything wrong with virtual assistants making phone calls.

jjosephmiller
5 min readMay 10, 2018
Bots pretending to be human? Clearly villains.

Google’s new Duplex seems to have struck a nerve. Twitter has been awash in charges that the whole project is unethical.

The backstory: Duplex is an AI that can make phone calls for you. But it’s how it makes those calls that is raising eyebrows. See, Duplex very carefully imitates real human speech—including adding verbal tics like “um” and “uh.” Here’s a sample:

I’ll admit that that’s pretty uncanny. But I’m not sure I see how it’s unethical.

The Case Against Duplex

Most versions of the argument that Duplex is unethical take the following form:

  1. Duplex sounds like a human.
  2. Duplex does not reveal that it is a bot.
  3. Using a bot that fails to disclose that it is a bot is unethical.
  4. Therefore, Duplex is unethical.

Obviously most of the weight of the argument rests on (3). So let’s unpack that a little bit. I’m going to start by making the claim a little more formal.

P: In any transaction T between transactors A and B, A is morally required to disclose fact F about A to B.

The question, then, is for what values of F is P true?

Let’s start by agreeing that P cannot possibly be true for all values of F. For starters, ought implies can. That is, to say that I ought to do X implies that I can in fact do X. I can’t, for example, have a personal obligation to lift every single person out of poverty because it’s not possible for me to lift every single person out of poverty.

That’s important because there is an infinite number facts that are true of transactor A. Those range from really big things (I’m the spouse of Caroline, the father of Matthew) to fairly trivial ones (my ring finger is longer than my index finger).

Since I cannot possibly disclose an infinite number of facts, it follows that there are at least some things that I’m not ethically required to disclose when I conduct a transaction. The question, then, is which camp being a bot falls into.

The Universe of Facts

Similarly, I cannot disclose all values of F in a transaction because the set of all F’s is infinite. Think about how many things are true about me. I’m 6'2". I’m right-handed. I’m a white, heterosexual, cis-gendered male (that’s four facts right there). I’m wearing a Star Trek shirt and my favorite pair of jeans. I have poor eyesight. My office is 4.5 miles from my wife’s office on foot, and 6.6 miles by car.

None of these facts seem like things I’m obligated to disclose when I call to make my hair appointment.

But some things clearly are required. For example, suppose thatI know that I cannot be at the salon until 4:15 but book a 4pm appointment anyway. Or suppose that I book a simple haircut knowing that once I arrive I plan to ask to have my hair dyed. Or suppose that I am actually booking an appointment for my 8-year-old nephew but give the stylist my name and information.

These do seem like facts it would be wrong to withhold. And they are wrong to withhold because they have a direct impact on the transaction itself. They are, in other words, facts that would change the nature of the agreement itself.

So let’s try again.

P1: In any transaction T between transactors A and B, A is morally required to disclose fact F about A to B if and only if F would change B’s willingness to complete T.

Moral Permissibility

That’s starting to seem like a solid ethical principle. But let’s throw in one more wrinkle.

Now as I said, I’m straight, white, cis-gendered, and male. I don’t exactly face a lot of discrimination. But I grew up a West Virginia hillbilly. Years of living outside WV (along with a speech coach who helped me get over crippling stage fright) have pretty much erased my Appalachian accent.

Suppose that my stylist had a grudge against people from Appalachia. And suppose that grudge were such that if I disclosed that I grew up in West Virginia, she would refuse to make an appointment. (So we’re clear, none of that is true; my stylist is a perfectly lovely person.)

In this case, my failure to disclose the location of my birth fails P1. But I don’t think any of us really believes that my failing to tell my stylist that I’m a native West Virginian is unethical. It’s true even if my stylist could (correctly) complain that my having had speech coaching made it difficult for her to tell that I am a West Virginian.

And that’s largely because we think that my place of origin isn’t a morally permissible reason for refusing to conduct a transaction with me.

One last time.

P2: In any transaction T between transactors A and B, A is morally required to disclose fact F about A to B if and only if F would change B’s willingness to complete T and F is a morally-relevant consideration for completing T.

Bots and the Ick Factor

It’s hard to see how I am a bot would be morally required under P2. The bothood (or lack thereof) of my assistant is completely irrelevant to the transaction. And I don’t like people who use bot assistants doesn’t seem on its face as if it’s a morally permissible reason to refuse to cut my hair.

In other words, there is no ethical requirement for the Duplex bot to disclose its bot-ness. Or, perhaps more modestly, the argument that Duplex is unethical because it is deceptive seems not to work.

There is one other kind of argument I have seen offered. Let’s call it The Argument from Ickiness.

The boo/hurrah theory of morality. No, that’s a real thing.

The Argument from Ickiness (philosophers call it emotivism) is a dangerously terrible argument. And while I’m picking on Matt here, it’s only because he’s given the most succinct version of something I’ve seen in a bunch of places.

What’s so bad about it?

Well, let’s start with the fact that it literally suggests using the punchline of a Stephen Colbert joke as our moral compass.

Remember when we made fun of the President for deciding things with his gut? Those were good times.

The bigger problem is that we often have wildly different emotional reactions to things. Plenty of people claim to experience revulsion at things like same sex relationships, transgenderism, abortion, and even miscegenation. I grew up in a world of Baptist fundamentalists. Trust me when I say that people really do feel those emotions.

Those gut feelings are just wrong.

Our gut feelings can tell us a lot about our culture. They are a notoriously unreliable guide to ethics.

I don’t disagree with Matt in one respect, though. Google probably could use some ethicists on staff. But in this case, I think those ethicists would have said that Duplex is perfectly fine.

--

--

jjosephmiller
jjosephmiller

Written by jjosephmiller

Employing hypertext to explore ambiguous idea spaces. Principal, Fountain Digital Consulting. Author SCREENS, RESEARCH AND HYPERTEXT. Recovering philosopher.

No responses yet