“Cheating on Machines: Consumers Cheat More on Machines (vs. Humans) Due to Reduced Guilt” by TaeWoo Kim, Hye Jin Lee and Adam Duhachek

“Cheating on Machines: Consumers Cheat More on Machines (vs. Humans) Due to Reduced Guilt” by TaeWoo Kim, Hye Jin Lee and Adam Duhachek

Building on the burgeoning literature of consumer dishonesty, the current research examines whether consumers’ dishonest behaviors amplify when interacting with non-human artificial agents. We hypothesize that consumers would act more dishonestly when interacting with an artificial (vs. human) agent due to a reduction in anticipatory guilt from engaging in unethical behavior. In support of this hypothesis, we found that consumers are more likely to cheat on artificial (vs. human) agents when an economic incentive for cheating was provided to do so (e.g., e.g., when providing false reasons for a product return leads to the return being free) and that this effect was mediated by the reduced anticipatory guilt associated with the dishonest behavior (Study 1). In an extension of this finding, we hypothesized that consumers would be more likely to disclose guilt-laden personal experiences to an artificial (vs. humsn) agent, as disclosure to an artificial (vs. human) agent feels less emotionally taxing (e.g., less embarrassing). In support of this hypothesis, we found that consumers are more likely to reveal their guilt-laden experiences in general episodic recall tasks (Study 2) and marketing related contexts (e.g., when a consumption experience made them feel guilty) (Study 3) when consumers believed that they were interacting with an artificial (vs. human) agent. We reconcile these two seemingly different findings – that individuals are more likely to be honest about one’s guilt-laden experiences when interacting with artificial agents, and, that individuals are more likely to be dishonest to artificial agents when given with an economic incentive – by attributing both observations to the attenuation attenuation of guilt tendencies when interacting with artificial agents.

Leave a Reply

Your email address will not be published. Required fields are marked *