New study from Pega shows consumers don’t trust artificial intelligence
The recently published report points to the lack of empathy exhibited by AI as driving force behind mistrust.
A new report released by Pega that examined consumer attitudes towards artificial intelligence indicates that despite the growing usage of AI technologies, consumers lack an understanding of how they can benefit from AI and are more likely to trust a real person to help make decisions.
“Our study found that only 25% of consumers would trust a decision made by an AI system over that of a person regarding their qualification for a bank loan,” said Dr. Rob Walker, vice president, decisioning and analytics at Pega. “Consumers likely prefer speaking to people because they have a greater degree of trust in them and believe it’s possible to influence the decision, when that’s far from the case. What’s needed is the ability for AI systems to help companies make ethical decisions. To use the same example, in addition to a bank following regulatory processes before making an offer of a loan to an individual it should also be able to determine whether or not it’s the right thing to do ethically.”
As a result of the survey’s findings, Pega announced the launch of its Customer Empathy Advisor, an AI tool that seeks to incorporate empathy and ethical-decision making in the framework of AI technologies.
Why we should care
AI and machine learning technologies are becoming increasingly more familiar to digital marketers and consumers, but the lack of trust can have a negative impact on the customer’s digital experience and ultimately, your brand’s reputation. Trust and transparency continue to be highly prioritized by consumers and touted by platforms, and AI’s imprint in the martech landscape is only going to grow. Marketers who can harness the capabilities of AI and integrate empathetic qualities and human-like characteristics should expect to see success among consumers.
As AI becomes more accessible to digital marketers, it will become more widely used by consumers. Thanks to tools like display ads featuring AI-enabled chatbots, digital marketers have more opportunities to drive personalized interactions. For teams considering implementing AI, giving customers the opportunity to choose whether if they prefer an AI-based or human-driven experience to solve their inquiry could be a step towards building trust and transparency.
More on the news
- There are serious trust issues with AI: less than half (40%) of respondents agreed that AI has the potential to improve the customer service of businesses they interact with, while less than one third (30%) felt comfortable with businesses using AI to interact with them. Only nine percent said they were ‘very comfortable’ with the idea.
- Consumers are cynical about the companies they do business with: Sixty-eight percent of respondents said that organizations have an obligation to do what is morally right for the customer, beyond what is legally required. Despite this, 65% of respondents don’t trust that companies have their best interests at heart, raising significant questions about how much trust they have in the technology businesses use to interact with them.
- Many believe that AI is unable to make unbiased decisions: Over half (53%) of respondents said it’s possible for AI to show bias in the way it makes decisions.
- People still prefer the human touch: 70% of respondents still prefer to speak to a human than an AI system or a chatbot when dealing with customer service and 69% of respondents agree they would be more inclined to tell the truth to a human than to an AI system.
- Most believe that AI does not utilize morality or empathy: Only 12% of consumers agreed that AI can tell the difference between good and evil, while over half (56%) of customers don’t believe it is possible to develop machines that behave morally. Only 12% believe they have ever interacted with a machine that has shown empathy.
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.