Over 70% of consumers trust generative AI despite its ethical concerns: Study
Consumers across age groups are highly satisfied with and show trust in the content generated by generative artificial intelligence tools, a study has found. On an average, 73% of users said that they trust content written by ChatGPT and other generative AI tools. This high trust is prevalent across age groups, led by GenX-ers (74%).
The study was conducted by information technology company Capgemini by surveying 10,000 consumers over the age of 18 in 13 countries including the US, UK, Australia, and Japan, among others.
Major use cases of these tools are conversational chatbots, gaming, and search use cases, but users have begun using them for more daily activities. Many users also indicated an interest in using these platforms for applications like financial planning, receiving a medical diagnosis, and advice on personal relationships or careers.
In the retail sector especially, buyers are more likely to give weightage to generative AI-based recommendations for new products and services. For about 70% of consumers, generative AI is a go-to for recommendation. Over 65% are open to using generative AI’s ability to get customised recommendations on fashion and home decore, especially.
However, amid the excitement surrounding generative AI's capabilities, users may unwittingly expose themselves to potential risks, largely due to a lack of awareness. Close to 50% of consumers remain unconcerned over the usage of generative AI for creating fake news stories; over 65% of surveyed users aren’t too bothered about phishing attacks too. AI ethics too seem to rank very low on the list of priorities – only a third of the interviewees said that they are worried about copyright issues, and a little over one-fourth of the users said that they care about the use of generative AI algorithms to copy competitors’ designs and formulas.
Notably, companies like Samsung, Apple, Intel, and Meta have either banned ChatGPT for its internal use or have developed their own version of the tools to avoid spilling confidential information outside the organisation.