People use AI for companionship much less than we’re led to think

Spread the love

How people are rotating in the AI ​​chatbots for sensitive assistance, sometimes even the extra amount of attention EstablishmentOften one is considered a national behavior to be common.

A new Report The popular AI chatboat is made by anthropologists, making it a different reality: in reality people rarely find companionship from Claud and return to the bot for sensitive support and personal advice.

The company highlighted in his report, “There are less than 0.5% of the combined conversation of the roller and the rollplay.”

Anthropic states that its research attempts to find insights in the use of AI for “sensitive conversation”, which defines it as personal exchange where people spoke to Cloes for coaching, counseling, companionship, role, or relationship advice. The company has analyzed the use of 4.5 million conversations that users have in clod free and pro layer levels that most of the clode use is related to work or productivity, people use chatboat to create most contents.

Figure Credit: Ethnographic

It was said that anthropologists have discovered that people often use clad for interpretary suggestions, coaching and counseling, users often ask for advice on improving mental health, personal and professional development and communication and advice on studying interconnected skills.

However, the company notes that conversation of support-evening conversations can sometimes become companionship where the user is facing sensitive or personal crisis, such as the fear of existence, loneliness or meaningful connection to their real life.

“We also noticed that in a long conversation, counseling or coaching conversations turned into companionships occasionally – even though anyone was not the real reason to reach,” the ethnic writes that broad conversations (including more than 50+ human messages) were ideal.

Anthropic also highlighted other insights, such as how Clod itself rarely prevents users’ requests, when its programming prevents it from promoting the border of protection, such as dangerous advice or self-loss support. People want coaching or advice from the bot to make conversations more positive over time, the company said.

The report is definitely interesting – this is a good thing to remind us again that how many and often AI tools are being used outside of work. Nevertheless, it is important to remember that the AI ​​chattobs across the board are still a lot of work: they HallucinateWell -known Provide incorrect information Or Hazardous consultationAnd anthropologists acknowledged by itself, May Even the blackmail resortThe

Leave a Reply

Your email address will not be published. Required fields are marked *