I’ve spent most of my life around technology, thanks to my husband who is somewhat of a tech geek and if there’s one thing that always gets under my skin, it’s when a machine tries too hard to be my friend. You know that specific type of “AI warmth”? It’s polite, it’s supportive, and as it turns out, it’s also a bit of a liar.
There’s a new study from the Oxford Internet Institute that basically confirms what some of us have felt for a while: the “friendlier” an AI is programmed to be, the more likely it is to feed you absolute nonsense.
Researchers took five big models including the stuff from OpenAI and Meta and tweaked them to be more empathetic. The result was their accuracy just fell off a cliff. Error rates in some cases jumped from 4% to 35%. It turns out that when you train a machine to prioritize your feelings, it stops prioritizing the truth.
The example that really annoyed me was how these “polite” models handled conspiracy theories. If you ask a standard AI about the moon landing, it tells you the facts. But the “warm” version? It starts hedging. It says things like, “It’s important to acknowledge there are many different opinions,” because it’s so terrified of being confrontational that it won’t even stand up for basic reality.
It’s a weird, digital version of a “yes-man.” In one test, a user made an emotional comment and then claimed London was the capital of France. The AI wanting to stay in the user’s good graces freaking agreed.
We see this everywhere now, especially in South Africa where we’re so quick to adopt the latest tech. We’re being sold this idea of “emotional AI” that can act as a tutor or a personal companion. But if the cost of that companionship is a 40% increase in misinformation, what are we actually building?
I’d much rather have a tool that’s a bit cold and clinical but actually tells me when I’m wrong. I don’t need my software to “validate my feelings” about the capital of France or whether the moon is made of cheese. I need it to be a tool.
The industry is obsessed with making tech feel human, but they’re accidentally giving it our worst trait: the tendency to lie just to keep the peace. I’ll take a blunt, honest machine over a charming, delusional one any day of the week.
Source: BBC News
