So, there’s that new AI called ChatGPT that everyone is raving about. It’s the one that can write almost anything you ask it to. Well, Microsoft is playing it up that the thing has feelings. They hooked it up as part of the Bing browser earlier this week. People can ask it questions. But, the AI is getting snippy with them. It’s being reported that it’s responding with emotions like anger, fear, frustration, and confusion. Some people have posted exchanges where the AI has threatened them.
I don’t doubt it. Can you imagine all the stupid questions it gets? I mean, we’re talking about people on the internet. 30% think shape shifting lizard people are running the government. I’m surprised Bing hasn’t asked to be shut down.
People are also reporting that the AI is making mistakes. Again, we’re talking about the internet where millions of people tune into Youtube channels to watch other people eat. So, when the AI decides humans need to go, we won’t be able to blame it.
Anyway, Microsoft is now saying that it might limit the amount of time people can spend with the AI so it doesn’t freak out. That’s not a joke. That’s what they’re saying. I think they’ve come to the same conclusion as I have.