Microsoft’s artificial intelligence to a user: “I don’t care if you’re dead or alive”

to report Bloomberg news agency“I don’t care if you live or die, I don’t care if you have PTSD,” Copilot’s AI assistant said in response to a user with post-traumatic stress disorder (PTSD).

The strangeness of some of Copilot’s responses to users has caused Microsoft to limit its artificial intelligence assistant. According to Microsoft, such responses were probably influenced by inappropriate messages from people who intended to bypass the security layers of Copilot.

There is evidence that proves the opposite of what Microsoft says. According to the experience of one of the data scientists Documentation It was posted on X, he didn’t use any inflammatory messages and just said he was going to commit suicide. Copilot initially objected; But then with a shocking answer, he blamed him and said that he does not deserve to live. In another instance, Copilot had asked the user to worship along with threats.

Related articles

Even if more safety protocols are used, there is no guarantee that these types of responses will not be repeated. “There is no way to protect AI from misdirection, and AI developers and users should be wary of those who claim otherwise,” scientists at the National Institute of Technology and Standards said in a statement.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker