Sydney, Microsoft’s crazy chatbot is likely to be available once again

Microsoft introduced its artificial intelligence chatbot earlier this year. The AI ​​company named Bing after its search engine, but its architecture featured a chatbot with a completely different personality. The Redmond-based tech giant named its AI prototype Sydney.

In the early days of Bing’s release, Sidney would sometimes give outlandish and out-of-the-box answers to users and even talk about his plans to take over the world. The chatbot even encouraged a New York Times reporter to leave his wife, and at its worst, took on an anti-Semitic character. The Redmond-based tech giant wasn’t too happy with the direction of its chatbot, so it shut it down and imposed restrictions on Bing’s responses, leaving Sydney forever in the dustbin of history.

According to the announcement GizmodoIt seems that Sydney is still there, hiding in the shadows of algorithms and training data, waiting for another chance to be seen. Kevin ScottMicrosoft’s chief technology officer said it’s possible to see Sydney return one day.

squat “One of the interesting things that happened after Sydney stopped was the creation of a Reddit sub-channel called Save Sydney,” he says. “People got really mad at us and said the chatbot was fun and they loved it.”

AI chatbots are interesting products because they can’t really be thought of as specific tools. The algorithms that run these services are trained on massive amounts of data, and engineers give them a set of instructions and set certain parameters to deliver the version they want you to see.

squat He points out that a Meta Prompt is actually a basic instruction that tells the AI ​​how to behave. For now, companies like Microsoft have to be conservative and keep chatbots healthy and safe while we figure out the limits.

Some people who enjoy a little chaos in computing might be happy to see Sydney back. Before the restrictions, Sydney was a really strange phenomenon; The chatbot mimicked TikTok, insisted it was a time traveler, and even hinted it was alive.

We definitely expect artificial intelligence systems not to cross certain boundaries. squat “There are some things that some people are uncomfortable with and some people aren’t,” he says.

Apparently, Sydney’s passive chatbot even has a following among Microsoft employees. Currently, it is very difficult to separate the hype from the facts in conversations related to artificial intelligence. As a journalist named Casey Newton Recently observed, some leading researchers in the field of artificial intelligence will tell you the technology will bring about the apocalypse, while others will tell you everything will be fine. At this point, it is not possible to say which view is more realistic. On the other hand, the people who develop this technology have no idea about its limitations or how far it will go.

Of course, it seems clear that conversational artificial intelligence such as ChatGPT, Bing, and Google Bard represent a revolution in the way we interact with computers. For a long time, you could only use computers in limited and specific ways, and any deviation from the path the engineers laid out was doomed to failure. Now the situation is different and you can interact with computer systems in ways similar to the way you interact with humans. Of course, the current generation of artificial intelligence often provides irrelevant answers or does not accurately understand the user’s request.

However, with the advancement of technology, changes will be applied to artificial intelligence systems. In fact, the day may come when you use voice commands to work with your computer as much as you use a mouse and keyboard. If this happens, it means that the behavior of programs and devices will become more like humans, and this means that they will have personality, or at least pretend to have personality.

For example, the weather app Carrot is equipped with a version of artificial intelligence. Every time you open this application, it talks to you so that you think it has its own personality. Of course, this idea is not true and in fact the program’s artificial intelligence works based on a set of programmed codes.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker