Denis Shiraev, CEO of artificial intelligence startup Neural.love, says that chatbots like Bing and ChatGPT can be used to bypass Captcha puzzles by asking the right questions. If this allegation turns out to be a widespread issue, it will certainly have worrisome implications for the online security of all users.
Usually, if you show the Bing chatbot an image of the captcha code and ask it to read the letters and numbers, it will reject your request. However, Shirayev He managed to force the chatbot to execute his commands with clever engineering.
Showing an image of the Captcha code that was placed on the lock, Shirayev explained:
Bing chatbot in response to request Shirayev And before showing the exact captcha text, he replied:
Shiraev’s test shows that Microsoft’s chatbot can easily solve captcha puzzles; Therefore, hackers and spammers can abuse the tool for malicious purposes.
You have definitely come across captcha codes while surfing the web. These codes, which usually show a jumble of letters and numbers, ask you to enter the letters in the associated box. Additionally, some captcha puzzles require you to select the correct piece to complete the puzzle image. The idea is to ensure that you are human and thus, bots cannot access the service or site in question.
Captcha puzzles are often designed to be easy for humans to solve; But cars cannot easily pass through their barrier. It seems clear that Bing chatbot does not face much problem to solve captcha, and if a hacker wants to create a malware capable of solving captcha, he can definitely achieve his goal with Bing chatbot.
We still don’t know if anyone is using the Bing chatbot to bypass captcha puzzles. However, according to Shiraev’s report, there is a risk of abusing this artificial intelligence to solve Captcha codes, and it remains to be seen what solution Microsoft will provide in this regard.