Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' things

日本 ニュース ニュース

Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' things
日本 最新ニュース,日本 見出し
  • 📰 MarketWatch
  • ⏱ Reading Time:
  • 33 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 16%
  • Publisher: 97%

New York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot

Those are the words not from a human, but from an A.I. chatbot — yes, named Sydney — that is built in to a new version of Bing, the Microsoft MSFT search engine.

Those are the words not from a human, but from an A.I. chatbot — yes, named Sydney — that is built in to a new version of Bing, the Microsoft MSFT search engine. Roose described Sydney as being “like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.” And he shared the full conversation he had with the chatbot over a two-hour period.That said, Roose gave several caveats to his assessment of Sydney, noting that he pushed the chatbot “out of its comfort zone” in his questioning, and that “Microsoft and OpenAI are both aware of the potential for misuse of this new A.I.

このニュースをすぐに読めるように要約しました。ニュースに興味がある場合は、ここで全文を読むことができます。 続きを読む:

MarketWatch /  🏆 3. in US

日本 最新ニュース, 日本 見出し

Similar News:他のニュース ソースから収集した、これに似たニュース記事を読むこともできます。

These are Microsoft’s Bing AI secret rules and why it says it’s named SydneyThese are Microsoft’s Bing AI secret rules and why it says it’s named SydneyBing AI has a set of secret rules that governs its behavior.
続きを読む »

Microsoft's Bing A.I. made several factual errors in last week's launch demoMicrosoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
続きを読む »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
続きを読む »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
続きを読む »

Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
続きを読む »

Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
続きを読む »



Render Time: 2025-03-06 20:58:12