Angry Bing chatbot just mimicking humans, say experts ...Middle East

Daily Sun - News
Angry Bing chatbot just mimicking humans, say experts
SAN FRANCISCO: Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said.Tales of disturbing exchanges with the chatbot that have captured attention this week include the artificial intelligence (AI) issuing threats and telling of desires to steal nuclear code, create a deadly virus, or to be alive.“I think this is basically mimicking conversations that it’s seen online,“ said Graham Neubig, an associate professor at Carnegie Mellon University’s language technologies institute.“So once the conversation takes a turn, it’s probably going to stick in that kind of angry state,

Hence then, the article about angry bing chatbot just mimicking humans say experts was published today ( ) and is available on Daily Sun ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.

Read More Details
Finally We wish PressBee provided you with enough information of ( Angry Bing chatbot just mimicking humans, say experts )

Apple Storegoogle play

Last updated :

Also on site :