OpenAI is warning that some people may become emotionally reliant on its lifelike ChatGPT voice mode. In a report Thursday, OpenAI released information on the safety work the company conducted on ChatGPT, its popular artificial intelligence tool, and the new voice mode that sounds human. OpenAI first began rolling out GPT-4o to paid customers last week, CNN first reported. The company revealed the latest technology during a demonstration in May. It can translate between two speakers during a real-time conversation and detect a human’s emotions based on a selfie they took. The company said the new audio technology presents “novel risks,” including speaker identification, unauth
Hence then, the article about openai warns people might become emotionally reliant on its chatgpt voice mode was published today ( ) and is available onThe Hill ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.
Read More Details Finally We wish PressBee provided you with enough information of ( OpenAI warns people might become emotionally reliant on its ChatGPT voice mode )