Tech companies make collective statement on AI safety

Tech companies Microsoft, Amazon and OpenAI establish public boundaries
Microsoft AI
Microsoft AI (Photo credit: Shutterstock.com / Below the Sky)

In South Korea, major tech companies made a collective statement to ensure safety in the future of AI progression.

On May 21, at the Seoul AI Safety Summit, Microsoft, Amazon, and OpenAI reached an international agreement representing countries such as America, China, Canada, the United Kingdom, France, South Korea, and the United Arab Emirates. The companies will make voluntary commitments to ensure the safe development of AI models.


“We must ensure the safety of AI to … protect the wellbeing and democracy of our society,” South Korean President Yoon Suk Yeol said, according to Reuters.

Yoon also noted his concerns over things like deepfake.


Star actress Scarlett Johansson made headlines earlier this week when suing ChatGPT over the refusal to stop using her voice on ChatGPT 4.0 with the “Sky” voice feature.  She received an offer to be hired by the company to be a voice on ChatGPT, but she declined. Months later, through friends and family, she found out the software was using her voice from a movie where she voiced a chat system, Samantha, who formed an intimate relationship with a human.

In a statement, she said, “Two days before the ChatGPT 4.0 demo was released, [OpenAI CEO] Mr. [Sam] Altman contacted my agent, asking me to reconsider. Before we could connect, the system was out there … In a time when we are all grappling with deepfakes and the protection of our likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity. I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.”

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Join our Newsletter

Sign up for Rolling Out news straight to your inbox.

Read more about:
Also read