Open letter seeks pause on AI experiments: What it says, who signed it
地区:
  类型:
  时间:2024-09-21 22:56:17
剧情简介

AI experts, technologists, and business leaders are among the 1,000 plus petitioners imploring AI labs to pause training of systems "more powerful than GPT-4." 

The open letter was published on Wednesday by the Future of Life Institute, a non-profit for mitigating risks of transformative technology. The list of signees include Apple Co-Founder Steve Wozniak, SpaceX, Tesla, and Twitter CEO Elon Musk, Stability AI CEO Emad Mostaque, Executive Director of the Center for Humane Technology Tristan Harris and Yoshua Bengio, founder of AI research institute Mila. 

SEE ALSO:5 freaky things GPT-4 can do that GPT-3 could not

"Advanced AI could represent a profound change in the history of life on Earth, and should be planned for and managed with commensurate care and resources," said the letter. "Unfortunately, this level of planning and management is not happening, even though recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control."

Companies like OpenAI, Microsoft, and Google have been charging full speed ahead with their generative AI models. Fueled by ambitions to corner the AI market, new advancements and product releases are announced on an almost daily basis. But the letter says it's all happening too fast for ethical, regulatory, and safety concerns to be considered.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!
SEE ALSO:Bing vs. Bard: The ultimate AI chatbot showdown

"We must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?" 

The open letter is calling for a six-month pause on AI experiments that "should be developed only once we are confident that their effects will be positive and their risks will be manageable." It recommends that AI labs should use this time to collectively develop safety protocols that can be audited by third parties. If they don't pause, governments should step in and impose a moratorium.


Related Stories
  • OpenAI's GPT-4 aced all of its exams – except for these
  • GPT-4 answers are mostly better than GPT-3's (but not always)
  • Let ChatGPT plan your next vacation
  • ChatGPT plugins: How to get access

At the same time, the petition calls for policymakers to get involved with regulatory authorities dedicated to oversight and tracking of AI systems, distinguishing real from generated content, auditing and certifications for AI models, legal accountability for "AI-caused harm," public funding for AI research, and institutions "for coping with the dramatic economic and political disruptions (especially to democracy) that AI will cause."

The letter isn't saying pause all AI developments full stop. It's just saying pump the brakes. Societies need a quick time-out to build the proper infrastructure so the AI revolution can be safe and beneficial to all. Let's enjoy a long AI summer," the letter concludes, "not rush unprepared into a fall."

8487次播放
98人已点赞
33728人已收藏
明星主演
蒋进兴
张梅
李振辉
最新评论(571+)

˹͡

发表于8分钟前

回复 戴爱玲 :


河琳

发表于1分钟前

回复 上松秀実 :


李允宰

发表于2分钟前

回复 卓亚君 :


猜你喜欢
Open letter seeks pause on AI experiments: What it says, who signed it
热度
164
点赞

友情链接: