Deprecated: Implicit conversion from float 79.9 to int loses precision in /home/cxvps542/visegrad24.info/wp-includes/class-wp-hook.php on line 85

Deprecated: Implicit conversion from float 79.9 to int loses precision in /home/cxvps542/visegrad24.info/wp-includes/class-wp-hook.php on line 87

Deprecated: Constant FILTER_SANITIZE_STRING is deprecated in /home/cxvps542/visegrad24.info/wp-content/plugins/wpseo-news/classes/meta-box.php on line 59
Tech Titans Demand Halt to AI Trials over Grave Threats to Society
Fastest News Updates around the World

Tech Titans Demand Halt to AI Trials over Grave Threats to Society

114

- Advertisement -

Billionaire Elon Musk and a panel of experts called on Wednesday for a halt to the development of powerful artificial intelligence systems to allow time for them to be secure.

The open letter, signed by more than 1,000 people so far, including Musk and Apple co-founder Steve Wozniak, was claimed to be the Microsoft-maintained version of OpenAI GPT-4.

The company says its latest model is much more powerful than the previous version that was used to run ChatGPT, a bot capable of generating snippets of text from even the shortest clues.

“AI systems with human competitive intelligence can pose a serious danger to society and humanity,” reads the open letter titled “Suspension of Giant Artificial Intelligence Experiments.”

“Strong AI systems should only be developed when we are confident that their effects will be positive and that their risks will be under control,” she said.

Musk was an original investor in OpenAI and spent years on its board of directors. His company Tesla develops artificial intelligence systems that, among other things, help with autonomous driving technologies.

The letter, posted by the Musk-funded Future of Life Institute, was signed by high-profile critics as well as OpenAI competitors such as Stability AI head Emad Mustek.

Canadian AI pioneer Yoshua Bengio, who is also one of the signatories of the agreement, warned at a virtual press conference in Montreal that “society is not ready” for this powerful tool and its potential abuse.

“Let’s slow it down. Let’s make sure we develop better security measures,” he said, calling for a comprehensive international discussion on AI and its implications, “as we did with nuclear energy and nuclear weapons.”

The letter cited a blog post by OpenAI founder Sam Altman who suggested that “at some point it may be important to get an independent assessment before starting to train future systems.”

“Agree. Deal now. Therefore, we call on all AI laboratories to immediately stop training AI systems more powerful than GPT-4 for at least 6 months,” the authors of the open letter wrote.

They urged governments to intervene and impose a moratorium if the companies fail to reach an agreement.

The six months should be used to develop security protocols and AI control systems, and to refocus research towards ensuring that AI systems are more accurate, secure, and “trustworthy and loyal.”

The report did not elaborate on the dangers identified by GPT-4.

But researchers, including NYU’s Gary Markus, who signed the letter, have long argued that chatbots are big liars and have the potential to become a widespread vehicle for disinformation.

Source: Science Alert

Leave a Reply

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More