Elon Musk, Steve Wozniak, and Many More AI Fears: Stop the Wild Race, 6-Month Break for General Rules

    Elon Musk, Steve Wozniak, and Many More AI Fears: Stop the Wild Race, 6-Month Break for General Rules

    Elon Musk, Steve Wozniakbut also to employees, researchers, and managers of Microsoft, Google, Amazon, and Meta afraid. Fear of what? of artificial intelligence. I am More than 1,000 locations from U.S open letter who asks him to Suspension of artificial intelligence experiments Shared rules get drawn up and validated by independent characters.

    We call on all AI labs to Immediately suspend training of more powerful GPT-4 AI systems for at least 6 monthsis the salient point of the letter.

    Race From the last few months to develop Artificial intelligence more powerful than ever And under everyone’s eyes: flourish gpt chatand the integration of artificial intelligence into services Microsoft And Google with whom he is trying to recover cold. Not only that, more or less valid solutions have popped up like mushroomsGenerative artificial intelligence their hub.

    It’s not just that these AI systems are a Danger to many professionsbecause they are able to do a job that hitherto required someone with great skills, but “they can share Serious dangers to society and humanityas evidenced by extensive research and recognized by top AI labs,” the letter reads.

    Advanced artificial intelligence can A profound change in the history of life on Earth It must be designed and managed with care and adequate resources. Unfortunately, this level of planning and management has not happened in recent months The Labs engaged in a race out of control to develop and implement Increasingly powerful digital minds that no one can understand, not even their creatorsor reliably predict or control.”

    Should we let machines flood our news outlets with propaganda and lies? Should we automate all jobs, including sick ones? We must develop non-human brains that may eventually be more numerous, Fool us, make us obsolete, and replace us? We must take risks Lose control of our civilization? Such decisions should not be delegated to unelected technology leaders,” the text continues.

    AI can lead to historical changes, not necessarily positive changes. So much so that the letter, in addition to requesting a pause for reflection that should include everyone, states that “If this pause cannot be quickly effected, Governments must step in and impose a ban“.

    Downtime can setCommon security protocols for designing and developing advanced artificial intelligenceThese protocols are supposed to ensure that the systems they adhere to are secure beyond a reasonable doubt.

    This does not mean a pause in the development of artificial intelligence in generalbut simply Step back From the risky model race black box increasingly large and unpredictable with emerging capabilities.” With black box we refer to a type of black box, whose external behavior can be described on the basis of what is returned after input data, but whose internal function is largely unknown.

    Humankind can enjoy a prosperous future with artificial intelligenceThe letter concludes. “Having succeeded in creating robust AI systems, we can now enjoy the summer of AI in which we reap the rewards, designing these systems for the obvious benefit of all and giving society a chance to adapt. The company has suspended other technologies with potentially catastrophic impacts on society (citing human cloning, human germline modification, gain-of-function research, and eugenics). We can do that in this case as well. Let’s enjoy a long summer of artificial intelligence, let’s not fall unprepared into the abyss.”

    Previous article“We come to Australia with confidence, it’s good to get 100 podiums” – OA Sport
    Next articleZaporozhye, the troops surrounded the plant: “It cannot be protected”
    "Explorer. Devoted travel specialist. Web expert. Organizer. Social media geek. Coffee enthusiast. Extreme troublemaker. Food trailblazer. Total bacon buff."


    Please enter your comment!
    Please enter your name here