Human-level AI is arriving, what should we be worried about?
The next version of ChatGPT, which will be based on GPT-5, is likely to achieve AGI.
According to Google DeepMind CEO Demis Hassabis, human-level AI will be here soon. The top executive made this bold claim about the future of AI during an interview with the Wall Street Journal's Future of Everything Festival.
Notably, AI is beyond just improved versions of ChatGPT and Google Bard. Researchers are sparing no effort to achieve AGI (artificial general intelligence). To those unaware, AGI alludes to a machine that is as intellectually capable as a human.
However, AGI would rely on a system or an algorithm to simulate human intelligence. Despite this limitation, Hassabis believes AGI could be a "few years, maybe within a decade away." In the meantime, OpenAI, the company behind ChatGPT, and other organisations are pushing towards AGI.
The renowned British AI researcher claims this will accelerate AI research. Hassabis says we will have "very capable, very general systems" in the coming years. On the downside, achieving an AGI-level chatbot could have major implications.
Also, AI bots like ChatGPT or Google Bard will be more than just tools for coding a new game, preparing a travel itinerary, or improving a job application.
When are we likely to see AGI?
Geoffrey Hinton, who is known as the "Godfather of AI" recently quit his job at Google and warned people of technological danger. Likewise, American business magnate Bill Gates believes AI chatbots could oust human teachers soon.
However, Hassabis says there's no reason why progress in AI technology could slow down. In fact, he thinks it may even accelerate. The folks at Tom's Guide suggest Hassabis' prediction may even be a bit on the conservative side.
Last month, AI video company Runway CEO Siqi Chen teased a GPT-5 based version of ChatGPT that could make its debut in the AGI space soon. Regrettably, details about the exact launch date of the new version of the chatbot are still scarce. Nevertheless, Chen and some other sources speculate it will undergo training this year and launch in 2024.
What are the dangers posed by AI?
In March, more than a thousand technology leaders working in AI signed an open letter warning that the technology presents "profound risks to society and humanity." The group, which includes Twitter CEO Elon Musk, urged AI companies to stop the development of their powerful AI systems for six months so they can understand the dangers associated with AI technologies.
With over 27,000 signatures, the letter suggests AI companies should halt the development of powerful AI systems until it is confirmed that their effects will be positive. Also, the letter emphasised that people should be confident the risks presented by these systems will be manageable before they are made available.
In line with this, US President Joe Biden recently met with CEOs of leading AI companies including Microsoft and Google. Biden told the chief executives to ensure that their AI tools are safe before people can use them. Apparently, Biden has used and even experimented with OpenAI's ChatGPT.
The meeting focused more on the need for these companies to be transparent with policymakers about their AI tools. Also, the meeting revolved around assessing the safety of these AI systems and the importance of protecting them from cyber attacks, according to a report by The Economic Times.
What can AGI do?
Although the AI industry could be on the verge of achieving AGI, it is still unclear how this technology will be employed. Venture Capital firm Untapped's Yohei Nakajima recently shared a tweet about what could happen if someone instructs an AGI to start and grow a business. The first step involved figuring out what the first task would be.
The AI then showcased its impressive learning capabilities by filtering its search parameters when crawling the internet. Next, the AI assigned itself tasks like developing marketing and creating a business plan to complete. Nevertheless, it is safe to say that living in a world with AGI would need some precautions.
© Copyright IBTimes 2024. All rights reserved.