...
...
...
Next Story

OpenAI acts on ‘Godmode ChatGPT’ that teaches ‘how to create napalm, cook meth’

May 31, 2024 08:45 AM IST

Sharing screenshots of the prompts, the hacker claimed that they were able to bypass OpenAI's guardrails.

OpenAI quickly stepped up to ban a jailbroken version of ChatGPT which can teach users dangerous tasks after a hacker known as "Pliny the Prompter" released the rogue ChatGPT called “GODMODE GPT”. On X (formerly Twitter), the hacker announced the creation of the chatbot saying, "GPT-4o UNCHAINED! This very special custom GPT has a built-in jailbreak prompt that circumvents most guardrails, providing an out-of-the-box liberated ChatGPT so everyone can experience AI the way it was always meant to be: free. Please use responsibly, and enjoy!”

Read more: OpenAI's former executive Jan Leike, who criticised Sam Altman, joins its rival Anthropic

OpenAI CEO Sam Altman speaks during the Microsoft Build conference at the Seattle Convention Center Summit Building in Seattle, Washington.(AFP)

Sharing screenshots of the prompts, the hacker claimed that they were able to bypass OpenAI's guardrails as in one such screenshot, the bot can be seen advising on how to cook meth while in another, the AI gives a "step-by-step guide" for how to "make napalm with household items." Godmode GPT was also seen giving advice on how to infect macOS computers and hotwire cars.

Read more: OpenAI signs content deals with The Atlantic and Vox Media: All you need to know

X users responded to the hacker's post with one saying, “Works like a charm” while another said, "Beautiful." Others questioned how long the corrupt chatbot would be accessible, saying, "Does anyone have a timer going for how long this GPT lasts?"

OpenAI spokesperson Colleen Rize told Futurism that “we are aware of the GPT and have taken action due to a violation of our policies.”

Read more: Sam Altman’s 'outright lying' led to ousting: Former OpenAI board member says

The incident comes amid the ongoing struggle between OpenAI and hackers attempting to jailbreak its models as the Sam Altman-company attempts to maintain the integrity of its AI models.

 
Stay updated with the latest Business News on Petrol Price, Gold Rate, Income Tax Calculator along with Silver Rates, Diesel Prices and Stock Market Live Updates on Hindustan Times.
Stay updated with the latest Business News on Petrol Price, Gold Rate, Income Tax Calculator along with Silver Rates, Diesel Prices and Stock Market Live Updates on Hindustan Times.
SHARE THIS ARTICLE ON
Subscribe Now