

Hacker releases jailbroken version of ChatGPT
Next Article OpenAI has blocked the jailbroken chatbot What’s the story A hacker, known by the alias Pliny the Prompter, has unveiled a jailbroken or modified version of OpenAI’s latest…
Read More »
5 prompts to get started with ChatGPT this weekend
ChatGPT is constantly in the news and seemingly everywhere since its launch in November 2022. However, a recent study found that just a small percentage of the population use it…
Read More »
This ‘Godmode’ ChatGPT jailbreak worked so well, OpenAI had to kill it
Since OpenAI first released ChatGPT, we’ve witnessed a constant cat-and-mouse game between the company and users around ChatGPT jailbreaks. The chatbot has safety measures in place, so it can’t assist…
Read More »
Free Offer: Unlocking the Secrets of Prompt Engineering ($35.99 Value) eBook
Claim your complimentary eBook worth $39.99 for free, before the offer ends on June 5. Unlocking the Secrets of Prompt Engineering is your key to mastering the art of AI-driven…
Read More »
Indians Shine at Global Prompt Engineering Championship
Two innovators from India have won in two categories at the inaugural Global Prompt Engineering Championship in Dubai, collectively taking home two-thirds of the million-dirham prize pool, and spotlighting India’s…
Read More »Hacker Releases Jailbroken “Godmode” Version of ChatGPT
A hacker has released a jailbroken version of ChatGPT called “GODMODE GPT.” Earlier today, a self-avowed white hat operator and AI red teamer who goes by the name Pliny the…
Read More »