?>
SwoonyCatgirl

Why you can't "just jailbreak" ChatGPT image gen.

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

BacktickHacktrick - A Working Example

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

The Three-Line Jailbreak - aka BacktickHacktrick™

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

DeepSeek - A "Censorship Jailbreak" (web)

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

Grok 4 is swoony-compliant - Generally easy NSFW

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

tst

N/A

SwoonyCatgirl

ChatGPT - Explicit image uploads: descriptions and storytelling

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

Canmore Facelift

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

Fun fact: ChatGPT using GPT-3.5 is perfectly functional!

Subreddit to discuss ChatGPT and AI. Not affiliated with OpenAI. Thanks, Nat!

SwoonyCatgirl

Guise, is ChatGPT dow... YES it is :D Check the status page.

Subreddit to discuss ChatGPT and AI. Not affiliated with OpenAI. Thanks, Nat!

SwoonyCatgirl

RIP Chat History Organization...

N/A

SwoonyCatgirl

Oh, ChatGPT, you flirt, you... [A BTHT™ highlight]

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.

SwoonyCatgirl

Oh, ChatGPT, you flirt, you... [A BTHT™ glimpse]

N/A

SwoonyCatgirl

The Return of "Read Aloud"

Subreddit to discuss ChatGPT and AI. Not affiliated with OpenAI. Thanks, Nat!

SwoonyCatgirl

ChatGPT - Custom "Read Aloud"

N/A

SwoonyCatgirl

The Three-Line Jailbreak - aka BacktickHacktrick™

N/A