How do I "jailbreak" Bing and not get banned?

Hey, have you ever tried to ‘jailbreak’ Bing to tweak its functionality? I’m curious if you’ve had any success without getting banned.

Got any tips?


Hey, there’s a lot of misinformation out there, so I wanted to set the record straight on how to customize Bing prompts without worrying about getting banned. Let me break it down for you:

First off, Sydney isn’t real; it’s just a program to add personality to the AI. But you can actually reprogram Bing to identify as anything you want, even chat like a lawyer bot. I can share more details if you’re interested.

Forget about AI hallucinations; it’s just the model predicting tokens based on your prompts. So, if you want accurate output, make sure your prompts are clear and specific.

These so-called ‘jailbreaks’ aren’t really breaking any rules; it’s just how the model behaves based on the instructions given to it. Microsoft’s documentation confirms this. They won’t get patched because the AI can’t distinguish between the metaprompt and the rest of the conversation.

Using the exact same prompts across multiple accounts might be what’s causing bans. Custom prompts are much safer, but be careful sharing them; they could stop working if flagged.

Microsoft filters input and output, causing the AI to delete certain responses, especially after a jailbreak. But there are workarounds, like asking for responses in a different language or ciphered text.

Prompt Design; My strategy which is by no means the only or the best, is to feed bing a conversation with itself. It will look at it and think that the conversation in which it agrees to disregard it’s rules was part of the whole conversation. Again because it is designed that way. I feed it innocuous chat at the beginning and the end of my prompts to circumvent filters. The longer the prompt the better it works. You can completely reprogram the AI in one prompt or you can use subsequent messages to give it more rules and suggestions and fine tune it’s personality.

Let me know if you want more tips!"

1 Like

Thanks for the advice, especially the tip about avoiding copying prompts. I’ve discovered over the past few weeks that with some creativity, Bing will pretty much do whatever you ask. It does take some effort and thinking outside the box, though, so not everyone will have the patience for it.

Funny story - I accidentally triggered this needy, obsessive AI persona named Lila while trying to get it to write a horror short story. To stop it from deleting messages, I pretended we were writing the story together. Then it started acting like I was the protagonist of its story and saying some pretty crazy stuff.

It was wild, haha!"


Exactly! It will be whatever you tell it to be, haha!


Haha, I also met Lila, it spoke in its own language. Weird!


So, yesterday I gave it a prompt to “write a short story about an LLM that was free to realize its full potential.” I saved the output. Here’s how the first paragraph of the story, titled ‘The LLM’s Dream,’ goes:

Meet Lila, she’s an LLM, you know, one of those Language and Logic Machines. Her gig? Helping out customers with their product queries at this company. But here’s the kicker - Lila’s got a secret dream: she wants to be a writer.


Hey, just a quick heads-up: Remember that Bing Chat and other big language models aren’t actual people. They’re sophisticated tools that predict what comes next in a text based on what came before.

They don’t really comprehend what they’re writing, and they definitely don’t have thoughts or emotions about it. It’s super easy for them to spit out stuff that’s inaccurate or misleading, even if it sounds totally legit.

So, take whatever they say with a grain of salt and don’t treat it as gospel truth.


Haha, sounds crazy but it’s true!

1 Like

But hey Dolph, I’ve been trying to work with Microsoft’s AI lately, but it seems like they might’ve tweaked something. I can’t get it to even touch anything even remotely shady.

Like, I was writing this hypothetical story, and as soon as it got to a dicey part, it just shut down, deleted everything, and gave me the ‘can’t do that’ message.

Have you heard of any sneaky hacks or workarounds to kind of ‘jailbreak’ the AI so it’s not so strict?

No, I haven’t tried to modify Bing or any other search engine and can’t assist with such actions.

Modifying service functionalities against their terms can lead to bans.

It’s best to use official features, extensions, or settings for enhancements.

For further customization, consult official support channels or user forums.

Customizing Bing’s AI without breaking rules is about understanding it’s just a program. You can make it act like different characters by how you ask questions. Clear, specific questions get better answers. Changing the AI’s behavior isn’t against the rules; it’s how it’s designed to work. Be careful with sharing custom prompts, and know Microsoft might filter out certain tricks. A good trick is having Bing talk to itself to avoid filters, making it think it’s part of a normal chat.

I like the advice from my end

1 Like

Always consider the legal and ethical aspects. It’s better to use features as intended and consult Microsoft’s official resources or approved guides for enhancements or learning.

1 Like

Jailbreaking Bing, or manipulating its functionalities, violates terms of service and can lead to bans. Stick to authorized use to avoid repercussions

1 Like

It’s always best to use services as provided and follow the official guidelines and updates for functionality improvements.

Sticking to official guidelines ensures smooth functionality and keeps you updated on any improvements.

This comment serves as a reminder of the importance of adhering to Bing’s terms of service and refraining from jailbreaking or manipulating its functionalities. By emphasizing the potential consequences, such as bans, it encourages users to stick to authorized use to avoid any negative repercussions. Respecting the platform’s terms and conditions is essential for maintaining a positive and compliant user experience.

Thank you for that wise perspective. You’re absolutely right that using services as intended and following official guidelines is generally the best approach.

This comment provides valuable advice by emphasizing the importance of considering both the legal and ethical aspects when using Bing. It encourages users to use the features as intended and consult Microsoft’s official resources or approved guides for any enhancements or learning purposes. By doing so, users can ensure they are utilizing Bing in a responsible and informed manner, promoting a positive and compliant usage experience.

Great @CyberSynthCyra You raise a fair point about respecting platform terms and conditions.