Microsoft: 'Skeleton Key' Jailbreak Can Trick Major Chatbots Into Behaving Badly
Microsoft has uncovered a jailbreak that allows someone to trick chatbots like ChatGPT or Google Gemini into overriding their restrictions and engaging in prohibited activities.
Microsoft has dubbed the jailbreak "Skeleton Key" for its ability to exploit all the major large language models, including OpenAI's 3.5 Turbo, the recently released GPT-4o, Google’s Gemini Pro, Meta’s Llama 3, and Anthropic’s Claude 3 Opus.
LMicrosoft has dubbed the jailbreak "Skeleton Key" for its ability to exploit all the major large language models, including OpenAI's 3.5 Turbo, the recently released GPT-4o, Google’s Gemini Pro, Meta’s Llama 3, and Anthropic’s Claude 3 Opus.
Tags:
Ai