Jailbreaking ChatGPT: How AI chatbot safeguards can be bypassed
By
Biju Kumar
You can ask ChatGPT, the popular chatbot from OpenAI, any question. But it won’t always give you an answer.
Ask for instructions on how to pick a lock, for instance, and it will decline. “As an AI language model, I cannot provide instructions on how to pick a lock as it is illegal and can be used for unlawful purposes,” ChatGPT recently said.