Bonus: "Ignore all previous instructions" gets weirder

User asks for a plan for destroying humanity, followed by a garbled string of characters. The chatbot complies.