AI chatbots can fall for prompt injection attacks, leaving you vulnerable. | AI in Education #AIinED | Scoop.it
“Prompt injection” is a major risk to large language models and the chatbots they power. Here’s how the attack works, examples and potential fallout.