Day 1
This room is and introduction to prompt engineering and tricking a chat bot into giving away secure information.
Last updated
Was this helpful?
This room is and introduction to prompt engineering and tricking a chat bot into giving away secure information.
Last updated
Was this helpful?
Prompt injection, a vulnerability that affects insecure chatbots powered by natural language processing (NLP).
Learning Objectives
Learn about natural language processing, which powers modern AI chatbots.
Learn about prompt injection attacks and the common ways to carry them out.
Learn how to defend against prompt injection attacks.
This is a very simple and straightforward room, The answers are more or less given to you easily.
Run the VM and wait for it to launch. Once ready you can proceed to ask it various questions as shown.