What a $10,000 Challenge and 300K+ Prompt Injection Attempts Taught Us About Attacking AI
About This Session
Over the course of 4 weeks in March 2025 we ran a $10,000 Prompt Injection Challenge where contestants competed to bypass 3 Virtual Escape Rooms consisting of 11 levels. Like golf, winners were scored on the lowest number of tokens used to bypass a level. Levels increased in difficulty and were protected by increasingly sophisticated guardrails. The challenge attracted thousands of participants.
We collected a broad set of Prompt Injection attacks that allowed us to build a comprehensive Taxonomy on Prompt Injection with well over 100 methods. We believe that this is the most comprehensive collection of Prompt Injection methods to date.
Oliver Friedrichs, founder and CEO of Pangea, will guide you through the analysis.
Oliver will uncover:
- Data-driven insights into how attackers manipulate generative AI systems.
- A comprehensive Taxonomy of Prompt Injection methods built on this data.
- Leading approaches to detecting and preventing Prompt Injection.
We collected a broad set of Prompt Injection attacks that allowed us to build a comprehensive Taxonomy on Prompt Injection with well over 100 methods. We believe that this is the most comprehensive collection of Prompt Injection methods to date.
Oliver Friedrichs, founder and CEO of Pangea, will guide you through the analysis.
Oliver will uncover:
- Data-driven insights into how attackers manipulate generative AI systems.
- A comprehensive Taxonomy of Prompt Injection methods built on this data.
- Leading approaches to detecting and preventing Prompt Injection.
Speaker
