Back to directory
WRITEUP #365

Google AI Studio Data Exfiltration via Prompt Injection - Possible Regression and Fix

AI / LLMLLMAIPrompt injectionData leak
byJohann Rehberger (wunderwuzzi23)
Program
Google (AI Studio)
Published
Apr 7, 2024
Added to HackDex
May 8, 2024
Read Full Writeuphttps://embracethered.com/blog/posts/2024/google-aistudio-mass-data-exfil/
RELATED WRITEUPS
Microsoft Copilot: From Prompt Injection to Exfiltration of Personal Information
AI / LLMAI
Google AI Studio: LLM-Powered Data Exfiltration Hits Again! Quickly Fixed.
AI / LLMAI
Jailbreak of Meta AI (Llama -3.1) revealing configuration details
AI / LLMAI
Zeroday on Github Copilot
AI / LLMAI
Sorry, ChatGPT Is Under Maintenance: Persistent Denial of Service through Prompt Injection and Memory Attacks
AI / LLMAI

Built with ❤️ by Shubham Rawat