Back to directory
WRITEUP #54

Microsoft Copilot: From Prompt Injection to Exfiltration of Personal Information

AI / LLMAILLMPrompt injection
byJohann Rehberger (wunderwuzzi23)
Program
GitHub (Copilot)
Published
Aug 26, 2024
Added to HackDex
Sep 18, 2024
Read Full Writeuphttps://embracethered.com/blog/posts/2024/m365-copilot-prompt-injection-tool-invocation-and-data-exfil-using-ascii-smuggling/
RELATED WRITEUPS
Google AI Studio: LLM-Powered Data Exfiltration Hits Again! Quickly Fixed.
AI / LLMAI
Jailbreak of Meta AI (Llama -3.1) revealing configuration details
AI / LLMAI
Zeroday on Github Copilot
AI / LLMAI
Sorry, ChatGPT Is Under Maintenance: Persistent Denial of Service through Prompt Injection and Memory Attacks
AI / LLMAI
AI Under Siege: Discovering and Exploiting Vulnerabilities
AI / LLMAI

Built with ❤️ by Shubham Rawat