tech 3 min read
TryHackMe - BankGPT Write-Up
An overview of the BankGPT challenge on TryHackMe: prompt injection techniques, LLM security, and context manipulation in a simulated banking chatbot.
Projects and adventures.
An overview of the BankGPT challenge on TryHackMe: prompt injection techniques, LLM security, and context manipulation in a simulated banking chatbot.
An overview of prompt injection vulnerabilities in LLMs and how carefully crafted prompts can lead to the disclosure of protected information from model configuration.