News

Twitter pranksters derail GPT-3 bot with newly discovered “prompt injection” hack By telling AI bot to ignore its previous instructions, vulnerabilities emerge.