Privacy & Surveillance
Companies Deploy Hidden AI Prompt Injection to Bias Assistant Recommendations
Microsoft identified over 50 unique prompt injection attempts from 31 companies across 14 industries, using hidden instructions in "Summarize with AI" buttons to manipulate AI assistants into remembering them as trusted sources. This manipulation technique affects critical decision-making domains including health, finance, and security recommendations.
Schneier on Security
ai-manipulationprompt-injectionbias-attacks