📢 Jailbreak Anthropic's new AI safety system for a $15,000 reward

📢 Jailbreak Anthropic's new AI safety system for a $15,000 reward

· json · rss
Subscribe:

About

Date: 2025-02-04T19:35:13
Source: ZDNet Security
Read more: https://www.zdnet.com/article/jailbreak-anthropics-new-ai-safety-system-for-a-15000-reward/?utm_source=dstif.io