📢 Anthropic offers $20,000 to whoever can jailbreak its new AI safety system
📢 Anthropic offers $20,000 to whoever can jailbreak its new AI safety system
About
Date: 2025-02-06T15:54:00
Source: ZDNet Security
Read more: https://www.zdnet.com/article/anthropic-offers-20000-to-whoever-can-jailbreak-its-new-ai-safety-system/?utm_source=dstif.io
Source: ZDNet Security
Read more: https://www.zdnet.com/article/anthropic-offers-20000-to-whoever-can-jailbreak-its-new-ai-safety-system/?utm_source=dstif.io