Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Alignment Faking in LLMs [pdf] (anthropic.com)
2 points by veryluckyxyz 12 months ago | hide | past | favorite | 1 comment


Dupe (281 points, 1 day ago, 310 comments) https://news.ycombinator.com/item?id=42458752




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: