LessWrong (30+ Karma) - “LLMs for Alignment Research: a safety priority?” by abramdemski
Sign in to continue reading, translating and more.