[PaRev] On the Role of Attention Heads in Large Language Models Safety
Attention heads play a critical role in shaping both safe and harmful behaviors
Attention heads play a critical role in shaping both safe and harmful behaviors
Several transformer models descriptions focus on its architecture
The very first posting of C2SR Lab blog!!