Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Did Meta add scalable rope to the official implementation?


We changed RoPE's theta from 10k to 1m and fine-tuned with 16k tokens long sequences.


Curious, what led you to adjusting the parameters this way? Also, have you guys experimented with ALiBi[1] which claims better extrapolative results than rotary positional encoding?

[1]: https://arxiv.org/abs/2108.12409 (charts on page two if you’re skimming)


Undoubtedly, they have tried ALiBi…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: