Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is interesting that openai isn't offering any inference for these models.


Makes sense to me. Inference on these models will be a race to the bottom. Hosting inference themselves will be a waste of compute / dollar for them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: