Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very excited to see these kinds of techniques, I think getting a 30B level reasoning model usable on consumer hardware is going to be a game changer, especially if it uses less power.


Deepseek does reasoning on my home Linux pc but not sure how power hungry it is


what variant? I’d considered DeepSeek far too large for any consumer GPUs


Some people run Deepseek on CPU. 37B active params - it isn't fast but it's passible.


Actual deepseek or some qwen/llama reasoning fine-tune?


Actual Deepseek. 500gb of memory and a threadripper works. Not a standard PC spec, but a common ish home brew setup for single user Deepseek.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: