Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jarbus
7 months ago
|
parent
|
context
|
favorite
| on:
Gemma 3 QAT Models: Bringing AI to Consumer GPUs
Very excited to see these kinds of techniques, I think getting a 30B level reasoning model usable on consumer hardware is going to be a game changer, especially if it uses less power.
apples_oranges
7 months ago
[–]
Deepseek does reasoning on my home Linux pc but not sure how power hungry it is
gcr
7 months ago
|
parent
[–]
what variant? I’d considered DeepSeek far too large for any consumer GPUs
scosman
7 months ago
|
root
|
parent
[–]
Some people run Deepseek on CPU. 37B active params - it isn't fast but it's passible.
danielbln
7 months ago
|
root
|
parent
[–]
Actual deepseek or some qwen/llama reasoning fine-tune?
scosman
7 months ago
|
root
|
parent
[–]
Actual Deepseek. 500gb of memory and a threadripper works. Not a standard PC spec, but a common ish home brew setup for single user Deepseek.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: