Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
calamari4065
on Dec 2, 2023
|
parent
|
context
|
favorite
| on:
Easy Stable Diffusion XL in your device, offline
Yeah, llama runs acceptably on my server, but buying a GPU and setting it all up seems really unfun. Also much more expensive than my hobby budget
brucethemoose2
on Dec 2, 2023
[–]
You don't need a big one, even an old 4GB GPU will massively accelerate the prompt ingestion.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: