Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can this use local LLMs?


Yes - you can use local LLMs through LiteLLM and Ollama. Would you like us to support anything else?


LM Studio?


Yes, because LM Studio is openai-compatible. When you run rowboatx the first time, it creates a ~/.rowboat/config/models.json. You can then configure LM Studio there. Here is an example: https://gist.github.com/ramnique/9e4b783f41cecf0fcc8d92b277d...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: