Have you played around with hosting your own LLM? I've just started running oobabooga, it lets you download various LLMs and host them. I've been working on getting it set up so the AI can provide text for Piper, and take input from Whisper. It requires ideally an nvidia card, but will work with AMD and CPU. That would let you use the API to get text for piper to read. It's a lot more privacy oriented than sending your queries off to ChatGPT. The larger models do take more CPU/RAM/VRAM to run, but perhaps a smaller tuned model would suit your needs.
this post was submitted on 24 Oct 2023
9 points (100.0% liked)
homeassistant
14090 readers
1 users here now
Home Assistant is open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY enthusiasts. Perfect to run on a Raspberry Pi or a local server. Available for free at home-assistant.io
founded 2 years ago
MODERATORS
dude i love you. great idea!