1 comments

  • newsdeskx 2 hours ago
    does this work with purely local models through Ollama, or do you still need the Ollama server running on another machine? been looking for something that actually works offline for basic voice commands