Today, Ollama launched its desktop app for macOS and Windows. As someone who regularly uses Enchanted LLM, I gave the new Ollama app a spin. Here’s where it stands out:

What Ollama Gets Right:

  • Windows support — finally, a native desktop LLM app on both major platforms.
  • Model downloads — lets you run models locally without relying on a constant internet connection.
  • Context window slider — handy for tuning memory depth per session.

Where Ollama Falls Short (for now):

  • No chat export — there’s no built-in way to copy or save the conversation.
  • No system-level TTS/STT — unlike Enchanted, it doesn’t integrate with macOS speech-to-text or text-to-speech.
  • No global hotkey integration — can’t summon it for quick completions or interactions system-wide.

For now, I’ll probably keep both. Enchanted LLM still has features I rely on daily, but it’s been stagnant — no updates in over a year. Ollama feels more modern and active, so I’m keeping an eye on how fast it evolves.