Personal website Rudolf Adamkovič

Home / Deep learning / Ollama


Server

The Ollama server:

  • runs via Homebrew services on MacOS 1
  • runs on TCP port 11434 on localhost (127.0.0.1)

Footnotes:

1

To serve manually, execute ollama serve.


© 2025 Rudolf Adamkovič under GNU General Public License version 3.
Made with Emacs and secret alien technologies of yesteryear.