My pc runs on Ubuntu 24.04 and one of my favorite things to do in my pc is take notes and annotations using Obsidian. Even though I have a modest pc with a low end graphics card, instead of using a paid cloud AI service, I run ollama with local LLMs - it might be slow but is nevertheless useful.
But then I ran into a problem when trying to use obsidian plugins that run AI on ollama local LLMs. After running ollama serve
in the terminal, the following error would show up:
Error: listen tcp 127.0.0.1:11434: bind: address already in use
I tried some tweaking here and there, like changing the port address in the command line, but it didn't work as I needed it to.
Turns out the solution is simpler than I thought. Here is how it works. When you install ollama on ubuntu 24.04 it will install itself as a systemd service.
Ollama runs in the background
Ubuntu 24.04 uses systemd, so Ollama automatically starts as a service when the pc starts. Think of it like a silent workhorse, always ready to process your AI requests.
No need to manually start the server
My mistake was typing ollama serve every time. There's no need for it. It's already running on port 11434. Trying to start it again will just give you an error because it's already busy.
Check ollama status as a systemd service
Ollama service status can be checked by sudo systemctl status ollama.service
on the terminal.
Need to tweak things?
Example: Want to change the port Ollama listens on? Add this line to the [Service] section:
Then, restart the service: sudo systemctl restart ollama.service
Now, let's talk Obsidian plugins
Learn more about ollama
Learn how to install, update and download LLMs.
Comments
Post a Comment