Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
If you can SSH to the LLM machine, I'd probably recommend
ssh -L127.0.0.1:11434:127.0.0.1:11434 <remote hostname>
. If for some reason you don't have or inadvertently bring down a firewall on your portable device, you don't want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.(Using 11434 instead of 3000, as it looks like that's ollama's port.)
EDIT: OP, it's going to be hard to give a reliable step-by-step, because I have no idea what your network looks like. So, for example, it's possible to have your wireless access point set up so that devices can't talk to each other at all. You might have some kind of firewall on your LLM machine, so that if they can talk to each other from the WAP's standpoint, the firewall will block traffic from your phone; you'd need to punch a hole in that. At least something (sshd for the example here, or ollama itself to the network) needs to be listening on a routable address. As DrDystopia points out, we don't even know what OS the LLM machine is running (Linux?) so giving any kind of step-by-step is going to be hard there.
Problem is, that doesn't say much. Like, doesn't say what you've seen.
Do you know what the LAN IP address of your LLM machine is? Can you ping that IP address from Termux on your phone when both are on the same WiFi network (
$ ping <ip-address>
?) What OS is the LLM machine? If Linux, do you have sshd installed? It sounds like you do have ollama on it and that it's working if you use it from the LLM machine? When you said that it didn't work, what did you try and what errors or behavior did you see?3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or
localhost
, only 0.0.0.0. Ollama's port 11434 on 127.x worked fine though.Fair point.