Selfhosted

49406 readers
685 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
1
 
 

Hello everyone! Mods here 😊

Tell us, what services do you selfhost? Extra points for selfhosted hardware infrastructure.

Feel free to take it as a chance to present yourself to the community!

🦎

2
150
submitted 1 day ago* (last edited 1 day ago) by sbeak@sopuli.xyz to c/selfhosted@lemmy.world
 
 

Today I set up my old laptop as a Debian server, hosting Immich (for photos), Nextcloud (for files), and Radicale (for calendar). It was surprisingly easy to do so after looking at the documentation and watching a couple videos online! Tomorrow I might try hosting something like Linkwarden or Karakeep.

What else should I self-host, aside from HA (I don’t have a smart home), Calibre (physical books are my jam), and Jellyfin (I don’t watch too many movies + don’t have a significant DVD/Blu-ray collection)?

I would like to keep my laptop confined to my local network since I don’t trust it to be secure enough against the internet.

edit: I forgot, I’m also hosting Tailscale so I can access my local network remotely!

3
 
 

A new open-source Single Sign-On (SSO) provider designed to simplify user and access management.

Features:

  • 🙋‍♂️ User Management
  • 🌐 OpenID Connect (OIDC) Provider
  • 🔀 Proxy ForwardAuth Domains
  • 📧 User Registration and Invitations
  • 🔑 Passkey Support
  • 🔐 Secure Password Reset with Email Verification
  • 🎨 Custom Branding Options

Screenshot of the login portal:

4
119
submitted 3 hours ago* (last edited 3 hours ago) by sailorzoop@lemmy.librebun.com to c/selfhosted@lemmy.world
 
 

Incoherent rant.

I've, once again, noticed Amazon and Anthropic absolutely hammering my Lemmy instance to the point of the lemmy-ui container crashing. Multiple IPs all over the US.

So I've decided to do some restructuring of how I run things. Ditched Fedora on my VPS in favour of Alpine, just to start with a clean slate. And started looking into different options on how to combat things better.

Behold, Anubis.

"Weighs the soul of incoming HTTP requests to stop AI crawlers"

From how I understand it, it works like a reverse proxy per each service. It took me a while to actually understand how it's supposed to integrate, but once I figured it out all bot activity instantly stopped. Not a single one got through yet.

My setup is basically just a home server -> tailscale tunnel (not funnel) -> VPS -> caddy reverse proxy, now with anubis integrated.

I'm not really sure why I'm posting this, but I hope at least one other goober trying to find a possible solution to these things finds this post.

Anubis Github, Anubis Website

Edit: Further elaboration for those who care, since I realized that might be important.

  • You don't have to use caddy/nginx/whatever as your reverse proxy in the first place, it's just how my setup works.
  • My Anubis sits between my local server and inside Caddy reverse proxy docker compose stack. So when a request is made, Caddy redirects to Anubis from its Caddyfile and Anubis decides whether or not to forward the request to the service or stop it in its tracks.
  • There are some minor issues, like it requiring javascript enabled, which might get a bit annoying for NoScript/Librewolf/whatever users, but considering most crawlbots don't do js at all, I believe this is a great tradeoff.
  • The most confusing part were the docs and understanding what it's supposed to do in the first place.
  • There's an option to apply your own rules via json/yaml, but I haven't figured out how to do that properly in docker yet. As in, there's a main configuration file you can override, but there's apparently also a way to add additional bots to block in separate files in a subdirectory. I'm sure I'll figure that out eventually.

Cheers and I really hope someone finds this as useful as I did.

5
 
 

I'm a good chemist, but not IT advanced. Started using Debian out of the box last year on miniPC. Running Jellyfin only on that local machine. Don't understand coding, but copy/ paste terminal instructions from trusted sites. Have 1TB music, films and documents. Want to move all photos from Google.

6
 
 

The title really says it all, but I’m self hosting world of Warcraft wrath of the lich king.

I’m just so shocked that it all works to be honest. It’s blowing my mind still.

I always want to play classic wow, but I play so infrequently that it’s not worth paying a subscription.

It never really occurred to me that I could just host my own server until chatgpt recommended that when I was researching things to self hosting.

It’s not public yet as my upload speeds are too slow.

I think I’m going to set the server up on my laptop so I can play wow while on my 14 hour flight coming up.

I’ve always played the game solo anyway due to my casualness.

7
 
 

I won a new grant (yaay!) and dipping my toes in the role of PI in my university. For now, I will have a PhD, a post doc and a couple of masters students in my team.

In all my previous labs, everything was on paper and very poorly documented (...don't ask). I myself used to use LaTeX to keep a "neat" labnote. Obviously, it is not easy to collaborate and work with others.

Any researchers here who have experience hosting their own e-lab book in their labs?

8
130
DietPi is great! (dietpi.com)
submitted 1 day ago* (last edited 8 hours ago) by Teppichbrand@feddit.org to c/selfhosted@lemmy.world
 
 

Do you guys know about DietPi? I use it on two Raspberry Pi, just installed it on a Wyse mini-PC and I think it's really great:

Truly Optimised
DietPi is an extremely lightweight Debian OS, highly optimised for minimal CPU and RAM resource usage, ensuring your SBC always runs at its maximum potential.

Simple interface
DietPi programs use lightweight Whiptail menus. Spend less time staring at the command line, more time enjoying your Pi.

DietPi-Software
Quickly and easily install popular software "ready to run" and optimised for your system. Only the software you need is installed.

DietPi-Config
Quickly and effortlessly customise your device's hardware and software settings for your needs, including network connection and localisation setup.

DietPi-Backup
Quickly and easily backup or restore your DietPi system.

Logging System Choices
You decide how much logging you need. Get a performance boost with DietPi-RAMlog, or, rsyslog and logrotate for log critical servers.

DietPi-Services Control
Control which installed software has higher or lower priority levels: nice, affinity, policy scheduler and more.

DietPi-Update System
DietPi automatically checks for updates and informs you when they are available. Update instantly, without having to write a new image.

DietPi-Automation
Allows you to completely automate a DietPi installation with no user input. Simply by configuring dietpi.txt before powering on.

9
 
 

I've recently gotten into self hosting. I have a VPS and a domain name and decided to set up Pangolin as a reverse proxy to my local homelab.

During the options in the installation, I was asked to provide an email address for "generating Let's Encrypt certificates". I don't have a clue what what role my email address plays into this nor what email I should provide for the setup, so I just gave one of my personal email address. Everything worked fine and the service was completely set up in the VPS.

However, logging into the dashboard, I was informed by my browser that the certificate of the website is self signed and visiting the page may be dangerous. Although I was later able to access the panel with https enabled, I felt this setup is not okay and decided I would need to fix it.

Unfortunately I have no idea how certificate issuing works. I tried to search for a solution online and read the docs for Pangolin and Traefik as well as rewatch the tutorial through which I set up Pangolin, but either they tend to skip explaining the email thing or go too much into detail without even explaining where to start. I also checked my inbox to see if the CA pinged me or something but to no avail.

I feel like I'm missing something in my setup which was apparent to everybody else. I would really appreciate if someone could help me ELI5 what the root cause of this 'email' problem is and how to fix it. I am willing to set up the service all over again or edit the config files if needed but I just need to know what to do.

10
11
 
 

cross-posted from: https://slrpnk.net/post/24568506

Hi!

I'm supplying a small camp I'm participating in with Internet/Wifi, so I built an x86 OpenWRT router with an LTE modem... it took forever, but now it's working. (camp is quite outback for open wifi routers) So now I thought: What if we could share files for... anything easily via the router without setting up SAMBA on their phones or whatever.

So I thought of services like Sharedrop, or drop.lol, or litterbox.moe or pastebin or whatever. And that it would be super convenient to fileshare without the Internet or whatever.

There are a lot of self-hosted options available but which ones run on that 8GB OpenWRT router I set up. (Should be easy - that's a powerhouse for writeaple drive space in a router.

So: what's the best idea here? I can set up a http server, but I guess an ftp server would work as well. Althoug it would be perfect if it worked with phones and ad-hoc filesharing (download and upload, preferably with QR-code generation).

I know stuff like magic wormhole or localsend or warp, but all of those are a bit of a hassle for noobs to setup (i.e.: opening a firewall, which you shouldn't do if you don't know what you're doing). That's why I was thinking: hosted in the router.

You got any ideas what I can run on my potato of a server/beefcake of a router?

12
13
 
 

Cross-posted from: https://programming.dev/post/33674513

Any general suggestions when getting started with headscale?

14
 
 

Hey! I have been using Ansible to deploy Dockers for a few services on my Raspberry Pi for a while now and it's working great, but I want to learn MOAR and I need help...

Recently, I've been considering migrating to bare metal K3S for a few reasons:

  • To learn and actually practice K8S.
  • To have redundancy and to try HA.
  • My RPi are all already running on MicroOS, so it kind of make sense to me to try other SUSE stuff (?)
  • Maybe eventually being able to manage my two separated servers locations with a neat k3s + Tailscale setup!

Here is my problem: I don't understand how things are supposed to be done. All the examples I find feel wrong. More specifically:

  • Am I really supposed to have a collection of small yaml files for everything, that I use with kubectl apply -f ?? It feels wrong and way too "by hand"! Is there a more scripted way to do it? Should I stay with everything in Ansible ??
  • I see little to no example on how to deploy the service containers I want (pihole, navidrome, etc.) to a cluster, unlike docker-compose examples that can be found everywhere. Am I looking for the wrong thing?
  • Even official doc seems broken. Am I really supposed to run many helm commands (some of them how just fails) and try and get ssl certs just to have Rancher and its dashboard ?!

I feel that having a K3S + Traefik + Longhorn + Rancher on MicroOS should be straightforward, but it's really not.

It's very much a noob question, but I really want to understand what I am doing wrong. I'm really looking for advice and especially configuration examples that I could try to copy, use and modify!

Thanks in advance,

Cheers!

15
 
 

i’m starting to think it’s the debian base of this container image. it may just be too out of date for my GPU.

i think i'm giving up on this for now.

thanks all!


hey all!

for the life of me, i cannot get VAAPI hardware accelerated encoding to work. i always get this error:

Error: ffmpeg exited with code 234: Device creation failed: -22.

Failed to set value '/dev/dri/renderD128' for option 'vaapi_device': Invalid argument

Error parsing global options: Invalid argument`

at ChildProcess.<anonymous> (/app/node_modules/fluent-ffmpeg/lib/processor.js:180:22)
at ChildProcess.emit (node:events:524:28)
at ChildProcess._handle.onexit (node:internal/child_process:293:12)
  • AMD Radeon RX 9060 XT
  • the peertube vaapi transcoding plugin is installed
  • i have mesa-va-drivers and mesa-libgallium installed from bookworm backports.
  • the container is rootful.
  • /dev/dri is mapped
  • the render group id matches between host and container.
  • SELinux is set to allow containers access to devices.

no joy.

vainfo

error: XDG_RUNTIME_DIR is invalid or not set in the environment.

error: can't connect to X server!

libva info: VA-API version 1.17.0

libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/radeonsi_drv_video.so

libva info: Found init function __vaDriverInit_1_17

amdgpu: os_same_file_description couldn't determine if two DRM fds reference the same file description.

If they do, bad things may happen!

libva info: va_openDriver() returns 0

vainfo: VA-API version: 1.17 (libva 2.12.0)

vainfo: Driver version: Mesa Gallium driver 25.0.4-1~bpo12+1 for AMD Radeon Graphics (radeonsi, gfx1200, ACO, DRM 3.63, 6.15.4-1-default)

vainfo: Supported profile and entrypoints VAProfileH264ConstrainedBaseline: VAEntrypointVLD VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice VAProfileH264Main : VAEntrypointVLD VAProfileH264Main : VAEntrypointEncSlice VAProfileH264High : VAEntrypointVLD VAProfileH264High : VAEntrypointEncSlice VAProfileHEVCMain : VAEntrypointVLD VAProfileHEVCMain : VAEntrypointEncSlice VAProfileHEVCMain10 : VAEntrypointVLD VAProfileHEVCMain10 : VAEntrypointEncSlice VAProfileJPEGBaseline : VAEntrypointVLD VAProfileVP9Profile0 : VAEntrypointVLD VAProfileVP9Profile2 : VAEntrypointVLD VAProfileAV1Profile0 : VAEntrypointVLD VAProfileAV1Profile0 : VAEntrypointEncSlice VAProfileNone : VAEntrypointVideoProc

i've also tried updating the packages from trixie and sid, and installing the firmware-linux-nonfree.

i've tried disabling SELinux. i've tried making the container permissive.

no change.

any help is appreciated! thank you!

i’m starting to think it’s the debian base of this container image. it may just be too out of date for my GPU.

16
71
Torrent for books (lemmy.world)
submitted 2 days ago* (last edited 1 day ago) by shanedawkins122@lemmy.world to c/selfhosted@lemmy.world
 
 

Looking for book torrents - anything really ive come across a number of sites from other forums - not sure if they work or are safe to use https://annas-archive.org/ https://x1337x.cc/ anyone know anymre

17
 
 

Figured I'd ask here as thought self-hosters would care most about looking after their photos.

What do you do with friends' photos you'd like to keep hold of? Maybe there's a pic on a chat app or they've sent you a link to an album on google photos.

Would you just throw into your own pile of photos or do you carefully adjust metadata to indicate who took them? Just use dirs to separate them from your own? Interested in any and all thoughts.

18
 
 

I think a lot of people have heard of OpenAI’s local-friendly Whisper model, but I don’t see enough self-hosters talking about WhisperX, so I’ll hop on the soapbox:

Whisper is extremely good when you have lots of audio with one person talking, but fails hard in a conversational setting with people talking over each other. It’s also hard to sync up transcripts with the original audio.

Enter WhisperX: WhisperX is an improved whisper implementation that automatically tags who is talking, and tags each line of speech with a timestamp.

I’ve found it great for DMing TTRPGs — simply record your session with a conference mic, run a transcript with WhisperX, and pass the output to a long-context LLM for easy session summaries. It’s a great way to avoid slowing down the game by taking notes on minor events and NPCs.

I’ve also used it in a hacky script pipeline to bulk download podcast episodes with yt-dlp, create searchable transcripts, and scrub ads by having an LLM sniff out timestamps to cut with ffmpeg.

Privacy-friendly, modest hardware requirements, and good at what it does. WhisperX, apply directly to the forehead.

19
 
 

Background: I've been writing a new media server like Jellyfin or Plex, and I'm thinking about releasing it as an OSS project. It's working really well for me already, so I've started polishing up the install process, writing getting started docs, stuff like that.

I'm interested in how other folks have set up their media libraries. Especially the technical details around how files are encoded and organized.

My media library currently has about 1,100 movies and just shy of 200 TV shows. I've encoded everything as high quality AV1 video with Opus audio, in a WebM container. Subtitles and chapters are in a separate WebVTT file alongside the video. The whole thing is currently about 9TB. With few exceptions, I sourced everything directly from Blu-ray or DVD using MakeMKV. It's organized pretty close to how Jellyfin wants it.

What about you?

20
 
 

Hi,

I really miss the old version of the website "weatherspark" It had an absolutely fantastic weather dashboard.

It was a old google-finance-like graph that you could scroll, to zoom in and out to increase the timescale, or left to right to shift the date at the center of the graph.

The lines on the graph shows, current temperature, historical temperature and prediction temperature.

There were bands around the temperature that would indicate temperature averages and records for the date period on screen.

If there were precipitations, it would show as another line the amounts.

In that simple graph you could get the sense of the local weather, what it's been and what it will be, this week, this month or 25 years ago with just your mouse.

With all the weather data being collected by government and available on public APIs, is there any open source self-hosted software that has an effective interactive data visualization user interface as weather spark of old ?

Here is what it looked like, all in a single graph with NO pageloads !

21
 
 

Nice big old port scan. Brand new server too. Just a few days old so there is nothing to find. Don't worry I contacted AWS. Stay safe out there.

22
 
 

Hi everyone, I have Truenas core running on an old desktop at home. For the past year everything has been going great. However for the past month or so I started to notice that my SMB share for Jellyfin was getting reset to some 6 movies. All of my home videos, music, TV shows, and other movies where gone. I have a backup drive where I store all this data so I copied the files back and Jellyfin saw them immediately again... but by the next day all where gone again. At first I thought someone had deleted them by accident so I checked the user access of my family members and no one except me has access to delete movies and the logs don't show any media deletion that I wasn't aware of (I found 2 duplicated movies). The server isn't exposed in any way to the internet except for a VPN connection that I keep closed tabs regarding access. I checked the permissions of Jellyfin's container in my Truenas server and both UID and GID are set to 568 per the Internet's recommendation when I set it up originally. My server is running on two 4TB HDDs configured in a mirror. I checked the health of both drives and both show healthy and without errors. I still suspect that when the server syncs the data, that somehow the data from the wrong drive is overwriting the data of the one with the media and deleting it, but I don't know enough on how to check.

Please let me know if you need more information and how to get it, as I am a complete noob when it comes to servers but I am trying to learn.

Thanks y'all

UPDATE: Some have suggested that my Syncthings server could have been the issue, however turning off the server has not fixed the issue and once again files got deleted.

UPDATE 2: I looked at the SMB logs and noticed a bunch of "Unlink" events that seem to be running every morning around 7am and go over every one of my media files. I am assuming "Unlink" means delete somehow, but I can't determine what is calling it.

23
 
 

Like how on Debian's website, you can find their ISO's and other related files in this very simple file browser layout which looks kind of old but I want that, know any projects or way to set something like that up? The modern self-hosted stuff just does not seem simple enough, and both aesthetically and from a functional perspective I would like something like what debain does with their own files. I also want it to be reliable, for some reason, with both immich and nextcloud, a relative of mine was unable to download alot of photos without the download not even starting on Nextcloud, or it stopping 30% of the way on immich, if reliable downloads necessitate a desktop app with their own unique file exchanging protocol I would be ok with that too (willing to compromise with the desired aesthetic and minimalist design)

The ideal thing is the thing here: https://cdimage.debian.org/debian-cd/

24
 
 

I'm just using the Cosmic Terminal that's part of the Pop!_OS Cosmic Alpha, but I ran into similar issues with Gnome terminal and even with Termius.

Scenario: I'm currently working on leveraging a VPS to act as the gateway to my homelab so I have one ssh session to Unraid server and one to VPS. One in each tab. Obviously the name shows up as what the username@servername is called in each tab. But I keep getting tripped up and sometimes try to do something from the wrong machine. Once I even failed to realize that the ssh session to one of them cut out and I was back on my desktop and took me an embarrassingly long time to realize why stuff was failing.

So what are y'all using to keep that organized in your work flow? Separate terminal windows instead of tabs? Some shell customizations to make them look different than one another? Or just so ingrained in your brain that you never have this problem?

EDIT: Thanks, everyone! Sounds like a terminal multiplexer is the ticket for me.

25
 
 

I'm kind of surprised I've struggled with this so long. Right now I've been using nextcloud camera upload, and mostly it works ok but shits the bed once in a while without me noticing and I need to spend time fixing it and it's never as simple as turning it off and on again.

I recently tried syncthing, and while it works, it frequently crashed and got stuck in a state where it says it's on and working by my destination folder is shown as disconnected, and the options to restart syncthing from the side menu are greyed out and the only way to make it work again is by force closing it and reopening it.

I'm running vanilla stock Google Android and truenas, does anyone have a better solution?

view more: next ›