Back up your git service/repositories to offline storage.
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
As others have said, a traditional off site backup will work. How do you plan to perform a restore, though? If you need the self hosted source repo, it won't be available until the infrastructure is stood to creating another circular dependency.
I'm still in the early stages of exploring this, too. My solution is to run a local filesystem git clone
of the "main" repo and execute it with a Taskfile that builds a docker image from which it can execute the ansible infrastructure build. It is somewhat manual but I have performed a full rebuild a few times after some Big Mistakes.
You pretty much got it. I need a quick way to restore the repo and ideally have git do a self backup. Seems like a cheap VPS may be the way to go
For my own curiosity, how do you perform a build? Is it all done in pipelines, kicked off on change? Do you execute the whole infra build each time you release an update?
Honestly, I just run it from the CLI myself.
I’ve wasted too much time fighting with CI and automation that when I migrated to forjego I didn’t bother to put it in again.
Same. I have spent way more time troubleshooting a pipeline than it saves. I like the idea of automation but laziness prevails.
Borgbackup in addition to git. Since there's probably not much data, any cheap VPS could act as storage.
I would configure a Backblaze B2 bucket and copy your repos and configs there, should be dirt cheap compared to a VPS and very durable.
I would set aside a dedicated device that acts as a sort of "provisioner" and admin node. It can be something like a raspberry pi or desktop computer.
From a backup perspective I would evacuate risk vs cost/effort. If you lost your home would it really matter that you lost some config files?
What I did is set up a NAS at my parents house, which I can log into as well for near zero cost offsite backups.
And at home I have a couple of local drives with borgbackups.
My parents have a NAS! Maybe I set up Tailscale and send it over there…
Although they live 3 streets away from me so I worry it’s not remote enough in case of flood etc
Codeberg and make sure you don't leak secrets or back it up at your buddies house on his homelab