I see many posts asking about what other lemmings are hosting, but I’m curious about your backups.

I’m using duplicity myself, but I’m considering switching to borgbackup when 2.0 is stable. I’ve had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

@gadgetzombie@feddit.nl
link
fedilink
English
18M

For the 14 pcs (~8 regularly used) in my house I’m running daily backups with Synology Active for Backup to a spinning disk DiskStation, file sync of the User directory using Synology Drive to an SSD DiskStation (also backed up to HDD DS). That data is all deduplicated. Then additionally I’ve got a few custom scripts to keep programs up to date using Chocolatey and winget which then export the list of installed programs ready to be reinstalled on a new machine.

This allows me to either do full device restores or clean installs where the reinstall of the relevant programs is handled automatically and then it’s just setting up sync/backup/office activation and we’re off to the races.

Backing up to backblaze with duplicacy

ptman
link
fedilink
English
18M

I’m moving from rsync+duplicity+borg towards bupstash

@kabouterke@lemmy.world
link
fedilink
English
28M

In short: crontab, rsync, a local and a remote raspberry pi and cryptfs on usb-sticks.

Amius
link
fedilink
English
28M

Holy crap. Duplicity is what I’ve been missing my entire life. Thank you for this.

💡dim
link
fedilink
English
38M

All nextcloud data gets mirrored with rsync to a second drive, so it’s in 3 places, original source and twice on the server

Databases are backed up nightly by webmin to second drive

Then installations, databases etc are sent to backblaze storage with duplicati

@cwiggs@lemmy.world
link
fedilink
English
18M

My important data is backed up via Synology DSM Hyper backup to:

  • Local external HDD attached via USB.
  • Remote to backblaze (costs about $1/month for ~100gb of data)

I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren’t crucial, it would just be helpful to rebuild if something went down.

@cwiggs@lemmy.world
link
fedilink
English
28M

My important data is backed up via Synology DSM Hyper backup to:

  • Local external HDD attached via USB.
  • Remote to backblaze (costs about $1/month for ~100gb of data)

I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren’t crucial, it would just be helpful to rebuild if something went down.

conrad82
link
fedilink
English
28M

I use syncthing to sync files between phone, pc and server.

The server runs proxmox, with a proxmox backup server in VM. A raspberry pi pulls the backups to an usb ssd, and also rclone them to backblaze.

Syncthing is nice. I don’t backup my pc, as it is done by the server. Reinstalling the pc requires almost no preparation, just set up syncthing again

For PCs, Daily incremental backups to local storage, daily syncs to my main unRAID server, and weekly off-site copies to a raspberry pi with a large external HDD running at a family member’s place. The unRAID server itself has it’s config backed up to the unRAID servers and all the local docker stores also to the off-site pi. The most important stuff (pictures, recovery phrases, etc) is further backed up in Google drive.

@craftymansamcf@lemmy.world
link
fedilink
English
1
edit-2
8M

For smaller backups <10GB ea. I run a 3 phased approach

  • rsync to a local folder /srv/backup/<service>
  • rsync that to a remote nas
  • rclone that to a b2 bucket

These scripts run on the cron service and I log this info out to a file using --log-file option for rsync/rclone so I can do spot checks of the results

This way I have access to the data locally if the network is down, remotely on a different networked machine for any other device that can browse it, and finally an offsite cloud backup.

Doing this setup manually through rsync/rclone has been important to get the domain knowledge to think about the overall process; scheduling multiple backups at different times overnight to not overload the drive and network, ensuring versioning is stored for files that might require it and ensuring I am not using too many api calls for B2.

For large media backups >200GB I only use the rclone script and set it to run for 3hrs every night after all the more important backups are finished. Its not important I get it done asap but a steady drip of any changes up to b2 matters more.

My next steps is to maybe figure out a process to email the backup logs every so often or look into a full application to take over with better error catching capabilities.

For any service/process that has a backup this way I try and document a spot testing process to confirmed it works every 6months:

  • For my important documents I will add an entry to my keepass db, run the backup, navigate to the cloud service and download the new version of the db and confirm the recently added entry is present.
  • For an application I will run through a restore process and confirm certain config or data is present in the newly deployed app. This also forces me to have a fast restore script I can follow for any app if I need to do this every 6months.
idunnololz
link
fedilink
English
88M

Are cyanide tablets a backup strategy?

poVoq
link
fedilink
English
28M

btrfs and btrbk work very well, tutorial: https://mutschler.dev/linux/fedora-btrfs-35/

Bdking158
link
fedilink
58M

Can anyone ELI5 or link a decent reference? I’m pretty new to self hosting and now that I’ve finally got most of my services running the way I want, I live in constant fear of my system crashing

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 79 users / day
  • 656 users / week
  • 1.75K users / month
  • 5.53K users / 6 months
  • 1 subscriber
  • 2.55K Posts
  • 51.3K Comments
  • Modlog