Am Wed, Sep 20, 2023 at 10:57:00PM +0100 schrieb Victor Ivanov: > On Wed, 20 Sept 2023 at 22:29, Grant Edwards wrote: > > > > That depends on how long it takes me to decide on tar vs. rsync and > > what the appropriate options are. > > I've done this a number of times for various reasons over the last 1-2 > years, most recently a few months ago due to hard drive swap, and I > find tar works just fine: > > $ tar -cpf /path/to/backup.tar --xattrs --xattrs-include='*.*' -C / . Does that stop at file system boundaries (because you tar up '/')? I think it must be, otherwise you wouldn’t use it that way. But when copying a root file system, out of habit I first bind-mount it in a subdirectory and tar/rsync from there instead. This will also make files visible which might be hidden under an active mount. This is not necessary if you do it from a live system, but then you wouldn’t tar up / in the first place. > Likewise to extract, but make sure "--xattrs" is present > > Provided backup space isn't an issue, I wouldn't bother with > compression. It could be a lot quicker too depending on the size of > your root partition. Or not, depending on the speed of the backup device. ;-) LZO compression (or zstd with a low setting) has negligible CPU cost, but can lower the file size quite nicely, specially with large binaries or debug files. -- Grüße | Greetings | Salut | Qapla’ Please do not share anything from, with or about me on any social network. Do you steel taglines, too?