Michael wrote:
On Thursday 5 September 2024 11:53:16 BST Dale wrote:

I made my backups last weekend.  I'm sure it was working fine then. 
After all, it would have failed to compile packages if it was bad.  I'm
thinking about checking against that copy like you mentioned but I have
other files I've added since then.  I figure if I remove the delete
option, that will solve that.  It can't compare but it can leave them be. 
Use rsync with:

 --checksum

and

 --dry-run 

Then it will compare files in situ without doing anything else.

If you have a directory or only a few files it is easy and quick to run.

You can also run find to identify which files were changed during the period 
you were running with the dodgy RAM.  Thankfully you didn't run for too long 
before you spotted it.


I have just shy of 45,000 files in 780 directories or so.  Almost 6,000 in another.  Some files are small, some are several GBs or so.  Thing is, backups go from a single parent directory if you will.  Plus, I'd want to compare them all anyway.  Just to be sure.

I also went back and got QB to do a manual file test.  It seems to be doing better.  There's over 4,000 torrents. Some 32TBs of data.  I think it's going to take a while.  o_^  As it is, I set the speed to tiny amounts until I get this sorted.  Don't want to accidentally share a bad file. 

Dale

:-)  :-) 

P. S.  My trees need some rain today.  It's getting very dry.  I been watering some trees.  My Swamp Chestnut trees are massive.  Hate to lose those things.  Likely 100 years old according to my tree guru.  In the fall, I wear a construction helmet.  Those things hurt when they fall and hit my head.