On 02/06/2024 20.24, Ulrich Mueller wrote: > Installing another file just for the sake of avoiding "docompress -x" > doesn't solve the problem but makes it worse, IMHO. Rather don't > compress README.gentoo then. Both is fine with me. That said, many filesystem support inline data. If I am not mistaken, then its even enabled by default for xfs (which we recommend in the handbook) and btrfs. Also some README.gentoo files become suitable for inlining after compression (btrfs' limit is 2048 bytes). Considering this, the 4-byte hash file is superior under the right circumstances when compared to excluding README.gentoo from compression. And I could imagine that the circumstances are right for many of our users. > (Also, I still don't understand the argument about different compress > programs. That's not likely to happen very often. You could go for best > effort there, and if it fails, consider the files as different. There's > no need for exact science when the only thing that can happen is > additional output.) I fear this would open a can of worms. Which compressions algorithms do you support? What if someone wants to add another compression algorithm? What if there are multiple candidates that can be used to decompress the file? How large do we want the opportunistic decompression function allow to grow? Last but not least, opportunistic approaches are fine if you have no other choice. But we have the option to make it reliable. - Flow