CLI: parallel compression FTW
Posted by gregster on 28 Jul 2019 in Announcements
I have a directory with 350,000 xml files and I want to archive it in a compressed format. I tried the usual methods, but they're slooooww - apparently because they are not optimized for parallel operations. I found a thread on superuser.com that recommends pigz (parallel implementation of gzip). It was so much faster I assumed it had cocked-up, but apparently not. It turned my 4GB dir in to a gz file less than 1GB in just a few minutes. Very impressive.
You run it like this:
tar -c --use-compress-program=pigz -f tar.file dir_to_zip