Size: 710
Comment:
|
Size: 868
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 19: | Line 19: |
## useful options * `-X`: Multiple arguments per job, distributed evenly * `--xargs`: Emulates xargs behavior; as many arguments per job as possible |
1 # Parallel copy
2 # Because we are copying the current directory, make sure $dst is correct
3 find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn {} $dst
4
5 # Check MP3s in parallel, many arguments per invocation
6 parallel -m mp3check -Sqe ::: *mp3
7
8 # Run commands in a file parallel
9 parallel < commands.txt
10
11 # Parallel equivalent of
12 # `find $DIRECTORY -type f -exec sha1sum '{}' \; > $DIRECTORY.sha1`
13 find $DIRECTORY -type f -print0 | parallel -q0 -k sha1sum > $DIRECTORY.sha1
14
15 # Quickly get hash of large file. Reads in 1G chunks, multiple CPUs, and outputs 1 hash.
16 parallel --block=1G --pipepart -a $LARGE_FILE --progress --recend '' -k sha1sum | sha1sum
-X: Multiple arguments per job, distributed evenly
--xargs: Emulates xargs behavior; as many arguments per job as possible