Size: 260
Comment:
|
Size: 1286
Comment: Faster parallel copy
|
Deletions are marked like this. | Additions are marked like this. |
Line 3: | Line 3: |
find . -maxdepth 1 -print0 | parallel -0 -j4 echo cp {} $dst | # Because we are copying the current directory, make sure $dst is correct parallel --bar -j4 cp -avn -- "{}" "$dst" ::: * # Old idea, much slower find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn "{}" "$dst" |
Line 6: | Line 9: |
parallel -m mp3check -Sqe ::: *mp3 | parallel -X mp3check -Sqe ::: *mp3 |
Line 10: | Line 13: |
# Using a file (one per line) of arguments parallel 'some-command {}' ::: list-of-arguments.txt # Parallel equivalent of # `find $DIRECTORY -type f -exec sha1sum '{}' \; > $DIRECTORY.sha1` find $DIRECTORY -type f -print0 | parallel -q0 -k sha1sum > $DIRECTORY.sha1 # Quickly get hash of large file. Reads in 1G chunks, multiple CPUs, and outputs 1 hash. parallel --block=1G --pipepart -a $LARGE_FILE --progress --recend '' -k sha1sum | sha1sum |
|
Line 11: | Line 24: |
## useful options * `-X`: Multiple arguments per job, spread evenly across jobs, with context replace to replace "{}". * `-m`: Multiple arguments per job, spread evenly across jobs. Preferably use `-X`. * `--xargs`: Multiple arguments per job emulating xargs behavior; as many arguments per job as possible * `-n`: Limits number of arguments per job, use w/ `-X`, `--xargs`, or `-m` |
1 # Parallel copy
2 # Because we are copying the current directory, make sure $dst is correct
3 parallel --bar -j4 cp -avn -- "{}" "$dst" ::: *
4 # Old idea, much slower
5 find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn "{}" "$dst"
6
7 # Check MP3s in parallel, many arguments per invocation
8 parallel -X mp3check -Sqe ::: *mp3
9
10 # Run commands in a file parallel
11 parallel < commands.txt
12
13 # Using a file (one per line) of arguments
14 parallel 'some-command {}' ::: list-of-arguments.txt
15
16 # Parallel equivalent of
17 # `find $DIRECTORY -type f -exec sha1sum '{}' \; > $DIRECTORY.sha1`
18 find $DIRECTORY -type f -print0 | parallel -q0 -k sha1sum > $DIRECTORY.sha1
19
20 # Quickly get hash of large file. Reads in 1G chunks, multiple CPUs, and outputs 1 hash.
21 parallel --block=1G --pipepart -a $LARGE_FILE --progress --recend '' -k sha1sum | sha1sum
-X: Multiple arguments per job, spread evenly across jobs, with context replace to replace "{}".
-m: Multiple arguments per job, spread evenly across jobs. Preferably use -X.
--xargs: Multiple arguments per job emulating xargs behavior; as many arguments per job as possible
-n: Limits number of arguments per job, use w/ -X, --xargs, or -m