Differences between revisions 9 and 10
Revision 9 as of 2019-02-12 08:04:49
Size: 1208
Editor: SamatJain
Comment: Clarify -m, mention preference for -X. parallel w/ file of arguments.
Revision 10 as of 2019-03-26 08:50:52
Size: 1286
Editor: SamatJain
Comment: Faster parallel copy
Deletions are marked like this. Additions are marked like this.
Line 4: Line 4:
find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn {} $dst parallel --bar -j4 cp -avn -- "{}" "$dst" ::: *
# Old idea, much slower
find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn "{}" "$dst"

   1 # Parallel copy
   2 # Because we are copying the current directory, make sure $dst is correct
   3 parallel --bar -j4 cp -avn -- "{}" "$dst" ::: *
   4 # Old idea, much slower
   5 find . -maxdepth 1 -mindepth 1 -print0 | parallel -q0 -j4 echo cp -avn "{}" "$dst"
   6 
   7 # Check MP3s in parallel, many arguments per invocation
   8 parallel -X mp3check -Sqe ::: *mp3
   9 
  10 # Run commands in a file parallel
  11 parallel < commands.txt
  12 
  13 # Using a file (one per line) of arguments
  14 parallel 'some-command {}' ::: list-of-arguments.txt
  15 
  16 # Parallel equivalent of
  17 # `find $DIRECTORY -type f -exec sha1sum '{}' \; > $DIRECTORY.sha1`
  18 find $DIRECTORY -type f -print0 | parallel -q0 -k sha1sum > $DIRECTORY.sha1
  19 
  20 # Quickly get hash of large file. Reads in 1G chunks, multiple CPUs, and outputs 1 hash.
  21 parallel --block=1G --pipepart -a $LARGE_FILE --progress --recend '' -k sha1sum | sha1sum

  • -X: Multiple arguments per job, spread evenly across jobs, with context replace to replace "{}".

  • -m: Multiple arguments per job, spread evenly across jobs. Preferably use -X.

  • --xargs: Multiple arguments per job emulating xargs behavior; as many arguments per job as possible

  • -n: Limits number of arguments per job, use w/ -X, --xargs, or -m

SamatsWiki: CheatSheet/GNUParallel (last edited 2019-04-28 00:16:11 by SamatJain)