[GRASS-dev] Pipeline efficiency in bash shell scripts
Patton, Eric
epatton at nrcan.gc.ca
Wed Oct 10 13:11:02 EDT 2007
I'll keep this short and provide more details if people need it. In a for loop within bash scripts, where several commands are performing text manipulation, is it more efficient to pipe the commands together in one long pipeline (case 1), or instead dump the output from one program into a text file and use a redirection to input the results into a second command (case 2)?
Case 1
======
for FILES in *.extension ; do
mbnavlist -Iinputfile -OJXY -N0 | awk '{lots of awk text manipulation goes here}' | v.in.ascii
done
Case 2
======
for FILES in *.extension ; do
mbnavlist -Iinputfile -OJXY -N0 > TMP.txt
awk '{slicing and dicing commands go here}' < TMP.txt > v.in.ascii OR awk '{slicing and dicing commands go here}' < TMP.txt > TMP2.txt follwed by v.in.ascii < TMP2.txt
In both cases, the for loop is calling the mbnavlist program (from free and open source bathymetry processing software MBTools) and awk together thousands or times. I wasn't sure if there are any general benefits to piping vs. writing out to a file, then redirecting input.
Any suggestions?
~ Eric.
More information about the grass-dev
mailing list