shell bash add tag
David
I have a set of ImageMagick `convert` routines to run on batches of PNG files. It involves these steps:[^1]

```
$ for p in *.png; do convert -fx '(r+g+b)/3' "$p" g"$p"; done

$ for p in g*.png; do convert -level 40%,95% "$p" l"$p"; done

$ for p in l*.png; do convert -brightness-contrast 0x33 "$p" c"$p"; done
```

I'm thinking there must be a way of combining those into a single bash line. I've found ways of running multiple commands with `&&`, but those are for rather simple things, not "looping" through a set of file conversions.

What would be the best (most efficient, safest, etc.) way of combining my three steps into one?

[^1]: If anyone cares, those conversions: (1) convert to greyscale; (2) adjust levels; (3) adjust contrast. This is on page scans that are RGB and dull. This brightens the PNGs considerably, and the background is near enough `#fff` by the time it's all done.
Top Answer
PeterVandivier
Apropos of very little, I would rewrite your sample code in my own idiosyncratic way. I don't know if it makes it "safer" out of the gate. Almost certainly it doesn't make it "more efficient", but I do think it provides some improvements. If you'll bear with me...

You have 3 batch commands (two-and-a-half, really...) that you execute over `all $files in $a_directory`. It seems like perhaps you want the various transforms to execute file-by-file as an atomic action. For example - is it helpful to set garden.png to greyscale if subsequent file goodtimes.png causes and error and exits the routine? Then you've failed to adjust the level of garden.png after and how do you know that happened or that you need to fix it over a batch of many files?

Instead consider having a single function that performs all-or-nothing edits as needed on files one-by-one. For example - if you have a file `make_greyscale.sh` of the form...

```bash
make_greyscale() {
    p=$1
    if [ -f "$p" ]; 
    then
        pushd $(dirname "$p") > /dev/null
        p=$(basename "$p")
    else
        echo "File '$p' not found!" 1>&2
        exit
    fi

    convert -fx '(r+g+b)/3' "$p" g"$p"

    first_letter=$(echo "$p" | head -c 1)

    if [ $first_letter == 'p' ];
    then
        convert -level 40%,95% "$p" l"$p"
    fi
    if [ $first_letter == 'g' ];
    then
        convert -brightness-contrast 0x33 "$p" c"$p"
    fi
    popd > /dev/null
}
```

...you might then source the file and execute your series-of-commands pre-file with something like the following 

```bash
source ./greyscale.sh

for p in *.png
do
    make_greyscale "$p"
done
```

---

By isolating your series-of-commands into a per-file function, you allow each file to be transformed as an all-or-nothing (atomic) operation (given other caveats and addendums). 

There's _more_ individual operations in the declared function than you had originally. It _does not_ make the transformation faster. It _does_ (in my opinion) make it more human-understandable and more portable. 
Answer #2
Jack Douglas
The literal equivilent on one line would be just seperate with semicolons, but I assume you want stop on error, in which case you can group your commands with curly brackets:

``` bash
$ { for p in *.png; do convert -fx '(r+g+b)/3' "$p" g"$p"; done; } && { for p in g*.png; do convert -level 40%,95% "$p" l"$p"; done; } && { for p in l*.png; do convert -brightness-contrast 0x33 "$p" c"$p"; done; }
```

However, even that probably isn't quite what you want because [for bash loops](https://www.gnu.org/software/bash/manual/html_node/Looping-Constructs.html):

> The return status is the exit status of the last command that executes.

It seems likely you want something more like this:

``` bash
for p in *.png; do convert -fx '(r+g+b)/3' "$p" g"$p" && convert -level 40%,95% g"$p" lg"$p" && convert -brightness-contrast 0x33 lg"$p" clg"$p"; done;
```

Enter question or answer id or url (and optionally further answer ids/urls from the same question) from

Separate each id/url with a space. No need to list your own answers; they will be imported automatically.