> Boyd Stephen Smith Jr. wrote:
>> In <[🔎] 4A01AC7B.firstname.lastname@example.org>, Barclay, Daniel wrote:
>>> I have some scripts that I run both manually and as cron jobs. The
>>> scripts generate stdout/stderr output reporting what they're doing.
>>> I want to see the output when I run the scripts manually. However, when
>>> the scripts are run by cron, normally I don't want cron to e-mail me
>>> bulky output for each run--I want the output (the full output) only when
>>> the output is different than expected (e.g., if something has gone wrong
>>> or has changed, which I want to notice).
>> For maximum utility, in particular use with cron, your scripts should
>> follow two Unix philosophies: (1) If everything is going as expected,
>> no output is required. (2) If the task was not completed, exit with a
>> non-zero value.
>> So, my first advice is to fix your scripts so they act like Unix
> That's what I'm trying to do by wrapping them in a script that does what I
> described--that suppresses all output when output from the underlying
> script is as expected, but which generates output (all output from the
> underlying script), and possibly returns a different exit code, when the
> output of the underlying script isn't quite as expected.
Hmm. It looks like there's another reason to fix the scripts _internally_
to get them to work the Unix/Linux/cron way:
I was going to write one general wrapper script, have it take the command
(other script) to run and the name of an expected-patterns files, and call
that wrapper script in crontab lines.
Of course, that means that the subject line from cron would contain the
very long command invoking the general wrapper scripts, and not just the
specific underlying script command.
So it looks like even if I implement the output suppression with crude
output-string filtering, it still should be done inside each script
(as recommended above).
(Plain text sometimes corrupted to HTML "courtesy" of Microsoft Exchange.) [F]