Daniel Shahaf wrote:
> Julian Foad wrote on Mon, Dec 10, 2018 at 18:23:00 +0000:
> > Robustness: teach it to only accumulate new entries, and not to wipe the file when there's a network glitch.
>
> That's easy enough: just write 'set -e' [or use] double ampersands
I have done that but I think the script will still generate bad output if the 'curl' commands fail; see TODO below.
> Another nice to have improvement would be making the 'svn ci' invocation
> silent — by adding --quiet, or by running it under chronic(1) (from package
> 'moreutils') — so the people on the crontab's MAILTO list don't get emails for
> successful commits.
Added '-q'etc.
I also made it:
* not commit if the only change is in the the "# Generated ... <date>" line.
* search only in the "publish" subtree, so it doesn't pick up the example in its own script
New version:
# Update our Haxx-URL-to-Message-Id map (a manual cron entry, for now)
0 4 * * * fn=publish/.message-ids.tsv; cd ~/src/svn/site && svn up -q && tools/haxx-url-to-message-id.sh publish > $fn.tmp && if diff -q -I'^# Generated' $fn $fn.tmp > /dev/null; then rm $fn.tmp; else mv $fn.tmp $fn && svn ci -q -m "* $fn: Automatically regenerated" $fn; fi
TODO:
* only add new entries (actually it happens to work this way already, because it finds its own previous output file as part of the scan);
* only query 'haxx.se' for new entries, not all the existing entries again each time;
* the 'curl | perl' part of the script wants a way to fail if the 'curl' fails (Bash has 'set -o pipefail' for this).
--
- Julian
Received on 2018-12-11 11:08:37 CET