Shell scripts: drawing the line

At geek brekkie yesterday P— mentioned the idea of archiving links that you use with the Internet Archive. This seemed like a great idea to use in deploying my blog. I’ve wanted to add a general link checker to look for broken links. This isn’t quite the same thing but it would be an option for remediating link rot when found. Plus it seemed simple to do. My proof of concept for this also provides an excellent answer for a common question: when have you gone too far for a shell script and should switch to a “real language.” This script has gotten past that line so thought I’d share it.

Using batch to avoid problems

I use batch to rerun failed cron jobs. I also use it for webhooks. There are three reasons for doing this and why eventually I’ll end up changing even first runs of cronjobs to use batch. They handle load issues, locking issues and return quickly. The batch command is usually implemented as part of cron and at, but it runs at a certain load, not a certain time. It can be set at different loads when the system is configured, but the idea is that batch run jobs one at a time when the system load is “low”.

Monitoring cron jobs

Cron jobs sometimes fail and the old way of getting emails from the cron daemon doesn’t really scale. For instance you might have a job that fails from time to time and that’s ok - but fail for too long and it’s a problem. Generally, email as an alerting tool is a Bad Thing and should be avoided. Since I have prometheus set up for everything, the easiest thing is to use the textfile-collector from node exporter to dump some basic stats.

How this blog gets deployed (part 2)

Yesterday I covered the overview of how this gets deployed. Now for the detail. The script for testing is more aspirational than functional. It runs a spellcheck on the blog, but doesn’t die if it finds any. I’m supposed to read the output. I never do. I should though. Someday I’ll add consequences to the spellcheck run, but for now at least it tries. #!/bin/sh set -e make spell Next up is the script that does the build and deployment to pages.

How this blog gets deployed (part 1)

This website is maintained with hugo which is a static site generator. That means the source is parsed and all of the html, css and javascript are generated and saved as files. It means deployment just requires a plain, basic web server. But that still does mean it needs to be deployed. You could deploy just on s3 but for me I already have my own server. So I just deploy it to there.

SSH Crypt

For some reason a few weeks back I was wondering about using ssh keys to encrypt/decrypt files. Seems like a thing that should be possible, why not? And sure enough, it’s been done. This won’t be as good as using gpg keys. Specifically without the web of trust it can be hit with MITM attacks, but I think it would be “good enough” for most people in most uses. And in my experience getting people to use gpg is like pulling teeth.

Blocking exact tweets

While playing with the Twitter API via a Go lib I saw someone call on people to troll reply to a tweet with a specific piece of text. This seemed like a really easy thing to search for and block. So I modified the little Go program from starting with ephemeral and put in this as the function run in main(). It searches for the string it’s given and then if the tweet consists solely of that text (compared by turning the string lowercase and then stripping out everything that’s not a letter) it blocks the user.

Go utilities in ~/bin

One nice side effect of using vcsh was developing more complex scripts to help me do things. I didn’t have to worry a script or tool would get lost when a machine inevitibly died. However before writing a script, sometimes it’s not a bad idea to check and see if someone else already has. Lately many of those that I’ve found have been in Go. Originally I did these with update but it made update take a long time to run and sometimes with die if a rarely used Go util was broken.

Moving home - updates to vcsh usage

A while back I switched to vcsh. I’ve written a few articles on using it but since then I’ve migrated machines a number of times. The big issue I’ve found is having to manually install software on each machine. There are things my scripts depend on and things I just expect to have and manually creating them each time is annoying. So the solution obviously is a script. It’s actually used all the time as I might create new dependencies or find new tools I need so I’d want that installed on all machines.

Ephemeral-ing Twitter

So now ephemeral was working on my current tweets, but not handling the “historical” ones which twitter deems to be the ones older than the 3,200 most recent. There are APIs that you can use to access those, but they require a paid license. So instead I went into my settings in twitter and requested an archive of my tweets. This takes a number of hours but eventually you get a download link and a while after using that you end up with a huge zip file.