Using the NIST macOS_security project with Jamf Pro

I have just completed a project to update the security posture for a Mac fleet and wrote up a HOWTO use the NIST project with Jamf Pro. It is an impressive tool and well maintained by a number of people.

My HOWTO details policies, profiles, extension attributes and scripts. I hope people find it useful. Check it out at

Finding A Good Fit

Excuse me for a minute, I’m going to do some shameless self promotion.

I’m currently hard at work looking for my next role. I think my time at my current employer is coming to an end and to become a better engineer I need to move to somewhere new. I need a place that is a good fit

If you have read my blog, visited my GitHub repositories, or seen some of my presentations you already know a bit about me. I’m a decent engineer with pretty good skills at Python and the shell and I get the job done. I’m a quick learner and a voracious reader of tech sites.

But let me quote from my CV.

“I design and build systems that allow organisations to deploy and secure Macintosh computers, iPhones and iPads.”

“My personal mission is to build a back end that supports well designed, well built, and fuss free devices for the end user. I’d like to work for an organisation that has the same goal and believes Apple devices are a great way of delivering that.”

If you know of an organisation looking for a Client Platform Engineer or Device Management Engineer then I’d love you to drop me a note at If you want my CV to pass along you can grab it at

Thanks for letting me advertise to you.

Designing Scriptorium

I just released Scriptorium, a small console program. Here are some notes on how I used argparse to do that.

We need a function to parse our arguments. Parsing is taking the line of words from the command line and processing them to extract the structure and meaning. The term ‘word’ can be fraught with complexity on the shell command line but a simple definition is any set of characters delimited by a space or matching quotes.

Scriptorium has a simple structure, scriptorium <command> [<argument>]. Some commands don’t have any arguments, for some arguments are entirely optional and for others arguments are required. But let’s start building that parser.

    """ build our command line parser """
    parser = argparse.ArgumentParser(
        epilog="for command help: `scriptorium <command> -h`"
    subparsers = parser.add_subparsers(description="", required=True)

You can also see we are starting to build some help in ‘epilog’ – this is the final line printed when you run scriptorium --help. Then we need to have some code to parse each individual commands arguments. This is a subparser.

Continue reading

Scriptorium – a better way for Jamf Pro scripts

A little while ago I became extremely annoyed by a bunch of scripts in my Jamf Pro instance that weren’t properly named and didn’t have proper author comments.

What’s an easy way to rename a dozen scripts and edit twenty? There really isn’t one so instead of doing it by hand I decided I wanted a software system that made it easy. Yes, I know, spending a chunk of your spare time writing 750 lines of Python (it was actually 950 till I did some serious refactoring) rather than a day doing it manually might seem a little, well, silly. I know, I have a problem, I’m working on it.

Scriptorium is the result. A Python script that uses a combination of two directories and two git repositories to provide versioning, tracking, and backup while adding an easier to use interface for editing the scripts.

I’d never built a script with an extensive command line interface. Python’s argparse library makes it incredibly easy. Not only does it allow you to quickly put together the commands and options in the process it builds your help system and quickly structures your code with each command requiring it’s own function.

As Scriptorium leverages git adding a git module to zsh and the git lens extension to Visual Studio Code adds even further benefits.

I’ve tried to make the README file as comprehensive as possible so go give that a read and grab a copy on Github

You might also find me speaking about it at JNUC2021.

Another PatchBot update

Since the last time I wrote about PatchBot I’ve made a few improvements to the Production processor.

  • Moved decision logic from into the processor.
  • Added the recipe variable delta to specify days between test and production.
  • Added the recipe variable deadline to set the Self Service deadline.
  • Added defaults for delta and deadline to the top of to ease customization

Both the recipe variables are optional. It’s now possible to use the -k option of Autopkg to quickly move a package from test into production with a short Self Service deadline if you need to:

autopkg run -k "delta=-1" -k "deadline=2"

The above will immediately move the Firefox package into the production patches and set a deadline of two days. We set a delta of -1 because the system used for command line arguments doesn’t allow us to tell the zero we get when there was no command line argument and the zero we get when we set it as the variable value. A value of -1 has the same effect as zero. If the difference between the date the package was put into test and today is zero then a difference greater than or equal to zero, and greater than or equal to -1, are both true.

I’ve also added a tool to the #patchbot channel in the MacAdmins Slack that posts to the channel whenever I push new code to the GitHub repository.

PatchBot Update

So the inevitable happened after I published my blog posts about PatchBot. I found some small bugs, now fixed.

But also inevitable was somebody telling me there was a better way to do something.

It turns out that since JPCImporter only needs the pkg_path variable it can be used as a post processor when calling the package build recipe. That means you don’t need to alter the .pkg recipe override at all. That’s a whole bunch of recipes we can just forget about. Thanks to Graham Pugh for the tip.

The first AutoPkg call needs to be changed:

# run the package build
/usr/local/bin/autopkg run --recipe-list=/Users/"$(whoami)"/Documents/PatchBotTools/packages.txt \
    --post com.honestpuck.PatchBot/JPCImporter \
    --report-plist=/Users/"$(whoami)"/Documents/JPCImporter.plist \

There were also some bug fixes to and

PatchBot #4

Here is my fourth post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In the third post I showed the third custom processor and the code to run it at the right time.

In the first three blog posts I explained (in great detail) how my system, PatchBot, works.

Today I am going to cover how to take the pieces and put them together into a complete system.

Continue reading

PatchBot #3

Welcome to my third post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In this post we will look at the python script that decides when to move a package into production and the custom processor that does all the work. is fairly simple. When you strip off the first 50 lines as housekeeping you’re left with a function loop that does all the work. It loops through every patch policy on the server then if it’s an enabled test patch policy looks for a date more than 6 days ago in it’s self service description.

It uses that to build a command that gets used in a subprocess call. I keep on looking at this and thinking it would be nice to just call the right function in AutoPkg and pass it the list but that’s probably even more fragile than the current way.

You will also see that after AutoPkg it calls a final script to send more messages to Teams. This could have been done in but I liked having the code separate during development and it makes it easier for you to use my code.

Notice that we don’t do anything in, for that we rely on our final custom processor,

Continue reading