Another PatchBot update

Since the last time I wrote about PatchBot I’ve made a few improvements to the Production processor.

  • Moved decision logic from Move.py into the processor.
  • Added the recipe variable delta to specify days between test and production.
  • Added the recipe variable deadline to set the Self Service deadline.
  • Added defaults for delta and deadline to the top of Production.py to ease customization

Both the recipe variables are optional. It’s now possible to use the -k option of Autopkg to quickly move a package from test into production with a short Self Service deadline if you need to:

autopkg run Firefox.prod -k "delta=-1" -k "deadline=2"

The above will immediately move the Firefox package into the production patches and set a deadline of two days. We set a delta of -1 because the system used for command line arguments doesn’t allow us to tell the zero we get when there was no command line argument and the zero we get when we set it as the variable value. A value of -1 has the same effect as zero. If the difference between the date the package was put into test and today is zero then a difference greater than or equal to zero, and greater than or equal to -1, are both true.

I’ve also added a tool to the #patchbot channel in the MacAdmins Slack that posts to the channel whenever I push new code to the GitHub repository.

PatchBot Update

So the inevitable happened after I published my blog posts about PatchBot. I found some small bugs, now fixed.

But also inevitable was somebody telling me there was a better way to do something.

It turns out that since JPCImporter only needs the pkg_path variable it can be used as a post processor when calling the package build recipe. That means you don’t need to alter the .pkg recipe override at all. That’s a whole bunch of recipes we can just forget about. Thanks to Graham Pugh for the tip.

The first AutoPkg call needs to be changed:

# run the package build
/usr/local/bin/autopkg run --recipe-list=/Users/"$(whoami)"/Documents/PatchBotTools/packages.txt \
    --post com.honestpuck.PatchBot/JPCImporter \
    --report-plist=/Users/"$(whoami)"/Documents/JPCImporter.plist \
    -k FAIL_RECIPES_WITHOUT_TRUST_INFO=yes

There were also some bug fixes to Move.py and ProdTeams.py.

PatchBot #4

Here is my fourth post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In the third post I showed the third custom processor and the code to run it at the right time.

In the first three blog posts I explained (in great detail) how my system, PatchBot, works.

Today I am going to cover how to take the pieces and put them together into a complete system.

Continue reading

PatchBot #3

Welcome to my third post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In this post we will look at the python script that decides when to move a package into production and the custom processor that does all the work.

Move.py

Move.py is fairly simple. When you strip off the first 50 lines as housekeeping you’re left with a function loop that does all the work. It loops through every patch policy on the server then if it’s an enabled test patch policy looks for a date more than 6 days ago in it’s self service description.

It uses that to build a command that gets used in a subprocess call. I keep on looking at this and thinking it would be nice to just call the right function in AutoPkg and pass it the list but that’s probably even more fragile than the current way.

You will also see that after AutoPkg it calls a final script to send more messages to Teams. This could have been done in Move.py but I liked having the code separate during development and it makes it easier for you to use my code.

Notice that we don’t do anything in Move.py, for that we rely on our final custom processor, Production.py.

Continue reading

PatchBot – Zero Touch Patch Management #2

Last post I detailed the first steps taken by PatchBot, building and uploading a new version of an application package.

This post I will explain the next step, updating the testing patch policy.

First thing I should explain is why we don’t do this when we build and upload the package. It boils down to the reliability of our patch definition feed. If every time a new version was available the patch definition feed was updated at exactly the same time we could have done it all in JPCImporter. Unfortunately the patch definitions are only updated every 12 hours (I think) and that’s enough of a window for Murphy. Kinobi keep on decreasing the window but no matter how narrow you know Murphy will have his way, so defensive design and coding.

Continue reading

PatchBot – Zero-Touch Packaging and Patch Management

A little over a year ago I set out to build a system that would deliver application patches to my users without me doing a thing.

I have leveraged AutoPkg, the JAMF patch management system, and API to build a total solution where almost all of my applications are automatically patched on my fleet without me touching a thing.

I call it PatchBot. This will be a series of four blog posts explaining the system and how to get it working. All the code and tools are published on Github.

When a new version of an application is available a package is built and a patch is sent to a test group, after a set delay the package is moved into production where everyone with the app installed gets a patch to update and our application install policy is updated.

Two LaunchAgents automatically run AutoPkg with some custom processors and scripts to perform all the work.

While it does take some setting up for each application the process requires manual intervention only to stop an application patch package going in to production when a problem is found in a package or to speed it up if you need to deploy a security patch quickly.

Patch levels across the fleet have improved dramatically.

Continue reading

Doing Some Cleaning

Tidying up my toys. (Image from Wikimedia)

My distribution point was gaining some bloat with a huge number of packages that were superseded or deprecated. I needed to do a clean out. I need a tool that will check which ones are used and list the rest.

I had a look at Spruce but it has one fatal flaw, it doesn’t look at patch policies. Now given my automation around patch management means I use patch policies a great deal that’s a problem. If my system is working perfectly then every package being used by a patch policy should also be used in a policy so it wouldn’t matter, but since it doesn’t always work perfectly I’m not prepared to risk removing a required package.

Continue reading

AutoPkg Repo List Fiddling Again

After my last post Graham Pugh mentioned that the AutoPkg repository list is stored in the AutoPkg preference file as RECIPE_REPOS with the search order in RECIPE_SEARCH_DIRS.

He suggested doing a while loop on the defaults read output but I thought it was just fiddly enough a task in the shell that I might resort to a few lines of Python, so here it is, a Python script to dump out your repository list in search order. Tiny but it does the job.

(Thanks to Graham for taking the time to comment on the previous post, it was just what I needed to get me to spend the few minutes doing this.)

#!/usr/bin/env python3

# repos.py
# print the list of AutoPkg repos in search order
# NOTE: Totally lacking in any error checking or handling

import plistlib
from os import path

plist = path.expanduser('~/Library/Preferences/com.github.autopkg.plist')
fp = open(plist, 'rb')
prefs = plistlib.load(fp)
search = prefs['RECIPE_SEARCH_DIRS']
repos = prefs['RECIPE_REPOS']

# start at 3 to skip the built in ones
for i in range(3, len(search)):
    print(repos[search[i]]['URL'])