PatchBot – Zero-Touch Packaging and Patch Management

A little over a year ago I set out to build a system that would deliver application patches to my users without me doing a thing.

I have leveraged AutoPkg, the JAMF patch management system, and API to build a total solution where almost all of my applications are automatically patched on my fleet without me touching a thing.

I call it PatchBot. This will be a series of four blog posts explaining the system and how to get it working. All the code and tools are published on Github.

When a new version of an application is available a package is built and a patch is sent to a test group, after a set delay the package is moved into production where everyone with the app installed gets a patch to update and our application install policy is updated.

Two LaunchAgents automatically run AutoPkg with some custom processors and scripts to perform all the work.

While it does take some setting up for each application the process requires manual intervention only to stop an application patch package going in to production when a problem is found in a package or to speed it up if you need to deploy a security patch quickly.

Patch levels across the fleet have improved dramatically.

Autopkg

AutoPkg is an automation framework for macOS software packaging and distribution, oriented towards the tasks one would normally perform manually to prepare third-party software for mass deployment to managed clients, to quote it’s website.

At it’s core it is used to build the packages, however people have written add-ons to perform other tasks such as integrating with Munki or uploading to a Jamf repository.

The existing add-on (or processor in AutPkg parlance) for integrating with a Jamf repository is jss-importer. I’ve written a replacement for two reasons. The first is that when I set out to build my first management system jss-importer could not upload to a cloud repository. The second is that jss-importer was designed and built around a system of using policies and smart groups to deliver patches to the users and now Jamf has patch management to do it more easily with less reliance on groups. Patch management also includes some nice version tracking across the fleet.

A final note before I delve into details. I am probably doing things in a way that horrifies some people. I’m not going to say that my method is perfect, just that it works for me and I hope you can find my efforts useful in building your own system. I’m also going to spend a great deal of time explaining my code, what it does and why it’s built that way.

Roughly, How Does It All Work

The first thing PatchBot does is build the packages and upload them to Jamf Pro. At the same time it saves the package details in a policy called TEST-<title>. In a previous version this delivered the test version to the testers but now it’s just a database record.

PatchBot then runs a script that takes the report plist from AutoPkg and uses it to send messages to a special channel in Teams. That’s so humans can know what’s going on.

Once packages are uploaded it’s time to start patch management. This requires a high quality patch definition feed for the Jamf Pro patch management system. I buy Kinobi from Mondada and believe it’s easily a value proposition. Seriously, I cannot overstate how well a bunch of Aussies do it. It’s incredibly finicky and tedious and throwing not much money at somebody else to do it is incredibly appealing when they do such a good job. There is an open source community alternative that I’m sure works fine for some.

The first step in patch management is to find the version definition for our new package and get it pointed to the package, then update a patch policy Test <title>. This patch policy is scoped to a single group regardless of the application, I call mine Package Testers. The patch policy has a self service deadline of two days. PatchBot also tells us the results with another set of messages to Teams.

The second step in patch management is to move a package from test into production. This is done seven days after it is moved into test using a production patch policy, called Stable <title> scoped to all computers and a self service deadline of seven days. Both the delay before moving patches into production and the self service deadlines are easily changed.

At this point PatchBot updates the install policy for the application so it uses the new version. I’m sure you’re not surprised it has a third script to send the results to our Teams channel.

It’s now done. We have a patch package in production and an update install policy. At no stage have we had to do a thing. The only human intervention we might need is halting the shift from test to production if our testers discover a broken package. That’s as easy as editing the self service description for the Test patch policy.

Now for some details. Today I will go over the first step, building and uploading the package.

Building & Uploading Packages

AutoPkg is controlled by recipes so every package we build needs a recipe, called <title>.pkg.recipe and we either find these online or write them ourselves.

Autopkg includes a security system for recipes that makes sure nobody can change a recipe without us knowing. It does this by saving a special recipe called a recipe override with a hash of the original recipe. Rather than have a separate recipe to run our custom processor, JPCImporter, I have chosen to add an extra block to the override. You can see an example of this block below. This is not really the approved way of handling it, I should use a different recipe for our custom processor but the overrides have to be there (security, if nothing else) and it reduced the number of files I was handling.

Let’s have a look at an example:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"&gt;
<plist version="1.0">
<dict>
<key>Identifier</key>
<string>local.pkg.Abstract</string>
<key>Input</key>
<dict>
<key>NAME</key>
<string>Abstract</string>
</dict>
<key>ParentRecipe</key>
<string>com.github.dataJAR-recipes.pkg.Abstract</string>
<key>ParentRecipeTrustInfo</key>
<dict>
<key>non_core_processors</key>
<dict/>
<key>parent_recipes</key>
<dict>
<key>com.github.dataJAR-recipes.download.Abstract</key>
<dict>
<key>git_hash</key>
<string>307367f4291653873cb643e1a37660a7a919b716</string>
<key>path</key>
<string>~/Library/AutoPkg/RecipeRepos/com.github.autopkg.dataJAR-recipes/Abstract/Abstract.download.recipe</string>
<key>sha256_hash</key>
<string>b9dfab745c6a08c79d6b7d9f8d62dd813a29b368503e065f80ccb859a5fe2ba8</string>
</dict>
<key>com.github.dataJAR-recipes.pkg.Abstract</key>
<dict>
<key>git_hash</key>
<string>6e3ce6ea55174a82101f629b0e0903b0510d4486</string>
<key>path</key>
<string>~/Library/AutoPkg/RecipeRepos/com.github.autopkg.dataJAR-recipes/Abstract/Abstract.pkg.recipe</string>
<key>sha256_hash</key>
<string>87c341b82ba7654083e19c7b5cac31fe5a7466b0a6b75d591fdf8228891d91bb</string>
</dict>
</dict>
</dict>
<key>Process</key>
<array>
<dict>
<key>Arguments</key>
<dict>
<key>pkg_path</key>
<string>%RECIPE_CACHE_DIR%/%NAME%-%version%.pkg</string>
</dict>
<key>Processor</key>
<string>JPCImporter</string>
</dict>
</array>
</dict>
</plist>
Abstract.pkg.recipe

You can see the block added for the custom processor call in lines 40-51 of the recipe override. Further up is the recipe trust info with the hashes of the parent recipes.

So how does JPCImporter do it’s work.

Before it starts we have to do some stuff for our automation. Number one is to make sure our package is named according to a standard format, <title>-<version>.pkg, where <title> is the name of the application with no periods or - characters in the name. I prefer no spaces but the system works if they’re there. It doesn’t work with underscores between the application name and version, such as the packages built by Rich Trouton’s recipes, for those I have to use a hack to change the name by adding a separate PkgCopier step to the recipe override to rename it.

The second thing is to create a test policy called TEST-<title> that is scoped to nobody and not enabled. We are simply using the Jamf Pro policy list as a database record. We read it later to track the latest version uploaded.

Finally we need a way for AutoPkg to find our custom processors. The way to do that is detailed on the AutoPkg wiki here. Basically I have a folder called PatchBotProcessors in my recipe folder containing the processors and a special recipe.

Here’s the special recipe.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"&gt;
<plist version="1.0">
<dict>
<key>Description</key>
<string>This is an example of a recipe that can be used by another
recipe outside of this directory or repo, to refer to
a processor in this directory.
Instead of setting the 'Processor' key to a processor name
only, we separate the recipe identifier and the processor
name with a slash:
&lt;dict&gt;
&lt;key&gt;Processor&lt;/key&gt;
&lt;string&gt;com.honestpuck.PatchBot/JPCImporter.py&lt;/string&gt;
&lt;/dict&gt;
..assuming that this recipe is in one of AutoPkg's search dirs.
</string>
<key>Identifier</key>
<string>com.honestpuck.PatchBot</string>
<key>Input</key>
<dict />
<key>MinimumVersion</key>
<string>0.4.0</string>
<key>Process</key>
<array />
</dict>
</plist>
view raw PatchBot.recipe hosted with ❤ by GitHub
PatchBot.recipe

JPCImporter

Let’s have a look at the code.

#!/usr/bin/env python3
#
# JPCImporter v2.1
#
# Tony Williams 2019-07-03
#
# ARW 2019-07-18 Many bug fixes
# ARW 2020-03-12 Version 2 with changes for new workflow
# ARW 2020-06-09 Some changes to log levels and cleaning up code
# ARW 2020-06-24 Final tidy before publication
"""See docstring for JPCImporter class"""
from os import path
import subprocess
import plistlib
import xml.etree.ElementTree as ET
import datetime
import logging
import logging.handlers
from time import sleep
import requests
from autopkglib import Processor, ProcessorError
APPNAME = "JPCImporter"
LOGLEVEL = logging.DEBUG
LOGFILE = "/usr/local/var/log/%s.log" % APPNAME
__all__ = [APPNAME]
class JPCImporter(Processor):
"""Uploads a package to JPC and updates the test install policy"""
description = __doc__
input_variables = {
"pkg_path": {
"required": True,
"description": "Path to the package to be imported into Jamf Pro ",
},
}
output_variables = {
"jpc_importer_summary_result": {"description": "Summary of action"}
}
def setup_logging(self):
"""Defines a nicely formatted logger"""
self.logger = logging.getLogger(APPNAME)
self.logger.setLevel(LOGLEVEL)
# we may be the second and subsequent iterations of JPCImporter
# and already have a handler.
if len(self.logger.handlers) > 0:
return
handler = logging.handlers.TimedRotatingFileHandler(
LOGFILE, when="D", interval=1, backupCount=7
)
handler.setFormatter(
logging.Formatter(
"%(asctime)s %(levelname)s %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
)
self.logger.addHandler(handler)
def load_prefs(self):
""" load the preferences form file """
# Which pref format to use, autopkg or jss_importer
autopkg = False
if autopkg:
plist = path.expanduser(
"~/Library/Preferences/com.github.autopkg.plist"
)
prefs = plistlib.load(open(plist, "rb"))
url = prefs["JSS_URL"]
auth = (prefs["API_USERNAME"], prefs["API_PASSWORD"])
else:
plist = path.expanduser("~/Library/Preferences/JPCImporter.plist")
prefs = plistlib.load(open(plist, "rb"))
url = prefs["url"]
auth = (prefs["user"], prefs["password"])
return (url, auth)
def upload(self, pkg_path):
"""Upload the package `pkg_path` and returns the ID returned by JPC"""
self.logger.info("Starting %s", pkg_path)
# do some set up
(server, auth) = self.load_prefs()
hdrs = {"Accept": "application/xml", "Content-type": "application/xml"}
base = server + "/JSSResource/"
pkg = path.basename(pkg_path)
title = pkg.split("-")[0]
# check to see if the package already exists
url = base + "packages/name/{}".format(pkg)
self.logger.debug("About to get: %s", url)
ret = requests.get(url, auth=auth)
if ret.status_code == 200:
self.logger.warning("Found existing package: %s", pkg)
return 0
# use curl for the file upload as it seems to work nicer than requests
# for this ridiculous workaround for file uploads.
curl_auth = "%s:%s" % auth
curl_url = server + "/dbfileupload"
command = ["curl", "-u", curl_auth, "-s", "-X", "POST", curl_url]
command += ["–header", "DESTINATION: 0"]
command += ["–header", "OBJECT_ID: -1"]
command += ["–header", "FILE_TYPE: 0"]
command += ["–header", "FILE_NAME: {}".format(pkg)]
command += ["–upload-file", pkg_path]
self.logger.debug("About to curl: %s", pkg)
# self.logger.debug("Auth: %s", curl_auth)
self.logger.debug("pkg_path: %s", pkg_path)
# self.logger.debug("command: %s", command)
ret = subprocess.check_output(command)
self.logger.debug("Done – ret: %s", ret)
packid = ET.fromstring(ret).findtext("id")
if packid == "":
raise ProcessorError("curl failed for url :{}".format(curl_url))
self.logger.debug("Uploaded and got ID: %s", packid)
# build the package record XML
today = datetime.datetime.now().strftime("%d-%b-%G")
data = "<package><id>{}</id>".format(packid)
data += "<category>Applications</category>"
data += "<notes>Built by Autopkg. "
data += "Uploaded {}</notes></package>".format(today)
# we use requests for all the other API calls as it codes nicer
# update the package details
url = base + "packages/id/{}".format(packid)
# we set up some retries as sometimes the server
# takes a minute to settle with a new package upload
# (Can we have an API that allows for an upload and
# setting this all in one go.)
count = 0
while True:
count += 1
self.logger.debug("package update attempt %s", count)
ret = requests.put(url, auth=auth, headers=hdrs, data=data)
if ret.status_code == 201:
break
if count > 5:
raise ProcessorError(
"Package update failed with code: %s" % ret.status_code
)
sleep(15)
# now for the test policy update
policy_name = "TEST-{}".format(title)
url = base + "policies/name/{}".format(policy_name)
ret = requests.get(url, auth=auth)
if ret.status_code != 200:
raise ProcessorError(
"Test Policy %s not found: %s" % (url, ret.status_code)
)
self.logger.warning("Test policy found")
root = ET.fromstring(ret.text)
self.logger.debug("about to set package details")
root.find("package_configuration/packages/package/id").text = str(
packid
)
root.find("general/enabled").text = "false"
root.find("package_configuration/packages/package/name").text = pkg
url = base + "policies/id/{}".format(root.findtext("general/id"))
data = ET.tostring(root)
ret = requests.put(url, auth=auth, data=data)
if ret.status_code != 201:
raise ProcessorError(
"Test policy %s update failed: %s" % (url, ret.status_code)
)
pol_id = ET.fromstring(ret.text).findtext("id")
self.logger.debug("got pol_id: %s", pol_id)
self.logger.info("Done Package: %s Test Policy: %s", pkg, pol_id)
return pol_id
def main(self):
"""Do it!"""
self.setup_logging()
# clear any pre-existing summary result
if "jpc_importer_summary_result" in self.env:
del self.env["jpc_importer_summary_result"]
pkg_path = self.env.get("pkg_path")
if not path.exists(pkg_path):
raise ProcessorError("Package not found: %s" % pkg_path)
pol_id = self.upload(pkg_path)
self.logger.debug("Done: %s: %s", pol_id, pkg_path)
if pol_id != 0:
self.env["jpc_importer_summary_result"] = {
"summary_text": "The following packages were uploaded:",
"report_fields": ["policy_id", "pkg_path"],
"data": {"policy_id": pol_id, "pkg_path": pkg_path},
}
if __name__ == "__main__":
PROCESSOR = JPCImporter()
PROCESSOR.execute_shell()
view raw JPCImporter.py hosted with ❤ by GitHub
JPCImporter.py

The first 25 lines are housekeeping before we define our class. Then it sets up logging and the input and output variables before the first function definition. You will notice I have set up the logs to rotate daily and to keep seven, that’s because personally I run my code at a high log level and I want short logs.

Speaking of debugging you’ll notice that when we come to a grinding halt due to some sort of problem I raise a ProcessorError. This is a function provided by AutoPkg that handles an error and places the error details into the report plist. You just pass it a string and it takes care of the rest.

The function upload is a single function to do all the work. I start off by calling curl using subprocess to upload the package file. This uses an unsupported, unofficial hack and with a lot of testing I’ve discovered that using curl is much more successful than any Python method I can find. It would be nice if Jamf gave us a way to do this via the API but don’t hold your breath, it’s been an open feature request on Jamf Nation since before Noah.

This only takes care of the file, it doesn’t save the package details, such as category, so we need a separate API call to perform this. There can be a long (in programming terms) delay between the file being uploaded and it being available for updating, you’ve probably seen this in the web GUI. Because of this the code is set up to try multiple times with a 15 second delay between attempts.

The last thing we have to do in upload is point the test policy at our package.

Finally, we have main which does a sanity check, calls upload and handles the AutoPkg report details. Oh, and the little stub to allow calling the processor outside AutoPkg for testing purposes.

Let The World Know

Before we can call package uploading complete PatchBot needs to tell somebody what it has done. For this it runs a script, Teams.py, that uses a webhook to send a message into a channel in Teams.

Autopkg provides a nice report as an XML plist so plistlib gives us a good dictionary to parse with two main sections, one for successful build and uploads and the other for failures. Most of the script is JSON templates for the messages. The only real complication in the script is handling a totally empty run.

#!/usr/bin/env python3
# Teams.py v1.0b
# Tony Williams 25/07/2019
#
"""See docstring for Teams class"""
import json
import plistlib
import os.path as path
import datetime
import logging
import logging.handlers
import sys
import requests
__all__ = ["Teams"]
# logging requirements
LOGFILE = "/usr/local/var/log/Teams.log"
LOGLEVEL = logging.INFO
class Teams:
"""When given the location of an output plist from Autopkg parses it
and sends the details on packages uploaded to Jamf Pro to Teams
"""
description = __doc__
def __init__(self):
# extremely dumb command line processing
try:
self.plist = sys.argv[1]
except IndexError:
self.plist = "autopkg.plist"
# Fake URL of Teams webhook, insert your own.
self.url = "https://outlook.office.com/webhook/&quot;
# token
self.url += "76ea46bf-3dda-41f0-831d-b0dc655e4f97@43f93f8a-55a8-4263-bd84"
self.url += "1d07f0672950542/-c3ef-4ee9-fd41-fafbe4177f30"
# URL for a button to open package test policy in Jamf Pro
self.pol_base = "https://example.jamfcloud.com/policies.html?id=&quot;
# set up logging
now = datetime.datetime.now().strftime("%d/%m/%Y %H:%M")
frmt = "%(levelname)s {} %(message)s".format(now)
# set up logging
logging.basicConfig(filename=LOGFILE, level=LOGLEVEL, format=frmt)
self.logger = logging.getLogger("")
# set logging formatting
# ch = logging.StreamHandler()
ch = logging.handlers.TimedRotatingFileHandler(
LOGFILE, when="D", interval=1, backupCount=7
)
ch.setFormatter(logging.Formatter(frmt))
self.logger.addHandler(ch)
self.logger.setLevel(LOGLEVEL)
# JSON for the message to Teams
# "sections" will be replaced by our work
self.template = """
{
"@context": "https://schema.org/extensions&quot;,
"@type": "MessageCard",
"themeColor": "0072C6",
"title": "Autopkg",
"text": "Packages uploaded",
"sections": [
]
}
"""
# JSON for a section of a message
# we will have a section for each package uploaded
# in this Autopkg run
self.section = """
{
"startGoup": "true", "title": "**AppName**", "text": "version",
"potentialAction": [
{
"@type": "OpenUri",
"name": "Policy",
"targets": [
{
"os": "default",
"uri": "https://docs.microsoft.com/outlook/actionable-messages&quot;
}
]
}
]
}
"""
# JSON template for the error message card.
self.err_template = """
{
"@context": "https://schema.org/extensions&quot;,
"@type": "MessageCard",
"themeColor": "0072C6",
"title": "Autopkg",
"text": "Package errors",
"sections": [
]
}
"""
# JSON template for a single error on error card.
self.err_section = """
{
"text": "A long message",
"startGoup": "true",
"title": "**Firefox.pkg**"
}
"""
# JSON template for the empty run message card.
self.none_template = """
{
"@context": "https://schema.org/extensions&quot;,
"@type": "MessageCard",
"themeColor": "0072C6",
"title": "Autopkg",
"text": "**Empty Run**"
}
"""
def Teams(self):
"""Do the packages uploaded!"""
self.logger.info("Starting Run")
sections = []
empty = False
jsr = "jpc_importer_summary_result"
try:
fp = open(self.plist, "rb")
pl = plistlib.load(fp)
except IOError:
self.logger.error("Failed to load %s", self.plist)
sys.exit()
item = 0
if jsr not in pl["summary_results"]:
self.logger.debug("No JPCImporter results")
empty = True
else:
for p in pl["summary_results"][jsr]["data_rows"]:
sections.append(json.loads(self.section))
# get the package name without the '.pkg' at the end
pkg_name = path.basename(p["pkg_path"])[:4]
pol_id = p["policy_id"]
self.logger.debug("Policy: %s Name: %s", pol_id, pkg_name)
(app, version) = pkg_name.split("-")
pol_uri = self.pol_base + pol_id
sections[item]["title"] = "**%s**" % app
sections[item]["text"] = version
sections[item]["potentialAction"][0]["targets"][0]["uri"] = pol_uri
item = item + 1
j = json.loads(self.template)
j["sections"] = sections
d = json.dumps(j)
requests.post(self.url, data=d)
# do the error messages
fails = pl["failures"]
if len(fails) == 0: # no failures
if empty: # no failures and no summary so send empty run message
requests.post(self.url, self.none_template)
sys.exit()
sections = []
item = 0
for f in fails:
sections.append(json.loads(self.err_section))
sections[item]["title"] = "**%s**" % f["recipe"]
sections[item]["text"] = f["message"].replace("\n", " ")
item = item + 1
j = json.loads(self.err_template)
j["sections"] = sections
d = json.dumps(j)
requests.post(self.url, d)
if __name__ == "__main__":
Teams = Teams()
Teams.Teams()
view raw Teams.py hosted with ❤ by GitHub
Teams.py

Plumbing

So we can do this on a regular basis we need some nuts and bolts to tie it all together.

LaunchAgent

Back when my programming was born we used cron to schedule tasks but that’s been ‘deprecated’ on macOS for many years now, replaced by LaunchAgents and LaunchDaemons. So we need a LaunchAgent definition.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"&gt;
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.github.PatchBot.autopkg</string>
<key>Program</key>
<array>
<string>/Users/autopkg/Documents/PatchBotTools/autopkg.sh</string>
</array>
<key>StartCalendarInterval</key>
<dict>
<key>Hour</key>
<integer>6</integer>
<key>Minute</key>
<integer>0</integer>
</dict>
</dict>
</plist>
view raw autopkg.plist hosted with ❤ by GitHub
autopkg.plist

It gets to the right place like this

mkdir /Users/"$(whoami)"/Library/LaunchAgents/bin/
cp ./autopkg.plist /Users/"$(whoami)"/Library/LaunchAgents/autopkg.plist
/bin/launchctl load /Users/"$(whoami)"/LaunchAgents/autopkg.plist

Shell Script

You can see it runs a shell script so let’s see what that looks like:

# run the package build 
/usr/local/bin/autopkg run -recipelist=/Users/"$(whoami)"/Documents/autopkg_bits/packages.txt \    
 --report-plist=/Users/"$(whoami)"/Documents/autopkg.plist \
 -k FAIL_RECIPES_WITHOUT_TRUST_INFO=yes
# messages to MS Teams /Users/"$(whoami)"/Documents/autopkg_bits/Teams.py \
/Users/"$(whoami)"/Documents/autopkg.plist

Notice we use AutoPkg’s ability to read the recipes we want to run from a list. So neither the LaunchAgent or script ever need to change, just the recipe list we feed AutoPkg. I really appreciate how well built AutoPkg is.

Next

Next post I will explain the next step, moving our package into testing and the second custom processor.

4 thoughts on “PatchBot – Zero-Touch Packaging and Patch Management

  1. Hi Honestpuck,

    Thank you very much to develop this tool. I have a few doubts.
    1. Is this tool supports automatically create new package and deployment( ex: firefox new version, this tool can create a package and deploy it.)
    2. Is there any flow chart , how to configure it in Mac machine & JAMF console.( Mac machine should were JAMF installed that machine or any mac machine.)

    3.

  2. Hi, I’d like to try your patchbot! I am new to jamf and autopkg. Could you provide: an installation script ( details here rather sketchy), configuration, and example of using patchbot with a simple recipe. Thanks a lot, V

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s