PatchBot Update

So the inevitable happened after I published my blog posts about PatchBot. I found some small bugs, now fixed.

But also inevitable was somebody telling me there was a better way to do something.

It turns out that since JPCImporter only needs the pkg_path variable it can be used as a post processor when calling the package build recipe. That means you don’t need to alter the .pkg recipe override at all. That’s a whole bunch of recipes we can just forget about. Thanks to Graham Pugh for the tip.

The first AutoPkg call needs to be changed:

# run the package build
/usr/local/bin/autopkg run --recipe-list=/Users/"$(whoami)"/Documents/PatchBotTools/packages.txt \
    --post com.honestpuck.PatchBot/JPCImporter \
    --report-plist=/Users/"$(whoami)"/Documents/JPCImporter.plist \
    -k FAIL_RECIPES_WITHOUT_TRUST_INFO=yes

There were also some bug fixes to Move.py and ProdTeams.py.

PatchBot #4

Here is my fourth post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In the third part I showed the third custom processor and the code to ruin it at the right time.

In the first three blog posts I explained (in great detail) how my system, PatchBot, works.

Today I am going to cover how to take the pieces and put them together into a complete system.

Continue reading

PatchBot #3

Welcome to my third post about PatchBot.

In the first post I gave a short summary of how the system works and introduced JPCImporter, the first AutoPkg custom processor.

In the second post I introduced patch management and the second custom processor.

In this post we will look at the python script that decides when to move a package into production and the custom processor that does all the work.

Move.py

Move.py is fairly simple. When you strip off the first 50 lines as housekeeping you’re left with a function loop that does all the work. It loops through every patch policy on the server then if it’s an enabled test patch policy looks for a date more than 6 days ago in it’s self service description.

It uses that to build a command that gets used in a subprocess call. I keep on looking at this and thinking it would be nice to just call the right function in AutoPkg and pass it the list but that’s probably even more fragile than the current way.

You will also see that after AutoPkg it calls a final script to send more messages to Teams. This could have been done in Move.py but I liked having the code separate during development and it makes it easier for you to use my code.

Notice that we don’t do anything in Move.py, for that we rely on our final custom processor, Production.py.

Continue reading

PatchBot – Zero Touch Patch Management #2

Last post I detailed the first steps taken by PatchBot, building and uploading a new version of an application package.

This post I will explain the next step, updating the testing patch policy.

First thing I should explain is why we don’t do this when we build and upload the package. It boils down to the reliability of our patch definition feed. If every time a new version was available the patch definition feed was updated at exactly the same time we could have done it all in JPCImporter. Unfortunately the patch definitions are only updated every 12 hours (I think) and that’s enough of a window for Murphy. Kinobi keep on decreasing the window but no matter how narrow you know Murphy will have his way, so defensive design and coding.

Continue reading

PatchBot – Zero-Touch Packaging and Patch Management

A little over a year ago I set out to build a system that would deliver application patches to my users without me doing a thing.

I have leveraged AutoPkg, the JAMF patch management system, and API to build a total solution where almost all of my applications are automatically patched on my fleet without me touching a thing.

I call it PatchBot. This will be a series of four blog posts explaining the system and how to get it working. All the code and tools are published on Github.

When a new version of an application is available a package is built and a patch is sent to a test group, after a set delay the package is moved into production where everyone with the app installed gets a patch to update and our application install policy is updated.

Two LaunchAgents automatically run AutoPkg with some custom processors and scripts to perform all the work.

While it does take some setting up for each application the process requires manual intervention only to stop an application patch package going in to production when a problem is found in a package or to speed it up if you need to deploy a security patch quickly.

Patch levels across the fleet have improved dramatically.

Continue reading

Doing Some Cleaning

Tidying up my toys. (Image from Wikimedia)

My distribution point was gaining some bloat with a huge number of packages that were superseded or deprecated. I needed to do a clean out. I need a tool that will check which ones are used and list the rest.

I had a look at Spruce but it has one fatal flaw, it doesn’t look at patch policies. Now given my automation around patch management means I use patch policies a great deal that’s a problem. If my system is working perfectly then every package being used by a patch policy should also be used in a policy so it wouldn’t matter, but since it doesn’t always work perfectly I’m not prepared to risk removing a required package.

Continue reading

AutoPkg Repo List Fiddling Again

After my last post Graham Pugh mentioned that the AutoPkg repository list is stored in the AutoPkg preference file as RECIPE_REPOS with the search order in RECIPE_SEARCH_DIRS.

He suggested doing a while loop on the defaults read output but I thought it was just fiddly enough a task in the shell that I might resort to a few lines of Python, so here it is, a Python script to dump out your repository list in search order. Tiny but it does the job.

(Thanks to Graham for taking the time to comment on the previous post, it was just what I needed to get me to spend the few minutes doing this.)

#!/usr/bin/env python3

# repos.py
# print the list of AutoPkg repos in search order
# NOTE: Totally lacking in any error checking or handling

import plistlib
from os import path

plist = path.expanduser('~/Library/Preferences/com.github.autopkg.plist')
fp = open(plist, 'rb')
prefs = plistlib.load(fp)
search = prefs['RECIPE_SEARCH_DIRS']
repos = prefs['RECIPE_REPOS']

# start at 3 to skip the built in ones
for i in range(3, len(search)):
    print(repos[search[i]]['URL'])

AutoPkg Repo List Fiddling

Here’s a little one for you. I needed to keep the recipe repositories in sync across two machines.

AutoPkg will happily give you a repo-list- here’s part of mine:

autopkg repo-list
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.48kRAM-recipes (https://github.com/autopkg/48kRAM-recipes)
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.HobbitHardcase-recipes (https://github.com/autopkg/HobbitHardcase-recipes)
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.MichalMMac-recipes (https://github.com/autopkg/MichalMMac-recipes)
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.adobe-ccp-recipes (https://github.com/autopkg/adobe-ccp-recipes)
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.arubdesu-recipes (https://github.com/autopkg/arubdesu-recipes)
/Users/Anthony.WILLIAMS/Library/AutoPkg/RecipeRepos/com.github.autopkg.aysiu-recipes (https://github.com/autopkg/aysiu-recipes)

Unfortunately that isn’t in a form you can feed to AutoPkg’s repo-add command. We need something like sed to make it right. Here we go.

autopkg repo-list | sed "s#[^(]*(\([^)]*\)).*#\1#"
https://github.com/autopkg/48kRAM-recipes
https://github.com/autopkg/HobbitHardcase-recipes
https://github.com/autopkg/MichalMMac-recipes
https://github.com/autopkg/adobe-ccp-recipes
https://github.com/autopkg/arubdesu-recipes
https://github.com/autopkg/aysiu-recipes

Now we add them on the other computer. Pipe the above into repos.txt and then:

while read -r line ; do
autopkg repo-add $line
done < repos.txt

Now if AutoPkg had an option to list the repositories in search order rather than alphabetical…

Easy Secure Passwords

From Wikimedia

More and more we are being told to make our passwords secure. I work at a bank where I am required to have a password over a certain length and to change it regularly. That makes coming up with a secure, easy to remember password becomes a task.

Well, I wouldn’t be a decent hacker if I didn’t come up with a way of solving that problem. Xkcd tells us that four common random words would qualify as secure and easy to remember. That means we just have to generate them.

Did you know your Mac has a list of over 200,000 English words and names? /usr/share/dict/words is the complete ‘Webster’s Second International Dictionary`. Published in 1934, the copyright has lapsed so it became part of FreeBSD and then macOS.

So we just need to randomly pick four words from the list. This shouldn’t be hard. cat /usr/share/dict/words | sort -R | tail -4 would do the job. Give it a try.

OK, I see a problem here. Some of those words are far from common. I ran it twice and among my eight words were ‘impositional’, ‘histographical’, and ‘Cagayan’. I even tried changing it to tail -10 to see if that gave me four “common” words in the list but that failed most of the time.

Most of those uncommon words are long. What if we extracted the long words before picking ten? cat /usr/share/dict/words | grep -v '^……*' | sort -R | tail -10 will do the job. That gives me a better result but it still sucks for usability. We really do need a list of common words.

I easily found one online. A search for “common english words list” in DuckDuckGo (my favourite search engine) quickly did it – 3000 most common words in English. I then copied and pasted the list of several thousand common words in a file as /usr/local/share/dict/common. I could then go back to sort -R /usr/local/share/dict/common | tail -4 to get a list.

Now I just added the alias alias passphrase='sort -R /usr/local/share/dict/common | tail -4' to the bottom of my .zshrc file and generating a safe secure password is trivial.

If we are to believe The Diceware Passphrase FAQ then four words isn’t really enough, “four words only provide 51.6 bits, about the same as an 8 character password made up of random ASCII characters. Both are breakable in less than a day with two dozen graphics processors” so feel free to change that ‘4’ to a larger number according to your level of paranoia. Note that the Diceware calculations are based on someone knowing your passphrase is a number of words. If they don’t know that then it becomes much harder. If the cracker knows you are using the 3000 word list then there is about 11.5 bits of entropy per word in your phrase. On the other hand, if they only know it is random, lower case letters it is around 4.7 bits per letter – if we average six letters per word that’s 28.2 bits per word. Diceware recommend over 100 bits of entropy so that would require a five word phrase.

Diceware also recommend inserting a space between each word. So my current passphrase is ‘rain bone conflict stone mind’. Not really, but that’s what I just generated.

So you can easily make your passwords secure.