Pipeline Mock-up
I have been toying with setting up a CI/CD Pipeline for PowerShell module development. Basic components and steps are as follows:
Components:
- Git Source Control
- Orchestration (Jenkins, TeamCity, etc)
- Binary repository (Artifactory, PowerShell Gallery, etc)
Steps:
- Code checked into a git-based source control.
master
branch is “Production Ready”,release/stage
is “Ready for QA”, other branches are considered in development. - Orchestration monitors the repo for pull requests on
release/stage
. PRs trigger automatic tests bundled with the code in a predetermined folder path. - Successful tests are reported … Somehow.
- When a PR is approved/merged, Orchestration re-runs tests (for validation) and merges the code into the
master
branch. - When the
master
branch is updated, Orchestration builds the module (process TBD) and pushes to binary repository. - (Optional) When code is run, check for updates from the binary repository before first run, updating the local code to the latest version as necessary.
For git, I’m using Gogs - it’s simple to set up and is the platform for my personal/private repo. I do not have a running Jenkins instance that is internet accessible, I’m debating splitting the DigitalOcean droplet - letting it do double duty: Gogs and Jenkins. Alternatively, I can spin up a vagrant box with Jenkins installed/pre-configured to monitor Gogs. To accomplish steps 2-5, I’ll need three Jenkins jobs:
Stage | Description | Trigger | Action |
---|---|---|---|
1 | Monitor for PRs and test | PR Request | Run tests and report |
2 | Monitor release/stage for updates and merge with prod |
Branch update | Run tests and merge |
3 | Monitor master for updates and publish |
Branch update | Run tests, package, and push to binary repo |
To handle the optional 6th step, one option would be something like PSDepend (something I’ve been meaning to take a closer look at in the future). I am lucky enough to have distinct Dev, QA, and Prod systems from which my PowerShell code can be run. Moreover, each of the higher-level orchestration engines that calls PowerShell is somewhat self aware - it knows what environment in which the orchestration workflow is running. If I own the binary repositories, I can configure multiple repositories for the configured environment:
- Dev Orchestration == Dev Binary Repository
- QA Orchestration == QA Binary Repository
- Prod Orchestration == Prod Binary Repository
Using this information, I can query the corresponding binary repo for a set of required modules and compare that to my local set/versions. If the local copy is outdated, I can pull down the latest version. Once the local version has been updated, I can load that specific version of the module and proceed with my normal PowerShell code execution.
The code necessary for checking local versions against a named remote repo can be found in the gist below. A future post will continue this process and describe how I can automatically register specific repositories based on environment if they do not already exist.