This set of scripts collect CI artifacts from a local directory or S3, and assembles them into a package structure defined by a packaging class in a staging directory. For the NugetPackage class the NuGet tool is then run (from within docker) on this staging directory to create a proper NuGet package (with all the metadata). While the StaticPackage class creates a tarball.
The finalized nuget package maybe uploaded manually to NuGet.org
- Requires Python 3
- Requires Docker
- (if --s3) Requires private S3 access keys for the librdkafka-ci-packages bucket.
-
Trigger CI builds by creating and pushing a new release (candidate) tag in the librdkafka repo. Make sure the tag is created on the correct branch.
$ git tag v0.11.0-RC3 $ git push origin v0.11.0-RC3
-
Wait for CI builds to finish, monitor the builds here:
New builds
Previous builds
Or if using SemaphoreCI, just have the packaging job depend on prior build jobs in the same pipeline.
-
On a Linux host, run the release.py script to assemble the NuGet package
$ cd packaging/nuget
$ ./release.py v0.11.0-RC3
-
If all artifacts were available the NuGet package will be built and reside in the current directory as librdkafka.redist..nupkg
-
Test the package manually
-
Upload the package to NuGet
-
If you trust this process you can have release.py upload the package automatically to NuGet after building it:
$ ./release.py --retries 100 --upload your-nuget-api.key v0.11.0-RC3
To create a bundle (tarball) of librdkafka self-contained static library builds, use the following command:
$ ./release.py --class StaticPackage v1.1.0
To clean up old non-release/non-RC builds from the S3 bucket, first check with:
$ AWS_PROFILE=.. ./cleanup-s3.py --age 360
Verify that the listed objects should really be deleted, then delete:
$ AWS_PROFILE=.. ./cleanup-s3.py --age 360 --delete