Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move from Travis-CI (and Appveyor?) to GitHub Actions #5261

Closed
mossheim opened this issue Nov 23, 2020 · 48 comments · Fixed by #5385
Closed

Move from Travis-CI (and Appveyor?) to GitHub Actions #5261

mossheim opened this issue Nov 23, 2020 · 48 comments · Fixed by #5385
Assignees
Milestone

Comments

@mossheim
Copy link
Contributor

Motivation

see #5252 for prior discussion; a brief summary is that we need to move off travis-ci to a new CI provider by the end of the year, and github actions seems to be the best option at the moment.

to reiterate, with GHA we would get unlimited and free builds on all 3 platforms we use with multiple concurrent jobs.

Description of Proposed Feature

we will move our entire build matrix from Travis over to Github Actions, and possibly our Appveyor builds too. the reasons we would switch over from appveyor are:

  • only one integration to maintain
  • more concurrent jobs on GHA
  • potentially faster jobs on GHA

we need some research to figure out how much would effort it would take migrating from Appveyor. this is also lower-priority since we don't have the same end of the year deadline. it would just be nice if it worked out.

Plan for Implementation

@dyfer , @joshpar , @claremacrae so far have offered to help with this. i'm also of course interested in helping. i'd really like to have multiple people involved so the knowledge can be shared, and hopefully some documentation for everyone else will come out of it too!

@claremacrae
Copy link
Contributor

Thanks Brian - great write-up.

After a little bit of though about possible ways to get started on this, I have been wondering about having some kind of shared online meeting, with screen sharing, where the 4 of us take a high-level look at what GHA offers, and how it works - by way of an introduction for those who haven't seen it, and a sharing of knowledge between those who have.

I think that this will start to generate a list of the kinds of structural decisions that would be useful to think about early on - and that might help drive and inform the early actions.

It would potentially also share info about what stuff comes for free, quick ways to get started, and useful resources for when you get stuck.

I don't know who amongst us has most experience with GHA, but if it helps, I would be happy to talk through what I've learned, and what I know that I don't know, about it so far - for others to fill in the gaps.

Equally, I would be very happy if someone else started the demos and then I joined in with gap-filling...

This is just a starting suggestion to get the discussion going. I am not at all wedded to the idea - others are very welcome!

@claremacrae
Copy link
Contributor

claremacrae commented Nov 23, 2020

The kinds of topics to consider include:

  • What are Actions and Workflows?
  • Browsing the logs of GHA builds
  • And what extras you get by downloading the logs
  • How to find out what software is installed on GHA images
  • Getting started - how to create your first workflow
  • Re-using other people's actions
  • Re-using your own actions
  • Useful online resources (and Clare's handy saved search)
  • Granularity - what level to divide up YML files in to?
  • Build badges - how they might change

@dyfer
Copy link
Member

dyfer commented Nov 23, 2020

Sharing what I've learned so far - correct me if I'm wrong please:

  • What are Actions and Workflows?

AFAIU a Workflow is an upper logical unit of GHA. I.e. there may be multiple jobs in one Workflow. I imagine we could structure our CI in few different ways, e.g.

  • one workflow with all CI jobs on all platforms
  • two workflows:
    • a slimmed down CI with one job for each platform and linting, run for every push
    • full matrix with all platforms, run either on every push or maybe only on merges into develop/3.x?

These are just propositions for further discussion of course.

Actions are "standalone commands that are combined into steps to create jobs" (from the intro) - it seems that Actions can come from GH or from community repos

  • How to find out what software is installed on GHA images

Here are links to available configurations. Few of my notes:

  • vcpkg available everywhere :)
  • Ubuntu 16.04, 18.04 and 20.04
  • Windows Server 2016 and 2019 with MSVC 2017/2019, respectively
    • no Qt preinstalled, but a quick search resulted in finding this Action to download and install Qt, as well as cache it
  • the oldest macOS is 10.15 (11.x also available) with XCode 10 (SDK 10.14), as well as XCode 11.x and even 12 with newer SDKs
    • I think we could still make the legacy build happen, but need to check, maybe getting older Qt would be easier using that action above as opposed to homebrew
  • Granularity - what level to divide up YML files in to?

I think this is a very good question, also would circle back to how we would structure our workflows

@dyfer
Copy link
Member

dyfer commented Nov 24, 2020

All right, here's my first pass on this: https://github.com/dyfer/supercollider/actions/runs/380527549

This is a single linux job, no ccache, no linting, no build matrix or configurability through environment variables (like we have currently with Travis). But it builds SC successfully and uploads the artifact (I know this is not useful on linux RN but wanted to test)

There seem to be some more GHA-idiomatic ways of performing certain actions, particularly:

  • installing Qt
    • useful definitely on Windows, maybe on Mac? (to allow for easy choice of version)
  • linting (not sure if this is useful for us?)
  • installing packages on Windows
    • I mean, vcpkg is listed first as package manager for each platform :) maybe we can consider it (again) for fftw and libsndfile? not sure what is the status of these packages, would probably need to test it locally first

Also the linux runner seems to be faster than Travis I think (at least when I ran it) - the full build took ~8 minutes (without ccache)

EDIT: ...and I haven't tried implementing unit tests

@claremacrae
Copy link
Contributor

All right, here's my first pass on this: https://github.com/dyfer/supercollider/actions/runs/380527549

Wow, that YML file is a thing of beauty and simplicity, @dyfer !

@dyfer
Copy link
Member

dyfer commented Nov 24, 2020

thanks, @claremacrae !

@dyfer
Copy link
Member

dyfer commented Nov 25, 2020

macOS job is also up now: https://github.com/dyfer/supercollider/actions/runs/382257003
I think it's weird because of XCode 11, I need to switch it to ninja generator.

@dyfer
Copy link
Member

dyfer commented Nov 25, 2020

I switched to Ninja, and SC now starts properly: https://github.com/dyfer/supercollider/actions/runs/383901328

However, the downloads are large, ~250MB (vs <100MB for 3.11). After unpacking, these builds are >500MB ( vs <250MB for 3.11). The main difference is QtWebEngineCore.framework: in this build it's 460MB, in 3.11 it's 150MB (unpacked)

I know that 3.11 was build with older XCode. Is this the reason for this size increase? Or is something else going on?

@joshpar
Copy link
Member

joshpar commented Nov 25, 2020 via email

@mossheim
Copy link
Contributor Author

mossheim commented Nov 25, 2020

whatever action you are using to zip and upload the artifact doesn't preserve symlinks, so many files and directories are being copied into the archive multiple times.

edit - to elaborate, macOS bundles/frameworks have a specific internal structure that makes use of symlinks, and if you resolved all of those symlinks, you'd end up with at least 2x the size. the webengine app is even worse here because all of its internal frameworks are just symlinks to frameworks from the outer bundle.

@mossheim
Copy link
Contributor Author

@dyfer
Copy link
Member

dyfer commented Nov 25, 2020

thanks for finding this @brianlheim I'll keep an eye out for symlink support solutions
thanks @joshpar I think we wouldn't be able to offer fat binaries now until QT provides them (which I don't believe they do, yet). I guess Qt will have to provide them at some point though.
BTW I wonder how that works for other libraries we use from homebrew: libsndfile, fftw, portaudio. Would homebrew be able to provide fat binaries? Or would they need to be manually recompiled when cross-compiling SC?

@mossheim
Copy link
Contributor Author

Homebrew/brew#7857 for the homebrew discussion.

https://bugreports.qt.io/browse/QTBUG-85279 looks like the top level Qt bug tracker ticket, from the info there it looks like only 5.15 and 6.0 will support apple silicon. not 100% sure though.

@joshpar
Copy link
Member

joshpar commented Nov 25, 2020 via email

@dyfer
Copy link
Member

dyfer commented Nov 25, 2020

Homebrew/brew#7857 for the homebrew discussion.

Thanks for the link.
Arm support is one thing, "fat"/universal binaries is another - on top of that. One thread suggests that universal binaries won't be supported.

I guess offering a separate download for arm64 vs x86_64 is not the end of the world, at least for a time.

It sounds like you found it - but I was thinking more that possibly x86 and x86_64 may have been getting included (32 and 64-bit).

Ah, got it. Sorry about misunderstanding, I was thinking about arm64. Yeah, we haven't been building x86 on macOS for a while.

@mossheim mossheim pinned this issue Nov 25, 2020
@dyfer
Copy link
Member

dyfer commented Nov 26, 2020

Zip symlink issue is now solved, I'm just creating the zip "manually" before the artifact upload step - https://github.com/dyfer/supercollider/actions/runs/384592997

Next:

  • linting
  • build matrix (at least some of the jobs)
  • windows build
  • unit tests

EDIT: Re: earlier consideration about the file structure - I'm leaning towards putting everything in one yaml file, at least for now.

@dyfer
Copy link
Member

dyfer commented Nov 29, 2020

Update:

  • linting is now a separate step, all builds run once linting succeeds
  • I started implementing build matrix
    • compiler selection doesn't work yet, I need to figure that out
    • syslibs and libscsynth options not working yet

https://github.com/dyfer/supercollider/actions/runs/389305122

@claremacrae
Copy link
Contributor

This is great progress, @dyfer

Has there been any discussion about making the CI mostly call scripts?

One of the things I've admired about SuperColider's CI is that, as far as I can see, it calls out to scripts on disk, such as:

https://github.com/supercollider/supercollider/tree/develop/.travis

I had wondered whether some such scripts might be more generally useful, e.g. to a developer on a particular platform... And so might be moved out of a .travis-specific directory, so they are reusable and more discoverable?

Whether they are moved or duplicated, having the scripts in .sh or similar might also allow some components of the CI builds to be tested on dev machines/VMs before committing...

@dyfer
Copy link
Member

dyfer commented Nov 29, 2020

I'm happy to have a discussion about using the scripts, as they have been used for Travis.

For me personally, it was rather confusing to find out what was happening in which script, and where environment variables were set. I won't insist having everything in one file is better, but that works better for me personally ATM.
I also think it's tricky to keep things both universal (reusable scripts) and idiomatic for a given CI system. E.g. currently some steps in YAML run conditionally using syntax in the yaml script, not in shell...

@dyfer
Copy link
Member

dyfer commented Nov 29, 2020

BTW I now have most of the linux build matrix implemented. https://github.com/dyfer/supercollider/runs/1469485607
For now I stuck with the compiler versions provided by GH runners. What's tested now is gcc 7, 8, 9, 10 and clang 6.0, 8, 9, 10. The runners are missing gcc 6, gcc <5, clang <6 and clang 7.

gcc 5 is available, but in ubuntu 16.04 image, which comes with older qt (5.5). I've set up a different action to install arbitrary qt version, but that has recently broken, so for now I disabled the build with GCC 5.
There are still some debugging leftovers left in the YAML file.

TODO: ccache, macOS build matrix, possibly Windows builds, testing (!)

Also: clang-format-8 is available in the runner image. Could we use that instead of downloading our own? Does it need to be an exact version?

@mossheim
Copy link
Contributor Author

mossheim commented Nov 29, 2020

clang-format-8 is available in the runner image. Could we use that instead of downloading our own?

yes, absolutely

What's tested now is gcc 7, 8, 9, 10 and clang 6.0, 8, 9, 10. The runners are missing gcc 6, gcc <5, clang <6 and clang 7.

this is fine for now, since we are moving to c++17 soon we will want to drop a few of the older compilers anyway. i think we'll still want to have gcc 6 and clang 7. clang 11 is also available now i believe.

TODO: ccache

this should be as easy as installing the package and making sure it's in path; the build system will pick it up automatically. but then you also need to preserve the cache across builds. speaking of which, can we cache homebrew or apt on these machines and is that something recommended by GHA docs? i remember on travis they said not to do it.


since travis is apparently decommissioning more of their linux machines, we are probably going to now start seeing increasingly bad build delays (last week i think we saw some builds take almost a full day) judging from this traviscistatus.com graphic. so perhaps you could make a PR this week and get us switched over minimally, and then we can continue adding features as needed?

Screenshot_20201129-090216_Firefox

@josiah-wolf-oberholtzer

@claremacrae Fail-fast is on by default with GHA, so you'll have to turn it off explicitly (see: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategyfail-fast).

@dyfer
Copy link
Member

dyfer commented Nov 29, 2020

@claremacrae Fail-fast is on by default with GHA, so you'll have to turn it off explicitly (see: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategyfail-fast).

Just did in my GHA integration branch.

@claremacrae
Copy link
Contributor

I believe that "set -o errexit" is one of several options that spots mistakes in shell scripts much earlier... and avoids time spent debugging them.

Having said that, the ones I tend to use are:

# Force execution to halt if there are any errors in this script:
set -e
set -o pipefail

@claremacrae
Copy link
Contributor

@claremacrae Fail-fast is on by default with GHA, so you'll have to turn it off explicitly (see: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategyfail-fast).

Yes thanks - sorry I was unclear in my wording...

@josiah-wolf-oberholtzer

Also thanks for making this transition so quickly! I just converted from Travis to GHA for my SC Python client library and I'm generally quite happy. GHA doesn't do as good a job with live log output, but otherwise we're good.

And ℹ️ I can report that you can boot both Jack and scsynth inside GHA runners.

@dyfer
Copy link
Member

dyfer commented Nov 29, 2020

And ℹ️ I can report that you can boot both Jack and scsynth inside GHA runners.

That's great! Thanks for reporting.

@dyfer
Copy link
Member

dyfer commented Nov 30, 2020

Update:

  • ccache is enabled and works for the linux builds (macOS has issues finding ccmake I think, setting env variables from shell script needs to be investigated...) and macOS builds
  • linting works (class library lint failure is ignored)
  • "fail fast" is disabled - after successful linting all build jobs will finish, even if one of them fails
  • quick search indicated that cache can be set up for homebrew by manually adding certain directories to cache; I haven't tried this
  • I don't know about caching apt packages
  • macOS build matrix implementation is incomplete (not all switches work yet)

https://github.com/dyfer/supercollider/actions/runs/391176708

@mossheim
Copy link
Contributor Author

And ℹ️ I can report that you can boot both Jack and scsynth inside GHA runners.

this is awesome news @josiah-wolf-oberholtzer !

@claremacrae
Copy link
Contributor

Note to self, to make it easy to see @dyfer's branch for this work:
https://github.com/dyfer/supercollider/tree/github-actions-test-01

@claremacrae
Copy link
Contributor

@dyfer Hi Marcin, I just wanted to say a huge thank you for all your work on this ticket - not least because of how much I have learned from what you have done.

This evening I have been able to move over to GHA almost all the Travis builds from a project I work on - and also make a bunch of other simplifications to that project's GHA config!

🙏

@dyfer
Copy link
Member

dyfer commented Dec 1, 2020

update/recap:

  • build matrix for both linux and macOS is (mostly) implemented (1)
  • ccache is used and cached between builds
  • linting works
  • artifact names use either tag or commit sha (if tag is not present)

not implemented/todo:

  • run tests
  • create windows builds
  • upload/publish nightly builds (?) (2)
  • fix macOS legacy builds (3)
  • deployment (4)

notes:
(1) linux is missing a few compilers, could be added later; macOS is missing the legacy build (see (3) below)
(2) GHA and the artifacts are available only to registered GH users; it's unclear whether there's a way to obtain a public link to "latest develop build" (maybe through a 3rd party app/script that uses GH API?); or we might need to upload to S3, like before
(3) since I don't think there's a way to get binary download of older Qt in homebrew for catalina, I was investigating what seemed the most popular way to get Qt on GHA runners: install-qt-action. This currently does not work; it uses python script aqt, which in turn requires pycryptodome, and that one has an issue installing on the GHA runner. I hope this will be resolved upstream and then we can re-enable the legacy build
(4) deployment can wait IMO - macOS builds need to be manually downloaded, signed and uploaded anyway...

tl;dr I'd like to step away from this for a few days. I think this is in a reasonable shape to provide basic substitution for Travis, with some functionality missing as indicated above. What do you think @brianlheim @claremacrae ?

@mossheim
Copy link
Contributor Author

mossheim commented Dec 1, 2020

@dyfer yes, this is amazing, thank you so much!!!

i'm happy to help with the deployments to S3 and GitHub Releases, it would be nice to not have any continuity breaks in S3 during this switch over. everything else can be fixed later.

no worries if you want to leave it for a few days. can you make a PR now though? then i can review and/or push to the branch. the travis backlog is getting worse by the day so i really want to make this switch ASAP.

@dyfer
Copy link
Member

dyfer commented Dec 1, 2020

@brianlheim great, I'll make a PR.
I was thinking that we could even leave travis macos builds (just main + legacy) up, these don't seem to be backlogged so much (I think?). But yeah, I'll make a PR and we can take it from there.
Also, there's number of little things I discovered setting this up. I was thinking about writing a personal blog post about it, but maybe this could be directly on SC wiki? What do you think?

@mossheim
Copy link
Contributor Author

mossheim commented Feb 7, 2021

note to self - put my secrets in https://github.com/supercollider/supercollider/settings/secrets/actions

@mossheim mossheim removed their assignment Feb 10, 2021
@mossheim
Copy link
Contributor Author

mossheim commented Feb 12, 2021

note - when removing travis CI support, also remove the travis_test_run_proto.json file from the repo

@mossheim
Copy link
Contributor Author

sorry @dyfer I just realized I never responded to this:

Also, there's number of little things I discovered setting this up. I was thinking about writing a personal blog post about it, but maybe this could be directly on SC wiki? What do you think?

if you still have any interest or if you already wrote it I'm sure it could be useful on our wiki too :)

@dyfer
Copy link
Member

dyfer commented Feb 12, 2021

if you still have any interest or if you already wrote it I'm sure it could be useful on our wiki too :)

How about this? https://github.com/supercollider/supercollider/wiki/GitHub-Actions-migration-notes
:-)

@dyfer
Copy link
Member

dyfer commented Feb 23, 2021

Writing down so I remember - the last step before closing this would be to update our Wiki, changing references to Travis and AppVeyor as needed. We should probably also update the main Readme (the SC is built/tested with ... parts) as this has changed.

EDIT: here's the wiki page that needs to be updated

This was referenced Mar 1, 2021
@dyfer
Copy link
Member

dyfer commented Mar 4, 2021

New Wiki page has been added: https://github.com/supercollider/supercollider/wiki/Continuous-Integration---GitHub-Actions

@mossheim
Copy link
Contributor Author

mossheim commented Mar 4, 2021

i also consider this fixed with #5385, thanks @dyfer !!!

we still need to move over sc3-plugins, i'll make a ticket there now.

@mossheim mossheim unpinned this issue Mar 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants