Skip to content

ci: pytest artifact upload #19466

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

andrzejnovak
Copy link
Contributor

@andrzejnovak andrzejnovak commented Feb 6, 2021

Collects result_images from failed tests and uploads them as artifacts on GHA. Filtering was necessary because the full set is ~50MB which is not practical for downloading and inspecting anyway.

https://github.com/matplotlib/matplotlib/actions/runs/543223044

@andrzejnovak andrzejnovak force-pushed the pytestart branch 6 times, most recently from d49e887 to ef37dc3 Compare February 6, 2021 14:49
@henryiii
Copy link

henryiii commented Feb 8, 2021

@andrzejnovak implemented this in mplhep and it is really useful to be able to see what failed!

@QuLogic
Copy link
Member

QuLogic commented Feb 9, 2021

I like the idea, but instead of writing it in bash hidden in the workflow, I would modify tools/visualize_tests.py. You could add an --only-failed option to only include the failed images, and a --output option that would copy the relevant files into a subdirectory that could be used as the artifact.

@andrzejnovak
Copy link
Contributor Author

I see how that would work, but that script seems to be really centred about sth else. It would actually be very convenient if GH allowed uploading the HTML file directly as artifact, but since it seems to always produce zips, having the plots in a web page would just be extra steps.

I think I find it a bit confusing since pytest-mpl has a very convenient --mpl-results-path, but I wasn't able to pass it here.

@jklymak
Copy link
Member

jklymak commented May 10, 2021

@QuLogic did you want those changes definitively, or was that just an idea. It seems gathering the failed images is helpful to me.

@jklymak jklymak requested a review from QuLogic May 15, 2021 15:25
@QuLogic
Copy link
Member

QuLogic commented May 20, 2021

We already have tools to determine which files are relevant in tools/triage_tests.py and tools/visualize_tests.py. I'd rather not have a third copy of this subsetting that may fall out of sync if things change, because no-one's going to notice that the CI version is wrong until a test result is broken, which may be long after said change. Even the scripts in tools break periodically because no-one runs them as often as tests.

Filtering was necessary because the full set is ~50MB which is not practical for downloading and inspecting anyway.

So I disagree that filtering is necessary, because once downloaded, we can run the above tools on them to whittle them down to the important stuff. I'd rather we just upload the entire directory.

@andrzejnovak
Copy link
Contributor Author

andrzejnovak commented May 21, 2021

@QuLogic I think that's somewhat a different use case. I might want to run the tools you linked if I ran test locally and I suppose one could want to cross-check remotes by downloading the outputs from GHA and visualizing them as one would local tests.

On the other hand, publishing only the failed images lets one quickly (because it's only the affected files) download the failed result and inspect what's wrong by eye without requiring any extra setup.

I suppose you'll notice something broke next time there's a PR with a failed img comparison test (which seems to happen often enough :) ).

@jklymak
Copy link
Member

jklymak commented May 21, 2021

I guess I'm somewhat ambivalent about this. If I know a test fails, I usually need to debug it locally anyway, so getting the failed diff remotely is rarely useful. Occasionally there is something I can't repo locally, though....

@jklymak
Copy link
Member

jklymak commented Jun 10, 2021

@andrzejnovak Thanks a lot for this. If the tests fail we now upload all the artifacts to GitHub actions artifacts, so this PR has been superseded. Sorry we didn't catch that when it happened! Thanks...

@andrzejnovak
Copy link
Contributor Author

@jklymak no worries, glad the functionality made it in one way or another :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants