@fungicide:matrix.org | > <@sean-k-mooney:matrix.org> corvus: i tried the tutorial in a clean ubuntu 22.04 cloud image where it was the only thing i had done. so perhaps but that implies the issue is with ubuntu 22.04 | 13:03 |
---|---|---|
looks like we're testing it on 22.04: https://zuul.opendev.org/t/zuul/build/3f82214752b24914aeaa608f58190fdf | ||
@sean-k-mooney:matrix.org | ya not really sure what to say. when i hit the gerrit endpoint i still got the first login screen | 14:36 |
-@gerrit:opendev.org- Zuul merged on behalf of Simon Westphahl: [zuul/zuul] 860684: End node request span when result event is sent https://review.opendev.org/c/zuul/zuul/+/860684 | 14:38 | |
@sean-k-mooney:matrix.org | fungi: my only guess is that https://opendev.org/zuul/zuul/src/branch/master/playbooks/tutorial/roles/setup-tutorial/tasks/main.yaml#L48-L62 bypasses the start page by pooling it | 14:45 |
@sean-k-mooney:matrix.org | its basically what https://opendev.org/zuul/zuul/src/branch/master/doc/source/examples/playbooks/setup.yaml#L29-L37 is doign however that does not wait for gerrit to be up | 14:48 |
@sean-k-mooney:matrix.org | i guess it does here https://opendev.org/zuul/zuul/src/branch/master/doc/source/examples/playbooks/setup.yaml#L9 | 14:49 |
@sean-k-mooney:matrix.org | in anycase in ci we retry it where as in docker-compose we try just once | 14:49 |
@sean-k-mooney:matrix.org | i think that is why im seing a delta | 14:50 |
@sean-k-mooney:matrix.org | so the fix might eb to add | 14:51 |
@sean-k-mooney:matrix.org | until: result.status == 200 and not result.redirected | 14:51 |
delay: 1 | ||
retries: 120 | ||
@sean-k-mooney:matrix.org | to the docker compose version | 14:51 |
@sean-k-mooney:matrix.org | although no that wont work since its before we have created the admin user | 14:52 |
@sean-k-mooney:matrix.org | actully maybe it will we are not creatign the admin user we are just updating the ssh keys | 14:53 |
@sean-k-mooney:matrix.org | so it should already exist in demo mode | 14:53 |
@clarkb:matrix.org | corvus: tristanC My efforts on https://review.opendev.org/c/zuul/zuul/+/860753 to try and use swc instead of Babel are stalled out because I think patternfly requires react 16? Is that a good assumption or is there a way to use react 17/18 with patternfly somehow? I don't think this is super urgent but we do spend a fair bit of time running Babel compiles and swc (also esbuild) are supposedly much quicker | 15:16 |
@jim:acmegating.com | Clark: i haven't looked into react/patternfly version compatability, so i don't know off the top of my head. | 15:21 |
@jim:acmegating.com | Clark: this doesn't address your question, but just as an fyi, react-app-rewired also exists to solve the same problem as craco, and may be more lightweight for certain things. | 15:25 |
@clarkb:matrix.org | I'll have to take a look at that. All of this is quite new to me :) | 15:26 |
-@gerrit:opendev.org- James E. Blair https://matrix.to/#/@jim:acmegating.com proposed: | 15:50 | |
- [zuul/zuul] 859466: Linger on auth_callback page until login is complete https://review.opendev.org/c/zuul/zuul/+/859466 | ||
- [zuul/zuul] 859481: Web: always set redux auth and wait for it https://review.opendev.org/c/zuul/zuul/+/859481 | ||
- [zuul/zuul] 860032: Show login button any time auth is available https://review.opendev.org/c/zuul/zuul/+/860032 | ||
- [zuul/zuul] 860033: Add access-rules configuration and documentation https://review.opendev.org/c/zuul/zuul/+/860033 | ||
- [zuul/zuul] 860034: Set Access-Control-Allow-Origin headers in check_auth tool https://review.opendev.org/c/zuul/zuul/+/860034 | ||
- [zuul/zuul] 860035: Support authz for read-only web access https://review.opendev.org/c/zuul/zuul/+/860035 | ||
-@gerrit:opendev.org- James E. Blair https://matrix.to/#/@jim:acmegating.com proposed: [zuul/zuul] 860605: Include skip reason in build error_detail https://review.opendev.org/c/zuul/zuul/+/860605 | 15:55 | |
@clarkb:matrix.org | https://review.opendev.org/c/zuul/nodepool/+/860591/ is a quick review that should fix a nodepool unittest race (thank you corvus for debugging that and fixing it) | 15:57 |
-@gerrit:opendev.org- Clark Boylan proposed: [zuul/zuul] 860858: Build zuul web with esbuild https://review.opendev.org/c/zuul/zuul/+/860858 | 17:02 | |
@clarkb:matrix.org | Trying a different approach with react-app-rewired and esbuild instead as there seem to be existing tools build around that that work with older react | 17:03 |
-@gerrit:opendev.org- James E. Blair https://matrix.to/#/@jim:acmegating.com proposed: [zuul/zuul] 860606: Include some skipped jobs in the code-review report https://review.opendev.org/c/zuul/zuul/+/860606 | 17:03 | |
@clarkb:matrix.org | One thing I notice is that we don't really have a good way to clean out old builds? | 17:04 |
@clarkb:matrix.org | we git ignore them so git clean by default doesn't clear them. I guess just need to use -x | 17:05 |
@clarkb:matrix.org | I think that cuts the build time to just under a minute from about 6 minutes. I'm definitely not the person to weigh in on how safe that is | 17:10 |
@clarkb:matrix.org | also I think the esbuilder stuff wants webpack 4.4.0 or newer but we're on webpack 4.3.3 for some reason (we don't list it explicitly so must be a transitive dep?) | 17:12 |
@jim:acmegating.com | Clark: i think cra is responsible for webpack | 17:14 |
@clarkb:matrix.org | corvus: hrm react-scripts actually pulls in webpack 4.42.0 which does satisfy webpack@^4.40.0 I wonder why it complains. Anyway I think this is actually fine due to that | 17:31 |
@clarkb:matrix.org | Oh I think it is because react-scripts is a dependency but not a dev dependency | 17:32 |
@clarkb:matrix.org | and so things are a bit split. I think react-scripts should actually be a dev dependency? | 17:33 |
@clarkb:matrix.org | hrm no that doesn't make the warning go away | 17:36 |
@clarkb:matrix.org | interesting | 17:36 |
@clarkb:matrix.org | I guess package.json understands "peer dependencies" and since we don't list webpack at the same level it is unmet? If I add `webpack@^4.40.0` it makes the error go away but then the package lock lists the older webpack and newer webpack versions. Any idea how that gets resolved? | 17:49 |
@clarkb:matrix.org | Old Babel build on OVH BHS1 https://zuul.opendev.org/t/zuul/build/bc838c782ae744b4b4c266a64e2d4568/log/job-output.txt#840-842 New esbuild on OVH BHS1 370 seconds vs 67. You have to scroll down a bit to see where it reports the second count. But the timestamps for the job also illustrate this | 18:39 |
@clarkb:matrix.org | * Old Babel build on OVH BHS1 https://zuul.opendev.org/t/zuul/build/bc838c782ae744b4b4c266a64e2d4568/log/job-output.txt#840-842 New esbuild on OVH BHS1 https://zuul.opendev.org/t/zuul/build/d2630a16c15c43c281fb7b3345786a57/log/job-output.txt#833-835 370 seconds vs 67. You have to scroll down a bit to see where it reports the second count. But the timestamps for the job also illustrate this | 18:39 |
@clarkb:matrix.org | I guess we should double check the site preview build works as expected | 18:40 |
@clarkb:matrix.org | The site preview does seem to work. It does also seem slow. I'm not sure if that is due to my change or just a problem of viewing through the object storage hosting. I'll try to cross check against another build | 18:42 |
@clarkb:matrix.org | ya they both seem slow so hard to blame the esbuild (though still possible) | 18:44 |
@jim:acmegating.com | Clark: output size seems to be the same (checking the tarball) | 19:16 |
@jim:acmegating.com | new has fewer files | 19:17 |
-@gerrit:opendev.org- James E. Blair https://matrix.to/#/@jim:acmegating.com proposed: | 20:24 | |
- [zuul/zuul] 860877: Add "draft" github pipeline requirement https://review.opendev.org/c/zuul/zuul/+/860877 | ||
- [zuul/zuul] 860878: Expand github pipeline reject docs https://review.opendev.org/c/zuul/zuul/+/860878 | ||
@fungicide:matrix.org | 5x speedup?!? that's nothing to sneeze at! | 20:42 |
@clarkb:matrix.org | ya its definitely not the biggest cost in those jobs, but does seem to be something we can have a valuable impact on | 20:47 |
@clarkb:matrix.org | I think what I've found when digging into jobs that are slow is that it isn't typical for a single task to be consuming the vast majority of runtime in a non valid manner. Instead we tend to have a bunch of tasks that are all a bit slow and if we improve them in aggregate we see good gains | 20:48 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!