opendevreview | Lukas Kranz proposed zuul/zuul-jobs master: Make prepare-workspace-git fail faster. https://review.opendev.org/c/zuul/zuul-jobs/+/910582 | 06:35 |
---|---|---|
opendevreview | Merged openstack/project-config master: Remove old infra team puppet testing https://review.opendev.org/c/openstack/project-config/+/912309 | 10:08 |
*** Adri2000_ is now known as Adri2000 | 10:25 | |
opendevreview | Dr. Jens Harbott proposed zuul/zuul-jobs master: DNM: testing ensure-docker role https://review.opendev.org/c/zuul/zuul-jobs/+/913897 | 11:24 |
frickler | infra-root: checking autoholds, I notice that there are (again?) some held nodes without matching autohold in zuul (associated with my name). seems some missing cleanup when zuul data were reset? | 11:27 |
yoctozepto | thanks for looking into ensure-docker, frickler! nebulous has been hit badly by this | 11:34 |
frickler | yoctozepto: see https://review.opendev.org/c/zuul/zuul-jobs/+/909029 , no idea how to fix though | 11:34 |
yoctozepto | frickler: any quick summary how that one is related? | 11:35 |
yoctozepto | (sorry, I am just getting around to this issue!) | 11:35 |
frickler | yoctozepto: that set's the API version cap for 1.22, which is no longer supported in 26.0.0 | 11:36 |
yoctozepto | frickler: ack, so the new version is the culprit for this recent breakage | 11:37 |
opendevreview | Dr. Jens Harbott proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 11:41 |
yoctozepto | frickler: not to break this poor skopeo (if still applicable) - may we default to installing docker<26.0? this would keep the yesterday's happy jobs | 11:44 |
yoctozepto | then we only need to announce on the mailing list that we have this workaround | 11:44 |
yoctozepto | and then have some time to think this situation through | 11:44 |
yoctozepto | WDYT? | 11:45 |
yoctozepto | also cc fungi and clarkb because maybe you will be around soon | 11:45 |
frickler | yoctozepto: feel free to propose a patch. I'm trying to see what the revert will show, but it ended up in merge conflict | 11:45 |
yoctozepto | yeah, I see | 11:45 |
yoctozepto | I'm getting my dev env up and will produce something | 11:46 |
opendevreview | Dr. Jens Harbott proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 11:48 |
yoctozepto | eh, the role does not support setting any versions/pins - would be more hacky / less minimal than I hoped for! | 11:50 |
yoctozepto | commented on your revert | 11:52 |
yoctozepto | frickler | 11:52 |
yoctozepto | let me know if you agree - then we change this patch to get it to do the right thing (TM) | 11:54 |
frickler | yoctozepto: I actually have no idea about all this, feel free to do anything you consider appropriate | 11:56 |
yoctozepto | https://zuul.opendev.org/t/zuul/build/a9bf948e8c824fb18076d6f212d3e0ad | 11:57 |
yoctozepto | it seems we need to keep that hotfix then | 11:57 |
yoctozepto | I am amending | 11:57 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 11:58 |
yoctozepto | done | 11:58 |
frickler | just noticed that this is also blocking project-config changes :-( | 11:58 |
yoctozepto | oh my | 11:59 |
yoctozepto | little hell down here today it seems | 11:59 |
yoctozepto | hmm, it seems k8s-crio jobs are affected by something else as well | 12:04 |
yoctozepto | but at least mk8s is passing now | 12:04 |
yoctozepto | any idea if there is a job that exercises skopeo to the extent that would tell us how much we break? | 12:04 |
fungi | frickler: roughly how long ago are the timestamps on those forgotten held nodes? wondering if they're older than the last time we knowingly/intentionally had to reset zk state | 12:05 |
frickler | fungi: zuul says 2 months. and yes, current autohold count is at 12, so got reset recentish | 12:07 |
yoctozepto | we also have zuul-jobs-test-registry-buildset-registry failing | 12:10 |
yoctozepto | but this is because it is a config-project job and it does not pick up this patch | 12:10 |
yoctozepto | we need to get it out of this part of CI as it does not test the right thing! | 12:10 |
yoctozepto | I am trying to find out if I can quickly fix crio jobs | 12:11 |
fungi | yoctozepto: which config project? one in nebulous? | 12:12 |
frickler | zuul-jobs | 12:12 |
yoctozepto | yeah | 12:14 |
yoctozepto | opendev-buildset-registry is the parent | 12:15 |
yoctozepto | so it does not test the new state, but the broken old one | 12:15 |
frickler | but there must be some other reason why this is still failing https://zuul.opendev.org/t/zuul/build/13baac456db247e2b708e477fc1df427 | 12:15 |
frickler | let me hold another job for debugging | 12:15 |
yoctozepto | so the crio is definitely | 12:17 |
yoctozepto | Failed to update apt cache: W:The repository 'http://apt.kubernetes.io kubernetes-xenial Release' does not have a Release file., W:Data from such a repository can't be authenticated and is therefore potentially dangerous to use., W:See apt-secure(8) manpage for repository creation and user configuration details., E:Failed to fetch | 12:17 |
yoctozepto | https://packages.cloud.google.com/apt/dists/kubernetes-xenial/main/binary-amd64/Packages 404 Not Found [IP: 2607:f8b0:400b:807::200e 443], E:Some index files failed to download. They have been ignored, or old ones used instead. | 12:17 |
yoctozepto | definitely broken* | 12:17 |
yoctozepto | because of the apt repo | 12:17 |
yoctozepto | irrelevant of this, we can disable for now as I don't know a good fix | 12:18 |
yoctozepto | frickler | 12:18 |
yoctozepto | it fails because it does not pick up the fix | 12:18 |
yoctozepto | you can see the tasks it has realised | 12:18 |
yoctozepto | it still uses the old one with overrides | 12:19 |
yoctozepto | and it is because this is parented on opendev-buildset-registry | 12:19 |
yoctozepto | it never tested the right changes... | 12:19 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 12:28 |
fungi | got it, so in opendev-base-jobs | 12:29 |
yoctozepto | the only question that remains is how much we break anything-skopeo | 12:29 |
yoctozepto | yeah, I meant the name of the zuul's kind of repos (config vs untrusted), rather than the name of the repo itself | 12:29 |
yoctozepto | any ideas about this skopeo? | 12:30 |
yoctozepto | otherwise I say we merge this as-is | 12:30 |
fungi | got it, and the opendev-buildset-registry job's pre-run playbook does an include of the (presently broken) ensure-docker role | 12:31 |
fungi | i need to switch computers to check, but has anyone raised this in the zuul matrix channel yet? | 12:33 |
yoctozepto | fungi: frickler did | 12:33 |
yoctozepto | and we are quite verbose with the commits flow now :D | 12:33 |
fungi | cool, thanks | 12:33 |
yoctozepto | https://zuul.opendev.org/t/zuul/build/1b55c2b43be740cab4a38bd65776440c | 12:52 |
yoctozepto | meh, I forgot about this little thing there | 12:52 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 12:55 |
yoctozepto | now the linters shall be happy | 12:55 |
Clark[m] | Are you telling me that docker 26 explodes if you see min API version rather than simply ignoring the env var? | 13:26 |
fungi | that's what appears to be the case, yes | 13:26 |
Clark[m] | I have a very strong reaction to software designed this way that I will keep to myself | 13:26 |
fungi | i'm sure everyone else is already saying worse on twitter | 13:27 |
Clark[m] | Also they are about two months early on the release | 13:28 |
Clark[m] | And I bet skopeo has done nada to make things better on their end | 13:29 |
Clark[m] | If anyone knows skopeo folks backports of API version negotiation would be helpful due to golang build requirements | 13:29 |
fungi | supposedly skopeo as of last month didn't need that workaround? based on the inline comments anyway | 13:29 |
fungi | but yes, very very recent at best | 13:29 |
Clark[m] | If you build the latest version which you cannot do with golang that comes on jammy | 13:30 |
fungi | aha, i missed that nuance | 13:30 |
Clark[m] | Which happens to still be the current Ubuntu lts | 13:30 |
fungi | and i suppose finding/installing prebuilt skopeo binaries is either a bad idea or not possible | 13:31 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 13:31 |
Clark[m] | They don't publish binaries to GitHub releases. In the way back we used the kubic OBS repos but those are pretty dead now aiui | 13:33 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 13:34 |
Clark[m] | Note those registry jobs will be broken because skopeo should be broken too. | 13:35 |
Clark[m] | I think a recheck on the reenable change should expose that if we land the first change | 13:35 |
yoctozepto | let's maybe try to keep the discussion on matrix | 13:37 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 14:15 |
opendevreview | Ron Stone proposed openstack/project-config master: Update StarlingX docs promote job for R9 release https://review.opendev.org/c/openstack/project-config/+/913907 | 14:31 |
opendevreview | Merged zuul/zuul-jobs master: Revert "Override DOCKER_MIN_API_VERSION for skopeo when installing docker" https://review.opendev.org/c/zuul/zuul-jobs/+/913808 | 14:45 |
opendevreview | Merged openstack/project-config master: Update StarlingX docs promote job for R9 release https://review.opendev.org/c/openstack/project-config/+/913907 | 14:54 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 15:07 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 15:07 |
clarkb | fungi: any idea why https://zuul.opendev.org/t/zuul/build/9a3affc1df884603a0e477a05e90c915/console#1/0/19/debian-bookworm failed? I thought we configured things to trust those repos? | 15:27 |
clarkb | hrm maybe whatever that step is did not run in that job? | 15:27 |
clarkb | https://zuul.opendev.org/t/zuul/build/9a3affc1df884603a0e477a05e90c915/console#0/4/22/debian-bookworm apt update worked earlier in the job in this task | 15:28 |
fungi | looking | 15:29 |
clarkb | it must be because we add that new repository without any override? | 15:29 |
clarkb | oh that mirror is ubuntu only | 15:29 |
clarkb | I think we just rip it out and then use packages from upstream | 15:29 |
fungi | hah, indeed, hence the 404 there | 15:30 |
fungi | the error message is misleading since it tries to cover too many possible reasons | 15:31 |
clarkb | ya | 15:34 |
clarkb | I think I know how to fix this I just have to sort out the right variables to set | 15:35 |
opendevreview | James E. Blair proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 15:41 |
opendevreview | James E. Blair proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 15:41 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 15:48 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 15:48 |
clarkb | oh shoot | 15:48 |
fungi | when changes collide | 15:48 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 15:56 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 15:56 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 16:18 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 16:18 |
Clark[m] | this is becoming quite the adventure | 16:18 |
clarkb | er that was meant for the zuul matrix room, meh | 16:18 |
frickler | oh, btw, I did the linaro cert update successfully earlier today, but then forgot to make a note here because the other issues popped up | 16:33 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 16:35 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 16:35 |
clarkb | frickler: thank you! | 16:35 |
clarkb | frickler: did you find any issues with the local docs? If not I can port them as is to our proper docs | 16:35 |
fungi | thanks frickler! | 16:39 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 17:01 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 17:02 |
frickler | clarkb: the docs worked perfectly, I could copy them 1:1. I've been wondering whether the kolla invocation could be optimized to only update the haproxy, but no reason not to document it as is for now | 17:11 |
clarkb | frickler: sure feel free to push up a change aprticularly if you've got a better kolla incantation | 17:11 |
frickler | clarkb: the problem is that we are using kolla-ansible in a completely different, wrapped way downstream, so I would need to do some testing first | 17:13 |
clarkb | ah | 17:26 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 17:30 |
opendevreview | Clark Boylan proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 17:30 |
clarkb | I know there has been a ton of excitement today, but I'm wondering if there is any interest in a gitea 1.21.8 upgrade and/or upgrading refstack's mariadb? | 17:33 |
clarkb | I can probably be awake and fight this cold off enough if we want to do one or the other (possibly both?) | 17:34 |
fungi | i'm probably going to be mia between 18:30-20:30 because a restaurant some ways up the island that christine really likes is reopening for the season and i promised i'd take her to an early dinner there. otherwise i'm free to help | 17:39 |
clarkb | fungi: do you have a preference for which one we should do? I think gitea is probably safer so is my prefernece. https://review.opendev.org/c/opendev/system-config/+/913686 is the change if you can review it I can approve it around 2000 UTC ? | 17:40 |
yoctozepto | get well soon clarkb! | 17:40 |
yoctozepto | fungi: enjoy the dinner! | 17:41 |
fungi | clarkb: gitea sounds great to me | 17:41 |
fungi | maybe both in one day is a bit much, but we could do the other tomorrow if you're up for it | 17:42 |
clarkb | ++ | 17:42 |
fungi | i'm +2 on https://review.opendev.org/913686 , i guess that change is all we need to trigger the upgrade? | 17:43 |
clarkb | yup should be | 17:43 |
clarkb | ok I'll try to remember to apprive it around 20:00 UTC | 17:43 |
fungi | thanks! | 17:44 |
fungi | heading out a couple minutes early, but will be back for gitea upgrade fun | 18:12 |
clarkb | infra-root rax notified us of a cinder volume having trouble. It appears this volume was a boot from volume system disk for a node that was in rax dfw that we can probably clean up. Shouldn't have any impact on any running services | 18:48 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 19:16 |
clarkb | as promised I have approved https://review.opendev.org/c/opendev/system-config/+/913686 at ~20:00 UTC | 19:57 |
fungi | i'm on my way back, should be at the keyboard in ~30 | 19:59 |
clarkb | see you soon | 20:02 |
fungi | back just in time, it seems | 20:39 |
clarkb | plenty of time actually. Looks like another 45 minutes before merging | 20:42 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 21:07 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable buildset-registry jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913902 | 21:09 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 21:09 |
opendevreview | Radosław Piliszek proposed zuul/zuul-jobs master: Reenable crio jobs https://review.opendev.org/c/zuul/zuul-jobs/+/913905 | 21:09 |
clarkb | the gitea update should merge any moment now. Then it should go straight to deployment as there are no hourly jobs currently running | 21:34 |
fungi | awesome | 21:36 |
opendevreview | Merged opendev/system-config master: Update gitea to 1.21.8 https://review.opendev.org/c/opendev/system-config/+/913686 | 21:40 |
fungi | there we are | 21:41 |
clarkb | promotion is done and deployment is starting. Should be a few minutes and then we'll be able to check gitea09 | 21:41 |
fungi | seems 09 is down | 21:43 |
clarkb | ya ist waiting for services to shutdown now | 21:43 |
clarkb | it is back | 21:44 |
clarkb | https://gitea09.opendev.org:3081/opendev/system-config loads for me | 21:44 |
fungi | same. i'm cloning nova from it now | 21:44 |
fungi | because that's the best way we have for making it wish it could go back offline again ;) | 21:45 |
clarkb | ha I cloned system-config and that worked for me | 21:45 |
fungi | i'm not so considerate | 21:45 |
clarkb | gitea10 is done now too | 21:47 |
fungi | gitea09 obliged my nova clone request nevertheless. seems the upgrade is in good shape | 21:49 |
clarkb | we're done through 12 no | 21:49 |
clarkb | *through 12 now | 21:49 |
clarkb | and now all are done | 21:52 |
fungi | and all deploy jobs succeeded | 21:53 |
clarkb | this appears to have gone well. | 22:00 |
fungi | yes! now to hope tomorrow's goes as smoothly | 22:00 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!