*** benj_71 is now known as benj_7 | 02:41 | |
eandersson | Mentioned this in opendev as well, but I think something broke with the github mirror after the github issues last week. | 06:08 |
---|---|---|
opendevreview | Dr. Jens Harbott proposed openstack/project-config master: Update github ssh rsa hostkey for uploads https://review.opendev.org/c/openstack/project-config/+/878616 | 07:06 |
*** jpena|off is now known as jpena | 07:42 | |
opendevreview | Merged openstack/project-config master: Update github ssh rsa hostkey for uploads https://review.opendev.org/c/openstack/project-config/+/878616 | 08:40 |
ykarel_ | hi a patch is stuck for 96 hours https://zuul.opendev.org/t/openstack/status#877944 | 13:11 |
ykarel_ | all other looks good, was there some issue around that time? | 13:12 |
ykarel_ | @infra-root ^ | 13:12 |
fungi | yeah, looks like it's waiting on a node (or nodes) for a neutron-tempest-plugin-ovn build | 13:12 |
fungi | that would have been since approximately this time thursday... | 13:13 |
ykarel_ | hmm there are multiple runs for that job since that time, just this one stuck | 13:14 |
fungi | so that was ~2023-03-23T12:28 | 13:14 |
fungi | according to https://meetings.opendev.org/irclogs/%23opendev/%23opendev.2023-03-23.log.html we had something going on with gerrit getting overwhelmed but it should have been resolved for hours by the time that change was enqueued | 13:16 |
fungi | if something was going on at that time, it doesn't seem we noticed it | 13:17 |
fungi | i'll have a look in logs | 13:17 |
ykarel_ | ohkk | 13:20 |
*** ykarel_ is now known as ykarel | 13:21 | |
frickler | there was a gerrit restart, too, maybe related | 13:26 |
fungi | hours earlier, but yes | 13:26 |
fungi | 2023-03-23 12:28:31,965 DEBUG zuul.Pipeline.openstack.check: [e: 5ad62b77625a411aa848b88e8bc76f94] Adding node request <NodeRequest 300-0020813181 ['nested-virt-ubuntu-focal']> for job <FrozenJob neutron-tempest-plugin-ovn> to item <QueueItem eceb7ec1d0e541618279a4abb0c47bb2 for <Change 0x7fe7228c54d0 openstack/neutron-tempest-plugin 877944,1> in check> | 13:28 |
fungi | so that was the node request | 13:28 |
fungi | now to see whatever became of nr 300-0020813181 | 13:28 |
fungi | zuul schedulers have been repeatedly logging this about it: | 13:29 |
fungi | 2023-03-27 13:27:18,855 DEBUG zuul.nodepool: [e: 5ad62b77625a411aa848b88e8bc76f94] Unable to revise locked node request <NodeRequest 300-0020813181 ['nested-virt-ubuntu-focal']> | 13:29 |
fungi | looks like a new patchset got uploaded moments ago though, so it's been dequeued now | 13:32 |
fungi | i guess i started tracking it down too late | 13:32 |
fungi | i'll still see if i can tell which launcher(s) took it | 13:33 |
fungi | 2023-03-23 12:28:36,524 DEBUG nodepool.driver.NodeRequestHandler[nl04.opendev.org-PoolWorker.ovh-bhs1-main-23b110982d5948f792c00c925fd701f8]: [e: 5ad62b77625a411aa848b88e8bc76f94] [node_request: 300-0020813181] Accepting node request | 13:38 |
fungi | and eventually this happened... "Launch failed for node" "Declining node request because nodes failed" | 13:39 |
fungi | so declined at 12:38:53, now to see if i can tell which launcher took it next | 13:40 |
fungi | oho, nl03 declined it at 12:38:54 and then logged this strange error... | 13:47 |
fungi | 2023-03-23 12:38:54,715 ERROR nodepool.driver.NodeRequestHandler[nl03.opendev.org-PoolWorker.osuosl-regionone-main-1791ed94295c4c6994d62df80ff6e617]: [e: 5ad62b77625a411aa848b88e8bc76f94] [ | 13:48 |
fungi | node_request: 300-0020813181] Unable to modify missing request | 13:48 |
fungi | nodepool.exceptions.ZKLockException: Request <NodeRequest [...] > does not hold a lock | 13:49 |
fungi | so maybe an unlock race for the znode? | 13:49 |
*** jpena is now known as jpena|off | 16:33 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!