opendevreview | Merged openstack/releases master: [Yoga] Release puppet-nova https://review.opendev.org/c/openstack/releases/+/889563 | 11:31 |
---|---|---|
elodilles | it seems we have a release job failure, which looks to be real: https://lists.openstack.org/pipermail/release-job-failures/2023-August/001672.html | 12:15 |
elodilles | i mean, maybe a job re-enqueue "solves" the issue | 12:15 |
elodilles | (ssl.SSLEOFError) | 12:16 |
opendevreview | Merged openstack/releases master: Release neutron-lib 3.8.0 https://review.opendev.org/c/openstack/releases/+/890038 | 12:29 |
opendevreview | Merged openstack/releases master: openstacksdk 1.4.0 https://review.opendev.org/c/openstack/releases/+/889085 | 12:33 |
opendevreview | Merged openstack/releases master: Release rally-openstack 2.3.0 https://review.opendev.org/c/openstack/releases/+/890212 | 12:33 |
opendevreview | Merged openstack/releases master: Add xstatic-angular-fileupload first release via the releases repo https://review.opendev.org/c/openstack/releases/+/889385 | 12:38 |
opendevreview | Merged openstack/releases master: [Yoga] Release puppet-heat https://review.opendev.org/c/openstack/releases/+/889554 | 12:48 |
opendevreview | Merged openstack/releases master: [Yoga] Release puppet-ironic https://review.opendev.org/c/openstack/releases/+/889556 | 13:21 |
opendevreview | Merged openstack/releases master: [Yoga] Release puppet-swift https://review.opendev.org/c/openstack/releases/+/889574 | 13:21 |
opendevreview | Merged openstack/releases master: [Yoga] Release puppet-gnocchi https://review.opendev.org/c/openstack/releases/+/889553 | 13:21 |
opendevreview | Merged openstack/releases master: [Yoga] Release puppet-ceilometer https://review.opendev.org/c/openstack/releases/+/889547 | 13:38 |
elodilles | cool, as I see the release job failure was some temporary issue as the other release-openstack-puppet jobs have passed: https://zuul.opendev.org/t/openstack/builds?job_name=release-openstack-puppet&skip=0 | 13:48 |
elodilles | fungi: can you please re-enqueue the failing job? https://lists.openstack.org/pipermail/release-job-failures/2023-August/001672.html | 13:49 |
fungi | elodilles: i'm swamped at the moment, but can take a look soon hopefully. please bug me again if i don't | 13:50 |
fungi | (seems like half a dozen unrelated things have all decided to break at the same moment) | 13:50 |
elodilles | fungi: of course, no problem, sorry, it's not urgent, feel free to do it when you'll have time. (I'll also look at the job results and ping you again later if needed) | 13:56 |
opendevreview | Merged openstack/releases master: [Yoga] Release glance https://review.opendev.org/c/openstack/releases/+/889451 | 14:16 |
fungi | elodilles: looking at https://zuul.opendev.org/t/openstack/build/42b9fee45c4443fabcdec35831655fed/console#4/0/3/localhost it seems the puppetforge upload failed with repeated ssl errors on connecting. any idea if there was an outage there? | 15:57 |
fungi | jcapitao[m]: ^ any idea? i can't find a puppetforge services or api status page where they might log outages | 16:01 |
fungi | luckily we don't seem to have uploaded it anywhere else after that failed, so it's probably safe to retry | 16:02 |
fungi | i'm not currently getting any socket negotiation failures from that site, so if it was down it looks like it may be working again now | 16:04 |
fungi | zuul-client enqueue-ref --tenant=openstack --pipeline=release --project=openstack/puppet-nova --ref=refs/tags/20.4.1 --newrev=d7598a1ef2e7b9c1fd037360f5bbe8378e9d3bab | 16:06 |
fungi | i've run that ^ just now | 16:06 |
jcapitao[m] | I checked the SSL certificate to check if it was issued and loaded during puppet-nova upload, but no, it was issue couple of months ago | 16:07 |
jcapitao[m] | let's see | 16:08 |
fungi | the error in the traceback looks like it was probably an error at a lower level, not something as straightforward as an expired cert | 16:08 |
fungi | "EOF occurred in violation of protocol" | 16:09 |
jcapitao[m] | right | 16:09 |
jcapitao[m] | quite explicit message | 16:10 |
fungi | and the job retries it several times, but got the same error each time it tried, so was consistent at least at that point | 16:10 |
fungi | release-openstack-puppet is running again for it... | 16:12 |
fungi | jcapitao[m]: it got the exact same error connecting to the puppetforge api again | 16:14 |
jcapitao[m] | damn | 16:14 |
jcapitao[m] | yeah I watched the logs | 16:15 |
fungi | maybe puppetforge dropped their v2 api? | 16:15 |
fungi | it specifically complains about "url: /v2/releases" | 16:16 |
fungi | jcapitao[m]: what are the odds the error is because the puppetforge api is unceremoniously disconnecting the client partway into the upload or before responding with an error message about the upload? looking at https://zuul.opendev.org/t/openstack/builds?job_name=release-openstack-puppet uploads succeeded for 5 other modules after the initial failure | 16:17 |
fungi | so maybe there's something about the state of the puppet-nova module (invalid metadata for example)? | 16:18 |
fungi | what changes were there between 20.4.0 and 20.4.1? | 16:19 |
jcapitao[m] | yeah I think it's puppet-nova tarball particularity | 16:19 |
jcapitao[m] | we had issue with puppet-nova in the past IIRC | 16:19 |
jcapitao[m] | maybe it's an heavy one | 16:19 |
jcapitao[m] | dealing with multipart upload ? | 16:20 |
fungi | oh, sure could just be an intermittent problem related to the size of the tarball i suppose | 16:20 |
jcapitao[m] | "size": 15851807, | 16:21 |
jcapitao[m] | which is quite big | 16:21 |
jcapitao[m] | comparing to the other puppet modules | 16:22 |
jcapitao[m] | for example puppet-ceilometer it's 2405743 B | 16:22 |
fungi | as an aside, i see that links puppetforge supplies use /v3/releases so we're using an older api version (but it's the same one we uploaded the others with) | 16:23 |
fungi | unfortunately, if the api is rejecting the upload for some reason, we're not seeing the response because of the stream eof | 16:25 |
fungi | though your speculation about a chunked encoding issue seems reasonable | 16:25 |
fungi | i can try uploading yet again, though i don't have reason to believe the result will be any different | 16:26 |
jcapitao[m] | yeah I think it will be the same but we can give a try .. | 16:28 |
fungi | running again | 16:29 |
jcapitao[m] | ack | 16:30 |
jcapitao[m] | I have to leave, but I can check tomorrow if still not uploading | 16:30 |
fungi | fwiw, it did fail yet again | 16:39 |
fungi | and yeah, i see where it may have failed similarly on june 15 but then succeeded when we reran it two days later | 16:40 |
fungi | however that's too long ago to still have the logs, so i can't say for sure it was the same error message we got | 16:41 |
fungi | nope, different error: https://meetings.opendev.org/irclogs/%23openstack-release/%23openstack-release.2023-06-15.log.html#t2023-06-15T20:50:46 | 16:42 |
elodilles | fungi: thanks for trying, i saw that other release-openstack-puppet jobs succeeded that's why i thought it will succeed as well :S | 17:03 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!