Monday, 2026-01-05

*** JayF_ is now known as JayF03:48
*** timburke_ is now known as timburke03:49
noonedeadpunkhey folks! a weird think has happened - despite https://review.opendev.org/c/opendev/system-config/+/971629 is merged and promoted - meeting bot (and channel logging) is not happening for #openstack-freezer.15:21
noonedeadpunkHave I missed smth in the patch?15:21
fungilooking into it15:21
funginoonedeadpunk: oh, adding channels requires a restart of the bot, which would interrupt/end any in-progress meetings in any channel, so it doesn't get automatically done at deploy and we try to remember (badly it seems!) to manually restart it at a convenient time after merging config changes15:24
fungii'll check the meeting schedule real quick15:24
noonedeadpunkbarbican has a mmeting now15:24
noonedeadpunkat least15:24
fungiyep, in progress in their channel, confirmed15:25
fungiand nova supposedly right after15:26
noonedeadpunk++15:27
fungiand secure default policies after that15:27
fungibut after 18:00 the schedule is clear today, so i can restart it a bit after then15:27
noonedeadpunkgreat, thanks!15:28
fungii've set myself a reminder so i hopefully don't forget again15:30
noonedeadpunkI can ping you again if anything :)15:30
zigoHi there! Happy new year!16:05
zigoOn this PR: https://review.opendev.org/c/openstack/python-openstackclient/+/962841 I got multiple times a POST_FAILURE in osc-functional-devstack, I wonder if I should do yet-another-recheck or if there's something that can be fixed or what ...16:05
fricklerzigo: the POST_FAILURE is misleading, the real issue is that the devstack deployment is failing. seems to affect all patches for that stable branch since like 9 months ... https://zuul.opendev.org/t/openstack/builds?job_name=osc-functional-devstack&project=openstack%2Fpython-openstackclient&branch=stable%2F2025.1&skip=0&limit=1016:30
fricklermight be a regression in OSC or devstack or neutron, would need further debugging16:31
fricklerhttps://zuul.opendev.org/t/openstack/build/2a69d737a41e4158963d72b9d95600c8/log/job-output.txt#17402 shows the error. adding better handling for this type of issue to devstack based jobs might also be a nice task16:33
mhuHi there, we're hitting some TLS issues with git on opendev.org/zuul/zuul-jobs seemingly at random, here is an error example:... (full message at <https://matrix.org/oftc/media/v1/media/download/AdT6nPgEQNUUtatC3nZg2LUFtMLUZlwpFzwZwSlL54GHl0964nd0x7p8MzkwcrbvyI_egWG8Sb7zDPsn6KrHlwRCeb1yqQTQAG1hdHJpeC5vcmcvc2RlQ2hYb0Znb3lmTFhXT0Zmc2NvTVlJ>)16:50
mhuUnfortunately it's hard to reproduce by hand but it does happen somewhat often in our CI16:53
mhuCould this be related to changes made to counter the AI bots?16:54
fungimhu: odds are the "tls issue" is that the server never responded after the client connected to the socket, which would be less due to our ai bot mitigations and more that we're overrun by bot crawlers taking up all the available worker slots in apache17:24
fungi"error:0A000458:SSL routines::tlsv1 unrecognized name" seems to be the arcane signal from openssl that your client tried to do sni and the server claimed not to recognize the requested hostname18:11
fungiwhich is *probably* a sign that the load balancer sent your request to a backend that was overloaded, my guess is it's a pathological error condition and the server failed to look up the hostname at all or something18:12
funginoonedeadpunk: the rbac pop-up team meeting was skipped, that concludes the published meeting agenda for the day so i'm restarting meetbot shortly to pick up the #openstack-freezer channel addition18:15
fungi#status log Restarted the meetbot container to pick up new configuration changes18:19
opendevstatusfungi: finished logging18:19
fungilooks like opendevmeet newly joined #openstack-freezer, so public logs for it should start appearing soon18:20
noonedeadpunkyup, see bots, thanks!19:57
Clark[m]mhu fungi and just to be clear none of the mitigations we've employed are expected to affect git clients that properly identify themselves which c git and zuul python git do last I checked. We'd love it if more of the crawlers were git aware it would decrease load tremendously 20:15
opendevreviewDamian Fajfer proposed zuul/zuul-jobs master: fix(upload-pypi): Ensure twine is installed before using it  https://review.opendev.org/c/zuul/zuul-jobs/+/97224421:12
opendevreviewDamian Fajfer proposed zuul/zuul-jobs master: fix: Rewrite upload-pypi test-playbook  https://review.opendev.org/c/zuul/zuul-jobs/+/97225223:13

Generated by irclog2html.py 4.0.0 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!