| *** JayF_ is now known as JayF | 03:48 | |
| *** timburke_ is now known as timburke | 03:49 | |
| noonedeadpunk | hey folks! a weird think has happened - despite https://review.opendev.org/c/opendev/system-config/+/971629 is merged and promoted - meeting bot (and channel logging) is not happening for #openstack-freezer. | 15:21 |
|---|---|---|
| noonedeadpunk | Have I missed smth in the patch? | 15:21 |
| fungi | looking into it | 15:21 |
| fungi | noonedeadpunk: oh, adding channels requires a restart of the bot, which would interrupt/end any in-progress meetings in any channel, so it doesn't get automatically done at deploy and we try to remember (badly it seems!) to manually restart it at a convenient time after merging config changes | 15:24 |
| fungi | i'll check the meeting schedule real quick | 15:24 |
| noonedeadpunk | barbican has a mmeting now | 15:24 |
| noonedeadpunk | at least | 15:24 |
| fungi | yep, in progress in their channel, confirmed | 15:25 |
| fungi | and nova supposedly right after | 15:26 |
| noonedeadpunk | ++ | 15:27 |
| fungi | and secure default policies after that | 15:27 |
| fungi | but after 18:00 the schedule is clear today, so i can restart it a bit after then | 15:27 |
| noonedeadpunk | great, thanks! | 15:28 |
| fungi | i've set myself a reminder so i hopefully don't forget again | 15:30 |
| noonedeadpunk | I can ping you again if anything :) | 15:30 |
| zigo | Hi there! Happy new year! | 16:05 |
| zigo | On this PR: https://review.opendev.org/c/openstack/python-openstackclient/+/962841 I got multiple times a POST_FAILURE in osc-functional-devstack, I wonder if I should do yet-another-recheck or if there's something that can be fixed or what ... | 16:05 |
| frickler | zigo: the POST_FAILURE is misleading, the real issue is that the devstack deployment is failing. seems to affect all patches for that stable branch since like 9 months ... https://zuul.opendev.org/t/openstack/builds?job_name=osc-functional-devstack&project=openstack%2Fpython-openstackclient&branch=stable%2F2025.1&skip=0&limit=10 | 16:30 |
| frickler | might be a regression in OSC or devstack or neutron, would need further debugging | 16:31 |
| frickler | https://zuul.opendev.org/t/openstack/build/2a69d737a41e4158963d72b9d95600c8/log/job-output.txt#17402 shows the error. adding better handling for this type of issue to devstack based jobs might also be a nice task | 16:33 |
| mhu | Hi there, we're hitting some TLS issues with git on opendev.org/zuul/zuul-jobs seemingly at random, here is an error example:... (full message at <https://matrix.org/oftc/media/v1/media/download/AdT6nPgEQNUUtatC3nZg2LUFtMLUZlwpFzwZwSlL54GHl0964nd0x7p8MzkwcrbvyI_egWG8Sb7zDPsn6KrHlwRCeb1yqQTQAG1hdHJpeC5vcmcvc2RlQ2hYb0Znb3lmTFhXT0Zmc2NvTVlJ>) | 16:50 |
| mhu | Unfortunately it's hard to reproduce by hand but it does happen somewhat often in our CI | 16:53 |
| mhu | Could this be related to changes made to counter the AI bots? | 16:54 |
| fungi | mhu: odds are the "tls issue" is that the server never responded after the client connected to the socket, which would be less due to our ai bot mitigations and more that we're overrun by bot crawlers taking up all the available worker slots in apache | 17:24 |
| fungi | "error:0A000458:SSL routines::tlsv1 unrecognized name" seems to be the arcane signal from openssl that your client tried to do sni and the server claimed not to recognize the requested hostname | 18:11 |
| fungi | which is *probably* a sign that the load balancer sent your request to a backend that was overloaded, my guess is it's a pathological error condition and the server failed to look up the hostname at all or something | 18:12 |
| fungi | noonedeadpunk: the rbac pop-up team meeting was skipped, that concludes the published meeting agenda for the day so i'm restarting meetbot shortly to pick up the #openstack-freezer channel addition | 18:15 |
| fungi | #status log Restarted the meetbot container to pick up new configuration changes | 18:19 |
| opendevstatus | fungi: finished logging | 18:19 |
| fungi | looks like opendevmeet newly joined #openstack-freezer, so public logs for it should start appearing soon | 18:20 |
| noonedeadpunk | yup, see bots, thanks! | 19:57 |
| Clark[m] | mhu fungi and just to be clear none of the mitigations we've employed are expected to affect git clients that properly identify themselves which c git and zuul python git do last I checked. We'd love it if more of the crawlers were git aware it would decrease load tremendously | 20:15 |
| opendevreview | Damian Fajfer proposed zuul/zuul-jobs master: fix(upload-pypi): Ensure twine is installed before using it https://review.opendev.org/c/zuul/zuul-jobs/+/972244 | 21:12 |
| opendevreview | Damian Fajfer proposed zuul/zuul-jobs master: fix: Rewrite upload-pypi test-playbook https://review.opendev.org/c/zuul/zuul-jobs/+/972252 | 23:13 |
Generated by irclog2html.py 4.0.0 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!