Friday, 2026-02-13

*** ykarel__ is now known as ykarel04:28
opendevreviewJoan Gilabert proposed openstack/releases master: Release Watcher stable/2025.2 for several bug fixes  https://review.opendev.org/c/openstack/releases/+/97672008:45
fricklerrelease-team: we keep on getting job timeouts on https://review.opendev.org/c/openstack/releases/+/976622 , I'm assuming this is due to general opendev.org slowness, not sure what we can do about it? ask stephenfin to split this into pieces? or override somehow?13:23
elodillesi guess the easiest workaround is to split the patch to multiple release patches (probably no need for a separate patch for every single xstatix-* deliverable, though)13:32
elodillesthe question is whether this is a new slowness that would impact our coordinated release on April 1st13:33
elodilleschecking on a random job that is not the usual 2-4 mins log, but 17 mins, i see these steps took extra long: https://paste.opendev.org/show/bkmbqLra5QPX3DbZuC1k/13:39
elodillesso yeah, it seems that sometimes there are ~2 mins lags at some points during the job run13:43
ttxo/14:00
ttx#startmeeting releaseteam14:00
opendevmeetMeeting started Fri Feb 13 14:00:26 2026 UTC and is due to finish in 60 minutes.  The chair is ttx. Information about MeetBot at http://wiki.debian.org/MeetBot.14:00
opendevmeetUseful Commands: #action #agreed #help #info #idea #link #topic #startvote.14:00
opendevmeetThe meeting name has been set to 'releaseteam'14:00
ttxPing list: release-team elod14:00
ttxOur agenda is at https://etherpad.opendev.org/p/gazpacho-relmgt-tracking#L27014:01
elodilleso/14:01
ttxHi elodilles !14:01
ttx#topic Review R-7 week task completion14:01
ttx- Notify the Infrastructure team to generate an artifact signing key (but not replace the current one yet), and begin the attestation process (ttx)14:01
ttxfungi: consider yourself notified!14:02
ttx- Check with the Technical Committee to make sure Python runtimes have been determined for the next development cycle (elod)14:03
elodilles(elod) i've pinged TC about this via their IRC channel14:03
elodillesin fact,14:03
elodillesthe patch is up for reviews:14:03
elodilleshttps://review.opendev.org/c/openstack/governance/+/97669114:03
elodillesthough usually it takes some time and discussions until it gets settled14:04
fungithanks ttx14:04
frickler\o14:04
fungiplanning to work on it today actually14:05
ttx- Propose DNM changes on repositories where no patches merged recently to check that tests are still passing with the current set of dependencies (libraries, client libraries) (elod)14:05
elodilles(elod) due to the current broken gates (setuptools 82.0.0 release) there are probably many broken gates. i've only restored a couple of health check patches to see the state there: https://review.opendev.org/q/topic:release-health-check-gazpacho+is:open14:05
elodillesand actually it doesn't look that bad14:05
ttxgood timing :)14:06
fungiit could have been waaaay worse, but we were extremely well-prepared for that actually14:06
ttxOK let's keep track of those14:06
elodilleswe might need to re-iterate when all the known fixes landed14:06
ttxright14:06
ttx- Send weekly email (ttx)14:06
ttxWill do shortly after the meeting14:07
ttx#topic Assign R-6 week tasks14:07
elodillessince you'll be away, i'm taking both meeting chair and other tasks :)14:08
opendevreviewStephen Finucane proposed openstack/reno master: scanner: Decode sha before printing  https://review.opendev.org/c/openstack/reno/+/97673714:08
ttxThanks to my great skills at vacation timing I won't be able to take tasks in the next two weeks14:08
opendevreviewStephen Finucane proposed openstack/reno master: Handle missing versions  https://review.opendev.org/c/openstack/reno/+/97673814:08
ttxsorry about that :)14:08
elodillesno problem :]14:08
elodillesvacations are essential ;)14:08
ttx#topic Review weekly countdown email14:09
ttx#link https://etherpad.opendev.org/p/relmgmt-weekly-emails14:09
elodillesLGTM14:11
ttxalright will send shortly14:11
fricklerI edited the last line to make it consistent14:11
ttx#topic Open Discussion14:11
ttxNothing from me except the aforementioned impeding vacation14:12
elodillesi think we need to mention here a couple of things14:12
fungihope you get rested and recharged14:12
ttxthanks frickler 14:12
ttxI might never come back14:12
elodillesso there are the problems around the new setuptools release14:13
elodillesfor now (for us) the master branch issues are the most problematic ones14:13
elodillesthough a lot of them got fixed meanwhile14:13
fricklerreqs bumps are still blocked though14:14
elodillesone that still lingers is regarding horizon, which pulls in xstatic-* deliverables and those are broken as they're missing pkg_resources module :/14:14
frickleralso cinder with os_win14:14
elodillesand that ^^^14:15
elodillesso the xstatic-* deliverables have been fixed (thanks to stephenfin !), but they need to be released14:15
elodillesand the release patches gets time out on the list-changes job :/14:16
fungisome of them only just got imported into gerrit yesterday14:16
frickleriiuc the first set of 28 or so repos is only the "old" ones14:16
elodillesas i wrote before the meeting:14:17
fricklerwe could maybe also just bump the timeout for this job from 1h to like 3h and see if that is enough?14:17
elodillesthe question is whether this is a new slowness that would impact our coordinated release on April 1st14:17
ttxwhy is it taking so long? infinite number of changes?14:17
elodilleschecking on a random job that is not the usual 2-4 mins log, but 17 mins, i see these steps took extra long: https://paste.opendev.org/show/bkmbqLra5QPX3DbZuC1k/14:17
elodillesso yeah, it seems that sometimes there are ~2 mins lags at some points during the job run14:18
elodillesso for now, probably increasing the job timeout would do the trick14:18
fricklerit pretty likely IMO that the reason for this is opendev.org getting AIdosed14:19
elodillesbut i think if this issue will be present around release day, then we might not be able to keep our time schedule there14:19
fungiactually, it looks like https isn't making it to vexxhost's sjc1 region from some parts of the internet14:20
fungiso that's probably ipv6 timing out and then getting automatically retried over ipv414:20
frickleroh, so that would depend on which cloud the job is running on14:21
fungitcp in general, in fact. ssh times out too14:21
stephenfinI was going to say, this sounds familiar14:21
fungiwell, no, vexxhost sjc1 is where our giteaz servers are14:21
* stephenfin also can't access opendev.org again today14:22
fungiicmp6 is also not making it through14:22
ttxok so we can hope this issue will be transient and fixed by release time14:24
fungiseems to be from everywhere on the internet i'm testing this time14:24
ttxelodilles: anything else you wanted to flag?14:24
elodillesone more thing maybe14:25
elodillesthat i'd appreciate some reviews on these: https://review.opendev.org/q/topic:2026.1-runtimes o:)14:26
elodillesand that's all from me (at least that was all that came into my mind)14:26
elodillesoh, and maybe another note14:27
elodillesthat requirements' gate is also broken (still, if i'm not mistaken)14:27
elodillesand some of the cross-<project>-jobs as well14:27
ttx...ok14:27
elodillesso we can prepare for longer delays with upper-constraint bumps if those don't get sorted out soon14:28
ttxOk, noted!14:29
elodillesand that's really all that i had in my mind14:29
ttxAnything else, anyone?14:29
ttxalright then14:31
ttx#endmeeting14:31
opendevmeetMeeting ended Fri Feb 13 14:31:05 2026 UTC.  Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)14:31
opendevmeetMinutes:        https://meetings.opendev.org/meetings/releaseteam/2026/releaseteam.2026-02-13-14.00.html14:31
opendevmeetMinutes (text): https://meetings.opendev.org/meetings/releaseteam/2026/releaseteam.2026-02-13-14.00.txt14:31
opendevmeetLog:            https://meetings.opendev.org/meetings/releaseteam/2026/releaseteam.2026-02-13-14.00.log.html14:31
ttxThanks everyone!14:31
elodillesthanks too o/14:31
opendevreviewElod Illes proposed openstack/releases master: [CI] Temporary increased timeout for list-changes  https://review.opendev.org/c/openstack/releases/+/97674114:43
elodillesfrickler: ^^^ i don't like this, but if we don't have better solution, then so be it... :/14:44
elodilles(as i saw the job timed out after 11th deliverable, so we definitely need the 3hrs timeout as 2hrs won't be enough for the 27 deliverables... :S)14:45
fricklerelodilles: well the other option would be to split the release patch into three parts with at most 10 deliverables each. or maybe 4/8 to be a bit more on the safer side?14:51
fricklerbut let's try the timeout bump first I'd say14:53
elodilles+114:53
opendevreviewMerged openstack/releases master: [CI] Temporary increased timeout for list-changes  https://review.opendev.org/c/openstack/releases/+/97674115:42
fungithe vexxhost ipv6 routing should be back to normal now15:57
opendevreviewTakashi Kajinami proposed openstack/releases master: Release tooz 8.0.0  https://review.opendev.org/c/openstack/releases/+/97676116:04
opendevreviewStephen Finucane proposed openstack/releases master: Revert "[CI] Temporary increased timeout for list-changes"  https://review.opendev.org/c/openstack/releases/+/97676316:13
stephenfinfungi: gtk. I've proposed a revert of elodilles change on the assumption that we probably want to know when issues like this happen again16:14
gmaanelodilles: frickler: ttx can you please check this, https://review.opendev.org/c/openstack/releases/+/975702 16:31
fricklerrelease-team: https://review.opendev.org/c/openstack/releases/+/976622 is also green now16:36
opendevreviewMerged openstack/releases master: Release oslo.service 4.5.0  https://review.opendev.org/c/openstack/releases/+/97570217:20
opendevreviewMerged openstack/releases master: Bulk release xstatic packages  https://review.opendev.org/c/openstack/releases/+/97662217:29
elodillesstephenfin gmaan frickler : done o:)17:51
opendevreviewCarlos Eduardo proposed openstack/releases master: [manila] Release 2025.2 stable branch  https://review.opendev.org/c/openstack/releases/+/97678717:56
opendevreviewStephen Finucane proposed openstack/releases master: Bulk release newly onboarded xstatic packages  https://review.opendev.org/c/openstack/releases/+/97662318:00
opendevreviewStephen Finucane proposed openstack/releases master: Bulk release newly onboarded xstatic packages  https://review.opendev.org/c/openstack/releases/+/97662318:00
opendevreviewCarlos Eduardo proposed openstack/releases master: [manila] New 2025.1 Epoxy release  https://review.opendev.org/c/openstack/releases/+/97679918:01
opendevreviewCarlos Eduardo proposed openstack/releases master: [manila] New 2025.1 Epoxy release  https://review.opendev.org/c/openstack/releases/+/97679918:01
opendevreviewCarlos Eduardo proposed openstack/releases master: [manila] New 2024.2 Dalmatian release  https://review.opendev.org/c/openstack/releases/+/97680118:07
opendevreviewCarlos Eduardo proposed openstack/releases master: [manila] New 2025.1 Epoxy release  https://review.opendev.org/c/openstack/releases/+/97679918:09
opendevreviewStephen Finucane proposed openstack/releases master: Normalize xstatic package configurations  https://review.opendev.org/c/openstack/releases/+/97662518:47
opendevreviewStephen Finucane proposed openstack/releases master: Bulk release xstatic packages (pt 2.)  https://review.opendev.org/c/openstack/releases/+/97666918:47
opendevreviewMerged openstack/releases master: Release tooz 8.0.0  https://review.opendev.org/c/openstack/releases/+/97676119:23
gmaanelodilles: frickler thanks19:48
fricklergreat, now we have 28 xstatic reqs bumps that all need to be squashed together and even then won't pass CI :(20:09
clarkbis that a force merge scenario? Just thinking they largely impact horizon which is already broken. So any failures caused by force merging probably wont' make things worse?20:53
*** haleyb is now known as haleyb|out23:42

Generated by irclog2html.py 4.0.0 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!