17:00:09 <gouthamr> #startmeeting tc
17:00:09 <opendevmeet> Meeting started Tue Aug 12 17:00:09 2025 UTC and is due to finish in 60 minutes.  The chair is gouthamr. Information about MeetBot at http://wiki.debian.org/MeetBot.
17:00:09 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
17:00:09 <opendevmeet> The meeting name has been set to 'tc'
17:00:24 <gouthamr> Welcome to the weekly meeting of the OpenStack Technical Committee. A reminder that this meeting is held under the OpenInfra Code of Conduct available at https://openinfra.dev/legal/code-of-conduct.
17:00:27 <gouthamr> Today's meeting agenda can be found at https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee
17:00:32 <gouthamr> #topic Roll Call
17:00:34 <spotz[m]> o/
17:00:49 <gtema> o/
17:01:27 <noonedeadpunk> o/
17:02:02 <gouthamr> noted absence: b a u z a s, c a r d o e
17:02:08 <frickler> \o
17:02:54 <gouthamr> courtesy-ping: gmaan, mnasiadka
17:03:01 <gmaan> o/
17:03:07 <mnasiadka> o/
17:03:43 <gouthamr> hello everyone o/ thank you for joining.. lets get started
17:03:49 <gouthamr> #topic Last Week's AIs
17:04:13 <gouthamr> we had lower than usual attendance last week, some ongoing activity that we need to check on
17:04:43 <gouthamr> i took a few AIs that are at various degrees of progress
17:05:13 <gouthamr> 1) connect with stephenfin, and figure out closure for the proposed goal: https://governance.openstack.org/tc/goals/proposed/migrate-from-wsgi-scripts-to-module-paths.html
17:05:39 <gouthamr> i don't have an update here.. will get to it today
17:06:12 <gouthamr> 2) Runtime update for 2026.1
17:06:12 <gouthamr> i am drafting this now, will post after this meeting
17:06:14 <gmaan> seeing gerrit topic, there are few changes yet to merge
17:06:19 <gmaan> #link https://review.opendev.org/q/topic:%22remove-wsgi_scripts%22+status:open
17:06:26 <gouthamr> ty gmaan
17:07:38 <gouthamr> 3) Retirement of monasca
17:07:53 <gouthamr> i started working on these changes this week:
17:07:54 <gouthamr> #link https://etherpad.opendev.org/p/monasca-retirement
17:08:30 <noonedeadpunk> there were comments on the first patch
17:08:31 <gouthamr> thank you for the review on the project-config change.. i'll update it
17:08:32 <gmaan> thanks for working on those. I left comment to fix the project-config change and repo content removal
17:08:38 <noonedeadpunk> so some work needs to be done there
17:09:07 <gouthamr> ack, pbkac on the noop jobs :D
17:09:15 <noonedeadpunk> frankly I didn't check it deper
17:09:31 <gouthamr> i did leave a note in the commit message directed at you, probably noonedeadpunk
17:09:50 <gouthamr> i see two related repos under OS Ansible: openstack/openstack-ansible-os_monasca, openstack/openstack-ansible-os_monasca-agent
17:10:01 <noonedeadpunk> ah
17:10:02 <gouthamr> i think we've deferred retired these in the past
17:10:32 <noonedeadpunk> indeed, this should be pretty much just a revert of previous un-retirement
17:10:37 <gouthamr> i don't know if we should do it with the same change, or in an immediate follow up so we don't forget, or we need to wait
17:10:49 <noonedeadpunk> as despite folks asked to revive roles, nobody actually picked up their maintenance
17:10:54 <noonedeadpunk> and I clean forgot about them
17:11:09 <gmaan> we did in separate proposal to governance as well other places
17:11:09 <noonedeadpunk> I will check what needs to be done there, thanks!
17:11:25 <gouthamr> ty, separate changes do make sense
17:11:30 <noonedeadpunk> yeah, I say in a separate one
17:11:38 <gouthamr> we can track them to closure with the etherpad no problem
17:11:43 <noonedeadpunk> ++
17:11:51 <gouthamr> its not like we'll mess up anything, there are only 17 repositories associated with the project team
17:12:10 * gouthamr expected a sensible chuckle
17:12:17 <noonedeadpunk> heh., yeah
17:12:31 <noonedeadpunk> what are the odds
17:12:31 <spotz[m]> hehe, and fungi can always recover something
17:13:08 <gouthamr> haha, alright.. next AI
17:13:15 <gouthamr> was around refstack-server
17:13:15 <fungi> yeah, worst case i can rollback edits or restore from a database backup
17:13:35 <gouthamr> #link https://lists.openstack.org/archives/list/openstack-discuss@lists.openstack.org/thread/WNI4PE2TZ3G52C3U5FT2YNVRUAJB3CMO/
17:13:59 <fungi> i deleted the server itself just this morning, after saving a filesystem snapshot image
17:14:03 <gouthamr> gmaan responded to this thread indicating that folks that needed this functionality can maintain a list of tests and use tempest in a similar fashion
17:14:14 <gouthamr> ah ack fungi
17:14:48 <noonedeadpunk> it's kinda sad we never fgigured out a reasonable replacement for refstack
17:15:04 <noonedeadpunk> as idea of having providers "certified" is overall not bad
17:15:32 <noonedeadpunk> promotes both the project and highlights provider following "best practices" (in a way)
17:15:41 <noonedeadpunk> (at least proper interoperability)
17:16:01 <gouthamr> totally, but, the interop program over the years grew too heavy to maintain.. i do support having a "certification" process that's lighter weight.. it'll take a lot of time/effort to think through one
17:16:20 <clarkb> its also just less necesasry as the total number of configuration options has reduced over time
17:16:33 <gouthamr> it'll need inputs from qa-core, but hopefully not eat into their limited bandwidth
17:16:37 <gtema> Don't want to repeat myself, but in the implemented form those tests prove 0 interoperability
17:16:40 <noonedeadpunk> I can recall gtema was having some ideas years back on some summit as well... but it never flew
17:16:52 <gmaan> well, it is not related to qa at all
17:16:58 <clarkb> (there are still differences but nothing like back when refstack was envisiioned)
17:17:08 <gmaan> certification is completely different things
17:17:08 <noonedeadpunk> that's why I said about reasonable replacement, and not current form :)
17:17:12 <gtema> Right, not at all, gmann
17:18:18 <noonedeadpunk> but anyway having a project logo on providers website is a beneficial thing for both parties
17:18:22 <fungi> yeah, in the past interop focus was more on the problem of providers "differentiating" their service offering by making incompatible downstream changes to services/apis and replacing openstack services with other things
17:19:10 <fungi> they still have an option to use the trademark logos, the process is merely an administrative/contractual one now which doesn't involve mandatory testing
17:19:17 <noonedeadpunk> while it can be still a thing... now it's more about variety of configurations which may make things non-interoperable
17:19:20 <gtema> We can chat years on this topic without any conclusions. I suggest not to continue it here. Maybe again during summit
17:19:27 <noonedeadpunk> ++
17:19:33 <gmaan> ++
17:19:40 <gouthamr> if there's a proposal, we could have a discussion at the PTG
17:20:13 <gmaan> proposal and I will say more of 'requirements and scope'
17:20:46 <gmaan> interop group was dissolved because there was no clear interest or requirement
17:20:51 <gtema> We may involve SCS into such discussion on the certification and interoperability side
17:21:03 <gmaan> if we want to discuss any solution, I will say we collect the interest and requirement of it
17:21:13 <gouthamr> ack, gtema is this something you're motivated to drive?
17:21:33 <noonedeadpunk> if they would be open to open discussion rather then just trying to mandate their own thing...
17:21:54 <gtema> Not really, after 8 years I figured out no csp is really interested and committed
17:22:07 <gouthamr> ah
17:23:08 <gmaan> yeah commitment too 'Requirement, interest, and commitment'
17:24:39 <gouthamr> okay, we'll put this out there to the community, and suggest that there's somewhat a desire to explore a replacement to certify OpenStack in a common, transparent way.. we lack however, a person/team to make it happen and it's outside the purview of the maintainers (QA or otherwise)
17:25:49 <gouthamr> i suspect we had references to interop/refstack on https://openinfra.org/legal/trademark-policy - and these aren't there anymore.. so such an effort may be tangential to OIF's "Commercial Use Trademark License" process
17:26:14 <gouthamr> or you may find that the foundation folks are interested to collaborate
17:26:26 <gouthamr> anything else to add?
17:27:03 <fungi> those are the trademarks that previously required interop testing
17:27:20 <fungi> the board approved a change in policy at the end of 2023
17:27:39 <spotz[m]> Yeah that was the retirement of interop
17:27:53 <gouthamr> ack..
17:28:12 * gouthamr wants to get "OpenStack Expertise" added to his LinkedIn profile
17:28:27 <gmaan> yeah, I think nothing needed from TC side here until anyone need some tooling more tests etc
17:28:38 <gouthamr> yes, or opinions
17:29:29 <gouthamr> i think we have a loose vision for what would benefit the maintenance of OpenStack itself, or for operators that interface with us..
17:30:02 <gouthamr> alright, that's all the AIs i am seeing from the past week, was anyone else working on anything to note?
17:31:28 <noonedeadpunk> I keep working on dashboard for all project
17:31:35 <noonedeadpunk> *for tc projects
17:31:46 <noonedeadpunk> I'm not that happy about the result so far
17:31:47 <gouthamr> ah, yes! :)
17:32:08 <noonedeadpunk> but the intermittent result is like this http://bit.ly/4lq4DQG
17:32:25 <noonedeadpunk> *intermediate
17:32:49 <gouthamr> nice
17:33:27 <noonedeadpunk> but input on what tabs we want to see on the dashboard is super welcome
17:33:43 <gmaan> noonedeadpunk: maybe exclude the 'election' repo as that is maintained by separate group. bcz during nominations it can fill up the dashbaord where TC members does not need to vote
17:33:45 <noonedeadpunk> as we don't really have specs/dashboard work
17:33:52 <noonedeadpunk> ++
17:33:59 <noonedeadpunk> yeah, was thinking about that
17:34:03 <gmaan> cool
17:34:12 <noonedeadpunk> another thing, is that list of projects is defined separately there
17:34:35 <noonedeadpunk> as it seems gerrit does not have awareness today about "ownership" of repos to filter on
17:35:13 <noonedeadpunk> for osa I did use parentprojects, but I don't think it's applicable to parent all these repos to governance
17:35:19 <noonedeadpunk> as it's just not true
17:36:39 <gouthamr> yeah, i don't think there's an elegant way to tie the repos together
17:37:15 <fungi> we could add an empty project like the one we use for the openstack/meta-config acl and inherit it, if that's really desired
17:37:33 <noonedeadpunk> we have openstack/openstack ? :D
17:37:51 <noonedeadpunk> but probabvly it's not good idea, as there repos are kinda not much related
17:38:00 <fungi> that's something else, used to aggregate subrepos as a superrepo
17:38:21 <fungi> i suppose doing something like that is also an option in theory
17:38:27 <noonedeadpunk> except of governance-website/governance-sigs/governance
17:38:35 <gouthamr> sorry to side track, but lets get through the agenda and discuss this in Open Discussion perhaps?
17:38:40 <noonedeadpunk> ++
17:38:44 <gouthamr> #topic A check on gate health
17:38:45 <fungi> yeah, or after the meeting
17:38:57 <gouthamr> any gate concerns/updates to note this week?
17:39:12 <gouthamr> #link https://www.debian.org/releases/trixie/
17:39:27 <gouthamr> ^ think fungi and frickler were chatting about this here a few days ago
17:39:47 <fungi> yeah i linked the announcement in here on saturday
17:40:23 <fungi> we brought the openmetal provider back online in zuul a few minutes ago since their data center relocation maintenance ended yesterday, but i guess keep an eye out for issues there
17:41:27 <gouthamr> ack, noted
17:42:27 <gouthamr> will the infra team be working on trixie mirrors?
17:43:13 <gouthamr> or is that something that each zuul/cloud provider needs to setup and configure?
17:43:25 <fungi> i think clarkb had mentioned freeing up space first by dropping xenial and some other similarly old content
17:44:11 <clarkb> yes I think we should clear out the existing content t hat we don't need then add in the new stuff
17:44:28 <fungi> also we still need to figure out what to do longer term about the upstream disappearance of bullseye-backports which we've been mirroring
17:45:33 <gouthamr> ack
17:45:47 <fungi> at the moment the plan is, i think, to make local changes to our base job to stop enabling backports by default on debian nodes (which is the defaulter default in zuul/zuul-jobs at the moment)
17:46:08 <clarkb> then we can delete the content for backports that have been deleted upstream
17:46:10 <fungi> otherwise if we delete the mirror of that, a lot of bullseye-based jobs are going to start hard failing
17:47:42 <gouthamr> we might not notice unless we're monitoring Caracal jobs
17:48:24 <fungi> not even sure you'd be using bullseye for that, bookworm was current for the caracal cycle wasn't it?
17:48:25 <gouthamr> thanks for the call out
17:49:06 <gouthamr> you're right
17:49:09 <mnasiadka> Kolla also only builds Bookworm images (we used Bullseye in 2023.2 and earlier)
17:49:12 <noonedeadpunk> we had bulsseye jobs on caracal just in case
17:49:22 <gouthamr> for upgrade testing..
17:49:23 <gmaan> we do test bullseye also in caracel for upgrade things
17:49:26 <gmaan> yeah
17:49:39 <gmaan> bullseye  and  bookworm both in caracal
17:49:44 <noonedeadpunk> as slurp should have work from 2023.1->20924.1
17:49:48 <noonedeadpunk> ++
17:50:52 <gouthamr> alright, we do have one more topic to get through - i see content for open discussion
17:51:06 <gouthamr> lets switch to that, since we can update the tracker offline
17:51:11 <gouthamr> #topic Open Discussion
17:51:20 <gouthamr> Discuss runtimes for 2026.1 development cycle
17:51:41 <gouthamr> ^ frickler, i was working on this and the answer to both your questions is yes
17:51:45 <gouthamr> Add Debian Trixie and python3.13?
17:51:45 <gouthamr> Drop python3.10 and Ubuntu 22.04?
17:51:56 <gmaan> we already dropped  Ubuntu 22.04 in this cycle right?
17:52:10 <gmaan> we only test  Ubuntu 24.04 in current cycle
17:52:10 <noonedeadpunk> I thought it's the case as well
17:52:24 <frickler> why do we have py3.10 in the mix, then?
17:52:31 <gouthamr> yes, no need to keep it for upgrades either ... we expect folks to upgrade to Ubuntu 24.04 before upgrading to Gazpacho
17:52:34 <noonedeadpunk> I'm not sure if we ever added CentOS 10 Stream as "complimentary though?
17:52:50 <noonedeadpunk> ++
17:52:56 <gmaan> for python3.10, I would not be very aggressive to drop. Seeing it is EOLing in oct 2026, we can still support this as min version
17:53:11 <gouthamr> i am adding CS10/Rocky10 although i'm not sure the state of the providers and their testing in Zuul
17:53:31 <noonedeadpunk> C10S working kinda. not sure about capacity
17:53:44 <noonedeadpunk> Rocky10 - there're images and it's possible to test
17:53:48 <gmaan> yeah, this is devstack change and job running fine
17:53:52 <clarkb> we have both now. They are limited to about 50% of our quota (that number varies depennding on which clouds are available, see earlier note about turning off openmetal for a couple weeks)
17:53:52 <gmaan> #link https://review.opendev.org/c/openstack/devstack/+/937251
17:53:54 <spotz[m]> If you need anything for CS10 let me know, I can possibly pull in folks to help
17:53:57 <noonedeadpunk> some things are missing there as a general though
17:54:13 <noonedeadpunk> as there's no compatability for many things between C10S and Rocky 10
17:54:21 <gouthamr> oh
17:54:31 <mnasiadka> noonedeadpunk: let me guess, you also have COPR repos with rebuilds of packages that are not yet in EPEL? ;-)
17:55:04 <spotz[m]> I also have EPEL contacts:)
17:55:06 <noonedeadpunk> well. We have only 1 package coming from copr as maintainers forgot about it in EPEL
17:55:12 <fungi> the main concern i'd have with not dropping python 3.10 is if we're stuck maintaining platforms that provide it until stable/gazpacho reaches end of maintenance, or even into unmaintained state
17:55:17 <noonedeadpunk> but then no systemd-networkd or ceph
17:55:44 <noonedeadpunk> (and then yeah - copr)
17:55:55 <mnasiadka> In Kolla I think we have two - mod_auth_mellon for SAML and glusterfs-fuse for Manila
17:56:03 <gmaan> gouthamr: we do have c10 and rocky 10 since this cycle, anything to change  there?
17:56:06 <gmaan> #link https://governance.openstack.org/tc/reference/runtimes/2025.2.html#advance-unstable-testing
17:56:16 <gouthamr> nope..
17:56:17 <noonedeadpunk> oh, yes, whole gluster is copr indeed
17:56:22 <gmaan> though we did not find help to setup rocky 10
17:56:23 <noonedeadpunk> but it's same for C10S
17:56:27 <gouthamr> we have called it out as "Advance/Unstable"
17:57:01 <gouthamr> was thinking to leave it at that, but these links and discussions are making me think about promoting them back to the tested linux distributions section
17:57:08 <mnasiadka> spotz[m]: https://bugzilla.redhat.com/show_bug.cgi?id=2326534 - would be happy if anybody can have a look, there's even a volunteer - but no traction at all
17:57:22 <gmaan> I think the key change for next runtime can be if we want to bump python max version to python 3.13 or keep it py3.12
17:57:53 <gmaan> which depends on the eventlet things
17:59:04 <noonedeadpunk> I don't think we can drop 3.12
17:59:09 <noonedeadpunk> until we drop 24.04
17:59:20 <mnasiadka> Debian OpenStack is going to release Flamingo DEBs only for Trixie, and given Bookworm EOL (June 2026?) - we sort of should.
17:59:25 <gouthamr> gmaan is talking about the max version
17:59:27 <mnasiadka> I think gmaan was mentioning the max version
17:59:30 <gmaan> yeah, min version we can keep py3.10 itswlf
17:59:39 <gouthamr> we'd bump it imo, because there's been bugfixes in eventlet concerning 3.13
17:59:54 <gouthamr> and there's none that's a blocker atm, please correct me if you know any
18:00:07 <gmaan> yeah, current max version is python 3.12, changing max version does not mean we need to drop py3.12
18:00:13 <noonedeadpunk> yeah. we need 3.13 for trixie
18:00:13 <gouthamr> functional testing will probably mostly use Ubuntu 24.04, and hence python3.12
18:00:37 <gouthamr> but, trixie will be tested by project teams increasingly given the eventlet work
18:00:38 <gmaan> do we have testing passing on py3.13?
18:01:10 <noonedeadpunk> and while debian is gonna release packages against 3.13 it's not a given they will work as intended if our code is buggy and was never tested against 3.13
18:01:11 * gouthamr timecheck
18:01:42 <noonedeadpunk> (even as unit tests)
18:01:58 <gouthamr> lets continue this discussion on the gerrit change, i'll have it up in a little bit
18:02:02 <frickler> https://review.opendev.org/c/openstack/devstack/+/954653 passes
18:02:07 <gouthamr> anything else to note in the minutes today?
18:02:11 <gmaan> this is unit test job results
18:02:14 <gmaan> #link https://zuul.opendev.org/t/openstack/builds?job_name=openstack-tox-py313&skip=0
18:02:32 <gmaan> one thing from me
18:02:45 <gouthamr> go for it, gmaan
18:02:52 <gmaan> I will not be running for the next TC term
18:03:08 <gmaan> just that update ^^
18:03:09 <gouthamr> :(
18:03:26 <noonedeadpunk> ;(
18:03:39 <gmaan> I will be around if any input needed but not as a official TC members
18:03:58 <gouthamr> it's been a long run with you on the TC so this would be a huge change
18:04:36 <gouthamr> thank you for all the hard work, you can bet we'll still tag you with things
18:04:39 <noonedeadpunk> and the value you have can't be underestimate
18:04:55 <gmaan> thanks, sure I will be happy to help where I can
18:05:37 <spotz[m]> End of an era
18:06:13 <fungi> speaking of which, we'll need at least 4 tc candidates to fill the open seats in the next election, and at least 5 if there's going to be a poll
18:06:39 <gmaan> ++, hope to see more candidates.
18:06:57 <gouthamr> alright, we're well over our time slot.. lets wrap up this meeting
18:07:01 <gouthamr> thank you all for joining
18:07:03 <gouthamr> #endmeeting