*** elodilles_pto is now known as elodilles | 08:03 | |
tkajinam | I'm always confused by the policy about EM/EOL, but who should decide whether an unmaintained branch should be retired, in the project without unmaintained liaison assigned. Can PTL make the decision ? | 08:26 |
---|---|---|
frickler | tkajinam: yes, in my understanding of the policy, the final decision is always with the PTL | 09:44 |
opendevreview | Merged openstack/project-team-guide master: Update instructions for adding new requirements https://review.opendev.org/c/openstack/project-team-guide/+/935270 | 09:56 |
cardoe | I need to unfortunately miss TC meeting. It's right at the same time I have to take my son to a doctor's appointment. | 13:56 |
gouthamr | cardoe ack; no problem! | 15:11 |
slaweq | gouthamr hi, I have some errands to do today and I may be late on the TC meeting. I will do my best to be there but please don't wait for me to start | 15:47 |
gouthamr | ack slaweq thanks for letting me know | 15:47 |
opendevreview | Dmitriy Rabotyagov proposed openstack/governance master: Retire Murano/Senlin/Sahara OpenStack-Ansible roles https://review.opendev.org/c/openstack/governance/+/935677 | 16:35 |
gouthamr | tc-members: a gentle reminder that we're meeting here in ~59 minutes | 17:01 |
opendevreview | Dmitriy Rabotyagov proposed openstack/openstack-manuals master: Retire Senlin/Sahara/Murano roles for OSA https://review.opendev.org/c/openstack/openstack-manuals/+/935684 | 17:02 |
opendevreview | Dmitriy Rabotyagov proposed openstack/governance master: Add ansible-role-httpd repo to OSA-owned projects https://review.opendev.org/c/openstack/governance/+/935694 | 17:57 |
gouthamr | #startmeeting tc | 18:01 |
opendevmeet | Meeting started Tue Nov 19 18:01:09 2024 UTC and is due to finish in 60 minutes. The chair is gouthamr. Information about MeetBot at http://wiki.debian.org/MeetBot. | 18:01 |
opendevmeet | Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. | 18:01 |
opendevmeet | The meeting name has been set to 'tc' | 18:01 |
gouthamr | Welcome to the weekly meeting of the OpenStack Technical Committee. A reminder that this meeting is held under the OpenInfra Code of Conduct available at https://openinfra.dev/legal/code-of-conduct. | 18:01 |
gouthamr | Today's meeting agenda can be found at https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee | 18:01 |
gouthamr | #topic Roll Call | 18:01 |
noonedeadpunk | o/ | 18:01 |
gmann | o/ | 18:01 |
gtema | o/ | 18:01 |
bauzas | \o | 18:02 |
gouthamr | noted absence: c a r d o e, s l a w e q | 18:02 |
frickler | o/ | 18:02 |
gouthamr | courtesy-ping: spotz [m] | 18:03 |
gouthamr | not in the channel, likely somewhere in the conference circuit :) | 18:03 |
gouthamr | lets get started.. | 18:04 |
frickler | may be an issue with the matrix bridge hiding ppl | 18:04 |
gouthamr | ack | 18:04 |
gouthamr | #topic Last Week's AIs | 18:04 |
gouthamr | we took an AI to see if we can revive the erstwhile Third-party CI SIG through the help of project teams that have a requirement for third party CI | 18:06 |
spotz[m] | o/ | 18:06 |
gouthamr | we had a pretty long discussion following our meeting, and jbernard brought it up at the cinder weekly meeting.. i don't suppose we have updates already.. we did suggest that project teams must evaluate the requirement if we're unable to support contributors putting together their CI | 18:07 |
gouthamr | re-evaluate* | 18:07 |
gmann | I think this is settle now? I saw it was discussed in cinder meeting and some operator having 3rd party CI mentioned to help/share doc etc to cinder team | 18:07 |
jbernard | i can give a quick update from cinder if there is interest | 18:07 |
gouthamr | please do, jbernard | 18:08 |
jbernard | sure, | 18:08 |
gouthamr | #link https://meetings.opendev.org/meetings/cinder/2024/cinder.2024-11-13-14.01.log.html#l-23 (cinder meeting discussion of third party CI) | 18:08 |
jbernard | i reached out to Inori directly to understand the cinder-specific issues, i think progress as been made on that front, my plan is to incorporate that feedback into the doc we create | 18:08 |
jbernard | as a team, we agreed that we need to do better in documenting our ci requirements and setup proceedure | 18:09 |
jbernard | a few volunteered to help with a doc | 18:09 |
jbernard | im in the process of collecting all that we have currently, with the plan of consolidating it into a single workign document for our ci reference, both for newcomers and existing vendors for reference | 18:10 |
gouthamr | ++ that's great | 18:10 |
clarkb | would also be good to share with other projects that require/use third party ci so they can see if it can be adapted to their needs too | 18:10 |
gmann | jbernard: ++, thanks. really appreciate that. | 18:11 |
clarkb | I don't think cinder necessarily has to solve any manila or ironic or nova problems, but at least make them aware that the documentation is there and can be reconsumed | 18:11 |
jbernard | clarkb: certainly, once we have something that's usable, ill send a mail to the list | 18:11 |
clarkb | ++ thanks | 18:11 |
gouthamr | thanks jbernard | 18:11 |
gmann | clarkb: ++ | 18:11 |
gmann | maybe we can add something in p-t-g even it is a cinder example but will be good ref | 18:11 |
gouthamr | that's all the AIs we took last week.. was anyone tracking anything else? | 18:11 |
jbernard | ideally we could have a working poc project to use as a starting point, but im not yet sure how feasible or how much time that will take | 18:11 |
jbernard | but if it looks like it coule be in reach, we'll try to go for it | 18:12 |
jbernard | that's all on ci that I have | 18:12 |
gouthamr | yeah, if the docs are frustrating to follow, someone will step in and automate it :) | 18:12 |
gouthamr | moving on.. | 18:13 |
gouthamr | #topic Status of migration to Ubuntu Noble (gmann) | 18:13 |
gouthamr | gmann: floor is your | 18:13 |
gouthamr | gmann: floor is yours* | 18:13 |
gmann | I sent the status mail yesterday night | 18:14 |
gmann | #link https://lists.openstack.org/archives/list/openstack-discuss@lists.openstack.org/thread/JOMDY26TCW7OX3NXRGOYQCIDXNNJ4E25/ | 18:14 |
gmann | or you can see the current status in ertherpad which will be more live | 18:14 |
gmann | #link https://etherpad.opendev.org/p/migrate-to-noble#L37 | 18:14 |
bauzas | thanks gmann for the hard work | 18:14 |
gmann | 14 projects are all green on Noble (for many fixes/changes are up to merge once base devstack/tox jobs are ready to migrate | 18:15 |
gmann | other projects failing but there are cases which might be non-noble related issue | 18:15 |
gtema | for it is honestly a pain - openstackdocstheme is broken due to pbr and to fix this one is also not trivial | 18:15 |
gmann | I am going through those and trying to fix/filter out them for noble specific issue | 18:16 |
gouthamr | i'd throw in one more: manila.. we identified a couple of issues and i've pinned jammy to debug them | 18:16 |
clarkb | gtema: fwiw I've seen that pbr change but pbr works fine with python3.12 in zuul. It still isn't clear to my why that change is necessary | 18:16 |
gtema | I also wonder about some statuses in the etherpad since i.e. first change to keystone today failed on docs job where now we need to drop sphinxcontrib-*diag | 18:16 |
gmann | gtema: yes. doc job is not yet ready due to openstackdocstheme | 18:16 |
bauzas | gtema: because of ceph, heh ? :) | 18:16 |
gmann | #link https://bugs.launchpad.net/pbr/+bug/2088360 | 18:16 |
bauzas | whoops | 18:16 |
bauzas | s/gtema/gouthamr | 18:16 |
clarkb | but also I think that change is close? its the logging changes that are holding it up now | 18:16 |
gtema | clarkb - locally it passes | 18:17 |
gmann | gtema: clarkb: can you add the link? | 18:17 |
gtema | with py36 I mean (no test for py27) | 18:17 |
gtema | #link https://review.opendev.org/c/openstack/pbr/+/924216 | 18:17 |
gouthamr | bauzas: actually, i'm scratching my head about ceph jobs.. i've to test locally to reproduce issues you and rdhasman reported.. but, gmann's changes to move ceph jobs to noble pass in the CI.. | 18:17 |
gtema | it is referred in the launchpad issue | 18:17 |
bauzas | gouthamr: :cross_fingers_emoji_ish: | 18:18 |
gmann | gtema: ++ | 18:18 |
gouthamr | bauzas: here's a no-op against devstack-plugin-ceph: https://review.opendev.org/c/openstack/devstack-plugin-ceph/+/935398 ,.. the failure is in the docs job | 18:18 |
clarkb | gtema: if I had to guess setuptools in other platforms doesn't configure logging at that "low" (info) level or similar | 18:18 |
clarkb | and going through the old distutils mechanism did? | 18:19 |
gtema | and literally - without doing anything docs job in keystone are broken today (due to blockdiag) and it is also a pain to replace them - there are lot of those | 18:19 |
gmann | gtema: I logged for sdk as openstackdoctheme use story and it is very hard to log and track in two places | 18:19 |
clarkb | but again it isn't clear to me why this is needed | 18:19 |
gmann | but thanks for checing and adding detail there | 18:19 |
clarkb | pbr works under python3.12 if you install setuptools | 18:19 |
gtema | clarkb - openstackdoctheme does import from pbr which is not working | 18:19 |
clarkb | got it this is runtime use of pbr not install time | 18:20 |
clarkb | that is helpful | 18:20 |
gtema | clarkb: soo https://bugs.launchpad.net/pbr/+bug/2088360/comments/3 | 18:20 |
fungi | is that with setuptools already installed? | 18:20 |
clarkb | fungi: based on that traceback I'm thinking no | 18:20 |
gtema | some bits in pbr still import distutils | 18:20 |
frickler | can we maybe discuss the technical details after the meeting? this is getting a bit out of scope for me | 18:20 |
clarkb | frickler: sure | 18:20 |
gmann | yeah, we can discuss after meeting | 18:20 |
fungi | yeah, newer python dropped distutils, if you need distutils on newer python you can get it by installing setuptools, which now provides it | 18:21 |
gmann | so my main purpose to add this topic here is that deadline for migrating the base jobs is Nov 29 and even there are many projects not green yet. I would like to know if we want to extent the deadline ? | 18:21 |
gtema | gmann - looking to amount of issues I do not believe we would make it | 18:21 |
gouthamr | awesome, thanks for sharing the status of the goal here and for all the hard work around this, gmann .. | 18:21 |
gouthamr | where does that deadline come from? | 18:22 |
gmann | personally I do not want to extend and if any project stay in broken state then we have option to pin the nodeset as jammy and they can continue fixing it | 18:22 |
gtema | especially that some changes were open since months and nobody reviewed them | 18:22 |
frickler | IMO nothings going to happen in december, either, so I don't think that that would help much | 18:22 |
gouthamr | ^ +1 | 18:22 |
clarkb | frickler: the inverse may be true though right? its a low impact time to make changes like that and let them more slowly get fixed? | 18:22 |
gmann | well we will be migrating base jobs early in cycle and moving deadline make situation wortst | 18:22 |
clarkb | vs trying to do it closer to the release when everyone is cramming in features last second | 18:23 |
noonedeadpunk | I aree that it's better to do in the beginning | 18:23 |
frickler | yes, so keep the deadline, switch, put more pressure on projects to fix things | 18:23 |
* gouthamr isn't clear how we picked the deadline.. | 18:23 | |
gmann | gtema: true but somewhere we have to break the things to fix and if any project is not activly checking their failure then they can pin the nodeset | 18:23 |
gtema | hmm, that is so untrue - pbr is broken and it affects everybody | 18:24 |
frickler | gouthamr: iiuc it is already delayed from "early in the cycle" | 18:24 |
gtema | and the change there is open since July | 18:24 |
gmann | and if some complex issue is there and project need time we have workarond also instead of holding complete CI | 18:24 |
noonedeadpunk | wait, so you're saying that pbr is broken with py3.12? | 18:24 |
gtema | yes, pbr is not working with 312 as of now | 18:25 |
bauzas | why exactly ? | 18:25 |
fungi | installing pbr without setuptools is broken | 18:25 |
gtema | it has imports of distutils which are dropped | 18:25 |
bauzas | I got no issues with noba | 18:25 |
bauzas | ah that | 18:25 |
gmann | disutils is not there | 18:25 |
fungi | distutils moves from python stdlib into setuptools | 18:25 |
noonedeadpunk | but wasn't 3.12 in pti for 2024.2? | 18:25 |
clarkb | this is fud | 18:25 |
gmann | noonedeadpunk: it was | 18:25 |
clarkb | pbr works fine it requires setuptools you must install setuptools | 18:25 |
gmann | where there is no setuptools it is issue | 18:26 |
gtema | noonedeadpunk - it makes so much difference whether you do tox -e py312 or you use py312 as a base for all jobs | 18:26 |
clarkb | it could be improved but it is not required to work with python3.12 | 18:26 |
noonedeadpunk | Yeah, I think it's true what clarkb is saying | 18:26 |
noonedeadpunk | gtema: well, I'm running 24.04 for almost half a year, and never saw tox failing due to pbr so far | 18:27 |
noonedeadpunk | maybe it's not always used.... | 18:27 |
gtema | right, so now pls get i.e. keystone project and try to build docs with py312 | 18:27 |
gmann | let's review it on gerrit but I do not think that should be blocker for migration as things work with setuptools | 18:27 |
noonedeadpunk | sec | 18:27 |
gouthamr | okay; couple of you have expressed that this is ETOOMUCHDETAIL for this meeting; lets discuss implementation concerns right after we end.. | 18:27 |
gmann | we still have 10 days and we can see how many get fixed. I am concentrating on this migration only and requested project team also | 18:27 |
gouthamr | gmann: it looks like there's no need to delay this | 18:28 |
gmann | so I think we are good to stay on deadline Nov 29 and not ok for extending it right? | 18:28 |
gmann | gouthamr: yeah, holiday time is good to fix the things also | 18:28 |
frickler | switching docs builds to noble/3.12 doesn't need to be coupled with that | 18:28 |
gmann | frickler: exactly | 18:28 |
gtema | -1 for blindly sticking | 18:28 |
frickler | so just switch unit tests + devstack | 18:28 |
gmann | we do not need to hold all things for a small portion/projects | 18:28 |
bauzas | how big is that portion ? | 18:29 |
gmann | gtema: doc job will not be migrated part of this | 18:29 |
bauzas | I tend to agree with you but I want to be sure that's a minority | 18:29 |
gmann | that is already separated out | 18:29 |
gouthamr | docs jobs, failing project jobs noted in the etherpad.. | 18:29 |
gmann | #link https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/935459/1 | 18:29 |
gtema | I hate this mix - it is too dirty and not helpful | 18:30 |
gmann | well, holding the integration testing for this is not good idea | 18:30 |
gouthamr | gtema: i think we'd want to move incrementally.. | 18:30 |
gmann | and delaying the migration make it worst to do it during rlease time | 18:30 |
noonedeadpunk | gtema: it failed but with completely different issue: https://paste.openstack.org/show/bKzQbfNEiaAoG7QQbrz4/ | 18:31 |
gmann | we have faced that in past when I personally wanted to have EVERYTHING GREEN before any job migration | 18:31 |
gouthamr | gtema: they're really not testing the same things, so we're batching things that are alike already.. no need to club all OpenStack CI Ubuntu jobs under one bucket, imo | 18:31 |
gtema | noonedeadpunk - that is what I mean, we just say we take different py version and all of a sudden there are plenty of issues not related to the project itself supporting this version or not | 18:32 |
noonedeadpunk | I don\t think it's py verison, I would put on u-c recent update | 18:32 |
gmann | it passed here | 18:32 |
gmann | #link https://review.opendev.org/c/openstack/keystone/+/932934 | 18:32 |
gtema | I am frustrated - for me now nearly everything is broken, for already a week I am just on fixing (workaround) things everywhere in and out of my reach | 18:33 |
gtema | I can't work on any features, only repairing what was working before | 18:33 |
noonedeadpunk | I have exact same issue with python 3.11 fwiw | 18:33 |
fungi | gmann: 932934 ran openstack-tox-docs on jammy, fwiw | 18:34 |
gmann | fungi: yeah just checked https://zuul.opendev.org/t/openstack/build/a3d99e308ad74ee0ad7919f7b07490fd/log/job-output.txt#49 | 18:34 |
gmann | it was after I separated out the doc job migration | 18:34 |
noonedeadpunk | could it be that? https://opendev.org/openstack/requirements/commit/376cf882ece239abee0e2c2ffb83f3bbefc1819c | 18:34 |
noonedeadpunk | well. plenty u-c were bumped today | 18:35 |
gmann | anyways, I am keeping deadline same and we will see how things goes in those 10 days. doc job migration can be done later once we are ready | 18:36 |
noonedeadpunk | let's try recheck your patch and see if it still works :) | 18:36 |
gtema | noonedeadpunk - everybody knows that those blockdiag are not really expected to work and majority of projects replaced them with graphviz, so it is not a question what broke it. We need to replace it. I am just tired of only fixing things to the left and right | 18:36 |
noonedeadpunk | I know, I know... | 18:36 |
noonedeadpunk | I'm not arguing that plenty of things are broken and bumping u-c results in failures like that quite often... | 18:37 |
gouthamr | okay, lets wrap up this discussion, and move to the next topic | 18:37 |
gouthamr | #topic Watcher's launchpad tracker recovery | 18:37 |
gouthamr | #link https://answers.launchpad.net/launchpad/+question/819336 | 18:37 |
gmann | gtema: feel free to add bugs you are facing in etherpad that will be helpful to know where we are | 18:38 |
bauzas | no answers unfortunately from my side :( | 18:38 |
gouthamr | ^ so launchpad admins provided the projects (watcher, watcher-dashboard and python-watcherclient) to openstack-admins | 18:38 |
gmann | gouthamr: this is still not openstack-admin | 18:38 |
gouthamr | can you folks kick out the watcher-drivers here? | 18:38 |
gmann | #linl https://launchpad.net/~watcher-drivers | 18:38 |
gmann | #link https://launchpad.net/~watcher-drivers | 18:38 |
bauzas | I invited him thru LinkedIn but no response so far and I can't InMail | 18:38 |
gmann | or you are expecting to create a new driver team? | 18:38 |
gouthamr | yes | 18:38 |
gmann | I was expecting to have ownership of watcher-driver changed | 18:39 |
gouthamr | i'm asking openstack-admins to take over now, and allow sean-k-mooney and folks to create a different team | 18:39 |
gouthamr | gmann: me too | 18:39 |
spotz[m] | Yeah I don't see any of the admins there even individually | 18:39 |
gouthamr | but, is this sufficient? | 18:39 |
gmann | humm, this is workarond we can do but did they deny changing ownewrship of driver? | 18:39 |
gouthamr | bauzas: david's been reading my linkedin messages, but, like i mentioned earlier.. i don't think he can do anything, his launchpad account is associated with his former employer.. he may have no way to log in | 18:40 |
gmann | gouthamr: maybe we can try that also, if you can ask them to do this change also? | 18:40 |
gouthamr | gmann: because we want consistency? | 18:40 |
fungi | openstack-admins group members don't allow/prevent creating new groups in lp, just make sure to add openstack-admins as the owner when creating | 18:40 |
bauzas | gouthamr: ack | 18:40 |
gmann | gouthamr: yeah, otherwise we have to create another driver team with different name | 18:41 |
sean-k-mooney | i think normally there are 3 teams | 18:41 |
gouthamr | gmann: yeah, my problem was the delay we've had to get LP folks to act on this | 18:41 |
gmann | gouthamr: I mean that is doable but let's try if they can hand over the existing driver team | 18:41 |
sean-k-mooney | drvier, a open bug team and a coresec team | 18:41 |
gouthamr | i must send Billy Olsen a thank you message :D | 18:41 |
frickler | seems they added openstack admins as maintainers of the 3 projects, so we should be able to go on from that | 18:41 |
gouthamr | sean-k-mooney: if that's what the team wants | 18:41 |
gmann | as you asked "We'd like help to reassign the "watcher-drivers" team to "openstack-admins".. Or a way to change the ownership of the three projects to "openstack-admins"." | 18:41 |
gouthamr | i know, but they ignored that :D | 18:42 |
fungi | projects can get by without a bug supervisor group if they allow open bug assigning and targeting | 18:42 |
gmann | and they did 2nd portion. I think they can do 1st also? | 18:42 |
sean-k-mooney | gouthamr: well we dont nessiarly need 3 but i woudl like to have at least 2. but we can figure out the details later once openstack-admins is an owner of the current one | 18:42 |
gmann | gouthamr: can you post it explicitly and if they do not do withtin this week then we can go with another driver team | 18:42 |
gouthamr | gmann: okay can do.. | 18:43 |
gmann | gouthamr: thanks | 18:43 |
gmann | as we are waiting for sean-k-mooney proposal for core team things also this week, we can wait for LP also for a week | 18:43 |
sean-k-mooney | yep | 18:43 |
gouthamr | sean-k-mooney: ack; do let openstack-admins know what you'd like once that's done.. | 18:43 |
gouthamr | this topic is not entirely about watcher | 18:44 |
sean-k-mooney | we are hoping ot restart the watcher irc meetings so we can add it as an adgenda item and report back | 18:44 |
gmann | I can handle that from admin part side once it is added in driver or create new one. will work with sean-k-mooney on that | 18:44 |
gmann | sean-k-mooney: ++ | 18:44 |
gouthamr | if there are any other launchpad trackers that are out of "openstack-admins" purview, we'd need to get them fixed.. please alert if you know any | 18:44 |
gouthamr | lets move on.. | 18:45 |
gouthamr | #topic A check on gate health | 18:45 |
frickler | flaky I'd say | 18:45 |
gouthamr | any gate health concerns, not pertaining to the noble effort? | 18:45 |
frickler | lots of gate failures I saw during the release and reqs rush today | 18:46 |
clarkb | docker hub may or may not be enforcing more restrictive rate limits | 18:46 |
gmann | I have not checked on much this week but saw tempest jobs not so bad and things are merging in Tempest/nova | 18:46 |
frickler | nothing standing out in particular, the usual volume stuff | 18:46 |
clarkb | that probably doesn't affect most openstack projects though. The deployment projects are affected though | 18:46 |
gouthamr | there were 10 requirements bumps merged in the past few hours :O | 18:46 |
gtema | partially: https://review.opendev.org/c/openstack/releases/+/934416 (os-api-ref) release is necessary since some api-ref jobs are broken without people knowing that | 18:46 |
gmann | gouthamr: :) good timing :P | 18:47 |
gtema | nova api-ref is broken and fix is struggling to be done since 2 weeks | 18:47 |
frickler | gouthamr: and more pending | 18:47 |
gouthamr | \o/ | 18:47 |
bauzas | gtema: yup, indeed | 18:47 |
bauzas | I can say that nova api-ref changes are broken now | 18:48 |
frickler | I'll ping the other release team members about that patch, missed it due to the e-1 rush | 18:49 |
gouthamr | thanks gtema frickler | 18:50 |
gouthamr | so gate flakiness must be expected as we start rolling out with these UC bumps | 18:50 |
gouthamr | we'll check back next week and hopefully there isn't a need for any reverts... | 18:50 |
gouthamr | anything else about $topic | 18:51 |
frickler | IMO it is unrelated, but we'll see, ack | 18:51 |
bauzas | yet again a question of how we can ensure we don't trample projects everytime we bump u-c | 18:51 |
bauzas | :) | 18:51 |
fungi | running a lot more test jobs is the only solution i'm aware of | 18:51 |
gouthamr | we have "cross" jobs.. but these are unit test jobs and maybe the coverage is insufficient - this is something project teams can assess and help with | 18:51 |
frickler | bauzas: have stable (non-flaky) integration jobs that we can afford to run on the reqs repo | 18:52 |
frickler | and make them faster than 2h if possible | 18:52 |
gmann | we hve many project testing on req but not all | 18:52 |
bauzas | yeah, I'd appreciate if we could run tempest scenarios on a u-c change | 18:52 |
gmann | we do right? | 18:52 |
gmann | tempest-full-py3 is there | 18:53 |
frickler | we run tempest-full-py3 and one sdk integration job | 18:53 |
bauzas | I dunno, b/c we recently hit hard problems to solve in a short timeframe | 18:53 |
bauzas | hmmm | 18:53 |
frickler | feel free to add more jobs, but also be prepared to have them removed again if they turn out to be failing too often | 18:53 |
bauzas | I then need to recollect *why* we missed that last time | 18:53 |
* gouthamr recalls the issue being with openstacksdk changes | 18:54 | |
bauzas | my brain is fried tonight so I miss context | 18:54 |
frickler | and also make sure to react to issues coming up. we have ~ 10 bumps blocked because projects need to fix stuff | 18:54 |
gouthamr | sry, openstackclient i think | 18:54 |
gmann | yeah, if jobs are stable those can be added and those should not be taking more time than tempest-ful-py3 | 18:54 |
frickler | and I'm currently the only one that actually looks at those failures and pings projects, help could be really well spent there | 18:54 |
bauzas | gouthamr: correct, it was a major OSC change that broke our world, so we needed to revert to old OSC version | 18:55 |
gmann | bauzas: ah, i remember now. | 18:55 |
gmann | we talked about adding greande job there | 18:55 |
bauzas | so I wonder why we didn't captured that in CI | 18:55 |
frickler | bauzas: because nova doesn't run an sdk/osc-tips job? | 18:55 |
gouthamr | ~~ time check; we've 5 mins ~~ | 18:55 |
gtema | bauzas - because like in the beginning of the meeting: project may run well with updated dep, but that doesn't mean that nobody else is using it in a totally different way | 18:56 |
bauzas | anyway, I agree with the timecheck, we shouldn't solve that problem now | 18:56 |
bauzas | I'll just take attention | 18:56 |
gmann | I was adding more cross service job on sdks but that is not green yet | 18:57 |
gouthamr | +1 it's in the TC tracker | 18:57 |
gouthamr | but, lets move the "TC Tracker" topic to next week; i do want to find owners for some PTG takeaways too.. we've three minutes to head into | 18:57 |
gouthamr | #topic Open Discussion | 18:57 |
gouthamr | a few new changes have showed up here: | 18:58 |
gmann | frickler: bauzas this one | 18:58 |
gmann | #link https://review.opendev.org/c/openstack/python-openstackclient/+/931858 | 18:58 |
gouthamr | #link https://review.opendev.org/q/project:openstack/governance+status:open (Open Governance changes) | 18:58 |
gouthamr | can i have some eyes on https://review.opendev.org/c/openstack/governance/+/931254 | 18:58 |
gouthamr | this moves a lot of the eventlet removal content out of the goal doc and into a wiki because (a) it is background information (b) alternatives will continue to be flushed out, this wasn't meant to be prescriptive of specific directions | 19:00 |
bauzas | I wanted to review the eventlet goal but had no time yet :( | 19:00 |
gouthamr | ack; please do take a look soon .. | 19:00 |
gouthamr | that said, we've reached the hour.. | 19:00 |
gouthamr | thank you all for attending | 19:00 |
gouthamr | we can continue discussing things here after i endmeeting... | 19:00 |
gouthamr | but see you here next week if you have to run.. | 19:01 |
gouthamr | #endmeeting | 19:01 |
opendevmeet | Meeting ended Tue Nov 19 19:01:10 2024 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) | 19:01 |
opendevmeet | Minutes: https://meetings.opendev.org/meetings/tc/2024/tc.2024-11-19-18.01.html | 19:01 |
opendevmeet | Minutes (text): https://meetings.opendev.org/meetings/tc/2024/tc.2024-11-19-18.01.txt | 19:01 |
opendevmeet | Log: https://meetings.opendev.org/meetings/tc/2024/tc.2024-11-19-18.01.log.html | 19:01 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!