15:00:48 <mnaser> #startmeeting tc
15:00:49 <openstack> Meeting started Thu Mar 11 15:00:48 2021 UTC and is due to finish in 60 minutes.  The chair is mnaser. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:50 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:52 <openstack> The meeting name has been set to 'tc'
15:00:56 <jungleboyj> o/
15:00:56 <mnaser> #topic roll call
15:00:57 <mnaser> o/
15:00:59 <ricolin> o/
15:00:59 <gmann> o/
15:01:03 <belmoreira> o/
15:01:17 <redrobot>15:02:12 <yoctozepto> \o/
15:02:51 <mnaser> #topic Follow up on past action items
15:03:05 <mnaser> #link http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-03-04-15.03.html
15:03:12 <mnaser> we don't have anything listed, so we can skip that for today
15:03:18 <mnaser> #topic Audit SIG list and chairs (diablo_rojo)
15:03:26 <mnaser> cc ricolin on this one too
15:03:45 <yoctozepto> ping diablo_rojo_phon
15:04:53 <ricolin> we only got one patch in review now https://review.opendev.org/c/openstack/governance-sigs/+/778304
15:05:27 <mnaser> right, i think gmann brings up a good point about archiving things
15:05:36 <ricolin> yes
15:06:29 <ricolin> I through we have ways to retire a SIG
15:06:31 <gmann> should we add sig also in TC-liaison list so that we periodically checks the status/health ?
15:06:46 <ricolin> gmann, +1
15:07:06 <jungleboyj> gmann:  ++
15:07:08 <fungi> we had them in there originally
15:07:08 <gmann> ricolin: yeah we have for moving to 'completed' state but not for 'un finished ' or so
15:07:50 <fungi> the first cycle or two that we did liaisons, we had an optional section for sigs and board-appointed committees/working groups
15:07:51 <gmann> ricolin: i mean if any SIG is retired from 'forming' state only
15:08:06 <gmann> fungi: i see.
15:08:35 <gmann> while reiterating the liaison for Xena cycle we can add SIG also in our automatic assignment script
15:08:39 <fungi> at the time people felt just keeping up with all the project teams was more work than we were able to get done in the cycle, but we also had much loftier goals for health measurement back then
15:08:54 <gmann> yeah
15:09:05 <ricolin> I guess this is currently all we have for retire process https://governance.openstack.org/sigs/reference/sig-guideline.html#retiring-a-sig
15:09:07 <mnaser> it's pretty hard to keep up with all the teams with all the things we have to deal with, that's what i found anyways
15:09:45 <gmann> ricolin: yeah, may be we can add 'forming' -> 'retire' also there
15:10:10 <belmoreira> +1
15:10:26 <mnaser> maybe add a reason
15:10:26 <gmann> mnaser: true, at least we know if any SIG is not active then who from TC can follow up quickly
15:10:38 <mnaser> and say 'folded into XYZ'
15:10:45 <jungleboyj> mnaser:  ++
15:10:45 <ricolin> we should also ask to retire or migrate SIG repo too in doc I assume
15:11:18 <gmann> mnaser: +1 for reason, nice idea
15:11:25 <ricolin> a reason will definitely something good for reactive
15:11:53 <ricolin> I will update the doc to reflect on these suggestion
15:12:00 <gmann> thanks.
15:12:17 <ricolin> Will also update the container SIG patch too
15:12:28 <ricolin> diablo_rojo_phon, ^^^
15:12:43 <mnaser> cool
15:12:45 <gmann> and should I add SIG in TC liaison list if all ok?
15:12:48 <openstackgerrit> Merged openstack/election master: Close Xena Elections  https://review.opendev.org/c/openstack/election/+/779845
15:12:55 <mnaser> i guess we could
15:13:07 <gmann> ok,
15:13:47 <ricolin> I think we should
15:13:53 <ricolin> but on the other hand
15:13:57 <ricolin> popup tema
15:14:14 <ricolin> what about popup team
15:14:16 <gmann> popup team has TC liaison/volunteer already.
15:14:29 <ricolin> oh, than we're all good:)
15:14:42 <gmann> #link https://governance.openstack.org/tc/reference/popup-teams.html
15:14:47 <gmann> 'TC Liaison'
15:15:29 <ricolin> the TC liaison for image encryption should update
15:16:14 <ricolin> or we're fine to have non-TC for TC liaison for popup team?
15:16:31 <mnaser> i think its ok for it to just be a liasion and not necessary a tc member
15:16:41 <mnaser> but maybe that's another discussion topic
15:16:42 <mnaser> :p
15:17:02 <gmann> yeah
15:17:10 <jungleboyj> :-)  I mean, he is an honorary member.  :-)
15:17:13 <mnaser> ricolin: wanna add that to next weeks agenda?
15:17:15 <ricolin> I think no need to make more discussion as I believe on fungi for it:)
15:17:26 <jungleboyj> Poor guy will never be able to get away.
15:17:30 <ricolin> mnaser, I think we're good on this
15:17:44 <gmann> :) we will not let him to go away
15:17:51 <jungleboyj> :-)
15:17:52 <fungi> heh, yeah i was a tc member when i originally served as the sponsor for that pop-up
15:17:57 <ricolin> yep!
15:18:04 <fungi> i still attend their weekly meetings
15:18:14 <gmann> +1
15:18:34 <mnaser> #topic Gate performance and heavy job configs (dansmith).
15:18:42 <jungleboyj> https://media1.giphy.com/media/KczBU4M2IEdClprXaq/giphy.gif?cid=ecf05e4772p0dj0zuysiay11z145hvnovyiqbthd0thwb6nx&rid=giphy.gif
15:18:45 <fungi> i want to say we originally decided that tc liaisons for pop-up teams didn't need to be tc members, i just happened to be in that case
15:18:45 <dansmith> oof, sorry
15:18:47 <mnaser> i think this one has been a rotating topic without that much progress
15:18:56 <mnaser> its been a busy week for all of us i think
15:19:13 <dansmith> yeah, so,
15:19:15 <fungi> i saw we finally caught up with our node request backlog around 02:00 utc today
15:19:21 <dansmith> the gate has been crazy busy
15:19:25 <gmann> yeah
15:19:29 <dansmith> I've seen a lot of cinder fail,
15:19:29 <jungleboyj> mnaser:  It has at least gotten some visibility in Cinder and we are working on cleaning up failures that are slowing the checks.
15:19:41 <dansmith> and the tempest queue has been somewhat problematic
15:19:58 <dansmith> we're definitely doing a lot of work, which is great
15:20:04 <gmann> yeah yesterday we finally got many of them merged in tempest but it was issue there
15:20:19 <dansmith> given the last couple weeks have been atyipcal (for normal, not for this part of the cycle), it's hard to tell how good we are or aren'tm
15:20:29 <dansmith> but some things have taken millions of rechecks to get landed
15:20:29 <gmann> and obviously it start happen during release time
15:20:31 <fungi> today is not so bad, i guess because we're at/past the freeze deadline now?
15:20:48 <dansmith> fungi: the major rush was yesterday for sure
15:20:51 <fungi> node backlog reached nominal levels around 13:00 utc
15:21:25 <fungi> there's a little bump at the moment, but there were brief periods in the past two hours where we weren't even using all of our quota
15:21:40 <dansmith> mnaser: personally I think this is a good thing for us to keep eyes on.. doesn't have to be every week, but I think keeping it on the radar has yielded good stuff, IMHO
15:21:52 <mnaser> yeah i think lets keep it on the radar
15:21:53 <mnaser> i agree
15:21:58 <gmann> agree
15:22:04 <jungleboyj> ++
15:22:24 <fungi> also the additional quota from inap has really helped in the past few weeks
15:22:35 <dansmith> fungi: yeah, really seems like it
15:22:37 <fungi> things would have been much worse without it
15:22:52 <dansmith> yesterday it was almost eight hours to get jobs running for a while,
15:22:57 <dansmith> but with a huuuge queue
15:23:09 <dansmith> so it felt like things were doing pretty well considering all the fail
15:23:58 <fungi> there's been some push on the ml to solve some cinder-related failures by switching the iscsi signalling is it?
15:24:14 <fungi> something which was causing a lot of job failures anyway
15:24:23 <jungleboyj> Yes.
15:24:39 <jungleboyj> I am not sure where that landed after discussion yesterday though.
15:24:53 <dansmith> switching to or from iscsi,
15:25:02 <dansmith> or switching something about how we use it?
15:25:08 <jungleboyj> Switching how we use it.
15:25:11 <fungi> lio vs tgt i think?
15:25:16 <jungleboyj> From tgt to lio
15:25:25 <dansmith> ah
15:25:43 <fungi> anyway, would be good not to lose sight of it with the change volume dropping as the cycle goes through its post-freeze state change
15:26:11 <mnaser> this sounds all good so we'll keep watching over things :)
15:26:22 <mnaser> i think we can move on to the next item
15:26:26 <dansmith> yup
15:26:31 <mnaser> #topic Consensus on lower constraints testing (gmann)
15:26:54 <gmann> it seems no objection on the proposed plan on ML #link http://lists.openstack.org/pipermail/openstack-discuss/2021-February/020556.html
15:27:11 <gmann> which is basically 1. Only keep direct deps in lower-constraints.txt 2. Remove the lower constraints testing from all stable branches.
15:27:47 <gmann> and it will be easy to maintain
15:28:08 <gmann> like for nova 77 deps can be removed from l-c #link https://review.opendev.org/c/openstack/nova/+/772780
15:28:11 <fungi> #2 includes removing it from new stable branches when they get created
15:28:20 <gmann> +1
15:28:53 <fungi> stable branches need stable jobs, and those won't be stable over time
15:28:59 <gmann> as next step, i feel we should document it somewhere, in project guide or PTI or resolution ?
15:29:07 <yoctozepto> I agree, it makes most sense to have some stub for it and fix when needed
15:29:52 <gmann> I feel PTI is better place?
15:30:05 <ricolin> gmann, do you think a goal is too strong/enforcing for this?
15:30:07 <gmann> or resolution and then update PTI
15:30:39 <gmann> ricolin: it is not strong, i think just removing the indirect one which would not cause much work
15:30:40 <fungi> we haven't previously required the use of lower-constraints jobs, so it seems weird to have a policy requiring something about a non-required job
15:31:28 <yoctozepto> indeed
15:31:39 <fungi> i think so far the pti only lists necessary policy, so this would be a shift to also including guidance i guess
15:32:03 <yoctozepto> I am with fungi on this
15:32:07 <yoctozepto> better not
15:32:25 <jungleboyj> Agreed.
15:32:29 * mnaser personally defers to the others on thisd
15:32:42 <gmann> true for projects clarity we can add somewhere at least when project asked TC to have some guidelines on this
15:32:48 <fungi> i agree with the guidance, just seems like maybe not something that needs to be enshrined in openstack's governance
15:33:21 <yoctozepto> agreed
15:33:43 <fungi> does the qa team maintain content in the project teams guide?
15:33:57 <gmann> i do not think so
15:34:06 <fungi> i wonder if a section in there on testing recommendations (not policy) would fit
15:34:07 <gmann> i think pti is the place where we all look on testing guidelines
15:34:33 <fungi> well, we certainly look there for policies that the tc has officially voted on
15:35:04 <yoctozepto> pti does not mention l-c at all
15:35:12 <gmann> yeah that's what projects were looking for, TC decide on l-c testing
15:35:17 <yoctozepto> in fact, the only place is pt guide
15:35:25 <gmann> yoctozepto: yes that was the confusion i think when this bring up on ML
15:35:26 <fungi> just remember the pti is part of openstack's governing documents (it's in the governance repository along with things like tc resolutions)
15:35:37 <yoctozepto> yes
15:35:46 <gmann> and it was hard to maintain and we did not find pti does not talk about it so remove it?
15:35:53 <gmann> remove the testing job?
15:36:28 * yoctozepto with his masakari ptl and kolla cores hats on admits to removing all l-c jobs
15:36:38 <gmann> I feel having all testing policy in single place will be more clear
15:36:47 <yoctozepto> not a single tear was shed
15:36:55 <fungi> policy yes, but is this policy when it's about something not required?
15:37:02 <gmann> and 'do not test l-c on stable and direct deps on master' s policy for testing
15:37:13 <yoctozepto> feels too brute
15:37:26 <gmann> at least like 'only requirement is to test direct deps on master'
15:37:44 <yoctozepto> so we are then adding one now, aren't we?
15:38:06 <gmann> i will say adding the one we were testing without any clearity
15:38:51 <gmann> if we end up removing the l-c testing then I would agree
15:38:57 <yoctozepto> would make sense to query projects; perhaps some do not test l-c at all
15:39:09 <yoctozepto> fwiw, masakari had broken jobs which ran noop with l-c so :-)
15:39:25 <yoctozepto> just saying :D
15:39:38 <gmann> yeah because there was no clarity on whether to test or not
15:39:44 <yoctozepto> indeed
15:40:13 <yoctozepto> so do we want to test l-c? we know the shortcomings of the newly proposed approach
15:40:17 <gmann> and after checking 'who need these'  and 'whether it is worth to test or not' we end up like yes we can at least test direct deps in consistent way
15:40:21 <yoctozepto> it makes sense obviously
15:41:19 <yoctozepto> "accidental version bump, you shall not pass!"
15:41:56 <spotz> Hehe
15:42:00 <yoctozepto> I would vote on making this a policy then
15:42:19 <gmann> in PTG, many project will be discussing on these like nova will so I think we should be ready with TC guidelines by then.
15:42:34 <fungi> so far, openstack has not mandated lower bounds testing, but many projects used lower-constraints jobs as an ad hoc standard. recent changes in pip made it apparent they could not be easily maintained on stable branches. some projects were cool with removing their l-c jobs entirely (they're not required after all), others wanted to keep the jobs but were looking for a compromise and so we've suggested
15:42:35 <fungi> that compromise is to just take them out of stable branches. none of that is policy
15:43:10 <yoctozepto> yes, none *is* at the moment
15:43:11 <mnaser> so maybe this is something we can leave up to the projects to decide but list the different options?
15:43:35 <fungi> it's all up to individual teams if they want to run l-c jobs at all, and they can also *try* to run them in stable branches if they like tilting at windmills, but it's inadvisable
15:43:41 <gmann> mnaser: project wanted TC to decide
15:44:04 <gmann> that was the original discussion started when neutron asked on ML
15:44:19 <fungi> nova wants the tc to tell them whether and how to run lower-constraints jobs?
15:44:20 <gmann> after oslo started the thread on dropping those.
15:44:21 <mnaser> so if projects want the tc to decide, then it sounds like policy
15:45:08 <gmann> #link http://lists.openstack.org/pipermail/openstack-discuss/2021-January/019660.html
15:45:37 <gmann> from here it was started to have some common guidelines
15:46:00 <fungi> a tc policy of "you can do this if you want" isn't a policy, so if some projects want the tc to make a policy about lower-constraints jobs then it sounds like they're asking the tc to require these jobs when they were not previously required. that's a lot different from mere guidelines
15:46:28 <mnaser> ok, so a guideline sounds like a list of approaches to take
15:46:36 <gmann> well it can be "l-c testing can be done with direct deps only for master and not needed for stable as mandatory"
15:47:08 <fungi> is nova asking the tc to decide how all projects will do lower bounds testing, or is nova asking the tc to provide them with some suggestions? the first is policy, the second is not
15:47:47 <gmann> its not nova, its from all other projects like neutron was seeing for some common strategy on this.
15:47:55 <gmann> where come project were dropping it and some not
15:48:17 <fungi> and i anticipate at least some projects to object to being required to add lower bounds testing they don't feel they have the capacity to stay on top of
15:48:49 <gmann> we have job testing it and it run on  all project/stable also so why not to make it in pti on what we expect on that.
15:49:15 <fungi> and you'll need to decide how to determine what kinds of deliverables are required to have/add lower bounds testing, vs how to identify deliverables where it doesn't make sense
15:49:27 <gmann> that is true in many other testing also, not all  projects test also defined pti
15:49:39 <yoctozepto> let's recap what we know
15:49:50 <yoctozepto> 1) l-c testing was largely broken
15:49:55 <yoctozepto> 2) we survived
15:49:58 <yoctozepto> so?
15:50:09 <yoctozepto> no need to policy if not required :D
15:50:26 <mnaser> i think we should revisit this next week
15:50:26 <mnaser> i'd like sometime to chat over the next topic.
15:50:26 <fungi> it's not like upper-constraints which is centrally maintained, lower bounds are different for every project and not always trivial to identify, i'm unconvinced that it makes sense to start forcing it on project teams who don't see value in it
15:50:35 <mnaser> or we keep discussing this
15:50:42 <mnaser> and move the rest of topics next week
15:50:44 <mnaser> but yeah
15:50:54 <gmann> ok for next week as next topic is more important
15:50:58 <yoctozepto> perhaps it's good to vocalize on ptg
15:51:14 <jungleboyj> yoctozepto:  ++
15:51:49 <mnaser> #topic PTL assignment for Xena cycle leaderless projects (gmann)
15:51:54 <mnaser> #link https://etherpad.opendev.org/p/xena-leaderless
15:52:28 <gmann> We have 4 project left as leader-less and 4 project as late candidacy
15:52:38 <gmann> better than last cycle i think
15:53:00 <yoctozepto> (let's keep retiring and it will get better and better, yes)
15:53:02 <jungleboyj> That is better.
15:53:07 <ricolin> It is
15:53:09 <gmann> out of first 4, Mistral might go with DPL as discussed preciously
15:53:21 <yoctozepto> very well
15:53:25 <redrobot> I volunteer as tribute for Barbican.
15:53:42 <yoctozepto> you are too kind
15:53:44 <spotz> :)
15:53:46 <jungleboyj> :-)
15:53:47 <gmann> nice
15:53:48 <ricolin> :)
15:53:54 * redrobot was not paying attention to PTL nomination deadline.
15:54:29 <fungi> redrobot: i your defense, we didn't provide as many warnings it was coming up as we have in past cycles
15:55:19 <gmann> I can reach out to Mistral team for DPL model
15:55:19 <yoctozepto> gmann: should we then move mistral to dpl in the whiteboard?
15:55:20 <fungi> we actually ended up with a lot fewer "leaderless" results than in past cycles
15:55:23 <yoctozepto> gmann: ack
15:55:40 <gmann> yoctozepto: let's check with them on required liaison list or so
15:55:43 <redrobot> fungi 😅
15:55:52 <yoctozepto> gmann: yeah, I figured from your subsequent message
15:56:06 <gmann> basically we need to decide on Keystone and Zaqar
15:56:18 <jungleboyj> Wow.  Keystone ...
15:56:19 <gmann> Zaqar seems not active in last cycle
15:56:42 <spotz> Yeah my feelings too jungleboyj
15:56:43 <gmann> may be we can get release team input also if they are doing wallaby release or not
15:57:36 <fungi> knikolla was suggesting dpl for keystone
15:57:47 <yoctozepto> zaqar is not deployable by kolla nor charms
15:57:59 <yoctozepto> I think tripleo and osa do deploy it though
15:58:09 <jungleboyj> Ok.  I assume there is still enough activty there to spread out the responsibility?
15:58:37 <jungleboyj> Someone go find Brant Knudson
15:58:39 <yoctozepto> I agree with gmann that Zaqar is likely 5 - bye-bye for now
15:58:50 <fungi> for keystone? i don't get the impression keystone is dead, at least, they're on top of vulnerability reports from my vmt perspective
15:59:03 <gmann> yeah, I will ping release team on Zaqar release status
15:59:33 <gmann> agree on keystone, it is active project just no PTL
15:59:33 <jungleboyj> fungi:  Yeah.  If knikolla is recommending depl, that seems fine.
15:59:38 <yoctozepto> https://review.opendev.org/q/project:openstack/releases+zaqar
15:59:57 <gmann> yoctozepto: thanks
16:00:24 <yoctozepto> nothing in wallaby whatsoever
16:01:46 <gmann> yeah
16:02:51 <belmoreira> another important data point is also to understand if the project is actually used
16:03:50 <gmann> yeah, good point
16:04:10 <gmann> may be we can check latest user survey data
16:04:29 <jungleboyj> Makes sense.
16:05:01 <yoctozepto> yeah, on that note I already wrote regarding deployment tools
16:06:13 <gmann> we are out of time anyways. we can keep discussing it on etherpad or after meeting
16:06:14 <yoctozepto> there was not enough steam to even add it in kolla and charms :/
16:06:15 <ricolin> Project like Heat use Zaqar to impl. singal, will be great to send them some notify if we gonna remove Zaqar
16:06:33 <yoctozepto> ricolin: that's interesting
16:07:00 <knikolla> o/ sorry i'm late
16:07:07 <ricolin> yoctozepto, not hard dependency, just provide it as one of singal backend
16:07:21 <ricolin> I mean from heat side
16:07:27 <ricolin> knikolla, o/
16:07:48 <yoctozepto> ricolin: yeah, I've done a quick read
16:07:58 <yoctozepto> they should be happy to maintain less code :-)
16:08:10 <yoctozepto> hi knikolla
16:08:30 <yoctozepto> oh, we are past time indeed
16:08:49 <spotz> Not too badly
16:09:11 <yoctozepto> if knikolla could say a word about keystone governance model
16:09:24 <yoctozepto> we would have a (almost) complete set of information
16:09:32 <jungleboyj> \o/
16:10:05 <knikolla> None of the cores has reached out to me showing interest in taking over as PTL
16:10:21 <knikolla> And pretty much everyone has cycled through the role (or is ptl of some other project)
16:10:38 <yoctozepto> duh
16:11:23 <spotz> Maybe ping them? They might be too shy to step up?
16:11:31 <gmann> ohk, how about DPL model? anyone interested in that or something you have discussed in keystone meeitng or so
16:12:04 <fungi> speaking from experience, it takes a lot of convincing for a former ptl to come out of retirement
16:12:27 <spotz> I was thinking more the cores
16:12:33 <knikolla> I don’t think it’s a question of shyness
16:12:38 <yoctozepto> gmann: hberaud said mistral is not releasing either
16:12:53 <fungi> spotz: he was saying basically all the keystone cores are also former keystone ptls
16:13:05 <yoctozepto> ++
16:13:08 <gmann> yoctozepto: yeah but as per Renat (former PTL) he is ok to help on that
16:13:10 <spotz> Ahhh
16:13:15 <yoctozepto> gmann: ahh, ack!
16:13:40 <bnemec> I get the impression that everyone is being pulled in other directions and doesn't feel like they have the time to commit to being PTL.
16:13:46 * bnemec sympathizes
16:13:56 <jungleboyj> bnemec:  ++
16:14:14 <yoctozepto> I could not agree more
16:14:36 <knikolla> I’ll ping the cores privately before Tuesday’s meeting, for a final attempt
16:14:46 <knikolla> Othwerise, I guess DPL will be it.
16:15:00 <gmann> +1
16:15:04 <gmann> thanks knikolla
16:15:08 <yoctozepto> ++
16:15:52 <gmann> may be we should end meeting
16:15:53 <yoctozepto> all right, I think we have gathered all we could
16:15:56 <gmann> mnaser: ?
16:16:01 <yoctozepto> my thoughts exactly
16:16:04 <mnaser> sorry, yes
16:16:05 <mnaser> #endmeeting