15:01:14 #startmeeting tc 15:01:15 Meeting started Thu Feb 4 15:01:14 2021 UTC and is due to finish in 60 minutes. The chair is mnaser. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:01:16 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:01:18 The meeting name has been set to 'tc' 15:01:24 #topic rollcall 15:01:27 o/ 15:01:29 o/ 15:01:34 o/ 15:01:48 o/ 15:01:55 o/ 15:02:47 #topic Follow up on past action items 15:03:00 #link http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-01-28-15.00.html 15:03:24 diablo_rojo complete retirement of karbor 15:03:42 * mnaser looks up review 15:03:53 #link https://review.opendev.org/c/openstack/governance/+/767056 15:04:01 #link https://review.opendev.org/c/openstack/project-config/+/767057 15:04:03 that seems to have landed 15:04:34 any other tc members want to vote on the governance repo change and we can land that cange 15:04:36 yeah, few usage one is the only left one #link https://review.opendev.org/q/topic:%2522retire-karbor%2522+status:open 15:04:44 especially system-config and openstack-map 15:05:21 osf/openstack-map i think ttx can help with that 15:05:49 and opendev/system-config maybe we can ask for fungi or clarkb help for that one 15:05:53 recheck them to have zuul green 15:06:11 ah we have rebase issue https://review.opendev.org/c/osf/openstack-map/+/767236/2/deployment_tools.yaml 15:06:14 tc-members: please review https://review.opendev.org/c/openstack/governance/+/767056 so we can land this 15:06:16 i can fix that quickly 15:06:28 i'll take a look at the opendev patches for that now 15:06:31 we can remove that action item for the upcoming meetings, i think it's just an open review topic for now 15:06:37 thank you fungi ! 15:06:41 +1 15:07:21 next up was tc-members review and leave comments on the etherpad for SIG updates <= i think we can leave that for the discusion item later 15:07:37 mnaser continue to sync with tripleo team to reduce usage < also can be left for the discussion point later 15:07:58 qa team to land parallelizing changes for devstack < we can probably leave this also for teh gate performance discussion point if y'all are okay with it? 15:08:22 ++ 15:08:38 ok cool 15:08:40 #topic Audit SIG list and chairs (diablo_rojo) 15:09:10 tc-members: https://etherpad.opendev.org/p/2021-SIG-Updates -- please review this to help diablo_rojo and ricolin go over auditing those 15:09:27 as well as the attached reviews 15:09:39 #link http://lists.openstack.org/pipermail/openstack-discuss/2021-January/019994.html 15:10:38 ok, time to get into the heavier subjects :) 15:10:45 #topic Gate performance and heavy job configs (dansmith) 15:10:55 yo 15:11:23 on my site, I reached out to the TripleO team and they most definitely are working on improving/reducing the footprint -- it was brought up in their meeting a few days ago 15:11:31 so, I've been working (with gracious help from fungi and clarkb) on trying to figure out what metrics we can generate and use to try to quantify the relative weights of jobs in each project 15:11:37 and the results are pretty interesting 15:12:03 the two worst offenders, by far, are tripleo and neutron and I too have had good response from both teams in my reaching out 15:12:16 #link http://eavesdrop.openstack.org/meetings/tripleo/2021/tripleo.2021-02-02-14.00.log.html#l-37 15:12:21 I've got a hacky tool to do some measurement and I'm planning an email to the list after this meeting to share some of the important numbers 15:12:43 I hope that some awareness will yield some initial good-faith trimming of jobs 15:12:49 +1 15:13:06 I think the TC could also use this data and try to determine some target "weights" for jobs to try to keep things reasonable, 15:13:28 because again this week, lag times have been pretty bad.. not horrific, but definitely more than a day to get anything useful done 15:13:36 in particular, i like that your tool can show the aggregate node utilization for a single build, which makes it more striking in side-by-side comparisons between different jobs what the relative costs for them are 15:13:41 also this can be good thing to monitor in Tact SIG 15:14:06 something like weekly or monthly usage on ML 15:14:15 gmann: yep, discussions so far have been in the context of the tact sig, in our irc channel 15:14:25 +1 15:14:31 yeah, I'd like to revisit this periodically and look at trends and identify things that make jobs suddenly worse 15:14:48 but we also really need to get people to work on fixing the spurious failures, of which we have a lot these days 15:14:57 as well as turning an eye to improving efficiency of what we're testing 15:15:10 on that note too, the tripleo team brought up a really good point which was -- having a target set for how much utilization they should have 15:15:26 "you're using too much" -- "how much should we be using" is a very fair question to ask 15:15:32 yeah, i get the impression a lot of these jobs test mostly the exact same things, rather than only testing what other jobs aren't covering 15:15:36 yeah, and this script gives us a way to weigh a patch, wehich we can then say "should be X instead of currently Y" 15:15:58 fungi: yes, we really need to spend time making sure that's not too overlappy 15:16:18 I think tripleo is probably better than most at only testing a smaller subset, especially after they make their proposed changes 15:16:19 and also combining the jobs when testing with same config can helpful . there are lot of jobs separated out per set of tests not config 15:16:28 partially because one run is so heavy that they're not inclined to add more of those 15:16:32 but nova has a ton of overlap, 15:16:40 and I've already proposed some reduction there and more can be done 15:17:27 so anyway, that's my status for the moment 15:17:35 i do think it does still bring back a really good point of "how much usage should we be at?" 15:17:43 projects have historically never had any 'quotas' 15:17:44 for tripleo question: can we set a cap for each project utilization ? it should be done based on nature of testing at project not equal to each project 15:17:55 yeah, that one 15:18:00 gmann: we can't, currently 15:18:16 there is fair queuing across projects, 15:18:36 but projects like tripleo have many trees, and it's definitely not a quota 15:18:52 I have to step away for something important right now, so we can move on 15:19:37 yeah but if we set current-working-time(after optimization when we say current is fine now) as quota so that it does not goes high in future or can be seen if it goes 15:20:00 it looks like we do have some actionable things out of this which is reaching out to the ML and bringing this up and it seems within the tact sig there is work and efforts 15:20:09 in order to try and watch what is happening 15:20:19 well, a target quota is good, but doesn't enforce anything, and enforcing quotas are not an option at the moment 15:20:28 yup yup 15:20:33 look for my email tome soon 15:20:42 yes, i agree, just a target that we can all agree on :) 15:20:54 It would seem having a benchmark so we see when a project is suddenly consuming more is at least a good starting point. 15:21:06 Raise a flag before things get bad. 15:21:12 yeah 15:22:27 #action dansmith reach out to ML regarding node usage 15:22:50 #action tact-sig review/trend project usage to raise flags before things "get bad" 15:22:55 #topic Dropping lower-constraints testing from all projects (gmann) 15:23:06 ++ 15:23:44 #link http://lists.openstack.org/pipermail/openstack-discuss/2021-January/019672.html 15:23:52 i know there was a few alternatives proposed which were being implemented in nova 15:24:14 I am testing with direct deps only in nova and placement 15:24:18 #link https://review.opendev.org/q/topic:%22l-c-direct-deps-only%22+(status:open%20OR%20status:merged) 15:24:35 it is under progress as it is not yet green 15:24:46 once I finish that then i can push result on ML 15:25:10 should we keep this a topic to continue to discuss in the TC meeting? 15:25:13 just be prepared to rip those out when stable branches are created. if you're not constraining your indirect dependencies then they will change over time and become incompatible 15:25:37 for stable, I think we should remove it 15:26:07 mnaser: we need to get consensus in TC but may be after we finish the discussion in ML 15:26:19 ok so i think it is still productive to keep it open 15:26:24 may be we can drop it for now and once we are ready to discuss in TC then we can add 15:26:46 i'm okay with that too 15:26:46 yeah either way is ok, 15:26:53 so it sounds like the direction is keep the job in master only and for direct dependencies only? 15:27:19 rosmaita: I am testing that if that work fine or need more work. will post the result on ML 15:27:31 ok, great 15:27:41 let me add test for cinder too 15:27:49 i have no objection to that idea 15:28:34 yeah that surly can reduce the maintenance effort 15:29:16 ok cool, gmann: your call on keeping it on the discussion list 15:29:24 sorry, agenda rathert 15:29:32 ok 15:30:16 I can drop/keep it based on ML discussion. like nothing to discuss then drop 15:30:23 before next Wed 15:30:29 ok that sounds like a good idea 15:30:33 * diablo_rojo sneaks in late 15:30:47 #action gmann continue to follow-up on using direct dependencies for l-c jobs 15:30:58 o/ diablo_rojo 15:31:00 #topic Mistral Maintenance (gmann) 15:31:07 #link http://lists.openstack.org/pipermail/openstack-discuss/2021-February/020137.html 15:31:14 * jungleboyj marks diablo_rojo tardy 15:31:20 didnt.. or does.. tripleo use mistral? 15:31:36 humm do not know that. 15:31:43 not anymore I don't think 15:31:44 could be wrong 15:31:47 i thought they did at one point 15:31:56 but yeah, i vaguely recall they stopped a while back 15:32:15 i think they dropped mistral in favour of using ansible 15:32:17 i.. think 15:32:24 https://specs.openstack.org/openstack/tripleo-specs/specs/ussuri/mistral-to-ansible.html 15:33:02 which was implemetned in ussuri-3 15:33:05 #link https://blueprints.launchpad.net/tripleo/+spec/tripleo-mistral-to-ansible 15:33:09 so that was a big consumer that droped off 15:33:14 #link https://review.opendev.org/q/project:openstack/mistral 15:33:47 ~100 changes or so in the past year 15:34:07 other project using it is Tacker which is optional integration afaik and work without mistral too. As discussed with one of the Tacker member they will discuss it and see what they can do, either help in mistral or deprecate the support. 15:34:40 i think it will become more tricky when we'll have to figure out a ptl for the project 15:34:56 but maybe a distributed leadership model could work because it is a very low volume project right now 15:35:09 yeah Renat mentioned he will not be able to run as PTL in next election 15:35:42 mnaser: but do we have any other maintainer there other than Renat ? 15:36:01 may be there are. 15:36:02 i see some patches by `ali.abdelal@nokia.com` 15:36:06 https://review.opendev.org/c/openstack/mistral/+/755481 15:36:14 and i see Eyal too making reviews 15:36:18 yeah 15:36:49 #link https://review.opendev.org/admin/groups/19588f2c593d37749e32bc5400365f654c683d19,members 15:36:50 Distributed leadership will be helpful but let's see 15:37:30 do we know the user adoption of Mistral? 15:37:35 other thing i was thinking to add it in newsletter to convey about it need maintainer... 15:37:56 belmoreira: good point. 15:37:57 belmoreira: that's a good question, maybe user survey data can help with this 15:38:03 might be able to figure out some percentage of user survey respondents who claimed to be using it? 15:38:07 gmann: maybe we can reach out to mistral-core before that 15:38:08 jinx 15:38:21 one I know is via Tacker in NEC/NTT japan which we are checking option to go woth non-mistral way 15:38:22 mnaser *I'm trying to find out* 15:38:43 mnaser: +1, 15:38:54 * mnaser looks at aprice 15:39:11 oh wait i'm mblind, there is a 'projects used in production deployments' 15:39:22 under 'deployments' https://www.openstack.org/analytics 15:39:46 it's amazing how she answers your questions before you even know you have them 15:39:51 so this shows .. i think .. 8% proudction, 4% testing, 13% interested 15:40:09 yeah 15:40:24 this is 2019 data it will great to get the latest 15:40:35 maybe aprice can help 15:40:43 all it shows to me is that we need to drop a few projects so that graph won't be unreadably crowded ;) 15:40:51 o/ 15:41:01 fungi: ++ 15:41:09 fungi :) 15:41:12 in another meeting simultaneously - is there a tl;dr on the current user survey request? 15:41:25 aprice: 2020 data :) 15:42:02 fungi: or divide into separate category based on usage like high vs low or so? 15:42:05 ah yes, i didnt realize that wasn't on o.o/analytics yet. i can work with jimmy to get that live soon 15:42:11 probably in the next 1-2 days 15:42:17 cool, awesome! 15:42:19 but if there's a specific data point yall need sooner, lmk 15:42:30 aprice: % of usage of mistral in prod environments 15:42:50 mnaser: ok - i can pull that in like an hour or so and drop it back in this channel 15:42:56 aprice: yay thank you! 15:42:59 np! 15:43:09 so we'll have that info, anyone would like to volunteer to reach out to the current maintainers? 15:43:46 for 'you need help' ? 15:44:09 yes, to try and see if any of them would like to run into ptl next release 15:44:15 ok 15:44:18 so we can try to get ahead of the troubled waters ahead :-) 15:44:18 i can do that 15:44:24 thanks gmann 15:44:37 #action gmann reach out to mistral maintainers about PTL for next cycle 15:44:38 gmann, let me help you with that too 15:44:44 #undo 15:44:45 Removing item from minutes: #action gmann reach out to mistral maintainers about PTL for next cycle 15:44:49 #action gmann/ricolin reach out to mistral maintainers about PTL for next cycle 15:44:50 :> 15:44:51 sure 15:45:28 so with the combination of anyone stepping up + usage in the wild as suggested by belmoreira, we should be able to take a decision with the project's direction 15:45:55 yeah 15:46:25 cool, thanks for bringing up gmann 15:46:26 #topic Open Reviews 15:46:30 #link https://review.opendev.org/q/project:openstack/governance+is:open 15:47:18 i'm going to go over a few that need to be merged noew 15:50:12 https://review.opendev.org/c/openstack/governance/+/770616 15:50:16 this really could use more votes 15:50:56 https://review.opendev.org/c/openstack/governance/+/773383 also can help the osa team land things 15:52:07 done 15:52:21 Done. 15:52:31 I want to read the cool down one again. Will do that. 15:54:34 cool, dansmith had some concerns too, so appreciate a review on https://review.opendev.org/c/openstack/governance/+/770616 when possible 15:54:58 Done. I think the wording is positive enough while communicating our reasons. 15:55:01 yeah I don't think my concerns were addressed, but they weren't super important 15:55:11 Don't quit, but work on what you want to do. 15:56:03 dansmith: i'm happy to revise if you want if you feel strongly about the concerns :) 15:56:19 I know, I'll try to hit it again after this 15:56:32 cool, no problem 15:57:05 and also some of those applications for projects that have historically been very interop-y -- https://review.opendev.org/c/openstack/governance/+/773684 (cinder) and https://review.opendev.org/c/openstack/governance/+/773090 (neutron) 15:57:11 those cant land yet but earlier reviews is helpful 15:58:10 gmann: this might be at a standstill https://review.opendev.org/c/openstack/governance/+/771785 -- i think we'll have to ping the monasca team to properly deprecate it 15:58:35 Merged openstack/governance-sigs master: Add Dirk Mueller as Packaging SIG co-chair https://review.opendev.org/c/openstack/governance-sigs/+/772881 15:58:37 yeah, I will check with them or if they need help 15:58:56 cool, awesome, we wer eable to clear out most of our queue 15:59:07 \o/ 15:59:32 :) 15:59:42 thanks for coming everyone! 15:59:43 Merged openstack/governance master: Add rbd-iscsi-client to cinder project https://review.opendev.org/c/openstack/governance/+/772597 15:59:52 very productive 15:59:54 #endmeeting