15:00:26 #startmeeting manila 15:00:26 Meeting started Thu Dec 1 15:00:26 2022 UTC and is due to finish in 60 minutes. The chair is carloss. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:26 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:26 The meeting name has been set to 'manila' 15:00:45 o/ 15:01:01 \o/ 15:01:08 welcome back ashrodri :D 15:01:21 O/ 15:01:37 courtesy ping: vkmc vhari gouthamr 15:01:42 hi 15:03:01 o/ 15:03:09 \o/ 15:04:07 hi 15:04:47 hello everyone :D 15:05:11 glad to see you here today 15:05:28 we have plenty on the agenda for today, so let's get started 15:05:37 #link https://wiki.openstack.org/wiki/Manila/Meetings#Next_meeting 15:05:49 hi 15:05:54 o/ carthaca 15:06:07 o/ 15:06:15 o/ 15:06:35 #topic Announcements 15:06:46 Schedule and Deadlines: 15:06:53 #link https://releases.openstack.org/antelope/schedule.html 15:07:20 we're on our way to m-2 15:07:38 o/ 15:07:57 o/ 15:08:08 you will notice that manila trackers aren't there yet 15:08:09 but 15:08:21 I had a patch proposed for that purpose weeks ago: 15:08:21 \o 15:08:23 #link https://review.opendev.org/c/openstack/releases/+/864313 15:08:56 I'd say one way to make some noise is to have some zorillas voting there 15:09:57 so the predicted date for spec freeze is supposed to be in 12 days, but as we didn't merge that into the official schedule and there are still some pending reviews, I'd say we could become more flexible with it 15:10:10 and also for another reason: this is a month that lots of people get holidays and PTO 15:11:56 other than that, I'd say the dates would follow the same 15:12:06 we could try to buy one more week in the official schedule for the specs 15:13:08 and the second announcement is about the RBAC hackathon :D 15:13:19 that we needed to postpone before 15:13:24 but, now we have new dates! 15:13:38 S-RBAC test Hack-a-thon (vhari/lkuchlan) 15:14:07 looks like vhari is having irc troubles 15:14:11 yep :D 15:14:14 I can cover that ;) 15:14:22 ++ 15:14:27 gouthamr, ack 15:14:35 gouthamr, intermittent network issues today :) 15:15:02 the new dates are supposed to be next week - (Dec 7th to Dec 9th) 15:16:16 so we'd have a call to kick off on Wed, use the slot of this meeting as a mid-term checkpoint and maybe have another call on Friday, to close the event 15:16:37 Liron will be sending the details on the mailing list in the beginning of the next week 15:17:01 hackathons with this group are usually pretty fun and you get a chance to learn by doing and also interacting with all the Zorillas 15:17:45 they are also good for who is starting to understand better manila's codebase - so highly recommended to sharpen your skills and have some fun 15:19:06 so wait for the details on the openstack-discuss mailing list, and let's have some fun next week! 15:19:07 and like usual, there are excellent code examples this time too 15:19:32 there’s a lot of small RBAC patches that have showed up, mostly props to lkuchlan++ 15:20:20 gouthamr: great point :D 15:20:24 lkuchlan++ 15:20:28 https://tree.taiga.io/project/silvacarloss-manila-tempest-plugin-rbac/kanban - this is the kanban board we intend to use. If you can't access it, please ping me and I can grant you access! 15:21:30 that's all I had for $topic 15:21:37 is there something you'd like to share with us today? 15:23:39 taking silence as no... let's continue with our agenda 15:23:46 #topic CI Status 15:24:18 so in previous meetings you might have seen me talking about the community-wide goal to migrate CI images to use ubuntu jammy 15:24:39 https://governance.openstack.org/tc/goals/selected/migrate-ci-jobs-to-ubuntu-jammy.html 15:24:53 and there was a target-date to switch the images: m-1 15:25:26 I had proposed a patch to see if our jobs were going to be okay: 15:25:29 #link https://review.opendev.org/c/openstack/manila/+/861635 15:25:38 as you can see, they were not 15:25:38 :D 15:25:56 but we didn't have time to check the failures and fix them on time 15:26:17 and as we got to m-1, the image officially changed, making our jobs to fail! 15:26:39 it was impacting CI, so we decided to make the images use the previous ubuntu version for now 15:26:42 that's a band-aid 15:27:19 #link https://review.opendev.org/c/openstack/manila-tempest-plugin/+/865185 15:28:04 our jobs are using ubuntu 20 now, but we *need* to make the changes to migrate it to 22 15:28:13 the only voting job that is failing is LVM 15:28:27 I have opened bugs against all of the drivers failing, see: 15:28:30 #link https://bugs.launchpad.net/manila/+bugs?field.tag=migrate-to-jammy 15:28:46 * carloss added the migrate-to-jammy tag to all so it's easy to filter and check on status 15:30:09 LVM as I said is more critical and it's an issue we already saw in the FIPS lvm job in the past: it fails to install quagga 15:30:29 the idea is that it should be replaced by FRR, as I state in the bug 15:30:46 thanks for pushing on this effort carloss! With the number of jobs we run, it’s a huge pain moving these along as the infra changes 15:32:00 no problem gouthamr :D 15:32:50 the thing is: we have 4 jobs to fix... the bugs I reported contain the current issues the jobs are facing, but we don't know if something will fail other than the issues we are catching 15:33:07 not trying to discourage anyone, but as one issue is fixed, another one might show up 15:33:50 and I'm saying I'm not trying to discourage anyone? well, I believe this is a good thing to be shared with the community 15:33:56 the ownership of these bugs I mean 15:34:17 the help there would be much appreciated and the bulk of the work would not be in only one of us 15:35:03 so the idea is not to have one person working on all of the bugs, but share the effort with other maintainers... it's also a good opportunity and you would have some help as you need 15:35:45 so, is there someone here that would like to volunteer on getting any of these bugs to work on? 15:36:41 +1; I can take a look at the ceph jobs 15:36:50 and for some triaging now, I believe the priorities for the bugs should differ a bit: LVM, as is a voting job, the priority should be High and for the other two, it could be Medium 15:37:03 gouthamr++ thanks! 15:37:11 i think you got a start already, but the ceph nfs job has the same issue with quagga 15:37:58 so I believe I'm halfway through with the glusterfs job, I have tentative fix and yet it fails :| 15:38:16 oh, yes gouthamr - the issue with frr would possibly address the LVM job as well 15:40:38 okay, I can own the glusterfs 15:40:45 both native and nfs 15:40:56 if you'd like to contribute, please let us know 15:41:31 jumping to the next topic: 15:41:35 #topic Review Focus 15:42:21 #link https://etherpad.opendev.org/p/manila-antelope-review-focus 15:42:49 what I'd say our eyes should be focused atm is kpdev's spec proposal: 15:43:03 #link https://review.opendev.org/c/openstack/manila-specs/+/330306 15:43:11 I had a few interactions, and I see haixin had some too 15:43:12 haixin++ 15:43:34 (felipe_rodriguess++) 15:43:42 as he also had some reviews there 15:45:12 is there something else you'd like to shed some light on? 15:45:53 or better, to get some eyes on 15:46:09 thanks carloss 15:46:24 expecting more reviews on this spec 15:46:42 gouthamr: could you please take a look when you have some time? :) 15:48:09 * carloss jumping to the next topic :p 15:48:14 #topic Bug Triage (vhari) 15:48:20 #link https://etherpad.openstack.org/p/manila-bug-triage-pad-new (Bug Triage etherpad) 15:48:28 ty carloss .. 15:48:40 vhari: it's been a long time since we don't have the time to do some bug triaging :D 15:48:52 looking for initial triage of some old bugs in the queue 15:48:54 #link https://bugs.launchpad.net/manila/+bug/1996793 15:49:25 carloss, ack .. will get a jump of a few today 15:50:58 interesting one vhari 15:51:16 interesting, I think we memoize the mon IPs, they want a way to invalidate this without restarting the service 15:52:06 decent ask; we wanted to avoid the round trips to storage for such infrequently changing data 15:53:11 I wonder if we can use sighup as a way to do this 15:53:46 carloss vhari this is a good getting started bug with the ceph driver imo 15:54:28 gouthamr, agreed 15:54:32 ++ 15:54:45 I'd like to take this one :) 15:54:56 oh wait, I’m not thinking about old exports with that thought 15:55:12 this is probably where we can trigger ensure shares 15:55:21 if we detect the change 15:57:09 agreed 15:57:19 in terms of timeline, i'd say m-3 as we are too close to m1 15:57:27 and medium prio 15:57:38 even though if shares get inacessible, this might be high 15:58:34 s/even though/though 15:59:33 * carloss checks time 15:59:48 we've one minute left 16:00:08 sorry for leaving a small amount of time for bug triaging vhari 16:00:11 carloss, quick wrap for bugs :) 16:00:32 thank you for joining today's meeting, zorillas! 16:00:36 see you on #openstack-manila 16:00:41 #endmeeting