15:00:26 <carloss> #startmeeting manila
15:00:26 <opendevmeet> Meeting started Thu Jul 28 15:00:26 2022 UTC and is due to finish in 60 minutes.  The chair is carloss. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:26 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:26 <opendevmeet> The meeting name has been set to 'manila'
15:00:42 <vkmc> o/
15:00:42 <haixin> o/
15:00:42 <carloss> courtesy ping: vkmc dviroel felipe_rodrigues ecsantos vhari gouthamr
15:00:52 <caiquemello[m]> o/
15:00:56 <MatheusAndrade[m]> o/
15:01:05 <vhari> o/
15:01:07 <ecsantos[m]> o/
15:01:23 <gouthamr> \o/
15:01:50 <nahimsouza[m]> \o
15:01:56 <luizsantos[m]> o/
15:02:43 <carloss> o/ hello everyone
15:02:46 <HelenaMylena[m]> o/
15:02:48 <felipe_rodrigues> o/
15:02:50 <dviroel> o/
15:03:13 <carloss> good quorum, we have a bunch of topics today, so let's get started
15:03:25 <carloss> today's meeting agenda:
15:03:31 <carloss> #link https://wiki.openstack.org/wiki/Manila/Meetings#Next_meeting (Meeting agenda)
15:04:00 <carloss> first topic is
15:04:09 <carloss> #topic Announcements
15:04:28 <carloss> #link https://releases.openstack.org/zed/schedule.html (Zed release schedule)
15:04:48 <carloss> according to our schedule, the new driver deadline is today
15:05:06 <carloss> we have one candidate
15:05:13 <carloss> #link https://review.opendev.org/c/openstack/manila/+/825429 (Add Macrosan Manila driver)
15:05:34 <carloss> I added some comments to this change in the last week and pinged the author yesterday
15:05:57 <carloss> the author mentioned that the only thing left to submit a new patch set is working on unit tests
15:06:38 <carloss> I think we may need an exception - and it would be good to have more eyes on this change too
15:06:50 <gouthamr> do we?
15:07:01 <gouthamr> i mean, there's still time for feature freeze
15:08:00 <carloss> yes, can wait until feature freeze
15:08:03 <gouthamr> however, sooner the better ofcourse since they've been working on the driver for a couple of releases now
15:08:37 <gouthamr> ++ on adding more reviewers, thanks carloss
15:11:17 <carloss> thanks gouthamr
15:11:49 <carloss> and Feature Proposal Freeze is next week
15:12:07 <carloss> which means that all new manila features must be proposed and substantially completed, with unit, functional and integration tests by the end of the week
15:13:23 <carloss> I'm unsure if someone will need some extra time for that, but we can chat about it when the time comes
15:14:15 <carloss> that's all I had for $topic
15:14:28 <carloss> do you have an announcement you would like to share with us today?
15:15:16 <gouthamr> https://lists.openstack.org/pipermail/openstack-discuss/2022-July/029757.html seems like good news
15:15:28 * gouthamr is hoping we really did have stable CI for a week :)
15:16:20 <carloss> indeed gouthamr - and the script was fixed, you were correct when mentioning that the numbers sounded a bit far when I first shared this
15:17:42 <carloss> but yes, it's a good stable CI indicative
15:17:52 <carloss> on to the next topic...
15:18:03 <carloss> #topic In person PTG attendance
15:18:07 <gouthamr> ack, it got us looking and correcting ourselves :)
15:18:14 <carloss> ++ :)
15:18:40 <carloss> #link https://openinfra.dev/ptg/ (OpenInfra PTG)
15:19:01 <carloss> after some only virtual PTG, we will finally have a in-person PTG!
15:19:31 <carloss> it will run from Oct 17th to Oct 20th, in Columbus, Ohio
15:19:54 <carloss> and I wanted to ask here if you have plans of attending in person or virtually
15:20:08 <carloss> I have already created our etherpad
15:20:18 <carloss> #link https://etherpad.opendev.org/p/columbus-ptg-manila-planning (Columbus PTG manila planning etherpad)
15:20:44 <carloss> there's an attendees list in the etherpad
15:21:19 <carloss> it will not only occur in person, we will also have it happening virtually, so people not on-site can connect as well
15:21:42 <carloss> please add your name in case you will be participating on the PTG, and in front of it if you intend to participate remotely or in-person
15:24:24 <sfernand> nahimsouza[m] and I will attend in person
15:24:34 <carloss> so we can have an idea of how many people to expect there and how many people to attend virtually
15:24:38 <carloss> great sfernand :D
15:25:12 <ashrodri> i will also be attending in person :D
15:25:36 <carloss> awesome! I do intend to as well
15:25:42 <carloss> it will be nice to have a Zorilla reunion
15:25:55 <carloss> we could think of a team dinner at some day :)
15:26:30 <sfernand> that would be be great :)
15:26:31 <gouthamr> ++
15:26:57 <carloss> also, please add your topics to the etherpad so we can plan accordingly
15:26:58 <sfernand> looking forward to meet some of you guys in per sim
15:27:15 <carloss> sfernand++
15:27:36 <gouthamr> agreed, i really hope the world stays somewhat sane and people we can travel
15:28:16 <carloss> indeed
15:28:34 <carloss> cool, on to the next topic :)
15:28:51 <carloss> #topic Holding CEPH upgrades
15:29:12 <carloss> the Rook community has asked users to not to upgrade to 17.2.2
15:29:40 <carloss> this is because they found a few issues, and that leads ceph mgr to crash
15:29:46 <carloss> #link https://tracker.ceph.com/issues/56700 (Ceph mgr crashes)
15:30:23 <carloss> also, they found a FIPS issue that will be addressed soon
15:30:27 <carloss> #link https://tracker.ceph.com/issues/56727 (FIPS issue)
15:30:59 <carloss> so the recommendation is that if you plan to upgrade to 17.2.2, hold that though for a bit
15:31:18 <carloss> a new release fixing those issues (17.2.3) should be out soon
15:31:36 <carloss> gouthamr - is there something else you would like to add to this? :)
15:32:45 <gouthamr> nope you covered it well
15:33:15 <carloss> thanks!
15:33:32 <gouthamr> ah just one point: this applies to all ceph clusters, not just ones deployed with rook
15:34:55 <gouthamr> our CI currently tests with ceph pacific (16.x); but vkmc will be moving testing to quincy soon on the main branch
15:36:45 <carloss> ah, thanks for the additional information
15:37:50 <carloss> moving on...
15:37:59 <carloss> #topic Review Focus
15:38:06 <carloss> #link https://etherpad.opendev.org/p/manila-zorilla-review-focus (Review Focus Etherpad)
15:39:13 <carloss> some changes to highlight were already mentioned (i.e. the new driver addition)
15:39:45 <carloss> and something we have been highlighting for a couple of weeks: our OSC patches
15:40:12 <carloss> the list is now smaller (thanks to everyone's effort)
15:40:16 <carloss> #link https://review.opendev.org/q/topic:bp/openstack-client-support+status:open
15:40:19 * gouthamr is looking at https://review.opendev.org/c/openstack/python-manilaclient/+/816401
15:40:29 <gouthamr> will workflow it soon
15:40:36 <carloss> good stuff, thanks gouthamr :D
15:42:05 <carloss> the other commands are close to get merged too
15:42:13 <carloss> the only ones remaining will be the share server migration ones
15:42:25 <carloss> which francie is currently working in the implementation
15:43:15 <carloss> thank you for the effort to get these changes in!
15:44:02 <carloss> we still have a couple of functional tests to merge, and the changes are moving as well
15:44:06 <carloss> #link https://review.opendev.org/q/topic:osc-functional-tests+status:open
15:44:45 <carloss> some are in merge conflict due to the fact that other changes have been merged and we are all touching one common file
15:45:14 <carloss> is there a change you would like to have reviewers attention on?
15:47:14 <carloss> if not, we can stick to looking at the changes in the focus etherpad. if you want to add other change there, please do so :)
15:48:52 <carloss> we still have some more minutes for triaging bugs, so...
15:48:56 <carloss> #topic Bug Triage (vhari)
15:49:11 <vhari> carloss, ack ty  so lets dive in :)
15:49:12 <vhari> #link https://bugs.launchpad.net/manila/+bug/1982808
15:49:47 <vhari> looking for minor triage ..
15:49:58 <vhari> per lst comment status moved to in progress
15:50:45 <carloss> sfernand: thanks for the bug report
15:51:24 <carloss> do you already have a fix in place for this issue? or someone currently working on it?
15:51:40 <gouthamr> so, teh driver was optimizing by avoiding "snapmirror release" for the source-->dest
15:51:52 <gouthamr> that optimization isn't necessary?
15:52:20 <gouthamr> optimizing because in case the relationship is reversed in the future, it'll be faster to sync the data back
15:53:02 <sfernand> we still avoid release before doing a resync and recreating the relationships
15:53:32 <sfernand> but after that the ontap engineers recommendation is to release the old destinations
15:53:56 <gouthamr> i see
15:53:58 <sfernand> from the original source
15:54:16 <gouthamr> okay makes sense
15:54:18 <sfernand> This ensure locks are proper released
15:54:31 <gouthamr> can you link the patch to the bug? the bot's missed it
15:54:38 <sfernand> And ontap can cleanup snapshots later.
15:55:14 <sfernand> Sure! I will check if I did something wrong in the patch as well
15:55:23 <gouthamr> ack, makes sense - the optimization gained by preserving prior relationships was just in theory i think
15:55:25 <gouthamr> can this be a low, zed-3?
15:55:40 <sfernand> Don’t know why it is no in comments :(
15:55:59 <gouthamr> the bot ignores things after the first patch
15:56:33 <gouthamr> so if the first patch didn't have the bug reference (or had it, with incorrect syntax), it'll mess up the updates
15:56:33 <sfernand> Yes, in a fanout configuration all replicas will have at least one snapshot in common
15:57:43 <sfernand> and that is all that ontap requires to avoid creating a new baseline
15:58:23 <sfernand> I will update the bug with the patch to the Link, sorry about that
15:58:44 <gouthamr> no worries, thanks sfernand
15:58:56 * carloss checks time
15:59:20 <vhari> sfernand++ ty I updated the bug with comments
15:59:33 <vhari> carloss, back to you :)
15:59:34 <carloss> ++ on low and z-3
15:59:47 <carloss> thank you vhari gouthamr sfernand :)
16:00:09 <carloss> let's continue the chatter in #openstack-manila
16:00:16 <carloss> thank you for joining today's meeting!
16:00:16 <sfernand> :)
16:00:20 <gouthamr> thanks!
16:00:22 <carloss> #endmeeting