15:00:25 <gouthamr> #startmeeting manila
15:00:26 <openstack> Meeting started Thu Apr 23 15:00:25 2020 UTC and is due to finish in 60 minutes.  The chair is gouthamr. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:27 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:29 <openstack> The meeting name has been set to 'manila'
15:00:36 <dviroel> o/
15:00:39 <lseki> o/
15:00:47 <gouthamr> courtesy ping: xyang toabctl ganso vkmc amito carloss tbarron
15:00:54 <danielarthurt> o/
15:00:57 <carthaca> Hi
15:01:01 <andrebeltrami> hi
15:01:04 <vkmc> o/
15:01:05 <carloss> hi :)
15:01:05 <tbarron_> hi
15:01:34 <gouthamr> hello everyone o/
15:01:47 <gouthamr> The agenda for this meeting is here: https://wiki.openstack.org/wiki/Manila/Meetings#Next_meeting
15:01:59 <gouthamr> Lets get started
15:02:04 <gouthamr> #topic Announcements
15:03:10 <gouthamr> First off, hope you all are still keeping good health! many of you were working hard to help us get to this milestone in the release, despite everything that's happening in the physical world around us
15:03:24 <gouthamr> So i'll begin with thanks, and some release news:
15:03:43 <gouthamr> We shipped manila-ui in the past week
15:03:44 <gouthamr> #link http://lists.openstack.org/pipermail/release-announce/2020-April/008879.html (Manila UI 3.0.0)
15:03:51 <dviroel> \o/
15:04:05 <gouthamr> it's a major version release because we dropped support for python2 and some django dependencies
15:04:05 <vkmc> yaay
15:04:45 <gouthamr> that release leaves us with two more deliverables for Ussuri to happen very soon
15:04:49 <gouthamr> Ussuri RC1 for manila is due today
15:04:55 <gouthamr> #link https://releases.openstack.org/ussuri/schedule.html (Ussuri Release Schedule)
15:05:00 <vhari> o/
15:05:25 <gouthamr> following this, we'll be releasing manila-tempest-plugin
15:05:32 <gouthamr> we'll discuss that in a bit
15:05:49 <gouthamr> I wanted to call your attention to a couple of things
15:06:19 <gouthamr> thanks to vhari for her research, we've now set an automatic expiry on LP bugs
15:06:25 <gouthamr> #link https://bugs.launchpad.net/manila/+expirable-bugs (Expirable bugs)
15:06:57 <gouthamr> So the criteria here is spelled out here:
15:07:10 <gouthamr> #link https://help.launchpad.net/Bugs/Expiry (Bug expiry criteria/help)
15:07:57 <gouthamr> tl;dr the idea is that a bot'll be cleaning up bugs that have been marked Incomplete, and no updates for 60 days
15:08:32 <gouthamr> there's a catch though, the bug has to have no asignee or a milestone too; this helps us really keep track of issues where the reporter has stepped away
15:09:12 <gouthamr> or for things that have been languising for a few releases, and are clearly not aligning with project priorities
15:10:15 <tbarron_> nice
15:10:25 <gouthamr> if you have any questions regarding this, please let me know - if you're a bug subscriber you should see mail regarding these expired bugs
15:11:17 <gouthamr> there's atleast a handful of folks i know that are watching, including the reporters so, there's less worry of closing something important accidentally
15:11:37 <gouthamr> speaking of making things easy, i finally got around to adding a bug template :|
15:11:40 <gouthamr> #link https://bugs.launchpad.net/manila/+filebug
15:12:37 <gouthamr> it's not mandatory to follow the template all the time, but, i've covered the questions we usually ask reporters as follow up - so we can reduce some back-and-forth
15:12:46 <carloss> gouthamr ++
15:12:58 <vhari> ++
15:12:59 <tbarron_> gouthamr: that's really great
15:13:06 <gouthamr> awesome
15:13:10 <dviroel> awesome
15:13:22 <gouthamr> speaking of improvements, we merged new docs!
15:13:34 <gouthamr> #link https://docs.openstack.org/manila/latest/contributor/contributing.html (So You Want to Contribute…)
15:13:53 <gouthamr> #link https://docs.openstack.org/manila/latest/contributor/project-team-lead.html (Manila Project Team Lead Guide)
15:14:33 <gouthamr> ^ both interesting reads, and aimed at new and existing contributors - please take a look and suggest improvements!
15:14:51 * tbarron_ suspects that under gouthamr's leadership manila has become a model openstack project
15:15:00 <gouthamr> Alright, that's all i had in terms of announcements today, anyone else got anything?
15:15:07 * vkmc suspects the same
15:15:09 <vkmc> this is great
15:16:02 * gouthamr day 499.. dear diary, they still don't realize the long con :)
15:16:26 <dviroel> lol
15:16:39 <gouthamr> alright, lets move on..
15:16:44 <gouthamr> #topic Victoria PTG Planning
15:16:57 <gouthamr> you're aware of this planning etherpad:
15:16:58 <gouthamr> #link https://etherpad.opendev.org/p/vancouver-ptg-manila-planning (Victoria PTG Planning Etherpad)
15:17:55 <gouthamr> thank you for adding topics there, its natural some of us haven't paid attention to it, thanks to being distracted with the current release
15:18:35 <gouthamr> but, i urge you to consider adding even abstract to-do's in the etherpad at the earliest, because the foundation wants us to sign up for slots
15:18:42 <gouthamr> #link https://ethercalc.openstack.org/126u8ek25noy (PTG Slot Signup sheet)
15:19:15 <gouthamr> i'm open to ideas regarding the scheduling
15:19:58 <gouthamr> as usual, we'll try to avoid co-scheduling anything with other projects that you may be contributing to, or during cross project discussions that affect us
15:20:44 <gouthamr> i feel like we might prefer 1 or 2 hour slots, like the old design summits
15:21:23 <gouthamr> to dive deep into individual topics, and perhaps spend some time in the end with a bunch of smaller miscellaneous topics
15:22:18 <gouthamr> we've some time to decide, as you can see, not a lot of slots are taken, given all other project teams are still deciding as well
15:23:31 <gouthamr> so please add your topics, and we'll poll some time slots
15:23:44 <gouthamr> anything else regarding $topic?
15:24:12 <gouthamr> #topic Reviews needing attention
15:24:34 <gouthamr> we're tracking a few open reviews left for RC1 and beyond here:
15:24:35 <gouthamr> #link https://etherpad.openstack.org/p/manila-ussuri-review-focus
15:25:10 <gouthamr> thanks a ton for getting things merged yesterday and today, we've very few left to go
15:25:17 <gouthamr> #link https://launchpad.net/bugs/1858328
15:25:17 <openstack> Launchpad bug 1858328 in Manila "Manila share does not get into "shrinking_possible_data_loss_error" status when shrinking a share" [Low,Fix released] - Assigned to Daniel Tapia (danielarthurt)
15:25:24 <gouthamr> https://review.opendev.org/#/c/713867/ (Follow up change - Update share-manager behavior for shrink share operation - depends on tempest change: https://review.opendev.org/#/c/715758/)
15:25:47 <gouthamr> seems like danielarthurt refreshed the tempest side change
15:26:21 <gouthamr> but the main change hasn't had many eyes:
15:26:27 <gouthamr> #link https://review.opendev.org/#/c/713867/
15:27:02 <gouthamr> vkmc carloss can you please take a look ^
15:27:09 <carloss> sure
15:27:37 <gouthamr> this can be backported at a later time too, so i wouldn't worry about it missing rc1
15:27:38 <danielarthurt> @gouthamr Just updated with your suggestions :D
15:27:43 <gouthamr> thanks danielarthurt
15:28:01 <gouthamr> next,
15:28:09 <gouthamr> #link https://launchpad.net/bugs/1860061
15:28:09 <openstack> Launchpad bug 1860061 in Manila "Totalcount returned by pagination query is wrong and filters should before pagination query " [Medium,In progress] - Assigned to MaAoyu (maaoyu)
15:28:30 <gouthamr> zuul doesn't like the latest patch https://review.opendev.org/#/c/703025/
15:28:58 <gouthamr> and it looks like your suggestions were not taken, carloss ?
15:29:19 <carloss> yep...
15:29:24 <carloss> only the commit message ones
15:29:33 <dviroel> =/
15:29:49 <gouthamr> oh, so is the change functionally correct?
15:30:24 <carloss> nope... there is a bug in the implementation
15:30:46 <gouthamr> ouch, please let MaAoyu know
15:31:04 <gouthamr> wait, i've already retargeted this one
15:31:19 <gouthamr> so no rush :) it's still a worthy backport when we get it in
15:31:48 <carloss> alright... Either way I'll comment in the change again
15:32:00 <gouthamr> realized this is the same for LP 1858328
15:32:00 <openstack> Launchpad bug 1858328 in Manila "Manila share does not get into "shrinking_possible_data_loss_error" status when shrinking a share" [Low,Fix released] https://launchpad.net/bugs/1858328 - Assigned to Daniel Tapia (danielarthurt)
15:32:01 <gouthamr> thanks carloss
15:32:18 <gouthamr> lets use this tracker for stuff to track for RC:
15:32:21 <gouthamr> #link https://launchpad.net/manila/+milestone/ussuri-rc1
15:32:34 <gouthamr> only one bugfix left:
15:32:36 <gouthamr> https://bugs.launchpad.net/manila/+bug/1873963
15:32:36 <openstack> Launchpad bug 1873963 in Manila "NetApp driver triggers 'peer-accept' when working with intra cluster replicas" [Medium,In progress] - Assigned to Douglas Viroel (dviroel)
15:32:47 <gouthamr> easy one
15:32:58 <gouthamr> just W+1'ed the fix: https://review.opendev.org/#/c/722124/
15:33:11 <dviroel> gouthamr: thanks!
15:33:43 <gouthamr> cool, that's the final change we're tracking for RC1 ^
15:33:55 <gouthamr> if there's anything else anyone is aware of, speak up now
15:34:28 <gouthamr> barring any exceptions, we will propose an RC1 right after https://review.opendev.org/#/c/722124/ merges
15:34:31 <carthaca> oh no, I was hoping to get the fix for LP 1860061 soon :(
15:34:31 <openstack> Launchpad bug 1860061 in Manila "Totalcount returned by pagination query is wrong and filters should before pagination query " [Medium,In progress] https://launchpad.net/bugs/1860061 - Assigned to MaAoyu (maaoyu)
15:34:59 <carthaca> but this may only be us, it is blocker to upgrade to train
15:35:30 <gouthamr> carthaca: it might happen soon, i dunno where MaAoyu works from, there's been some latency between changes and reviews
15:36:30 <gouthamr> carthaca: we can still backport it to ussuri and train when it merges and request releases
15:37:52 <gouthamr> we've one pending manila-tempest-plugin change as well, not required to merge right away, but must soon:
15:37:56 <gouthamr> #link https://review.opendev.org/#/c/709702/ (bp/create-share-from-snapshot-in-another-pool-or-backend)
15:38:12 <gouthamr> dviroel andrebeltrami: what's the status on this one?
15:38:40 <andrebeltrami> I am testing the new patch set, I intend to upload a new ps in the next hours!
15:39:26 <gouthamr> andrebeltrami: ack ty for the update, no rush - i didn't expect to cut the manila-tempest-plugin release this week
15:39:51 <gouthamr> phew that's all the stuff we're tracking
15:40:59 <gouthamr> #ACTION gouthamr wait for us to decide on https://review.opendev.org/#/c/709702/ before ack-ing a ussuri release for manila-tempest-plugin
15:41:24 <gouthamr> #ACTION gouthamr will request manila ussuri RC1 after https://review.opendev.org/#/c/722124/ merges
15:41:47 <gouthamr> if there's nothing else, lets move on to bug triage!
15:41:50 <gouthamr> #topic Bugs (vhari)
15:42:00 <vhari> gouthamr thanks, have a couple on the list for today
15:42:08 <vhari> https://etherpad.opendev.org/p/manila-bug-triage-pad-new
15:42:15 <vhari> #link https://bugs.launchpad.net/manila/+bug/1873963
15:42:16 <openstack> Launchpad bug 1873963 in Manila "NetApp driver triggers 'peer-accept' when working with intra cluster replicas" [Medium,In progress] - Assigned to Douglas Viroel (dviroel)
15:42:29 <vhari> this is in progress with fix proposed
15:42:42 <vhari> floating out in case anything to discuss
15:42:42 <gouthamr> ^ easy, this is the fix we're tracking for rc1
15:42:50 <vhari> gouthamr, :0
15:42:51 <vkmc> and WF+1
15:43:05 <vhari> cool
15:43:26 <vhari> next up
15:43:29 <vhari> #link https://bugs.launchpad.net/manila/+bug/1871768
15:43:29 <openstack> Launchpad bug 1871768 in Manila "zfsonlinux driver's provisioned_capacity_gb is None in the scheduler" [Undecided,New]
15:43:43 <vhari> looking for minor triage at this time
15:44:11 <gouthamr> ack, ty, i hate this issue - it's one of those things that happens sporadically at the gate
15:44:38 <gouthamr> since we merged a fix for https://launchpad.net/bugs/1869712
15:44:38 <openstack> Launchpad bug 1869712 in Manila "Increased schedule time for non thin_provisioned backends" [Medium,Fix released] - Assigned to Jose Castro Leon (jose-castro-leon)
15:45:10 <tbarron_> The fix is good but it is exposing an intermittent underlying problem.
15:45:50 <gouthamr> yes, this one definitely existed before the bugfix for LP 1869712
15:45:50 <openstack> Launchpad bug 1869712 in Manila "Increased schedule time for non thin_provisioned backends" [Medium,Fix released] https://launchpad.net/bugs/1869712 - Assigned to Jose Castro Leon (jose-castro-leon)
15:45:51 <tbarron_> Do we see it only with zfsonlinux bac end?
15:45:57 <tbarron_> back
15:46:01 <gouthamr> tbarron_: on the gate, yes
15:46:33 <gouthamr> but, in theory this could be for any backend that reports thin_provisioning as True, [True], or [True, False]
15:47:05 <gouthamr> one fix is to ensure the backend calculates and reports a provisioned_capacity_gb
15:47:16 <tbarron_> right but
15:47:23 <gouthamr> but, what's messing up in the scheduler is a bigger question, a bad thread switch?
15:47:32 <tbarron_> we should do the right thing if it doesn't
15:49:07 <gouthamr> yes, we can profile the threads and in-memory objects to find out whats happening - for starters by debug logging provisioned_capacity_gb or adding it to the pool_stats explicitly
15:49:08 <tbarron_> I suppose we could just put a defensive check for None before the arithmetic calculation, use 0 instead
15:49:24 <gouthamr> i suspect the latter would be helpful for the scheduler-stats api consumers
15:49:47 <gouthamr> tbarron_: yes, that would be an option too
15:49:53 <tbarron_> debug logging is probably a good start
15:50:23 <gouthamr> vhari: lets target this as a medium, and to victoria-1
15:50:28 <vhari> ack
15:50:52 <gouthamr> i'll take a look with tbarron_ soon
15:51:02 <tbarron_> gouthamr: deal
15:51:13 <vhari> sounds good
15:51:36 <vhari> gouthamr, that's a wrap for bugs
15:51:39 <vhari> ty
15:51:58 <gouthamr> vhari: awesome, thank you for grooming the backlog!
15:52:06 <gouthamr> #topic Open Discussion
15:52:06 <vhari> :)
15:52:46 <gouthamr> there's a recent bug that we've had feedback on
15:52:49 <gouthamr> https://launchpad.net/bugs/1859474
15:52:49 <openstack> Launchpad bug 1859474 in Manila "Low datetime precision leads to inconsistent order of shares" [Medium,In progress] - Assigned to Dmitry Galkin (galkindmitrii)
15:53:18 <gouthamr> the reporter has graciously proposed a fix:
15:53:26 <gouthamr> #link https://review.opendev.org/#/c/721807/ (Fix inconsistent ordering caused by low datetime precision)
15:53:41 <tbarron_> gouthamr: looks like you fed in the full list of tables that need the migration/
15:53:49 <gouthamr> its a good one for us to take a look at next, and provide timely feedback
15:54:13 <gouthamr> tbarron_: yep, and i came up with "manila_nodes" somehow, and confused galkindmitrii
15:54:50 <tbarron_> unit tests are failing on migration
15:54:57 <tbarron_> maybe not the fault of the patch
15:55:34 <tbarron_> ok to recheck?
15:55:58 <gouthamr> tbarron_: yes, i think so - it should be our old friend, the timeout
15:56:05 <tbarron_> this is an intermittent error i think but if it is more constant with the patch then it will have to be dealt with
15:56:31 <gouthamr> ack
15:56:59 <gouthamr> alright, we're almost at the hour, does anyone have anything else to discuss?
15:57:06 <gouthamr> in all of 3 minutes :)
15:57:50 <gouthamr> going thrice...
15:57:53 <gouthamr> going twice...
15:58:24 <gouthamr> going one and three quarter times...
15:58:28 <tbarron_> like the end of a basketball game
15:58:39 <tbarron_> he runs the clock down
15:58:39 <gouthamr> :D
15:58:47 <tbarron_> foul
15:59:05 <gouthamr> hahaha, we have #openstack-manila for extra innings time
15:59:14 <vhari> :D
15:59:24 <gouthamr> thanks all!
15:59:25 <tbarron_> cya!
15:59:34 <gouthamr> #endmeeting