15:02:43 <tbarron> #startmeeting manila
15:02:44 <openstack> Meeting started Thu Sep 19 15:02:43 2019 UTC and is due to finish in 60 minutes.  The chair is tbarron. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:02:45 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:02:48 <openstack> The meeting name has been set to 'manila'
15:02:55 <carloss> hi :)
15:02:58 <gouthamr> o/
15:03:01 <vkmc> \o
15:03:04 <jgrosso> hello :)
15:03:29 <tbarron> #topic Agenda
15:03:39 <dviroel> o/
15:03:56 <lseki> o/
15:04:08 <ganso> o/
15:04:12 <tbarron> #link https://wiki.openstack.org/w/index.php?title=Manila/Meetings
15:04:30 <tbarron> oh, sec
15:04:44 <tbarron> courtesy ping: gouthamr xyang toabctl bswartz ganso erlon tpsilva vkmc amito jgrosso dviroel lseki carloss
15:04:54 * tbarron waits a couple
15:04:55 <amito> hey :)
15:05:14 <bswartz> .o/
15:05:26 * bswartz rushes in out of breath
15:05:38 <tbarron> bswartz: me too :)
15:05:47 <tbarron> ok, we've got 10 people or so,
15:05:51 <tbarron> Hi all!
15:06:11 <tbarron> if you update the agenda pls. ping me so I reload :)
15:06:17 <tbarron> #topic Announcments
15:06:38 <tbarron> #link https://releases.openstack.org/train/schedule.html
15:06:58 <tbarron> Next week is our RC1 target, focus is on testing so we don't
15:07:08 <tbarron> have to do multiple release candidates
15:07:34 <tbarron> I'll remind everyone that we are in String and Requirements freeze
15:07:42 <tbarron> and also feature freeze of course
15:08:26 <tbarron> So lets identify any must-have bugs and put reviews for them on the review focus etherpad
15:08:33 <tbarron> (it's own topic later)
15:08:56 <tbarron> Discussion or comments on RC1 target and our work?
15:09:07 <tbarron> OK,
15:09:24 <tbarron> tomorrow I need to submit our Forum topic submissions
15:09:36 <tbarron> using this etherpad as a source:
15:09:50 <tbarron> #link https://etherpad.openstack.org/p/manila-shanghai-forum-brainstorming
15:10:12 <tbarron> No one has updated it except me and noggin143
15:10:23 <tbarron> Thanks noggin143 :)
15:10:31 <tbarron> So this is your last chance.
15:10:43 <gouthamr> or else :)
15:11:10 <tbarron> I'll remind that the Forum sessions are useful even if you can't be there for helping shape future directions.
15:11:19 <tbarron> gouthamr: damn straight!
15:11:46 <tbarron> Any other announcements?
15:11:51 <gouthamr> "U and V goals for Manila" - is a good catchall topic to discuss plans with operators; Manila CSI is great for awareness and feedback
15:12:16 <gouthamr> can't think of any better to speak to deployers/operators/users about
15:12:57 <tbarron> #topic tempest 3rd party CI
15:13:43 <tbarron> #link https://wiki.openstack.org/w/index.php?title=Manila/TrainCycle&action=edit&section=12
15:14:18 <tbarron> err
15:14:43 <tbarron> #link https://wiki.openstack.org/w/index.php?title=Manila/TrainCycle#Python3_Testing
15:15:02 <tbarron> This is a standing item - anybody have anything new to report?
15:15:37 <bswartz> Py3 specifically?
15:15:38 <tbarron> amito: you're a manager now :) can you rally some resources to get infinidat CI working with python3?
15:15:50 <bswartz> Or better testing in general?
15:16:01 <tbarron> bswartz: well that is our concrete goal for this topic
15:16:03 <amito> tbarron: is it still not working with py3? I asked the guys working on it and they said they did the conversion...
15:16:09 <amito> tbarron: I'll check again
15:16:26 <tbarron> amito: maybe it is, please update (or have them update) the wiki if it is
15:16:54 <amito> tbarron: no problem, I'll make sure it happens
15:17:03 <tbarron> amito++ ty!
15:17:27 <tbarron> Anything else on this one?  If not, we'll return to it next week.
15:17:55 <tbarron> #topic Cross Project Goals
15:18:03 <tbarron> PDF docs
15:18:21 <tbarron> #link  http://lists.openstack.org/pipermail/openstack-discuss/2019-August/008570.html
15:18:35 <tbarron> we got manila, manila-ui, and python-manilclient done
15:18:54 <tbarron> need to get manila-tempest-tests and manila-specs at lest still
15:18:59 <tbarron> least
15:20:13 <tbarron> They shouldn't be too hard if you look at the other patches with topic: build-pdf-docs
15:21:02 <tbarron> The other Cross Project Goal for Train is testing with pure IPv6 devstack as
15:21:06 <tbarron> discussed here:
15:21:35 <tbarron> #link https://storyboard.openstack.org/#!/story/2005477
15:21:58 <tbarron> gman has put up a review that is making good progress:
15:22:16 <tbarron> https://review.opendev.org/#/c/682716/
15:22:39 <gouthamr> ^ needs a fix on the tempest side
15:23:29 <tbarron> tempest needs to install oslo with py3?
15:23:46 <tbarron> devstack?
15:23:50 <gouthamr> yeah, i dunno how it works elsewhere
15:24:47 <gouthamr> the script bails early and doesn't do the verification intended - but, the job's configured properly, and the service endpoints are all IPv6
15:25:06 <tbarron> well probably gman0 will get back to us on that ...
15:25:41 <tbarron> It's moving along, thanks for your review.
15:25:52 <gouthamr> tbarron: gmann
15:26:00 <tbarron> gmannnn
15:26:07 <tbarron> sorry
15:26:32 <tbarron> #topic Review Focus
15:26:37 <tbarron> Etherpad
15:26:52 <tbarron> is here: https://etherpad.openstack.org/p/manila-train-review-focus
15:27:15 <tbarron> I've started to update it with reviews for bug fixes or test changes that we want to land prior to RC1
15:27:44 <tbarron> Which -- again -- is next week.
15:27:55 <dviroel> ack
15:28:31 <tbarron> I make no claim it's complete, so add stuff as appropriate.
15:29:06 <tbarron> #link https://review.opendev.org/#/c/676475/
15:29:15 <tbarron> ^^ share networks subnets tests
15:30:04 <tbarron> has a -1 from zuul but is being rechecked
15:30:14 <dviroel> tbarron: yes, dummy has reported that 2 tests are failing,  we are working on this to reproduce them and do the fix
15:30:39 <dviroel> tbarron: should not take too long to upload a new PS
15:30:54 <tbarron> dviroel: good, thanks, feel free to ping me when that's ready
15:31:07 <dviroel> tbarron: ok, thanks
15:31:08 <gouthamr> dviroel: container driver job's failed the same tests
15:31:28 <dviroel> gouthamr: I will take a look on this too, ty
15:31:35 <tbarron> https://review.opendev.org/#/c/677573/ looks like it's ready ?
15:31:59 <tbarron> that one does replication tests with DHSS=True
15:32:32 <dviroel> tbarron: yes
15:32:46 <tbarron> dviroel: k, thanks.
15:33:01 <tbarron> and then the dummy driver tests just got workflowed I see
15:33:09 <dviroel> =)
15:33:24 <tbarron> #link https://review.opendev.org/#/c/677576/
15:33:40 <gouthamr> yeah that'll need a recheck if dviroel uploads a new PS to the subnets change
15:34:01 <dviroel> gouthamr: ack
15:34:15 <gouthamr> but we can w+1 the replication change too, they'll merge together if we +1 the subnets change
15:34:22 <tbarron> Do we have any more tests/bugs pertaining to the share networks and replication work?
15:35:40 <dviroel> Hum, will wait for gouthamr review the latest PS on subnets, it may or may not need new tests
15:35:42 <tbarron> No
15:35:47 <tbarron> kk
15:36:03 <tbarron> dviroel: thanks, just update the etherpad and ping if we have more
15:36:11 <dviroel> tbarron: ok, ty
15:36:33 <tbarron> I *think* we caught up on tests for the share-type update work so I don't have anything there on the review focus etherpad.
15:37:18 <tbarron> I do have there a UI fix
15:37:37 <tbarron> which isn't strictly for RC1 since we already released manila-ui
15:37:47 <dviroel> tbarron: we also have the pagination fix that carloss is working on.
15:37:59 <tbarron> but it looks like a good bug-fix and we can cut another release
15:38:19 <tbarron> but it depends on a horizon fix that hasn't merged yet, so we'll see
15:38:24 <tbarron> dviroel: ah yes,
15:38:25 <carloss> dviroel: probably I'll have a PS soon with the changes
15:39:00 <dviroel> carloss: ok, thanks
15:39:18 <carloss> resolved most comments but needed to stop to investigate the py37 issue
15:39:28 <tbarron> carloss: ok, you know what I'm gonna say, put it in the etherpad and ping when it's ready for review again if you want it before rc-1
15:39:40 <carloss> good, ty tbarron
15:39:48 <tbarron> it's not a release blocker, so better to get it in first
15:40:00 <tbarron> otherwise we wait for U and consider backport
15:40:16 <tbarron> Any other rc-1 considerations?
15:40:54 <tbarron> #topic Bugs
15:41:02 <tbarron> jgrosso: What do you have for us?
15:41:08 <jgrosso> Hey all
15:41:17 <jgrosso> i have some cleanup stuff :)
15:41:26 <jgrosso> https://bugs.launchpad.net/manila/+bug/1607150
15:41:27 <openstack> Launchpad bug 1607150 in Manila "Tempest test for dr/readable replication fails because share has two active replicas" [Medium,New] - Assigned to NidhiMittalHada (nidhimittal19)
15:42:19 <jgrosso> gouthamr did you log this while at netapp?
15:42:33 <jgrosso> and then it transferred to another person :)
15:42:35 <tbarron> We should probably unassign this one?  I don't think Niddi is working on Manila these days.
15:42:36 * gouthamr looks
15:42:48 <tbarron> Niddhi
15:42:58 <tbarron> s/dd/d/ :)
15:43:03 <jgrosso> :)
15:43:19 <gouthamr> hmmm, have we seen this one occur lately?
15:43:27 <bswartz> No nidhi in this channel or manila channel
15:43:34 <gouthamr> i haven't, but i know we didn't address the problem either
15:43:36 <bswartz> Has she been around lately?
15:43:49 <tbarron> bswartz: I don't think so
15:44:07 <jgrosso> should I lower the priortity and unassign?>
15:44:35 <tbarron> jgrosso: let's ask the people who support backends that do replication
15:44:48 <jgrosso> tbarron ack
15:44:49 <gouthamr> jgrosso: yes, and perhaps dviroel/carloss can take a look
15:44:58 <jgrosso> ok
15:45:22 * gouthamr makes meme about gifting all replication bugs to dviroel/carloss
15:45:33 <tbarron> it's annoying if this causes CI to fail, but may not be likely in the field ?
15:45:42 <carloss> I haven't seen it happening but we can take a look
15:45:46 <gouthamr> tbarron: yeah, tests are crazy
15:45:49 <carloss> gouthamr: haha
15:46:15 <jgrosso> just a general question on the following bug
15:46:16 <jgrosso> https://bugs.launchpad.net/manila/+bug/1807969
15:46:17 <openstack> Launchpad bug 1807969 in Manila "[maniila image elements] custom image job fails to test the new image" [Medium,Fix committed] - Assigned to Tom Barron (tpb)
15:46:29 <jgrosso> is this fixed released?
15:46:33 <tbarron> yes
15:46:43 <jgrosso> thanks tbarron
15:47:15 <jgrosso> one other question
15:47:27 <jgrosso> I noticed some docs are logged a medium and some are low
15:48:09 <jgrosso> do we have a way to determine what is high or low for docs?
15:48:33 <jgrosso> sorry med or low
15:48:41 <tbarron> jgrosso: we probably have many considerations, not some simple rule though.
15:48:55 <tbarron> Just as for non-doc bugs :)
15:49:10 <jgrosso> tbarron thanks
15:49:12 <gouthamr> good q, we should probably think about this one
15:49:32 <tbarron> And as for non-doc bugs, it may be that too many are marked too high.
15:49:34 <jgrosso> I just noticed there are alot of doc bugs
15:50:10 <jgrosso> tbarron ack
15:50:18 <jgrosso> last clean up bug
15:50:19 <jgrosso> https://bugs.launchpad.net/manila/+bug/1816430
15:50:20 <openstack> Launchpad bug 1816430 in Manila "intermittent generic back end extend/shrink share failures" [Medium,Triaged]
15:50:25 <tbarron> We could have High doc bugs if we are telling people something just wrong and we know they are being led down very dangerous paths.
15:50:42 <jgrosso> hmm tbarron great point
15:51:14 <tbarron> hmm, I reppored this oen and said it was Medium :)
15:51:41 <jgrosso> :) sorry forgot to ask for milestone :)
15:51:59 <tbarron> but I haven't seen it recently.
15:52:11 <jgrosso> I know they are intermittent so not easy to put a milestone
15:52:20 <jgrosso> ok I can leave without a milestone :)
15:52:32 <tbarron> Well it's more that no one is paid to work on the generic driver :)
15:52:52 <jgrosso> :) makes sense
15:53:02 <tbarron> so unless we can reproduce it in a straightforward way ...
15:53:30 <jgrosso> thanks tbarron for grabbing the new bugs that came in the last week and responding :)
15:53:33 <jgrosso> tbarron ack
15:53:55 <jgrosso> that is all I have for bugs today !
15:54:27 <tbarron> jgrosso: I'll bump priority on that to low and say something asking others to add to this bug if they hit it
15:54:43 <tbarron> #topic Open Discussion
15:54:46 <jgrosso> tbarron thanks!
15:55:08 <tbarron> bswartz: here's where we could discuss 3rd party CI more generally or other stuff :)
15:55:37 <tbarron> I didn't mean to cut you off on that earlier topic ...
15:55:54 <bswartz> No I was just curious if we had other shortcomings related to 3rd party CI
15:55:59 <bswartz> Such as test coverage issues
15:56:10 <bswartz> Because that would be something I'd care about
15:57:37 <tbarron> Well I'm not confident that 3rd party CIs run and pass regularly, quality varies.
15:58:09 <tbarron> We keep saying we'll make a report card but in my time as PTL I haven't made that happen :(
15:58:17 <bswartz> Yeah no doubt quality varies across vendors. I'm hoping that NetApp is among the best
15:59:30 <dviroel> bswartz: ++
15:59:36 <carloss> bswartz: ++
15:59:54 <tbarron> Not until the replication tests are running regularly and passing :)(
15:59:57 <tbarron> :)
16:00:04 <tbarron> Just messing with you.
16:00:06 <dviroel> =|
16:00:15 <tbarron> ok, see you on #openstack-manila
16:00:19 <tbarron> Thanks everyone!
16:00:23 <tbarron> #endmeeting