15:02:43 #startmeeting manila 15:02:44 Meeting started Thu Sep 19 15:02:43 2019 UTC and is due to finish in 60 minutes. The chair is tbarron. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:02:45 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:02:48 The meeting name has been set to 'manila' 15:02:55 hi :) 15:02:58 o/ 15:03:01 \o 15:03:04 hello :) 15:03:29 #topic Agenda 15:03:39 o/ 15:03:56 o/ 15:04:08 o/ 15:04:12 #link https://wiki.openstack.org/w/index.php?title=Manila/Meetings 15:04:30 oh, sec 15:04:44 courtesy ping: gouthamr xyang toabctl bswartz ganso erlon tpsilva vkmc amito jgrosso dviroel lseki carloss 15:04:54 * tbarron waits a couple 15:04:55 hey :) 15:05:14 .o/ 15:05:26 * bswartz rushes in out of breath 15:05:38 bswartz: me too :) 15:05:47 ok, we've got 10 people or so, 15:05:51 Hi all! 15:06:11 if you update the agenda pls. ping me so I reload :) 15:06:17 #topic Announcments 15:06:38 #link https://releases.openstack.org/train/schedule.html 15:06:58 Next week is our RC1 target, focus is on testing so we don't 15:07:08 have to do multiple release candidates 15:07:34 I'll remind everyone that we are in String and Requirements freeze 15:07:42 and also feature freeze of course 15:08:26 So lets identify any must-have bugs and put reviews for them on the review focus etherpad 15:08:33 (it's own topic later) 15:08:56 Discussion or comments on RC1 target and our work? 15:09:07 OK, 15:09:24 tomorrow I need to submit our Forum topic submissions 15:09:36 using this etherpad as a source: 15:09:50 #link https://etherpad.openstack.org/p/manila-shanghai-forum-brainstorming 15:10:12 No one has updated it except me and noggin143 15:10:23 Thanks noggin143 :) 15:10:31 So this is your last chance. 15:10:43 or else :) 15:11:10 I'll remind that the Forum sessions are useful even if you can't be there for helping shape future directions. 15:11:19 gouthamr: damn straight! 15:11:46 Any other announcements? 15:11:51 "U and V goals for Manila" - is a good catchall topic to discuss plans with operators; Manila CSI is great for awareness and feedback 15:12:16 can't think of any better to speak to deployers/operators/users about 15:12:57 #topic tempest 3rd party CI 15:13:43 #link https://wiki.openstack.org/w/index.php?title=Manila/TrainCycle&action=edit§ion=12 15:14:18 err 15:14:43 #link https://wiki.openstack.org/w/index.php?title=Manila/TrainCycle#Python3_Testing 15:15:02 This is a standing item - anybody have anything new to report? 15:15:37 Py3 specifically? 15:15:38 amito: you're a manager now :) can you rally some resources to get infinidat CI working with python3? 15:15:50 Or better testing in general? 15:16:01 bswartz: well that is our concrete goal for this topic 15:16:03 tbarron: is it still not working with py3? I asked the guys working on it and they said they did the conversion... 15:16:09 tbarron: I'll check again 15:16:26 amito: maybe it is, please update (or have them update) the wiki if it is 15:16:54 tbarron: no problem, I'll make sure it happens 15:17:03 amito++ ty! 15:17:27 Anything else on this one? If not, we'll return to it next week. 15:17:55 #topic Cross Project Goals 15:18:03 PDF docs 15:18:21 #link http://lists.openstack.org/pipermail/openstack-discuss/2019-August/008570.html 15:18:35 we got manila, manila-ui, and python-manilclient done 15:18:54 need to get manila-tempest-tests and manila-specs at lest still 15:18:59 least 15:20:13 They shouldn't be too hard if you look at the other patches with topic: build-pdf-docs 15:21:02 The other Cross Project Goal for Train is testing with pure IPv6 devstack as 15:21:06 discussed here: 15:21:35 #link https://storyboard.openstack.org/#!/story/2005477 15:21:58 gman has put up a review that is making good progress: 15:22:16 https://review.opendev.org/#/c/682716/ 15:22:39 ^ needs a fix on the tempest side 15:23:29 tempest needs to install oslo with py3? 15:23:46 devstack? 15:23:50 yeah, i dunno how it works elsewhere 15:24:47 the script bails early and doesn't do the verification intended - but, the job's configured properly, and the service endpoints are all IPv6 15:25:06 well probably gman0 will get back to us on that ... 15:25:41 It's moving along, thanks for your review. 15:25:52 tbarron: gmann 15:26:00 gmannnn 15:26:07 sorry 15:26:32 #topic Review Focus 15:26:37 Etherpad 15:26:52 is here: https://etherpad.openstack.org/p/manila-train-review-focus 15:27:15 I've started to update it with reviews for bug fixes or test changes that we want to land prior to RC1 15:27:44 Which -- again -- is next week. 15:27:55 ack 15:28:31 I make no claim it's complete, so add stuff as appropriate. 15:29:06 #link https://review.opendev.org/#/c/676475/ 15:29:15 ^^ share networks subnets tests 15:30:04 has a -1 from zuul but is being rechecked 15:30:14 tbarron: yes, dummy has reported that 2 tests are failing, we are working on this to reproduce them and do the fix 15:30:39 tbarron: should not take too long to upload a new PS 15:30:54 dviroel: good, thanks, feel free to ping me when that's ready 15:31:07 tbarron: ok, thanks 15:31:08 dviroel: container driver job's failed the same tests 15:31:28 gouthamr: I will take a look on this too, ty 15:31:35 https://review.opendev.org/#/c/677573/ looks like it's ready ? 15:31:59 that one does replication tests with DHSS=True 15:32:32 tbarron: yes 15:32:46 dviroel: k, thanks. 15:33:01 and then the dummy driver tests just got workflowed I see 15:33:09 =) 15:33:24 #link https://review.opendev.org/#/c/677576/ 15:33:40 yeah that'll need a recheck if dviroel uploads a new PS to the subnets change 15:34:01 gouthamr: ack 15:34:15 but we can w+1 the replication change too, they'll merge together if we +1 the subnets change 15:34:22 Do we have any more tests/bugs pertaining to the share networks and replication work? 15:35:40 Hum, will wait for gouthamr review the latest PS on subnets, it may or may not need new tests 15:35:42 No 15:35:47 kk 15:36:03 dviroel: thanks, just update the etherpad and ping if we have more 15:36:11 tbarron: ok, ty 15:36:33 I *think* we caught up on tests for the share-type update work so I don't have anything there on the review focus etherpad. 15:37:18 I do have there a UI fix 15:37:37 which isn't strictly for RC1 since we already released manila-ui 15:37:47 tbarron: we also have the pagination fix that carloss is working on. 15:37:59 but it looks like a good bug-fix and we can cut another release 15:38:19 but it depends on a horizon fix that hasn't merged yet, so we'll see 15:38:24 dviroel: ah yes, 15:38:25 dviroel: probably I'll have a PS soon with the changes 15:39:00 carloss: ok, thanks 15:39:18 resolved most comments but needed to stop to investigate the py37 issue 15:39:28 carloss: ok, you know what I'm gonna say, put it in the etherpad and ping when it's ready for review again if you want it before rc-1 15:39:40 good, ty tbarron 15:39:48 it's not a release blocker, so better to get it in first 15:40:00 otherwise we wait for U and consider backport 15:40:16 Any other rc-1 considerations? 15:40:54 #topic Bugs 15:41:02 jgrosso: What do you have for us? 15:41:08 Hey all 15:41:17 i have some cleanup stuff :) 15:41:26 https://bugs.launchpad.net/manila/+bug/1607150 15:41:27 Launchpad bug 1607150 in Manila "Tempest test for dr/readable replication fails because share has two active replicas" [Medium,New] - Assigned to NidhiMittalHada (nidhimittal19) 15:42:19 gouthamr did you log this while at netapp? 15:42:33 and then it transferred to another person :) 15:42:35 We should probably unassign this one? I don't think Niddi is working on Manila these days. 15:42:36 * gouthamr looks 15:42:48 Niddhi 15:42:58 s/dd/d/ :) 15:43:03 :) 15:43:19 hmmm, have we seen this one occur lately? 15:43:27 No nidhi in this channel or manila channel 15:43:34 i haven't, but i know we didn't address the problem either 15:43:36 Has she been around lately? 15:43:49 bswartz: I don't think so 15:44:07 should I lower the priortity and unassign?> 15:44:35 jgrosso: let's ask the people who support backends that do replication 15:44:48 tbarron ack 15:44:49 jgrosso: yes, and perhaps dviroel/carloss can take a look 15:44:58 ok 15:45:22 * gouthamr makes meme about gifting all replication bugs to dviroel/carloss 15:45:33 it's annoying if this causes CI to fail, but may not be likely in the field ? 15:45:42 I haven't seen it happening but we can take a look 15:45:46 tbarron: yeah, tests are crazy 15:45:49 gouthamr: haha 15:46:15 just a general question on the following bug 15:46:16 https://bugs.launchpad.net/manila/+bug/1807969 15:46:17 Launchpad bug 1807969 in Manila "[maniila image elements] custom image job fails to test the new image" [Medium,Fix committed] - Assigned to Tom Barron (tpb) 15:46:29 is this fixed released? 15:46:33 yes 15:46:43 thanks tbarron 15:47:15 one other question 15:47:27 I noticed some docs are logged a medium and some are low 15:48:09 do we have a way to determine what is high or low for docs? 15:48:33 sorry med or low 15:48:41 jgrosso: we probably have many considerations, not some simple rule though. 15:48:55 Just as for non-doc bugs :) 15:49:10 tbarron thanks 15:49:12 good q, we should probably think about this one 15:49:32 And as for non-doc bugs, it may be that too many are marked too high. 15:49:34 I just noticed there are alot of doc bugs 15:50:10 tbarron ack 15:50:18 last clean up bug 15:50:19 https://bugs.launchpad.net/manila/+bug/1816430 15:50:20 Launchpad bug 1816430 in Manila "intermittent generic back end extend/shrink share failures" [Medium,Triaged] 15:50:25 We could have High doc bugs if we are telling people something just wrong and we know they are being led down very dangerous paths. 15:50:42 hmm tbarron great point 15:51:14 hmm, I reppored this oen and said it was Medium :) 15:51:41 :) sorry forgot to ask for milestone :) 15:51:59 but I haven't seen it recently. 15:52:11 I know they are intermittent so not easy to put a milestone 15:52:20 ok I can leave without a milestone :) 15:52:32 Well it's more that no one is paid to work on the generic driver :) 15:52:52 :) makes sense 15:53:02 so unless we can reproduce it in a straightforward way ... 15:53:30 thanks tbarron for grabbing the new bugs that came in the last week and responding :) 15:53:33 tbarron ack 15:53:55 that is all I have for bugs today ! 15:54:27 jgrosso: I'll bump priority on that to low and say something asking others to add to this bug if they hit it 15:54:43 #topic Open Discussion 15:54:46 tbarron thanks! 15:55:08 bswartz: here's where we could discuss 3rd party CI more generally or other stuff :) 15:55:37 I didn't mean to cut you off on that earlier topic ... 15:55:54 No I was just curious if we had other shortcomings related to 3rd party CI 15:55:59 Such as test coverage issues 15:56:10 Because that would be something I'd care about 15:57:37 Well I'm not confident that 3rd party CIs run and pass regularly, quality varies. 15:58:09 We keep saying we'll make a report card but in my time as PTL I haven't made that happen :( 15:58:17 Yeah no doubt quality varies across vendors. I'm hoping that NetApp is among the best 15:59:30 bswartz: ++ 15:59:36 bswartz: ++ 15:59:54 Not until the replication tests are running regularly and passing :)( 15:59:57 :) 16:00:04 Just messing with you. 16:00:06 =| 16:00:15 ok, see you on #openstack-manila 16:00:19 Thanks everyone! 16:00:23 #endmeeting