15:00:32 #startmeeting manila 15:00:33 Meeting started Thu Jan 25 15:00:32 2018 UTC and is due to finish in 60 minutes. The chair is bswartz. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:34 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:37 The meeting name has been set to 'manila' 15:00:42 o/ hello 15:00:45 hello all 15:00:48 \o 15:00:55 hello o/ 15:00:57 hello 15:01:06 Hi 15:01:06 hi 15:01:23 hi 15:02:37 hey guys, sorry I'm dropping a -2 on some patches 15:03:25 Okay! 15:03:26 thud 15:03:34 #topic announcements 15:03:47 Today is the last day for features to merge 15:03:52 @! 15:03:53 I need to push tags today 15:04:12 jungleboyj: your bot is awol 15:04:22 * jungleboyj is sad 15:04:53 I don't think we have an other announcements 15:05:11 Feature freeze is usually the biggest deadline 15:05:24 #agenda https://wiki.openstack.org/wiki/Manila/Meetings 15:05:31 #topic Feature Freeze / Queens milestone-3 15:05:46 So we've been trying to get the last feature patches merged this week 15:06:01 There's been good progress, but this morning I still see some unmerged patches, so let's go through them 15:06:47 The filtering for the share type API won't make it, so I've already block those 2 patches 15:07:29 We have this one waiting for a workflow: https://review.openstack.org/#/c/527134/ 15:08:02 * tbarron is timing out connecting to gerrit atm 15:08:09 was testing that one, need xyang/tbarron to weigh in 15:08:16 or zhongjun 15:08:35 gouthamr: found any issues worth discussing? 15:08:47 nope 15:09:30 https://review.openstack.org/#/c/528667/ 15:09:52 ^ This one looks like it's suffering infra problems 15:09:53 ouch, rechecking 15:10:19 https://review.openstack.org/#/c/535701/ 15:10:29 ^ This is a 5 line change 15:10:36 No rechecking 15:10:44 Both of its dependencies are merged 15:10:48 Wait for next infra broadcast 15:10:52 efried: ? 15:11:13 bswartz: 535701 is ok, I was just waiting for the dependencies 15:11:14 NOTICE: We're currently experiencing issues with the logs.openstack.org server which will result in POST_FAILURE for jobs, please stand by and don't needlessly recheck jobs while we troubleshoot the problem. 15:11:15 efried: is the recheck machinery broken? 15:11:47 * bswartz smh 15:11:52 https://review.openstack.org/#/c/535701/ is failing a bunch of third party CIs 15:12:32 thirdy party CIs are failing for another reason 15:12:36 third 15:12:37 ganso: can you speak to that? 15:12:44 including NetApp 15:13:13 bswartz: erlon and I are diagnosing a CI failure due to library update 15:13:14 looks like a broken package atleast for NTAP, python-pcre 15:13:19 python-pcre paacke issue 15:13:25 bswartz: erlon says all CIs are failing 15:13:25 http://52.42.67.99/67/537867/1/check/dsvm-veritas-access-manila-driver/95ec234/logs/devstack-early.txt.gz#_2018-01-25_12_22_34_363 15:13:36 ganso: all 3rd party 15:13:39 * bswartz headdesk 15:13:41 ganso: not infra 15:14:12 infra jobs are working, ubuntu and centos, but ubuntu 3rd party has the issue 15:14:17 tbarron, yep infra not, https://review.openstack.org/#/c/534854/ 15:14:28 but 90% of the tird party 15:14:31 Okay, this is a perfect illustration of why it's better to get stuff merged on Monday or Tuesday, not Thursday 15:14:35 not sure why 15:14:44 Library problems seem to happen every release 15:14:58 Is it something we can get sorted out this morning? 15:15:24 Or will we have to accept that third party CI is broken until after FF? 15:15:26 https://review.openstack.org/#/c/537867/ is a DNM job that illustrates 15:15:46 bswartz, not sure, couldnt get help from #infra as they are focusing on the logserver problem 15:15:59 infra jobs are all passing, third party are all failing 15:16:18 tbarron: not really passing, they have POST_FAILURE issue now 15:16:22 #link https://pypi.python.org/pypi/python-pcre/0.6 15:16:23 this? 15:16:31 infra has mirrors, they'll be affected late :) 15:16:41 ganso, they are going further at least :p 15:17:22 ganso: they passed in 537867 two hours ago, iow they don't have the package issue 15:17:42 gouthamr: ah, maybe just delayed 15:18:04 I don't understand what's changed 15:18:15 That package hasn't been modified since 2015 according to pypi 15:19:11 bswartz, Im not getting the problem as well, the bindep is already there, for long time: https://github.com/openstack/requirements/search?utf8=%E2%9C%93&q=pcre&type= 15:19:47 so, devstack or something else should know that it should be installed before trying to install the python lib 15:20:12 I'm confused 15:20:16 did we always have to compile it? 15:20:50 tbarron -- if you install from pip, I think the native parts get compiled as part of the install 15:20:51 looks like the dev pkg might be missing, the compile fails for lack of pcre.h 15:21:12 tbarron, not sure, may be that was what is different now, now it needs to be compiled 15:21:14 If you install from apt/yum you get a distro-compile binary 15:21:23 bswartz: ack, understood 15:22:00 It's possible the pcre-dev package moved some files and the python package is now broken 15:22:15 So there may be a python-pcre 0.7.1 very soon 15:22:19 bswartz, that will download the pcre lib only, the pip package still need to compile 15:22:58 I'm okay with ignoring the 3rd party CIs until that issue is sorted 15:23:23 It sounds like the upstream gate might be about to get less friendly for the same reasons, so we should hurry up and merge what we can 15:24:16 Okay are there any other patches I missed? 15:24:24 For Q-3? 15:24:45 If not, I'm going to watch for the last few things to get through the gate and push tags as soon as they make it 15:25:18 looking at devstack-early log in a successful third party CI run from a couple weeks ago I see no mention of python-pcre 15:27:17 I know the netapp guys will dig into to try to fix the netapp CI at least 15:27:27 Please share any fixes you find on the channel 15:27:45 And lmk if you want my help on that CI issue 15:28:05 Okay I think that's it for feature freeze 15:28:18 a quick workaround we found is to install libpcre3-dev on the nodes before stacking 15:28:31 tpsilva: ty 15:29:01 #topic Let's Go Over New Bugs 15:29:40 Before we get into specific bugs, let me remind y'all that the focus between feature freeze and RC1 should be on finding and fixing bugs 15:30:00 There are some bugfix patches already waiting for reviews 15:30:23 I will be targeting bugs at the RC1 milestone in LP 15:30:57 Arguably, finding bug is as important as fixing the known bugs, so I encourage people to test out some of the new features and try to break them 15:31:52 The RC1 target date for us is 2 weeks from now 15:32:26 As always, we'll cut the RC1 tag when the number of bugs drops to zero, whether that's early or late 15:32:53 But the closer we get to the target date, the more aggressively we'll untarget bugs -- our release manager don't like late RCs 15:33:00 :-) 15:33:08 dustins: do you have specific bugs for today? 15:33:16 I do indeed 15:33:19 #link https://bugs.launchpad.net/manila/+bug/1550258 15:33:20 Launchpad bug 1550258 in Manila "Manila UI - Manage rule for CIFS share updates status from "error" to "active" for invalid users" [Low,New] 15:33:27 #link https://etherpad.openstack.org/p/manila-bug-triage-pad 15:34:04 Wow this is an old one 15:34:41 Why did I target this to manila-ui? 15:34:55 it's marked invalid on manila-ui 15:34:59 It must have been a click error 15:35:19 In any case, is it HP3par specific? 15:35:33 there was a ui screenshot in the bug report but then it was realized that it's not a ui bug 15:35:42 It looks like it manifested first in the UI but is caused by a Manila "core" thing 15:36:02 But yeah, I think this may be specific to the 3PAR driver? 15:36:30 would it make sense to ask if the issue is still reproducible? 15:36:42 I'm not convinced there's any problem outside the 3par driver 15:36:48 or close saying re-open if it's still an issue 15:37:29 we don't have any active HP 3par folks here 15:37:31 The maintainer is Ravichandran Nudurumati 15:37:56 No known IRC nick for him 15:38:45 It's up to HPE whether they want to fix this bug or not -- I don't see what we can do without access to their hardware 15:39:17 Yeah, I was kinda hoping that we had an HPE representative to provide some feedback 15:39:48 next? 15:39:59 what was the disposition? 15:40:06 assign or close? 15:40:23 we need to clear the backlog, not just talk about the bugs. 15:40:29 I think we leave it unassigned, low importance 15:40:34 We can mark it traiged 15:40:35 -1 15:40:43 triaged even 15:41:04 ok, I'm just looking for ways to make some progress 15:41:31 I don't see how we can close it -- it's likely still an issue that users of the HPE driver should be aware of 15:41:34 * tbarron adds tags 15:42:07 Hopefully if someone users the driver and finds the bug, they can go bother HPE about getting it fixed 15:42:29 tags: drivers, hpe-3par 15:42:34 +1 15:42:58 we do tags in cinder 15:43:51 #link https://bugs.launchpad.net/manila/+bug/1700501 15:43:52 Launchpad bug 1700501 in OpenStack Compute (nova) "Insecure rootwrap usage" [Undecided,Incomplete] 15:43:59 This is the next one for us 15:44:35 note sdague: "this is too vague to be actiionable" 15:44:58 there's a proposed cross-project initiative for privsep 15:45:00 Wow 15:45:37 we should talk at PTG about whether we should be migrating manila to privsep 15:45:41 Yeah rootwrap was always a pretty hacky approach to fixing security issues 15:45:58 tbarron: Can you add that to the ptg etherpad if its not there already? 15:46:06 bswartz: yes 15:46:08 I don't know the details of privsep 15:46:36 we should talk about what it is and whether / when we want to do it 15:46:37 It's important to note that the security issues addressed by rootwrap are for the most part theoretical issues 15:47:14 yeah, I'm not sayiing that we shouldn't fix specific rootwrap issues (like too broad commands specified, etc.) just 15:47:19 that this bug is open ended 15:47:29 Rootwrap exists to reduce the chances of an exploit in the API services from turning into a more general exploit of the machine the services run on 15:48:23 Our first line of defense is the API services themselves, and while python is bad at many things, it's pretty good at preventing over-the-network exploits 15:50:20 I'm reading about privsep -- is anyone already familiar with it? 15:50:45 kinda sorta, need to refresh 15:51:07 jungleboyj: cinder has some experience attempting privsep migration, not all positive, right? 15:51:40 From what I see, it's unclear that privsep helps at all 15:51:52 * tbarron will try to see where other projects are on this 15:52:10 tbarron: It was complicated but we got there. 15:53:00 tbarron: Some of it was learning curve for people on how to use it. 15:53:27 jungleboyj: Are there good docs on how to transition to privsep? 15:54:04 dustins: Not sure. I think hemna did a lot of the work there. Could check with him. 15:54:20 one thing privsep does is encourage use of libraries rather than shell commands to achieve the needed 15:54:29 I'd like someone to explain how privsep is any better than a properly locked-down rootwrap config 15:54:35 Not a bad idea, if it's something that other projects are doing we should leverage their experience and expertise 15:54:53 tbarron: I can see that working for some things 15:55:10 In particular stuff like the "chmod 777 /etc/shadow" exploit listed in the bug 15:55:21 right 15:55:29 But so much of what manila does is invoking binaries that will never have python-library equivalents 15:56:17 Then again, much of that is driver specific... 15:56:42 I've got an AI to see where other projects are on quota replacement before PTG, let me see what I can find out about privsep as well. 15:56:43 Perhaps we could modularize the rootwrap and only enable parts of it related to the specific drivers in use 15:56:48 Did it help? 15:56:52 etc. 15:56:56 okay 15:57:00 We're low on time 15:57:05 dustins: anything else? 15:57:27 mention the default share type one quick 15:57:28 #link https://bugs.launchpad.net/manila/+bug/1743472 15:57:29 Launchpad bug 1743472 in Manila "Create a default share type for tempest tests" [Undecided,New] - Assigned to Victoria Martinez de la Cruz (vkmc) 15:57:39 o/ 15:57:47 vkmc: Take it away! 15:57:47 about to submit a fix for that, I'm running test locally 15:57:55 s/test/tempest tests/g 15:57:57 We discuss this one already didn't we? 15:58:12 bswartz: just noting that there's some action on that front ... 15:58:17 Or am I remembering a conversation in another venue? 15:58:23 plus, we need to add the dependency for https://review.openstack.org/#/c/532713/... it's almost a duplicate 15:58:31 bswartz: A related one, https://bugs.launchpad.net/manila/+bug/1743472 15:58:32 Launchpad bug 1743472 in Manila "Create a default share type for tempest tests" [Undecided,New] - Assigned to Victoria Martinez de la Cruz (vkmc) 15:58:36 Okay 15:58:48 bswartz, yeah, about two weeks ago 15:59:09 Okay let's make sure to get those patches reviewed before RC1 15:59:19 Thanks vkmc 15:59:22 np 15:59:37 Feel free to ping me for a review when it's ready 15:59:45 That's all we have time for today 15:59:57 * bswartz prays the last few patches merge soon 16:00:04 Thanks all 16:00:04 will do, thx 16:00:13 #endmeeting