15:00:01 <tbarron> #startmeeting manila
15:00:01 <openstack> Meeting started Thu Mar 28 15:00:01 2019 UTC and is due to finish in 60 minutes.  The chair is tbarron. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:02 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:05 <openstack> The meeting name has been set to 'manila'
15:00:14 <bswartz> .o/
15:00:17 <tbarron> courtesy ping gouthamr xyang toabctl bswartz ganso erlon tpsilva vkmc amito jgrosso
15:00:19 <gouthamr> o/
15:00:21 <ganso> hello
15:00:22 <jgrosso> hi
15:00:36 <amito> hey o/ will not be entirely present in this meeting, unfortunately
15:00:37 <vkmc> o/
15:00:44 <carlos_silva> hi
15:01:15 <tbarron> amito: it's hard to be entirely present w/o years of mediation anyways
15:01:18 <tbarron> hi all!
15:01:29 <bswartz> tbarron: +1
15:01:39 <tbarron> Agenda: https://wiki.openstack.org/wiki/Manila/Meetings
15:01:45 <lseki> hi
15:01:46 <amito> tbarron: :P
15:02:00 <tbarron> #topic Announcements
15:02:11 <tbarron> We're in the week for RC2.
15:02:31 <tbarron> We don't have any bugs that I'm aware of that would lead us to need another RC.
15:02:56 <tbarron> nothing has been backported to stable/stein.
15:03:09 <tbarron> I don't see anything sitting on master that needs to be.
15:03:20 <tbarron> Any opinions otherwise?
15:03:43 <vhariria> hi
15:03:54 <tbarron> Not hearing any, but if anything comes up we should cut rc2 today :)
15:04:02 <tbarron> vhariria: hi
15:04:26 <tbarron> The final release candidate will be Monday 1 April.
15:04:30 <tbarron> No fooling.
15:04:46 <tbarron> Then Stein itself will be released 10 April.
15:04:55 <tbarron> Next announcement.
15:05:05 <xyang> hi
15:05:09 <tbarron> Please welcome vhariria to manila!
15:05:16 <bswartz> April fools day release, nice
15:05:26 <jgrosso> vhariria: Welcome !!
15:05:32 <xyang> welcome!
15:05:32 <gouthamr> hey vhariria, welcome :)
15:05:36 <tbarron> Vida has joined Jason to help us control manila bugs downstream and up.
15:06:07 <ganso> if they say they have a release problem at that day, I am not sure if I would believe them
15:06:08 <vhariria> thanks everyone, good to be part of the team  :)
15:06:11 <tbarron> So you will be seeing/hearing more of her.
15:06:37 <tbarron> She is among other things an ansible expert :)
15:06:54 <tbarron> xyang: belated hi
15:07:03 <xyang> :)
15:07:03 <vhariria> tbarron: o/
15:07:11 <amito> vhariria: welcome
15:07:16 <tbarron> OK, any other announcements?
15:07:17 <bswartz> welcome
15:07:41 <tbarron> #topic PTG
15:07:54 <tbarron> reminder: our planning etherpad
15:08:10 <tbarron> #link gouthamr xyang toabctl bswartz ganso erlon tpsilva vkmc amito jgrosso
15:08:16 <tbarron> well that was funny
15:08:17 <tbarron> sec
15:08:20 <amito> :D
15:08:36 <bswartz> select buffer error?
15:08:50 <gouthamr> he just likes picking on us
15:08:52 <bswartz> Clipboard managers can do that
15:08:59 <tbarron> #link https://etherpad.openstack.org/p/manila-denver-train-ptg-planning
15:09:08 <tbarron> just making sure everyone is awake
15:09:29 <bswartz> Be careful not to anger the anti-spam bots
15:09:42 <tbarron> thanks to those who indicated they play to attend
15:09:57 <tbarron> and reminder to others to do the same, in the etherpad
15:10:07 <tbarron> We're getting lots of topics to work on.
15:10:22 <tbarron> and atm we have 1.5 days of official time.
15:10:44 <tbarron> #link gouthamr xyang toabctl bswartz ganso erlon tpsilva vkmc amito jgrosso
15:10:49 <tbarron> woah, sorry
15:10:53 <tbarron> slow learner
15:11:05 <xyang> what dates are the PTG?
15:11:08 <bswartz> You really want the bot to kick you
15:11:13 <bswartz> #link https://www.openstack.org/ptg/
15:11:15 <tbarron> #link https://www.openstack.org/ptg/#tab_schedule
15:11:16 <bswartz> May 2-4
15:11:27 <ganso> lol
15:11:36 <tbarron> we are shedule for the first day and a half but
15:11:43 <tbarron> according to
15:12:01 <tbarron> #link http://lists.openstack.org/pipermail/openstack-discuss/2019-March/003955.html
15:12:20 <tbarron> we will be able to dynamically request more time and space if we need to
15:12:42 <tbarron> I think we should try to be efficient with the 1.5 offiical days for us and
15:12:49 <bswartz> Wait a minute
15:12:59 <bswartz> May 2 is a Thursday
15:13:11 <tbarron> then evaluate whether we need more official space or whenter we want to
15:13:17 <tbarron> use time in other sessions
15:13:50 <bswartz> Does the PTG extend into saturday?
15:13:55 <tbarron> bswartz: right, PTG is Thursday, Friday, Saturday
15:14:05 <bswartz> Okay that's odd
15:14:06 <tbarron> bswartz: yes
15:14:25 <tbarron> but manila is not officially scheduled on Saturday
15:14:33 <bswartz> Right
15:14:40 <tbarron> that calendar is somewhat tentative atm and could change
15:14:56 <tbarron> but I've seen no push towards moving us later
15:15:09 <tbarron> I'm not flying back till Sunday though
15:15:35 <tbarron> Anything else on PTG?
15:16:02 <tbarron> Who plans to be at summit itself? besides gouthamr, vkmc, and myself?
15:16:22 <xyang> I'll be there
15:16:31 <tbarron> xyang: awesome
15:17:01 <tbarron> Anything else on PTG or summit?
15:17:23 <tbarron> will any of you be at kubecon in barcelona?
15:17:43 <tbarron> just mapping out the face to face opportunities ...
15:17:49 <tbarron> ok
15:18:07 <tbarron> #topic python3
15:18:33 <tbarron> just a heads up that we have a review that proposes running all dsvm functional jobs under py3
15:18:36 <tbarron> on master
15:18:50 <tbarron> since we need to keep py2 compatability through train
15:18:57 <bswartz> #link https://review.openstack.org/#/c/646037/
15:19:05 <tbarron> it adds a dummy back end py2 job as well
15:19:07 <tbarron> bswartz: ty
15:19:40 <tbarron> The good news is that the jobs are passing at least as well as they were with python 2
15:20:21 <xyang> tbarron: I'll be at Kubecon in barcelona
15:20:22 <tbarron> Maybe a bit more reliably since it also only runs the set of services that are needed to run and test the back end in question.
15:20:45 <tbarron> xyang: great, there's quite a bit of stuff I want to talk with you about!
15:20:52 <xyang> awesome!
15:21:11 <tbarron> Also it runs with the tls-proxy on every job.
15:21:14 <gouthamr> hey wait tbarron
15:21:33 <tbarron> If anyone has reservations about it, including gouthamr :) speak up
15:21:37 <gouthamr> i actually don't think that py3 change works :(
15:21:38 <gouthamr> http://logs.openstack.org/37/646037/3/check/manila-tempest-dsvm-postgres-zfsonlinux/6d55070/logs/screen-m-shr.txt.gz?level=ERROR
15:21:51 <tbarron> gouthamr: raining on the parade again :)
15:22:25 <tbarron> gouthamr: quick, -1 before I merge it!
15:22:36 <gouthamr> :) fun..
15:22:37 <tbarron> seriously, good catch
15:22:48 <tbarron> are you seeing that on other back ends than zfs?
15:23:49 <gouthamr> tbarron: nope, just started with that driver... will look at the others and confirm
15:24:18 <tbarron> well there's a bit more work to do I guess, but at least everyone knows the plan.
15:24:34 <gouthamr> yeah, same with generic
15:25:06 <gouthamr> and the py3 dummy driver job
15:25:08 <bswartz> Boy, if they ever release a python 4 we'll be in big trouble
15:25:15 <tbarron> worst case we have to revert those back to py2
15:25:37 <tbarron> gouthamr: why do the generic and dummy jobs report success then?
15:25:43 <tbarron> some of the generic and the dummy
15:26:08 <gouthamr> because we've supported py2.7 for a while now? :)
15:26:52 <tbarron> gouthamr: i mean, why are they reporting success while hitting this issue running under py3?
15:26:54 <gouthamr> could be a devstack/devstack-gate issue like the ones we fixed a while ago
15:27:30 <tbarron> ok, more work to do :0
15:27:32 <tbarron> :)
15:27:32 <bswartz> It would be nice if there were dsvms that were py3-only, so any attempt to use py2 would result in an error
15:27:57 <gouthamr> ++, they could begin by uninstalling py2
15:28:27 <tbarron> good luck getting devstack patches merged :)
15:29:03 <tbarron> #topic bugs
15:29:06 <bswartz> We could just symlink /usr/bin/python2 to a shell script that creates a fork bomb
15:29:28 <tbarron> bswartz: please put it in review :)
15:29:41 <jgrosso> Hey all
15:30:11 <jgrosso> I just wanted to thank all of you for answering many of may questions
15:30:27 <jgrosso> we are getting closer to managing all the bugs
15:30:35 <jgrosso> I dont have alot of new ones
15:30:43 <jgrosso> https://bugs.launchpad.net/manila/+bug/1822039
15:30:44 <openstack> Launchpad bug 1822039 in Manila "Tempest Tests in manila" [Undecided,New]
15:30:52 <jgrosso> this came in today
15:31:34 <ganso> another bug that is actually a feature?
15:31:58 <bswartz> I notice he's using py2
15:33:13 <bswartz> Anyone familiar with oslo config?
15:34:05 <gouthamr> slightly, interesting ... we do run those scenario tests on rocky, which OP seems to be using
15:34:31 <jgrosso> OP?
15:35:10 <gouthamr> jgrosso: bug reporter in this case :)
15:35:22 <bswartz> OP = original poster
15:35:32 <jgrosso> haha
15:35:35 <jgrosso> thanks
15:36:00 <jgrosso> any takers ? :)
15:36:18 <tbarron> I just ran 'tempest run -l' from /opt/stack/tempest on master and don't havae this issue
15:36:39 <gouthamr> just a bunch of questions and links that we can share on the bug..
15:36:42 <tbarron> also 'tempest list-plugins'
15:36:58 <tbarron> Probably we want to point to:
15:37:13 <tbarron> #link https://docs.openstack.org/manila/latest/contributor/tempest_tests.html
15:37:27 <jgrosso> ok I will update the bug
15:37:29 <tbarron> and indicat that when we follow it step by step we don't hit this issue
15:37:43 <jgrosso> Awesome thanks tbarron
15:38:02 <jgrosso> https://bugs.launchpad.net/manila/+bug/1804208
15:38:03 <openstack> Launchpad bug 1804208 in Manila "scheduler falsely reports share service down" [High,New]
15:38:43 <jgrosso> Its high new and 4 months old
15:38:50 <tbarron> OP is carthaca
15:39:08 <tbarron> who is on family leave iirc
15:39:22 <carthaca> I'm still here ;)
15:39:30 <tbarron> carthaca: :)
15:39:47 <tbarron> you propose and idea for a fix in that bug ^^^ 1804208
15:40:10 <tbarron> were you planning to code that up or are you otherwise occupied?
15:40:52 <tbarron> I think this is a pretty important bug.
15:41:00 <carthaca> I got a workaround - though I don't have enough pressure to fix it ^^
15:41:12 <carthaca> I increased service_down_time
15:41:26 <tbarron> B/c manila is going to be expected to scale to many more than 5 manila-share services
15:41:40 <tbarron> for edge architectures
15:41:59 <tbarron> jgrosso: let's note this bug under the Edge PTG topic
15:42:09 <jgrosso> sounds good
15:42:13 <tbarron> carthaca: fair enuf
15:42:33 <jgrosso> https://bugs.launchpad.net/manila/+bug/1746725
15:42:34 <openstack> Launchpad bug 1746725 in Manila "LVM driver is unable to remove addresses in different IP versions belonging to the same interface properly" [Undecided,New]
15:43:43 <jgrosso> Rodrigo responded but I am wondering if this will ever ve merged to fix it
15:43:50 <jgrosso> https://review.openstack.org/#/c/476239/
15:44:19 <ganso> I think it needs to be re-tested
15:44:46 <ganso> the problem could have been fixed on the nfs package
15:45:23 <ganso> if not, it is something desirable to be fixed
15:45:30 <ganso> on our side
15:46:26 <jgrosso> ok so shall I ask Rodrigo to re-test?
15:46:55 <gouthamr> the steps to reproduce on that bug need to be a scenario test...
15:46:56 <tbarron> jgrosso: ask ganso whether you should as rodrigo :)
15:47:17 <jgrosso> I was gently asking that in a round about way
15:47:21 <jgrosso> :)
15:47:31 <jgrosso> oh lol
15:47:36 <jgrosso> sorry had no idea
15:48:31 <jgrosso> well that is all I had for today @
15:48:34 <jgrosso> :)
15:48:54 <jgrosso> ganso you want that bug ?
15:49:32 <tbarron> we're running the lvm job on ubuntu bionic now and many not see the centos bug there ...
15:49:39 <ganso> atm I don't have cycles to re-test it
15:50:00 <jgrosso> ok
15:50:19 <gouthamr> maybe worth extending http://specs.openstack.org/openstack/manila-specs/specs/release_independent/scenario-tests.html#share-access-with-multiple-guests to remove the first guest's rule and try to unmount and remount as the second guest and expect it to work
15:51:13 <jgrosso> I will update the defect to say retest when possible
15:51:16 <gouthamr> jgrosso: i can probably confirm this bug... although if bswartz's https://review.openstack.org/#/c/476239/ needs to be revived, it'd be quite some effort
15:51:56 <tbarron> first step is just to see if we still have the bug and confirm or mark it invalid
15:52:09 <jgrosso> if you can confirm that would be great
15:52:25 <gouthamr> ack
15:52:31 <jgrosso> thanks gouthamr
15:52:36 <tbarron> then backlog with no one assigned unless someone is ready to work on a fix
15:52:56 <tbarron> no point in assigning if people are too busy to actually work on the bug
15:53:01 <jgrosso> agreed
15:53:24 <jgrosso> that is all I had for today
15:53:30 <tbarron> jgrosso: thanks
15:53:50 <tbarron> jgrosso: and thanks for all the cleanup of old bugs that you have been doing!
15:54:00 <tbarron> #topic open discusssion
15:54:11 <jgrosso> your welcome
15:54:55 <gouthamr> yes, thank you jgrosso
15:55:14 <gouthamr> i'm cleaning up the API reference thanks to your diligent combing of all the old bugs :)
15:55:39 <tbarron> gouthamr: yes, I've seen a lot of patches posted on api ref
15:55:44 <tbarron> thanks!
15:55:47 <gouthamr> tbarron: more coming :)
15:55:56 <tbarron> that's awesome
15:56:12 <tbarron> it's a pleasure working with people who care and who are smart
15:56:13 <jgrosso> :)
15:56:24 <tbarron> Anything else today?
15:56:45 <tbarron> OK, thanks everyon!  See you in #openstack-manila
15:56:49 <tbarron> #endmeeting