15:01:12 <gouthamr> #startmeeting manila
15:01:12 <opendevmeet> Meeting started Thu Jul 15 15:01:12 2021 UTC and is due to finish in 60 minutes.  The chair is gouthamr. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:01:12 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:01:12 <opendevmeet> The meeting name has been set to 'manila'
15:01:28 <carloss> o/
15:01:31 <ashrodri> o/
15:01:41 <vhari> hi
15:01:55 <tbarron> hi
15:02:09 <dviroel> o/
15:02:38 <gouthamr> courtesy ping: ganso vkmc felipe_rodrigues ecsantos vhari
15:02:42 <felipe_rodrigues> o/
15:02:47 <gouthamr> o/ hello folks
15:02:48 <vkmc> o/
15:02:50 <vkmc> hi everyone
15:02:57 <archanaserver> o/
15:02:58 <caiquemello> o/
15:03:17 <gouthamr> here's the agenda for today's meeting: https://wiki.openstack.org/wiki/Manila/Meetings#Next_meeting
15:03:25 <gouthamr> let's begin with
15:03:30 <gouthamr> #topic Announcements
15:03:45 <gouthamr> it's milestone-2 today
15:03:49 <gouthamr> #link https://releases.openstack.org/xena/schedule.html
15:04:05 <gouthamr> #link https://launchpad.net/manila/+milestone/xena-2
15:04:46 <gouthamr> ^ if your name's against any of these bugs, and you're not working on them, please comment on the bug and move the target forward
15:05:07 <gouthamr> our new driver submission deadline is in two weeks
15:05:21 <gouthamr> and Feature Proposal Freeze is on Aug 6
15:05:41 <gouthamr> so we're slowly inching towards the tail end of the xena development cycle
15:07:08 <gouthamr> i've signed us up for space and time at the upcoming virtual project team gathering event
15:07:21 <gouthamr> #link https://ethercalc.openstack.org/8tum5yl1bx43 (Tentative Schedule for the Yoga PTG)
15:07:55 <gouthamr> it's a tentative schedule, and we have the flexibility to move things around
15:08:44 <gouthamr> lots of empty spaces on the ethercalc :) but this is FCFS
15:09:46 <gouthamr> as has been the case in the past, we may move some action around to other team rooms
15:11:23 <gouthamr> i'll send an email on the ML with some more details, and a planning etherpad to collect ideas/topics
15:11:49 <gouthamr> a number of us are participating in the grace hopper celebration summer open source day today
15:12:24 <gouthamr> its a day long event, you could see some code review action around eod
15:12:46 <gouthamr> that's all the announcements i had, anyone else got any?
15:14:24 <gouthamr> #topic Manila UI stable/queens to EOL?
15:15:04 <gouthamr> so the horizon team's planning to EOL their stable/queens branches
15:15:09 <gouthamr> #link https://review.opendev.org/c/openstack/releases/+/799543 (Horizon moving stable/queens to eol)
15:15:37 <gouthamr> and they want to let us determine if manila-ui's stable/queens branch has to be EOL'ed at the same time
15:16:22 <gouthamr> noticed this via code review, and thought i could stop some wasted effort..
15:16:24 <gouthamr> #link https://review.opendev.org/c/openstack/manila-ui/+/800544 (Patch to move horizon jobs to manila-ui, but is this necessary?)
15:16:51 <gouthamr> there's really little benefit if horizon's moving to eol, and manila-ui continues to accept patches
15:17:29 <gouthamr> queens is a 3.5 year old release
15:18:08 <gouthamr> and i am motivated to pursue discussions to eol the stable/queens branches on manila repos itself
15:18:51 <gouthamr> but, the immediate question is how we want to proceed with manila-ui
15:19:14 <gouthamr> does anyone have any concerns with us asking the horizon team to eol the stable/queens branch of manila-ui as well?
15:20:37 <gouthamr> ^ manila-ui is jointly maintained by the horizon team and us
15:20:55 <gouthamr> and the last time we had a patch on the queens branch here was exactly one year ago:
15:21:18 <gouthamr> #link https://review.opendev.org/q/project:openstack/manila-ui+branch:stable/queens (backports to stable/queens on manila-ui)
15:21:45 <vkmc> ++
15:21:53 <vkmc> I think this should be a joint effort
15:22:07 <vkmc> if they are eol, it doesn't make sense we keep maintaining that branch
15:22:12 <vkmc> in manila-ui
15:22:24 <carloss> I don't have any concerns :)
15:22:29 <vkmc> perhaps one of us could follow this up with them
15:22:35 <vkmc> gouthamr, do you want me to chase that?
15:23:41 <gouthamr> vkmc: yep! vishalmanchanda was pursuing this, so we can let him know
15:24:07 <vkmc> gouthamr, cool!
15:24:28 <gouthamr> vkmc: if we don't want to do this in https://review.opendev.org/c/openstack/releases/+/799543 ; you can upload a similar patch to tag just manila-ui
15:24:35 <gouthamr> and add me to review :)
15:24:56 <vkmc> gouthamr, perfect
15:25:02 <gouthamr> thanks vkmc++
15:25:17 <gouthamr> alright anything else regarding $topic?
15:25:23 <haixin> what does eol mean?
15:25:42 <vkmc> end of life
15:25:55 <vkmc> :D
15:25:57 <gouthamr> +1, the branch will be retired
15:26:02 <haixin> i know,,
15:26:24 <gouthamr> the phases are described here:
15:26:33 <gouthamr> #link https://docs.openstack.org/project-team-guide/stable-branches.html (Stable Branches and Policies)
15:27:32 <gouthamr> so at eol, we 1) create a git tag, 2) abandon all pending changes to the respective branch, 3) pursue the deletion of the branch
15:27:56 <gouthamr> step 1) is done via the openstack/releases repository, like https://review.opendev.org/c/openstack/releases/+/799543
15:28:44 <gouthamr> step 2) for stable branches will need someone from the manila-stable-core, or stable-maint teams, or patch owners to manually abandon the pending gerrit changes
15:29:14 <gouthamr> step 3) will require asking the friendly folks at #opendev-infra to clean up the branch
15:30:05 <gouthamr> so the real impact here is if anyone was pulling code from that branch
15:30:23 <gouthamr> since the branch will disappear from the sources (opendev.org, github.com), they'll need to stop relying on it
15:31:13 <gouthamr> generally, distributions like RDO watch changes to older branches, and possibly have work to cleanup stuff on their end when we retire older stable branches
15:31:43 <gouthamr> its a lot of sausage making being described here :P
15:33:09 <gouthamr> on related thoughts, i'd like to propose similar retirement of the stable/queens branch on manila and python-manilaclient and signal the intent to retire stable/rocky and stable/stein as well
15:33:29 <tbarron> +1
15:34:34 <gouthamr> we discussed this at the PTG and we seem to be in agreement, i'll collect some thoughts on the mailing list
15:35:17 <gouthamr> any other comments/concerns about this?
15:35:54 <ashrodri> ++
15:36:09 <gouthamr> good stuff, lets move to regular programming
15:36:25 <gouthamr> #topic Reviews needing attention
15:36:25 <gouthamr> #link https://etherpad.opendev.org/p/manila-xena-review-focus
15:37:01 <gouthamr> we've some good reviews on the last spec to merge: https://review.opendev.org/c/openstack/manila-specs/+/775198
15:37:23 <gouthamr> carloss vkmc dviroel haixin, et al ^ can you please look as well
15:37:46 <vkmc> gouthamr, sure thing
15:38:06 <dviroel> yep
15:38:31 <gouthamr> i reiterate, i treat spec reviews a bit differently than code - i usually wait for maximum reviews there, so we have a chance to work through the design collectively
15:39:09 <haixin> ok
15:39:14 <gouthamr> if you're new to this community, this is your cue :D you'll love specs - they lay out problems and solutions in a way that'll probably peak your interest as opposed to reading code
15:39:15 <carloss> sure
15:39:20 <gouthamr> thanks folks
15:39:39 <gouthamr> simondodsley: we're chatting here, this is the weekly irc sync
15:40:08 <gouthamr> so regarding the pure driver review, we're stuck with not being able to get the CI to pass of late
15:40:13 <gouthamr> #link https://review.opendev.org/c/openstack/manila/+/789384
15:40:34 <gouthamr> #link http://lists.openstack.org/pipermail/openstack-discuss/2021-July/023631.html
15:41:10 <gouthamr> thanks for running tests and posting on the mailing list, carloss
15:41:21 <carloss> my pleasure gouthamr
15:41:58 <simondodsley> note that before the devstack changes to OVN we were getting the CI to pass
15:42:24 <gouthamr> so if you're not aware, something is broken on devstack that we're unable to nail down that prevents vms within the devstack from accessing the world outside the devstack node
15:42:26 <simondodsley> so the scenarios do work with this driver
15:42:54 <gouthamr> so if you have bright ideas, do share them on the review
15:43:10 <simondodsley> what I'd like, whilst this issue is resolved, to get an exception for the vendor CI pass and get the driver merged
15:44:12 <dviroel> is mandatory to have scenarios tests passing as criteria to merge?
15:44:52 <gouthamr> it's mandatory to run the scenario tests, yes - although this failure of infrastructure is bothersome
15:44:53 * dviroel doesn't mean that we shouldn't persue the issue, of course
15:45:36 <gouthamr> simondodsley: you've ofcourse tested the data paths, you've a vested interest in making sure this works
15:46:02 <gouthamr> ^ any concerns with allowing this exception, while we pursue what broke with devstack?
15:46:31 <simondodsley> yes - all the data paths have been tested and I will continue to work of a resolution as it will stop the CI passing any other reviews
15:46:59 <gouthamr> tbarron dviroel carloss: you've signed up to review, what do you think?
15:48:23 <carloss> I'm okay with the exception - simondodsley has been working for some time in the driver, was always responsive in the reviews I saw
15:49:24 <dviroel> me too
15:50:22 <gouthamr> awesome thank you, simondodsley - we'll do the exception
15:50:34 <gouthamr> thanks for all your hard work resolving this issue
15:50:48 <tbarron> ok by me
15:51:05 <gouthamr> its now a common concern for most third party CIs, so we'll chase this down with the neutron/OVN folks
15:51:08 <simondodsley> thanks team. appreciate it.
15:51:23 <dviroel> simondodsley++
15:51:28 <carloss> simondodsley++
15:51:44 <gouthamr> great, any other reviews on https://etherpad.opendev.org/p/manila-xena-review-focus that need to be discussed here?
15:52:22 <gouthamr> haixin tbarron: it is possible that https://review.opendev.org/c/openstack/manila/+/789702 exposes an underlying issue
15:52:34 <tbarron> yes
15:52:44 <tbarron> I think exposes rather than causes
15:53:01 <gouthamr> i haven't looked at the loopingcall code (although at first glance the code paths look very similar) closely yet
15:53:07 <tbarron> but am open to counter-argument
15:54:13 <tbarron> I wonder if those tasks only take so long in a resource constrained devstack environment
15:54:15 <tbarron> dunno
15:54:19 <gouthamr> it's possible
15:54:35 <gouthamr> there was a thread on the ML where this occurred in prod
15:54:45 <gouthamr> #link http://lists.openstack.org/pipermail/openstack-discuss/2019-September/009534.html ([oslo] run outlasted interval)
15:54:58 <gouthamr> i didn't see a resolution however
15:55:12 <tbarron> good find
15:55:52 <gouthamr> i don't see this in the tempest jobs in cinder where a similar approach is followed
15:56:37 <gouthamr> another thing to check is if this occurs on all the CI jobs
15:56:44 <tbarron> gouthamr: do you know if we see this consistently?
15:56:47 <tbarron> jinx
15:57:11 <gouthamr> i didn't look beyond one of the dummy driver jobs that i linked you folks to
15:57:27 * gouthamr looks at the clock
15:57:52 <tbarron> dummy driver job probably is running lots of concurrent stuff with no backend slowing stuff down
15:58:03 <tbarron> maybe more contention but let's check other jobs
15:58:29 <gouthamr> +1
15:58:49 <gouthamr> haixin: would like your thoughts as well; but lets chat directly on the change
15:59:25 <gouthamr> vhari: sorry we're going to skip bug triage today
15:59:39 <vhari> gouthamr, np .. no new bug this week :)
15:59:45 <gouthamr> oh that's good news :)
16:00:07 <gouthamr> alright, everyone - no time for Open Discussion as well.
16:00:20 <gouthamr> if you'd like to chat about something, please hop on over to #openstack-manila
16:00:37 <gouthamr> have fun at GHC/OSD if you're attending, and see you here next week!
16:00:42 <gouthamr> thanks!
16:00:51 <gouthamr> #endmeeting