14:00:02 <whoami-rajat> #startmeeting cinder
14:00:02 <opendevmeet> Meeting started Wed Nov  9 14:00:02 2022 UTC and is due to finish in 60 minutes.  The chair is whoami-rajat. Information about MeetBot at http://wiki.debian.org/MeetBot.
14:00:02 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
14:00:02 <opendevmeet> The meeting name has been set to 'cinder'
14:00:04 <whoami-rajat> #topic roll call
14:00:17 <felipe_rodrigues> o;
14:00:26 <felipe_rodrigues> o/
14:00:36 <nahimsouza[m]> o/
14:00:36 <rosmaita> o/
14:00:38 <ganso> o/
14:00:43 <caiquemello[m]> o/
14:00:46 <enriquetaso> hi
14:01:33 <whoami-rajat> for US and EMEA regions, the meeting might have shifted 1 hour earlier since this is UTC time
14:01:39 <whoami-rajat> #link https://etherpad.opendev.org/p/cinder-antelope-meetings
14:02:24 <whoami-rajat> If it's too early for anyone and they would like to join, let me know
14:03:18 <e0ne> hi
14:03:31 <mubeen> hi
14:03:53 <tosky> hi
14:04:04 <whoami-rajat> good turnout we've, let's get started
14:04:10 <whoami-rajat> #topic announcements
14:04:23 <whoami-rajat> first, cinderlib Zed release deadline is 16 Dec 2022
14:04:46 <whoami-rajat> i was going to add it before the meeting but i think rosmaita added it earlier
14:04:53 <rosmaita> :)
14:05:00 <whoami-rajat> #link https://lists.openstack.org/pipermail/openstack-discuss/2022-November/031095.html
14:05:09 <whoami-rajat> so we've cinderlib release coming up
14:05:35 <whoami-rajat> rosmaita, would you like to give an current overview and elaborate on the gate situation?
14:05:43 <rosmaita> sure
14:05:58 <rosmaita> the short version is that cinderlib is completely broken in zed
14:06:24 <whoami-rajat> :(
14:06:30 * jungleboyj sneaks in late
14:06:30 <enriquetaso> oh no
14:06:41 <rosmaita> i'm pretty sure the problem is that the db code changes in zed, the moving the session to the context stuff, is what has done it
14:06:54 <rosmaita> so, shouldn't be too bad to fix, but someone needs to do it
14:07:28 <rosmaita> and then we need to get the gate fixed (current gate is still testing yoga, i think)
14:07:48 <rosmaita> i'm pretty sure there haven't been many zed cinderlib patches, it's been kind of quiet
14:07:58 <enriquetaso> do we have a bug for this to keep track somewhere or we don't need it?
14:07:59 <whoami-rajat> i think you had a patch that changes it but it never merged because of the gate failures
14:08:12 <rosmaita> so once the db stuff is fixed, i am hopeful that it will be smooth sailing
14:08:27 <rosmaita> enriquetaso: not sure, but you are right, we should be tracking it
14:08:36 <rosmaita> whoami-rajat: yes, i will dig up a link to the review
14:08:41 <whoami-rajat> +1 for tracking it
14:09:07 <rosmaita> https://review.opendev.org/c/openstack/cinderlib/+/848846
14:09:26 <whoami-rajat> #link https://review.opendev.org/c/openstack/cinderlib/+/848846
14:09:58 <rosmaita> i should get that patch out of merge conflict and re-run it to get a fresh set of logs
14:10:09 <enriquetaso> OK, i will open a bug
14:10:42 <whoami-rajat> title: "cinderlib gate is horribly broken"
14:10:48 <enriquetaso> lol
14:11:29 <rosmaita> i think it's just the unit tests having DB issues
14:11:29 <whoami-rajat> so i think we're looking good but some progress would be good as well
14:11:34 <whoami-rajat> 16 december is not far ...
14:11:48 <whoami-rajat> rosmaita, let me know if i can help anywhere
14:12:20 <rosmaita> yeah, let's coordinate with geguileo when he gets back next week
14:12:34 <whoami-rajat> +1
14:12:49 <whoami-rajat> #action: enriquetaso to open a bug for cinderlib gate situation
14:12:59 <whoami-rajat> #action: rosmaita geguileo and whoami-rajat to coordinate of fixing it
14:13:04 <tosky> and maybe see if it's possible to prevent future issues
14:13:49 <whoami-rajat> cinderlib depends a lot on cinder so unless there is some major change in cinder, cinderlib should be fine
14:14:11 <whoami-rajat> but we should consider cinderlib while reviewing major changes in cinder
14:15:18 <rosmaita> yeah, we run cinderlib functional tests on cinder changes, not sure why we didn't see any breakage
14:15:31 <rosmaita> i guess because the functional jobs were actually passing
14:15:44 <whoami-rajat> is it n-v in cinder gate?
14:15:47 * whoami-rajat checks
14:16:12 <rosmaita> i think cinderlib only needs to use the DB for certain drivers who (quite improperly) access the cinder db
14:16:30 <rosmaita> so the drivers used in our gate aren't going to hit that problem
14:17:43 <whoami-rajat> I'm unable to find a job running cinderlib tests, rosmaita do you know which one runs it?
14:18:11 <rosmaita> maybe in cinder-tempest-plugin, there are some jobs that run cinder and then also run cinderlib functional
14:18:15 <rosmaita> (i will look)
14:18:31 <whoami-rajat> ok found in cinder-tempest-plugin-lvm-lio-barbican
14:18:45 <whoami-rajat> https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_8ad/860997/3/check/cinder-tempest-plugin-lvm-lio-barbican/8ad5892/testr_results.html
14:18:55 <whoami-rajat> total 17 tests we run here
14:19:01 <rosmaita> cinder-plugin-ceph-tempest also
14:19:23 <whoami-rajat> hmm, I'm unable to find it there https://b5fbbfefaf709464375e-0f35cc5e18a5457e4954b3967fd49fc8.ssl.cf2.rackcdn.com/860997/3/check/cinder-plugin-ceph-tempest/f01de21/testr_results.html
14:20:22 <rosmaita> yeah, that's weird, because the job is defined to use it: https://opendev.org/openstack/cinder/src/branch/master/.zuul.yaml#L153
14:20:35 <whoami-rajat> yep was checking that
14:20:39 <whoami-rajat> maybe something changed
14:21:49 <whoami-rajat> anyway, we can take a look later
14:23:08 <rosmaita> actually, that job did run the tests, looks like there was an error later when uploading logs: https://zuul.opendev.org/t/openstack/build/f01de213d6794d7b89e4521da94c6e3b/log/job-output.txt#26211
14:23:39 <whoami-rajat> ah ok, looks good then
14:23:45 <rosmaita> (at least i think that's the same job, those urls are crazy)
14:24:08 <tosky> I think i'ts a different job
14:24:22 <tosky> oh, no, it's that one
14:24:50 <tosky> so both the lvm-lio-barbican job (defined inside cinder-tempest-plugin) and the ceph one (defined in cinder.git) should run those tests
14:25:49 <rosmaita> ok, looks like that's working
14:26:12 <whoami-rajat> actually the tests are defined to run in cinder-tempest-plugin-lvm-barbican-base
14:26:19 <whoami-rajat> so any job inheriting from that should be running ti
14:26:22 <whoami-rajat> s/ti/it
14:26:39 <whoami-rajat> so there are few but mostly we're concerned with lvm-lio-barbican and ceph-tempest
14:26:43 <whoami-rajat> https://github.com/openstack/cinder-tempest-plugin/blob/master/.zuul.yaml#L125
14:26:44 <tosky> that was the plan, yes
14:26:59 <whoami-rajat> ack
14:27:28 <rosmaita> looks like we could use some kind of canary tests that hit the db
14:27:39 <rosmaita> we can see if geguileo has any ideas
14:27:56 <whoami-rajat> let's discuss this again next week when geguileo is back
14:28:52 <whoami-rajat> ok moving on
14:28:59 <whoami-rajat> next announcement, Milestone 1 next week
14:29:08 <whoami-rajat> we had few targets for M-1
14:29:17 * rosmaita hides
14:29:23 <whoami-rajat> :D
14:29:30 <whoami-rajat> I've reviewed one change in TusharTgite's reset state series
14:29:36 <whoami-rajat> he is working on those changes
14:29:54 <whoami-rajat> apart from that, keystone team said they might be implementing service role by M-1
14:30:01 <whoami-rajat> which I'm not sure about the status
14:30:41 <whoami-rajat> that's all i can remember for now
14:31:08 <whoami-rajat> if anyone has a patch that is a priority for M-1, add it to the etherpad under this topic
14:31:47 <rosmaita> shoot, service role spec is still un-approved
14:32:03 <whoami-rajat> that's bad ...
14:32:25 <rosmaita> i'll bring it up at the TC meeting today, get some visibility
14:32:59 <whoami-rajat> rosmaita++
14:33:45 <whoami-rajat> moving on then
14:33:50 <whoami-rajat> next, Midcycle - 1 planning
14:34:00 <whoami-rajat> so we've midcycle coming up on 30th November
14:34:30 <whoami-rajat> which seems like far but the deadlines come very quickly
14:34:42 <whoami-rajat> so I've created an etherpad to add topics for it
14:34:48 <whoami-rajat> #link https://etherpad.opendev.org/p/cinder-antelope-midcycles
14:35:26 <whoami-rajat> so encouraging everyone to add your topics in the etherpad
14:36:06 <whoami-rajat> I've already added 2 so you don't have to be the first one
14:37:14 <whoami-rajat> let's move to the last announcement
14:37:20 <whoami-rajat> Bug Deputy and Stable release manager
14:37:23 <whoami-rajat> first Bug deputy
14:37:38 <whoami-rajat> enriquetaso, has been doing a great job since past cycles
14:38:05 <jungleboyj> ++
14:38:38 <whoami-rajat> although I would like her to continue the work, I also would like to ask her if she is willing to continue it or have other plans
14:39:08 <rosmaita> ++ for enriquetaso from me too
14:39:43 <whoami-rajat> it doesn't have to be a prompt response, we can discuss it
14:40:08 <enriquetaso> sure
14:40:17 <enriquetaso> lol
14:40:27 <whoami-rajat> great, maybe next week you can update with your response
14:40:35 <enriquetaso> If anyone would like to take the position is also fine
14:40:55 <enriquetaso> i dont mind continue
14:41:01 <rosmaita> let's be clear, we are concerned about over-working you, not about the job you are doing, which is fantastic
14:41:15 <whoami-rajat> exactly what rosmaita said!
14:41:22 <enriquetaso> \o/
14:41:32 <jungleboyj> :-)
14:41:42 <jungleboyj> Giving you the option out if you need a break.
14:42:40 <whoami-rajat> rosmaita, and jungleboyj explains it better than i do but that's what i wanted to convey
14:43:03 <jungleboyj> whoami-rajat: :-)
14:43:06 <enriquetaso> thanks
14:43:29 <whoami-rajat> so let's discuss this again next week
14:43:42 <whoami-rajat> coming to Stable release manager
14:44:04 <whoami-rajat> Jon doesn't seem to be around today but he became stable release maintainer last cycle
14:44:13 <whoami-rajat> and has done an excellent job
14:44:29 <whoami-rajat> he was able to carry out stable releases for all active branches
14:44:55 <whoami-rajat> and also recently released final wallaby release before moving it to EM
14:45:24 <whoami-rajat> since he's not around, we can discuss this next week as well
14:45:47 <whoami-rajat> but he's doing a good job and would be good if he continues
14:46:04 <whoami-rajat> anyway, moving on to the topics now
14:46:07 <whoami-rajat> #topic Request reviews for new Pure Storage replication feature
14:46:10 <whoami-rajat> simondodsley, that's you
14:46:33 <whoami-rajat> #link https://review.opendev.org/c/openstack/cinder/+/862365
14:46:50 <simondodsley> Yep - new replication feature. Passes all Pure CI and Zuul. Just need some core eyeys on it
14:48:38 <whoami-rajat> since this is a driver feature, the deadline is M-3 so i would keep it little lower on my priority list (we've a lot for M-1 and M-2)
14:48:52 <whoami-rajat> but don't want to discourage anyone from reviewing it ^
14:48:56 <simondodsley> OK - move on then
14:48:58 <whoami-rajat> please take a look
14:49:45 <whoami-rajat> if you are a driver vendor, would appreciate your reviews on driver changes like these ^
14:50:09 <whoami-rajat> next topic
14:50:11 <whoami-rajat> #topic Request for re-review on new patchset
14:50:17 <whoami-rajat> ganso, that's you
14:50:27 <ganso> o/
14:50:36 <ganso> so just a request for re-review on that patch
14:50:38 <ganso> I addressed the comments
14:50:42 <ganso> whoami-rajat: thanks for the review btw!
14:50:48 <whoami-rajat> ack, will take a look
14:50:50 <whoami-rajat> np
14:50:54 <ganso> #link https://review.opendev.org/c/openstack/cinder/+/812685
14:51:11 <ganso> if other core reviewers could chime in, would be awesome! thanks in advance!
14:51:13 <rosmaita> i will take a look too
14:51:21 <rosmaita> (for realz this time)
14:51:29 <whoami-rajat> it's been sitting there for long time and is important to fix for the glance multi store case
14:51:47 <whoami-rajat> great
14:51:52 <whoami-rajat> last topic then
14:51:56 <whoami-rajat> #topic using cinderclient with older Block Storage API versions
14:51:59 <whoami-rajat> rosmaita, that's you
14:52:36 <rosmaita> yeah, walt found a bug earlier this week when using zed cinderclient with wallaby
14:53:02 <rosmaita> i think i figured out what's going on, but we don't really test that scenario at all
14:53:40 <rosmaita> anyway, the commit message gives my theory of what's happening, and i added a unit test for it
14:53:57 <whoami-rajat> #link https://review.opendev.org/c/openstack/python-cinderclient/+/864027
14:54:02 <rosmaita> thanks!
14:54:28 <rosmaita> so, please review, and there may be some other cases where we will hit this issue
14:55:00 <rosmaita> though it may not be worth worrying much about if we move to the openstackclient for CLI
14:55:38 <rosmaita> that's all, and thanks to walt for testing the patch in his environment
14:55:51 <whoami-rajat> yeah, that reminds me i need to work on that ^
14:56:15 <whoami-rajat> thanks hemna- and rosmaita for fixing this
14:57:00 <whoami-rajat> we're done with topics so let's move to open discussion for 4 minutes
14:57:04 <whoami-rajat> #topic open discussion
14:59:23 <whoami-rajat> guess nothing else to discuss
14:59:32 <whoami-rajat> thanks everyone for joining
14:59:36 <whoami-rajat> #endmeeting