14:00:01 <whoami-rajat> #startmeeting cinder
14:00:01 <opendevmeet> Meeting started Wed Jan  4 14:00:01 2023 UTC and is due to finish in 60 minutes.  The chair is whoami-rajat. Information about MeetBot at http://wiki.debian.org/MeetBot.
14:00:01 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
14:00:01 <opendevmeet> The meeting name has been set to 'cinder'
14:00:03 <whoami-rajat> #topic roll call
14:00:09 <enriquetaso> hi
14:00:09 <simondodsley> o/
14:00:30 <eharney> hi
14:00:32 <harshailani> hi
14:00:33 <Mounika> hi
14:00:34 <mubeen> hi
14:01:13 <whoami-rajat> #link https://etherpad.opendev.org/p/cinder-antelope-meetings
14:01:43 <whoami-rajat> Happy new year everyone!
14:01:44 <jungleboyj> o/
14:01:51 <jungleboyj> Happy New Year!
14:02:21 <rosmaita> o/
14:02:46 <harshailani> Happy New Year :)
14:03:01 <Mounika> Happy New Year!
14:03:01 <whoami-rajat> good amount of people are around after the break
14:03:18 <whoami-rajat> let's get started
14:03:24 <whoami-rajat> #topic announcements
14:03:37 <whoami-rajat> first, Status of Specs (deadline 23rd December, 2022)
14:03:44 <whoami-rajat> so we're past the spec deadline
14:04:01 <whoami-rajat> there were 3 specs proposed out of which 1 merged
14:04:10 <whoami-rajat> which is Extend in-use FS volumes
14:04:25 <whoami-rajat> the next one is, Encrypted Backup support
14:04:43 <whoami-rajat> but that currently has 2 -1s and I think Gorka is still not back from break
14:05:16 <whoami-rajat> last one is New backup field for allowing backups during live migration
14:05:26 <whoami-rajat> which was recently proposed i think during year end
14:05:56 <simondodsley> Wel that is well past spec freeze so it should wait
14:06:14 <whoami-rajat> yes correct
14:06:27 <whoami-rajat> we could've considered them but they're way far from merging right now
14:06:27 <jungleboyj> ++
14:06:49 <whoami-rajat> and extending it would only conflict with our next deadlines
14:06:56 <whoami-rajat> still I'm open to suggestions if the team thinks otherwise
14:06:57 <eharney> iirc we still need to assess whether fernet is a good solution for encrypted backups
14:07:17 <whoami-rajat> that's a good point
14:07:39 <eharney> there were some good concerns raised that haven't really been analyzed yet
14:07:56 <happystacker> hey guys! Happy new year to all the community
14:08:15 <happystacker> nad especially my loved core reviwers ;-)
14:09:36 <whoami-rajat> so it makes sense to push them to next cycle, I will do the procedural -2 later today
14:09:43 <whoami-rajat> thanks for the discussion
14:09:44 <whoami-rajat> happystacker, happy new year!
14:10:10 <whoami-rajat> let's move to the next announcement
14:10:13 <whoami-rajat> Driver Merge deadline 20th Jan, 2023
14:10:42 <whoami-rajat> deadline would've been this week but I shifted it based on past cycle experience so we've time to review them
14:10:59 <whoami-rajat> ok, let's discuss the drivers quickly
14:11:01 <whoami-rajat> 1) HPE XP
14:11:07 <whoami-rajat> #link https://review.opendev.org/c/openstack/cinder/+/815582
14:11:20 <whoami-rajat> I did a CI check yesterday and it hasn't responded yet
14:11:40 <whoami-rajat> I used the same comment they did in os-brick
14:11:48 <whoami-rajat> so not sure what's wrong
14:12:05 <simondodsley> is abdi: here?
14:12:17 <whoami-rajat> regarding the code part, it has inherited everything from the hitachi driver, even supports the same features as the parent driver
14:12:50 <simondodsley> it's the same device just rebadbed
14:12:56 <simondodsley> rebadged
14:13:16 <whoami-rajat> ah, that makes sense now
14:14:23 <whoami-rajat> ok, I've left a comment for the author to check the CI
14:15:16 <simondodsley> rosmaita do you have your CI checklist somewhere?
14:15:19 <whoami-rajat> in the meantime, feel free to review the driver, the tests are good LOC so that could be reviewed
14:15:29 <rosmaita> sightly off-topic, but whoami-rajat it would be a good idea to send something to the ML reminding vendors that we are requiring CI on os-brick changes for antelope
14:15:43 <rosmaita> #link https://lists.openstack.org/pipermail/openstack-discuss/2022-August/030014.html
14:15:49 <rosmaita> follow-up to that ^^
14:16:14 <rosmaita> simondodsley: yes, i should put that up in an etherpad or something
14:16:24 <happystacker> btw, I have a few code changes I'd expect to be merged into Antelope, can someone take a look at them if I give you the list?
14:16:27 <rosmaita> "that" == CI checklist
14:16:43 <whoami-rajat> rosmaita, good idea, i will reply to that thread
14:17:05 <rosmaita> sounds good
14:17:13 <whoami-rajat> happystacker, we can discuss that towards the end in the open discussion
14:17:28 <happystacker> sounds good, preparing the code change IDs
14:17:44 <simondodsley> happystacker, put them in the etherpad
14:17:55 <happystacker> will do
14:18:05 <whoami-rajat> happystacker, better, I've added a review request section in the end, you can put the list there
14:18:11 <whoami-rajat> all ^
14:18:22 <happystacker> ok Rajat, working on it
14:18:39 <whoami-rajat> great
14:18:45 <whoami-rajat> so moving on to the next driver
14:18:58 <whoami-rajat> 2) Fungible NVMe TCP
14:19:03 <whoami-rajat> #link https://review.opendev.org/c/openstack/cinder/+/849143
14:19:24 <whoami-rajat> I've reviewed it today, they seem to be doing something with a default volume type that they've defined
14:19:39 <whoami-rajat> I'm not sure what's the idea there but probably they will clarify
14:20:04 <whoami-rajat> also I don't know if we support NVMe with TCP or not
14:20:17 <whoami-rajat> from os-brick perspective
14:20:29 <simondodsley> yes - TCP and RoCE and supported. FC is not
14:20:54 <whoami-rajat> oh that's good
14:21:06 <happystacker> what's the etherpad link again?
14:21:15 <whoami-rajat> happystacker, https://etherpad.opendev.org/p/cinder-antelope-meetings
14:21:15 <simondodsley> FC is delayed as Red Hat don't want Gorka working on it as they don't see their customers wanting it
14:21:48 <simondodsley> which is a shame for other vendors that do, but don't have the os-brick skillset...
14:22:24 <whoami-rajat> ok, i think it also depends on vendors what protocol they use with nvme? do we have drivers (existing/upcoming) that would want nvme with FC?
14:22:41 <simondodsley> Pure want it
14:23:12 <rosmaita> simondodsley: just out of curiosity, is that for customers who already have FC, or completely new customers?
14:23:38 <simondodsley> usually for existing customers. New ones tend to be greenfield and will go TCP or RoCE
14:23:57 <rosmaita> thanks, that was my intuition, but i wanted a reality check
14:24:37 <whoami-rajat> good to know we've a use case for NVMe FC support
14:25:09 <whoami-rajat> ok, that's all for this driver as well
14:25:14 <felipe_rodrigues> Hi guys
14:25:42 <felipe_rodrigues> NetApp wants to deliver a NVMe/TCP driver for the next release
14:25:58 <whoami-rajat> these are the 2 drivers we've for this cycle, reviewers please take a look
14:25:59 <felipe_rodrigues> We are working downstream on it.. probably, sending the patch upstream end of this week
14:26:17 <felipe_rodrigues> is it possible yet ?
14:26:33 <whoami-rajat> felipe_rodrigues, if it's for next release, sounds good to me
14:26:55 <whoami-rajat> for this cycle, it needs to have a working CI and all the driver guidelines satisfied
14:28:02 <felipe_rodrigues> They are not available yet, because it is private.. The patch is a medium one.. because it is just about connection (initialize and terminate connection)..
14:28:54 <felipe_rodrigues> I mean.. The CI/patch would be available end of this week, is it possible to have the review and merge  to Antelope or it is too late ?
14:29:18 <whoami-rajat> the deadline is 20th Jan so we've enough time
14:29:43 <felipe_rodrigues> I see..
14:29:59 <felipe_rodrigues> Let see if it possible, thank you so much!
14:30:14 <enriquetaso> remember that we are requiring CI on os-brick changes for antelope
14:30:50 <whoami-rajat> the idea is to have the driver and CI in a working state and respond quickly to review comments to have better chance to make it to the cycle
14:31:00 <whoami-rajat> enriquetaso++
14:31:29 <whoami-rajat> ok, last announcement I've is for midcycle 2
14:31:34 <whoami-rajat> Midcycle-2 Planning
14:31:40 <whoami-rajat> #link https://etherpad.opendev.org/p/cinder-antelope-midcycles#L33
14:31:57 <whoami-rajat> we hadn't finalized the date in the beginning since there were conflicts
14:32:09 <whoami-rajat> currently I'm proposing the date 18th Jan which is next to next week
14:32:19 <whoami-rajat> do we have any known conflicts for that date?
14:32:56 <whoami-rajat> it is a 2 hour video meet
14:32:57 <whoami-rajat> it will be on wednesday and will overlap with 1 hour of cinder upstream meeting
14:33:20 <simondodsley> i'm not available then, but that shouldn't stop you
14:34:36 <rosmaita> no conflicts from me, and at the risk of insulting someone, doesn't look like there's a national holiday in any of the major locations on that day
14:34:58 <harsh> yea no conflict with me as well
14:35:21 <enriquetaso> no conflicts from me either (but i'm also available any other day)
14:35:40 <whoami-rajat> cool, let's fix this date for now and discuss this again next week
14:35:48 <whoami-rajat> in the meantime, please add topics
14:36:26 <whoami-rajat> that's all i had for announcements
14:36:31 <whoami-rajat> anyone has anything else?
14:37:50 <whoami-rajat> looks like not, let's move to topics then
14:37:55 <whoami-rajat> #topic tox jobs failing in the stable branches
14:37:58 <whoami-rajat> rosmaita, that's you
14:38:35 <rosmaita> yeah, we are having tox 4 failures in all stable branches
14:38:37 <whoami-rajat> #link https://review.opendev.org/q/topic:tox-4-postponed
14:38:51 <rosmaita> those are the patches ^^ that should fix it
14:39:11 <rosmaita> at least for now, they don't use tox 4
14:39:32 <happystacker> ok so we'll continue to use tox 3 for now?
14:39:41 <rosmaita> yes, in the stable branches
14:39:48 <happystacker> ok
14:39:53 <rosmaita> you can't use tox4 with python 3.6
14:39:54 <whoami-rajat> rosmaita, one question i had, i only see os-brick and cinder patches, don't we require it in cinderclient or other cinder projects?
14:40:35 <rosmaita> yeah, i was waiting until the cinderclient patch to master was working
14:40:40 <happystacker> Do we have an estimate of when the swicth to tox4 will happen?
14:40:41 <rosmaita> which doesn't make sense, now that i think about it
14:40:51 <rosmaita> happystacker: december 23, 2022
14:41:06 <whoami-rajat> rosmaita, ah we still have cinderclient change open?
14:41:08 <rosmaita> i can put up patches for the cinderclient stable branches too
14:41:13 <whoami-rajat> i mean the master one
14:41:29 <happystacker> december 23 has passed
14:41:31 <rosmaita> yeah, i think i figured out what was happening, and put up a new patch
14:41:50 <whoami-rajat> ok, i remember, it was failing gate
14:41:57 <rosmaita> take a look at this real quick: https://zuul.openstack.org/stream/6997323310894f48855e4b9139b26f10?logfile=console.log
14:41:57 <harsh> yes
14:42:12 <rosmaita> the functional-py38 are passing now, but it's going to timeout
14:42:20 <harsh> gate was failing but that was due to openstacksdk failures
14:42:22 <whoami-rajat> happystacker, the tox4 migration has happened and that's why we're seeing gate breaking in stable branches, for master rosmaita already fixed it
14:42:38 <rosmaita> harsh: this is a different issue (i think)
14:42:43 <harsh> oh ok
14:42:45 <whoami-rajat> harsh, that's a different issue which is fixed now
14:42:51 <whoami-rajat> but related to tox4
14:42:55 <harsh> yea
14:43:02 <happystacker> mmh, ok
14:43:02 <rosmaita> maybe fungi is around?
14:43:19 <fungi> i am
14:43:24 <rosmaita> the func-py38 has been sitting for >20 min
14:43:43 <rosmaita> fungi: happy new year! can you take a look at https://zuul.openstack.org/stream/6997323310894f48855e4b9139b26f10?logfile=console.log
14:44:14 <rosmaita> looks like the job is stuck?  hopefully it's not something on the cinder side?
14:45:12 <rosmaita> this is the patch being checked: https://review.opendev.org/868317
14:45:16 <fungi> what does a run of that job normally look like after that point? is it maybe collecting files or compressing something?
14:45:45 <rosmaita> fungi: should look just like the func-py39 job, i think
14:45:52 <rosmaita> https://zuul.opendev.org/t/openstack/build/9ba34208232b4db49c48ceac72716c98
14:45:57 <fungi> subunit file analysis?
14:46:32 <whoami-rajat> it proceeded
14:46:51 <rosmaita> fungi: maybe, guess i shoud wait until it actually reports back to zuul
14:47:29 <rosmaita> anyway, that's all from me ... i'll put up the cinderclient stable branch patches later today; in the mean time, we need to merge the cinder/os-brick stable patches
14:47:48 <fungi> yeah, would be interesting to see what it was doing during that quiet period, but often it will be something like the executed commands generated waaaaay more logs than expected or tons more subunit attachments or something
14:48:13 <fungi> also whether subsequent builds of the same job pause in the same place
14:48:15 <rosmaita> i'll put something on the midcycle agenda about my reasons for not backporting the master branch changes, and we can discuss
14:48:50 <whoami-rajat> sounds good, always up for topics
14:49:03 <rosmaita> fungi: thanks, i'll put up a patch that removes the tempest test so we can get quicker response
14:49:50 <whoami-rajat> I've reviewed all changes to pin tox<4, it is a straightforward 2 line change which also TC suggested so should be easy to review and fix our gate
14:49:53 <whoami-rajat> other cores ^
14:50:13 <whoami-rajat> posting the link again to patches
14:50:15 <whoami-rajat> #link https://review.opendev.org/q/topic:tox-4-postponed
14:50:47 <whoami-rajat> rosmaita, anything else on this topic?
14:50:58 <rosmaita> nothing from me
14:51:17 <whoami-rajat> great, thanks for bringing this up
14:51:24 <whoami-rajat> next topic
14:51:32 <whoami-rajat> #topic Unit test will fail with python 3.11
14:51:34 <whoami-rajat> enriquetaso, that's you
14:51:38 <enriquetaso> hello
14:51:40 <whoami-rajat> #link https://lists.openstack.org/pipermail/openstack-discuss/2023-January/031655.html
14:51:44 <enriquetaso> Quoting the oficial bug: "An unfortunately common pattern over large codebases of Python tests is for spec'd Mock instances to be provided with Mock objects as their specs. This gives the false sense that a spec constraint is being applied when, in fact, nothing will be disallowed."
14:51:50 <enriquetaso> #link https://github.com/python/cpython/issues/87644
14:52:08 <enriquetaso> Just mentioning it because this would affect our future python 3.11 job CI.
14:52:12 <whoami-rajat> do we have a debian job somewhere so we can reproduce this in gate?
14:52:12 <eharney> i worked on this some before the winter break, have at least one patch posted
14:52:22 <eharney> just run tox -e py311 to repro
14:52:23 <enriquetaso> We need to update at least 250 tests from different drivers.
14:52:38 <simondodsley> ouch
14:52:40 <enriquetaso> i've reproduce this with docker (python3.11 debian image)
14:52:46 <enriquetaso> Thomas Goirand discovered this and opened a bug report to track the work:
14:52:47 <whoami-rajat> ah so it fails even in other distros
14:52:51 <eharney> it has nothing to do with Debian...
14:52:52 <enriquetaso> #link https://bugs.launchpad.net/cinder/+bug/2000436
14:53:22 <enriquetaso> it's related to python >3.11
14:53:23 <eharney> this is a change in Python
14:53:48 <eharney> anyway, fixing it is not particularly hard, but it will involve shuffling around a lot of mocks in unit tests, so it's laborious
14:54:10 <whoami-rajat> ok, i got confused from the bug report
14:54:13 <enriquetaso> yes.. i think we dont have plans to have a 3.11 job yet
14:54:29 <eharney> the plans are: we definitely need it to work at some point, so we should fix it :)
14:54:34 <whoami-rajat> for antelope, the runtime is 3.8 and 3.10 but we need to be ready for next cycle runtimes
14:55:21 <eharney> the new restriction is, basically, you can't mock a mock now, so make the mock once in the unit tests
14:55:46 <eharney> https://review.opendev.org/c/openstack/cinder/+/867824 is the first fix i submitted for this
14:56:14 <enriquetaso> cool, i think we can use the bug number to track all the fixes or use a tag if needed
14:56:43 <rosmaita> we should probably add a non-voting py3.11 job
14:56:48 <eharney> the bug number is less important than a patch that turns on 3.11 and depends-on: patches
14:56:55 <eharney> right
14:56:56 <whoami-rajat> rosmaita++
14:57:00 <enriquetaso> ++
14:57:04 <rosmaita> i thought that was going to happen as part of the antelope template
14:57:05 <enriquetaso> okay!
14:57:22 <rosmaita> but there were other issues that came up ... i can ask at the TC meeting later today
14:57:40 <rosmaita> in any case, we can do it ourselves in cinder, i think
14:58:00 <whoami-rajat> maybe they plan to keep 3.10 for another cycle and add n-v template next cycle but not sure
14:58:23 <eharney> also, fwiw, i just ran "tox -e py38" this morning and am seeing 3.11 failures in there
14:58:25 <whoami-rajat> rosmaita, yep, even a DNM should be good to track the failing tests
14:58:33 <eharney> maybe some tox4 weirdness?
14:58:52 <rosmaita> i hope not
14:59:04 <rosmaita> but probably so
14:59:40 <eharney> it appears to be running the wrong version of python in that env :/
15:00:26 <enriquetaso> topics are not available in gerrit anymore?
15:00:41 <eharney> i think they are
15:00:46 <whoami-rajat> we're out of time, let's continue this next week and in the meantime hoping we will get some fixes in
15:00:54 <whoami-rajat> also want to mention the review request section
15:01:03 <whoami-rajat> there are a bunch of review requests to please take a look at them
15:01:10 <whoami-rajat> #link https://etherpad.opendev.org/p/cinder-antelope-meetings#L121
15:01:19 <whoami-rajat> thanks everyone!
15:01:22 <whoami-rajat> #endmeeting