14:00:07 <whoami-rajat> #startmeeting cinder 14:00:07 <opendevmeet> Meeting started Wed May 4 14:00:07 2022 UTC and is due to finish in 60 minutes. The chair is whoami-rajat. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:00:07 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:00:07 <opendevmeet> The meeting name has been set to 'cinder' 14:00:15 <whoami-rajat> #topic roll call 14:00:23 <rosmaita> o/ 14:00:23 <hemna> yough 14:00:28 <fabiooliveira> o/ 14:00:33 <geguileo> hi! o/ 14:00:43 <caiquemello[m]> o/ 14:00:51 <eharney> hi 14:01:18 <jungleboyj> o/ 14:01:21 <tosky> o/ 14:01:32 <enriquetaso> hi 14:02:06 <simondodsley> hi 14:02:46 <whoami-rajat> great turnout 14:02:49 <whoami-rajat> let's get started 14:02:55 <whoami-rajat> #topic announcements 14:03:02 <whoami-rajat> openstack-tox-docs failure 14:03:12 <whoami-rajat> the openstack-tox-docs job started breaking last week because of a change in oslo.policy 14:03:18 <whoami-rajat> #link https://review.opendev.org/c/openstack/oslo.policy/+/830514 14:03:23 <whoami-rajat> It was fixed by 14:03:28 <whoami-rajat> #link https://review.opendev.org/c/openstack/oslo.policy/+/839711 14:03:42 <whoami-rajat> and released with this patch 14:03:46 <whoami-rajat> #link https://review.opendev.org/c/openstack/releases/+/839775 14:04:23 <whoami-rajat> the reason cinder didn't see this failure will be discussed later (I've added a topic for it) 14:04:43 <rosmaita> because cinder is the BEST! 14:05:15 <whoami-rajat> :D there might be changes needed (if we agree to what I'm seeing in other projects) ... 14:05:19 <whoami-rajat> but let's discuss that later 14:05:26 <whoami-rajat> anyone has anything else for announcements? 14:06:10 <whoami-rajat> looks like not, so we can get started with topics 14:06:19 <whoami-rajat> #topic Ceph min client version set to Mimic: enable it by default 14:06:22 <whoami-rajat> enriquetaso, that's you 14:06:35 <enriquetaso> Hello, argonauts, 14:06:48 <enriquetaso> After the problems we have testing RBD upstream, I think it's best to enable the Mimic client by default. 14:06:55 * enriquetaso lost the link 14:07:04 <enriquetaso> #link https://review.opendev.org/c/openstack/devstack-plugin-ceph/+/782624/ 14:07:20 <enriquetaso> I know it is a safety mechanism to disable it by default, but the goal is to test Ceph in all our upstream CIs. 14:07:26 <enriquetaso> For backward compatibility, the patch includes a check line #493 14:08:44 <whoami-rajat> i think the initial time this change was done, it broke glance but glance is now fixed (with a workaround) so maybe this is safe now? 14:09:19 <tosky> do we know if it can affect manila? 14:09:49 <enriquetaso> I don't think our ci jobs affect manila 14:10:33 <tosky> enriquetaso: if we enable it by default, every user of devstack-plugin-ceph will be affected 14:10:57 <eharney> it could affect manila, i'm not sure if it does in any meaningful way or not 14:11:10 <tosky> I mean, they should at least know 14:11:18 <tosky> for the record, we also need to ask Francesco and vkmc to also add the same feature in the new codepath which deploys with cephadm in any case, even more if it's true by default 14:11:21 <vkmc> tosky, seems it's set to false in the patch, so it's not enabled by default 14:11:44 <vkmc> can be toggled in ci as needed I guess 14:11:49 <tosky> vkmc: the proposal is to turn it on by default 14:11:55 <vkmc> ah 14:11:55 <tosky> that's what's being discussed 14:12:19 <whoami-rajat> IIUC nova, glance, manila, cinder are the consumers of devstack-plugin-ceph so those projects needs to be consulted/considered 14:12:48 <eharney> nova and glance are tested in the CI jobs on this patch, right? 14:13:19 <enriquetaso> yes, it should be 14:13:30 <eharney> as is manila 14:15:08 <enriquetaso> I think it would be of great benefit for us to activate it to start testing it properly. 14:15:28 <eharney> yes, this is the configuration we want to target, so we should have devstack configure it by default 14:17:13 <whoami-rajat> have we tested that it breaks any job with the config option defaulted as True? 14:17:33 <eharney> it looks like that should be the next step 14:17:58 <whoami-rajat> ok 14:18:08 <eharney> well, i pushed for it to not be a True/False setting, but, conceptually anyway 14:19:25 <enriquetaso> if nobody is against it, I can update the patch and we could continue the discussion on the patch 14:19:57 <whoami-rajat> sounds reasonable 14:20:18 <enriquetaso> cool 14:20:22 <enriquetaso> thanks 14:20:26 <whoami-rajat> you can update the patch and see for any potential failures and from there we can continue this discussion 14:20:38 <enriquetaso> makes sense 14:20:54 <whoami-rajat> great 14:20:57 <whoami-rajat> ok, so let's move on to our next topic 14:21:02 <whoami-rajat> #topic stable-cores: backport Zuul job playbook fix 14:21:06 <whoami-rajat> rosmaita, that's you 14:21:18 <rosmaita> hi 14:21:33 <rosmaita> tosky fixed a problem in some of our zuul jobs 14:21:44 <rosmaita> we need to backport to all stable branches where appropriate 14:21:45 <jungleboyj> \o/ 14:21:57 <rosmaita> because we are sometimes reporting success for failed jobs 14:22:01 <rosmaita> and nobody wants that 14:22:05 <whoami-rajat> #link http://tiny.cc/cinder-legacyfacade 14:22:17 <rosmaita> no, not that link 14:22:17 <tosky> just one job, the ceph one on the cinder.git repository 14:22:21 <whoami-rajat> sorry wrong link 14:22:22 <whoami-rajat> #link https://review.opendev.org/q/topic:fix-reporting 14:22:29 <rosmaita> that's it 14:22:47 <rosmaita> if you look at the patch to master, i left a comment there to show you that tosky's fix works 14:22:52 <rosmaita> (in case you had any doubt) 14:22:56 <enriquetaso> tosky++ 14:23:13 <rosmaita> but it would be good to get these backported as soon as possible before we let somthing slip past the goalie 14:23:22 <rosmaita> that's all 14:24:02 <whoami-rajat> ok, this looks important to get in, not good to get wrong reporting status to introduce more issues than we already have 14:24:20 <whoami-rajat> so stable cores can take a look 14:24:37 <jungleboyj> rosmaita: I can get these merged. 14:24:45 <rosmaita> works for me! 14:24:51 <whoami-rajat> jungleboyj++ 14:25:04 <whoami-rajat> and tosky++ 14:25:25 <whoami-rajat> ok, let's move on to the next topic then 14:25:31 <whoami-rajat> #topic remove legacy oslo.db enginefacade patches need more reviews! 14:25:36 <whoami-rajat> rosmaita, that's you again! 14:26:00 <rosmaita> not much to say other than REVIEW! REVIEW! REVIEW! 14:26:15 <rosmaita> and you can use this groovy dashboard: http://tiny.cc/cinder-legacyfacade 14:26:36 <whoami-rajat> I'm reviewing them and honestly the formatting is killing me, specially on the long patches 14:26:55 <rosmaita> it's semi-black, so it's the wave of the future 14:27:21 <whoami-rajat> I've left some comments on the last patch, i can move on with the review if those are addressed 14:27:22 <eharney> the pattern of bonus commas everywhere is quite painful and just feels wrong 14:27:23 <rosmaita> and the stacked parameter lists will help when we mypy those files 14:27:34 <whoami-rajat> but it would be great if others participate and get it done faster! 14:27:50 <whoami-rajat> rosmaita, excluding the extra commas ... 14:28:18 <rosmaita> i think in parameter lists the commas are wrong, but elsewhere i don't care that much 14:28:34 <whoami-rajat> i think there's also a patch to refactor whole cinder code to follow black's formatting but I saw quite a lot of negative reviews there (one is from me as well) 14:28:34 <rosmaita> but i don't think they are wrong enough to revise the patches and have to review all this ... stuff 14:28:50 <rosmaita> again 14:29:21 <whoami-rajat> yeah, rebasing that chain is not good for CI 14:30:10 <whoami-rajat> ok, so coming back to the main point, please review the patches proposed by Stephen 14:30:15 <rosmaita> anyway, please review so we can get the legacy enginefacade gone before we absolutely have to 14:30:17 <eharney> does that mean we also won't think the extra commas in method calls aren't wrong enough to patch them away later? 14:30:51 <rosmaita> i don't think so, you can fix it when you mypy those files :D 14:31:01 <eharney> can do 14:31:23 <rosmaita> works for me 14:31:47 <whoami-rajat> great, so we can get rid of the unnecessary formatting as well, more incentive to review mypy 14:32:16 <whoami-rajat> moving on to the next topic 14:32:22 <whoami-rajat> #topic Yoga "Known Issues" followup patches need reviews 14:32:23 <geguileo> eharney: I would normally -1 a patch for those... 14:32:34 <eharney> geguileo: me too... 14:32:55 <whoami-rajat> rosmaita, that's you again :D 14:33:48 <rosmaita> #link https://review.opendev.org/c/openstack/cinder/+/839451 14:34:14 <rosmaita> just want to mention that it addresses a known issue in the Yoga release, so let's get it merged soon and backport it 14:34:49 <rosmaita> geguileo is also working on the nvme-of known issues from Yoga 14:35:09 <rosmaita> geguileo: do you have a link to your stuff that needs reviews? 14:35:51 <whoami-rajat> great, good to get these issues resolved earlier so we need to do less backports 14:36:03 <geguileo> rosmaita: I'm changing the patches for the nvmeof stuff 14:36:22 <rosmaita> ok, we can have a weekly "known issues" checkin 14:36:23 <geguileo> rosmaita: I went back and I'm redoing the big ones :'-( 14:36:28 <rosmaita> not trying to rush you 14:36:38 <rosmaita> and now i don't feel bad about not reviewing them yet! 14:37:09 <rosmaita> jbernard is working on the cgroup v1->v2 issue 14:37:39 <whoami-rajat> sounds like a good idea if we've a lot of patches fixing known issues, i think fixing nvmeof issues is a long series itself 14:37:44 <rosmaita> and i think enriquetaso is working on a ceph related issue 14:37:47 <rosmaita> #link https://docs.openstack.org/releasenotes/cinder/yoga.html#known-issues 14:38:27 <rosmaita> ok, that's it from me 14:38:44 <rosmaita> for real this time, i don't think i have any more agenda items 14:39:43 <whoami-rajat> they were all pointing to important things that we should actively review, thanks rosmaita for keeping check on them 14:39:58 <hemna> the db reviews are quite a lot 14:40:05 <rosmaita> yeah, tell me about it 14:41:12 <whoami-rajat> guess there are too many items to review now, so core reviewers please take a look ^^ 14:41:29 <hemna> https://review.opendev.org/c/openstack/cinder/+/831247 this one too please :) 14:42:03 <whoami-rajat> yep, we decided at the PTG that this one is important to get fixed as well ^ 14:42:17 <whoami-rajat> so moving on to the last topic 14:42:20 <hemna> at some point I'd like to discuss the idea of volume transaction tracking 14:42:32 <hemna> oh sorry, I thought we were done. 14:42:36 <hemna> ignore more 14:43:00 <whoami-rajat> we can continue in open discussion and we've midcycle coming up on 1st june 14:43:07 <whoami-rajat> #topic Move install_command constraints to deps 14:43:19 <whoami-rajat> that's me 14:43:31 <whoami-rajat> so Cinder was not affected with the openstack-tox-docs failure because we take packages from upper constraints and not released versions 14:43:56 <rosmaita> not sure what that means? 14:44:03 <whoami-rajat> which is good but we use install_command for that instead of the deps section of a particular job 14:44:40 <whoami-rajat> rosmaita, so when oslo.policy 3.12.0 (or whatever the failing version was), it wasn't merged in upper-constraints file in requirements project 14:45:06 <rosmaita> right, so it's not a problem, cinder is doing the right thing 14:45:10 <whoami-rajat> so projects installing from released versions saw this failing job but projects using upper-constraints file were unaffected 14:45:13 <rosmaita> geguileo fixed this way back 14:45:13 <whoami-rajat> yes 14:45:19 <whoami-rajat> but there is more to it 14:45:46 <whoami-rajat> we use the install_command which is generic for all jobs 14:45:53 <rosmaita> it's not installing from released versions, it's installing from the development version 14:45:58 <whoami-rajat> instead of the deps section which is specific to every job 14:46:19 <rosmaita> i think geguileo explains this on his commit message pretty clearly: https://opendev.org/openstack/cinder/commit/5ad7fe92690b29a05c6284aa52e01913b5b7cf76 14:47:17 <geguileo> in my opinion we should always honor the upper constraints 14:47:23 <geguileo> that's the whole point of having them 14:47:27 <rosmaita> exactly 14:47:44 <tosky> I don't want to reopen old discussions, but iirc I remember a long global series of patches to remove the usage of install_command 14:48:09 <geguileo> I'm ok if there's another way to do it 14:48:35 <geguileo> but installing cinder with packages that don't honor upper constraints is without a doubt BAAAAAAAAAAAAAAAD 14:48:56 <geguileo> and I'll -2 any patch that tries to do it 14:48:56 <jungleboyj> ++ 14:48:59 <whoami-rajat> i think i should have worded the topic better as it created a lot of confusion here 14:49:28 <whoami-rajat> we are using upper constraints, which is correct, the point I'm trying to convey is to do it otherwise, i.e. not from install_commands but from deps 14:49:47 <rosmaita> yes, except you can't do it by deps 14:49:56 <whoami-rajat> some of the reasons listed here on this patch seemed valid like passing multiple contraint files only takes the first one 14:50:04 <whoami-rajat> #link https://review.opendev.org/c/openstack/heat/+/818802 14:50:16 <rosmaita> i'm not sure that's correct 14:50:28 <rosmaita> geguileo looked into this closely (while i watched) 14:50:40 <rosmaita> with usedevelop=True you get a two-phase install 14:50:56 <rosmaita> the way we have it defined, pip uses the same install command both times 14:51:07 <rosmaita> and both times it has the upper-constraints 14:51:38 <tosky> but shouldn't other projects show the same issue? 14:51:51 <rosmaita> they do, that's why they broke 14:51:57 <rosmaita> or, they are ignoring it 14:52:22 <rosmaita> it only showed up in a really weird case when we couldn't figure out why cinderlib was using an unconstrained library 14:53:00 <geguileo> yup, that drove me crazy 14:53:02 <rosmaita> a lot of times nothing will break ... but when it does, it's a real PITA to track down 14:53:11 <geguileo> +1 to that 14:53:20 <rosmaita> yeah, we wasted a lot of time on that, and i don't want to do it again 14:53:56 <rosmaita> and if they don't have usedevelop=True, then it's not a problem 14:54:42 <whoami-rajat> ok, i don't have any strong views on that, whichever way seems reasonable to the team works 14:55:09 <geguileo> just to understand the issue, was it being suggested to do the same thing as that patch? 14:55:37 <whoami-rajat> geguileo, you mean the heat patch? 14:55:38 <tosky> let me find a link 14:55:43 <geguileo> yes 14:55:56 <geguileo> is that was we are discussing? 14:56:11 <rosmaita> that heat patch does not look correct to me 14:56:12 <whoami-rajat> yeah, I've seen other projects move away from install_commands and the reasons on the heat one sounded most reasonable to me so took that example 14:56:29 <tosky> I don't remember all the discussion, but there are some hints here: https://review.opendev.org/c/openstack/nova/+/684775/ 14:57:37 <rosmaita> yeah, that patch is not correct, you will get the problem geguileo tracked down 14:58:09 <geguileo> if nova is not hitting the issue, good for them... 14:58:32 <geguileo> since cinder will get bitten by this if we change it, we have to keep it 14:58:49 <rosmaita> yes 14:58:50 <tosky> ok, but if there were other issues which lead to that recommendation, can't we at leat raise the question globally and have a general guideline? 14:59:00 <geguileo> maybe we should add a comment to our tox.ini, since this may not be clear to other people reading the file 14:59:34 <whoami-rajat> tosky, yeah, this seems to be missing an openstack wide guideline 14:59:34 <rosmaita> yeah, i guess you have to do a "blame" to see the reasoning 14:59:46 <whoami-rajat> geguileo, that would be good since it wasn't clear to me at least ... 15:00:30 <whoami-rajat> ok, we are out of time 15:00:46 <whoami-rajat> thanks everyone for joining! 15:00:47 <whoami-rajat> #endmeeting