14:00:37 #startmeeting glance 14:00:37 Meeting started Thu Jul 1 14:00:37 2021 UTC and is due to finish in 60 minutes. The chair is abhishekk. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:00:37 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:00:37 The meeting name has been set to 'glance' 14:00:39 o/ 14:00:42 #topic roll call 14:00:46 #link https://etherpad.openstack.org/p/glance-team-meeting-agenda 14:00:46 o~ 14:00:52 o/ 14:01:12 o/ (though i am not really paying attention) 14:01:17 lets wait couple of minutes for others 14:01:26 wait we're supposed to pay attention? 14:01:30 this job keeps getting harder 14:01:52 o/ 14:02:31 cool 14:02:35 lets start 14:02:46 #topic Updates 14:02:54 VPTG is announced - October 18-22, 2021! 14:03:05 #link https://openinfra-ptg.eventbrite.com 14:03:15 Registration is open as well 14:03:29 #topic release/periodic jobs update 14:03:47 M2 2 weeks from now 14:03:53 Same goes for spec freez 14:04:38 Policy refactoring spec is waiting for reviews, kindly have a look (it already has 2 +2s and 1 -1) 14:04:59 We need to get it merged in 2 weeks 14:05:29 Periodic jobs, all green, so we haven't broke anything yet :D 14:05:39 #topic M2 Target progress check 14:05:46 Unified quota spec 14:05:56 Good progress and most of the patches are merged 14:06:05 we have docs and releasenote up as well 14:06:18 I got distracted from the cache api work, should have patch up today 14:06:26 ack 14:06:58 For cache-api glanceclient spec needs reviews so that it can be merged 14:07:15 I think we should move client support for cache API to M3 14:07:23 kk 14:07:34 and only target server side changes in this milestone 14:07:50 moving ahead 14:07:59 #topic Policy refactoring 14:08:05 Master spec: https://review.opendev.org/c/openstack/glance-specs/+/796753 14:08:32 There are some concerns raised by jokke_ and provided answers as well 14:08:39 jokke_, kindly have a look to get it moving 14:08:46 Tests refactoring lite spec: https://review.opendev.org/c/openstack/glance-specs/+/797593/1 14:09:25 Patch for this change is already in gate, I will move this dependency on the policy refactoring spec so that we can get this merged 14:09:35 Authorization layer and its use? 14:09:48 I am seeing most of the concerns are about authorization layer 14:10:21 AFAIK we have similar kind of checks in db layer as well 14:10:45 I am going to write on lite spec which will highlight how we are going to move out of authorization layer 14:10:48 are you thinking about the hard-coded "is admin or owner" checks on db update? 14:11:16 we already have it on delete 14:11:34 so similar we can have on update as well, know? 14:11:40 dansmith: yeah update and destroy are covered on the sqlalhemy api 14:12:13 yeah, this: https://github.com/openstack/glance/blob/master/glance/db/sqlalchemy/api.py#L135 14:12:19 yeah 14:12:48 that's obviously going to have to change in order to do secure rbac 14:13:53 Lets keep secure rbac out as of now 14:14:54 So AFAIK we have these mutation checks in db for metadefs also 14:15:49 Lets discuss this on the lite-spec, I will try to have it up for review by tomorrow EOD 14:16:34 anything else? 14:17:09 I take that as know, moving ahead 14:17:11 #topic Bi-weekly Bug discussion (4th Meet) 14:17:49 While analyzing metadef code for policy refactor related changes I found that couple of APIs does not have full client support 14:17:55 glanceclient has no support to add/specify description while creating metadef object using md-object-create command. 14:18:24 #link https://docs.openstack.org/api-ref/image/v2/metadefs-index.html?expanded=create-object-detail,create-property-detail,create-namespace-detail#create-object 14:18:40 and second one is more critical 14:18:57 glanceclient has no support to add 'type' while creating md-property for namespace using md-property-create command 14:19:09 is there a bug open for these 2 issues? 14:19:24 So type is required property for creating metadef properties 14:19:31 croelandt, no, need to report it 14:19:51 I found these issues around midnight 14:20:02 #link http://paste.openstack.org/show/807083/ 14:20:34 So both these issues needs to be reported and fixed on priority 14:21:10 Also I am wondering now usage of metadef APIs 14:21:38 I think that just tells quite obvious story how widely those commands are used ;) 14:21:48 exactly 14:22:05 our devstack uses glance-manage command to load the metadefs from json file 14:22:31 I am not sure how others are using it (or really using it) 14:23:14 There might be more issues around metadefs, I will have more close look at the code 14:23:28 That's it from me today 14:23:34 moving to Open discussion 14:23:39 #topic Open discussion 14:24:34 Nothing from me 14:24:48 abhishekk: I think those two you mentioned above are great bugs to include in your list of low hanging fruit and not necessarily something we need to spend our time urgently 14:24:52 croelandt, I have mentioned these issues in our bug tracker as well 14:25:09 Hi jokke_ any update on the swift bugs i raised ? 14:25:24 abhishekk: thanks 14:25:41 jokke_: yeah, they could go into the "for interns" pile :D 14:26:10 rajiv_mucheli_: I have revert of that change that broke the trusts in review waiting. Thanks for figuring that out 14:26:17 jokke_, I also thought about that, lets see 14:26:31 rajiv_mucheli_: can move on with the rest once we get over that bump 14:27:22 oh ok, we tried few code changes to fix bulk deletes but had no luck, would increasing the delete_concurrency in swift help ? 14:28:28 rajiv_mucheli_: I'm not sure. Either that or we need to start throttling the bulk deletes in Glance side to give swift room to get them done. 14:29:55 rajiv_mucheli_: I think the delete_concurrency is good place to start as in general swift is very good with lot of parallel operations and performs better that way than serialized 14:30:58 But I'm not sure how far that needs to be stretched if we're talking about 400GiB NFV image or something, so we might need to throttle it anyways from our side 14:32:04 anything else guys otherwise we can use remaining time in reviews 14:32:54 okay, i will try this 14:33:44 cool, lets warp this up 14:33:49 thank you all! 14:33:55 have a nice weekend ahead 14:34:09 #endmeeting