13:59:26 #startmeeting glance 13:59:26 Meeting started Thu May 11 13:59:26 2023 UTC and is due to finish in 60 minutes. The chair is pdeore. Information about MeetBot at http://wiki.debian.org/MeetBot. 13:59:26 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 13:59:26 The meeting name has been set to 'glance' 13:59:27 #topic roll call 13:59:27 #link https://etherpad.openstack.org/p/glance-team-meeting-agenda 13:59:31 o/ 14:00:06 croelandt, abhishekk I was thinking to wait for today, if there are no objections then I will set the +W 14:01:06 shall we start? I think everyone is here :) 14:01:07 o/ 14:01:19 o/ 14:01:40 ok, let's start 14:01:42 #topic release/periodic jobs updates 14:01:52 o/ 14:01:58 We are in M1 release week, and i think we are good to tag M1 14:02:11 waiting for the config refresh patch to merge, then I will submit the release patch today 14:02:18 o/ 14:02:22 for glance_store release we need to wait for os-brick release 14:02:50 rosmaita, could you please keep watch on it? 14:02:53 rosmaita, can we update the glance_store hash after os-brick release 14:03:25 you can update the hash now but release should be done after os-brick release 14:03:46 ok, so what i would like to do is this 14:03:55 the os-bricks have been released 14:04:00 o/ 14:04:11 but the upper constraints have not been updated 14:04:19 due to a pep8 problem 14:04:41 anyway, the glance-store change has merged, but i think we should update the min version of os-brick in setup.cfg 14:04:51 before doing the new glance_store releases 14:05:03 i have patches up for that 14:05:32 gimme a sec to find the link 14:05:50 https://review.opendev.org/q/topic:bug/2004555+project:openstack/glance_store 14:05:50 ack, 14:06:22 another problem is that as far as i can tell, the stable/2023.1 upper constraints are not being updated by the bot 14:06:51 but i can push a patch for that by hand 14:07:06 after the pep8 stuff is fixed in requirements 14:07:27 the master patch is already merged, so we can update the hash, right? 14:07:59 we haven't released os-brick from master yet 14:08:44 yeah, https://review.opendev.org/c/openstack/releases/+/882580 14:08:52 we will have to wiat for this 14:09:32 or, not, if we don't update the 'extras' in setup.cfg 14:10:15 pdeore: thanks for pointing out that patch, didn't realize elod had updated it 14:10:30 so one option would be to just release it without the dep and then follow up later with a bump right? 14:10:45 so that people *can* get the fix if they put the versions together 14:10:52 which right now they can't easily 14:11:36 there is that 14:13:43 ok, so we can release the store after os-brick release without the dep? 14:14:32 well, if we don't change the req, you can release glance_store now 14:14:34 that would be my preference, but on the gradient of concern, glance's exposure to this is probably the lowest of the three projects, so it's certainly less concerning than nova 14:15:09 ok 14:15:52 pdeore: the situation is that the change in glance_store adds some parameters to a os-brick call that haven't changed, so using an old os-brick won't break anything with the new glance_store code 14:16:10 (not sure i said that clearly) 14:16:24 yeah i got your point 14:16:43 anyway, new glance_store + old os-brick should be fine, just doesn't give you full protection for the CVE 14:17:24 rosmaita, Is it possible for you to keep watch on this ? I will try to as much late as possible though 14:17:27 but it's the "accidental part" anyway, so pretty low probability 14:17:47 i can do it, my time zone is good for this 14:17:56 rosmaita, gr8 Thanks ! 14:18:11 ok, let's move to next topic 14:18:19 Periodic job all green except oslo-tips failure, may be again version conflict issue 14:18:28 what we can do is, if the upper constraint hasn't changed by say 1800UTC, we can release without the req change 14:18:42 rosmaita, ok 14:18:44 (sorry, i know you wanted to move on) 14:18:48 one more question 14:19:09 actually, not ... 14:19:10 np, I thought we are done :) 14:19:32 we will release all the glance_store versions at 1800 UTC 14:19:41 rosmaita, ack 14:19:47 i can ping dansmith to verify the hashes on the release patch 14:19:53 sure 14:20:05 ok, that's all from me, thank you 14:20:05 cool 14:20:13 rosmaita, Thanks 14:20:20 shall we move ahead ? 14:21:49 I think it's safe :) 14:21:58 ok, let me continue with the next peridoc job updates :) 14:22:09 so as I said 2 oslo-tips jobs are still failing with version conflict issue again I think 14:23:03 but i still wonder how the jobs which were failing till last week started passing without merging the changes submitted by croelandt ? 14:23:50 or I'm missing some context on this? 14:23:56 hm 14:24:05 it's magic 14:24:09 yeah 14:24:10 :D 14:24:11 everything is magic 14:24:51 then I think we should wait everytime for this magic to be happened :P 14:25:21 croelandt, you have abandon the py38 to p310 patch right? 14:25:39 hm not really 14:25:43 so we need only the one where nodeset is changed 14:25:47 we wanted to wait & see the results ont he tips jobs 14:25:54 ohh ok 14:26:02 but we never test py310 with tips then? 14:26:05 * croelandt is confused 14:26:45 i think we did 14:27:01 ohh no, not tested 14:27:30 damn it feels like this should be tested somehow 14:27:51 "this" == ? 14:28:10 ok, let's test that and check if these recently failing ones are passing or not 14:28:19 https://review.opendev.org/c/openstack/glance/+/882185 14:28:25 this == py310 I think 14:28:29 rosmaita: the tips 14:29:46 move to next ? 14:30:23 moving ahead :) 14:30:24 #topic Important reviews 14:30:31 #link https://review.opendev.org/c/openstack/glance/+/881940 - Add new add loc api 14:30:31 #link https://review.opendev.org/c/openstack/glance/+/882498 - Add get loc api 14:30:42 I have submitted the new location apis changes, kindly please have a look 14:30:51 Would we like these merged for M1 or was it just the spec? 14:31:01 pdeore: I really want to review that I have just been slammed lately 14:31:26 croelandt, it would be nice if we can get those in m1 :) 14:31:38 * croelandt stretches fingers 14:32:00 #link https://review.opendev.org/c/openstack/glance-specs/+/880030 - Reproposed New Location API spec 14:32:07 on spec we have 2 +2s 14:32:22 I will go ahead and will set +W if there are no objections on the design until tommorw. 14:33:34 the sdk part for these new loc api, I've started working on, will submit the patch tmrw 14:33:46 any questions? 14:34:31 * croelandt has none 14:34:39 cool, moving to next 14:34:41 #topic Specs 14:34:49 we have 2 more specs for review, so kindly please have a look 14:34:55 #link https://review.opendev.org/c/openstack/glance-specs/+/880627 - Reproposed Image Encryption Spec 14:34:55 #link https://review.opendev.org/c/openstack/glance-specs/+/881951 - Spec-lite Add new Location Strategy 14:35:12 ack 14:35:42 moving to next 14:35:44 #topic Can we delete multiple images from the same store 14:35:48 mrjoshi_ ^ 14:36:22 #link https://review.opendev.org/c/openstack/python-openstackclient/+/882086/3/openstackclient/image/v2/image.py#1854 14:36:31 this is regarding the ``glance stores-delete`` which deletes image from store 14:36:38 I recently added a patch for adding support of ``glance stores-delete`` equivalent in the openstack client . 14:37:22 There's a comment from stephen regarding it ,Currently we can only delete a single image from the store , what if user's want to delete multiple images from the same store? 14:37:42 they make multiple requests 14:38:13 Can't we have it in a single request 14:38:31 we don't have it for image-delete, do we? 14:38:42 we do 14:38:52 we can do openstack image delete IMG1 IMG2 IMG3... ? 14:38:59 yes 14:39:11 we can certainly do glance image-delete 1 2 3 14:39:16 for the sake of consistency, it would make sense to have this behaviour everywhere 14:39:30 as a user it infuriates me when commands are not consistent 14:39:59 that's not in the API, though, is it? (batch delete, i mean) 14:40:07 The proposal is to deleted different images from same store at once (we can decide this in next PTG) 14:40:14 no that is not in API 14:40:22 but this could be in the CLI, right? 14:40:26 yes 14:40:58 its easy to do it in CLI 14:41:01 so it wouldn't require any glance-side changes, not sure what the issue is here 14:41:15 let me tell you 14:41:26 we are adding support to delete image from a store in SDK 14:41:57 the maintainer suggested rather than adding new command like image store delete we should pass --store option to existing delete command 14:42:32 and then he asked us whether we want to add support to delete multiple images from same store 14:42:55 So we can have image delete store image1, 2, 3 and so on 14:44:04 so, that's an SDK design decision, that is, whether they want the SDK to diverge from the actual API 14:44:20 seems fine to me to have the SDK do 3 calls to the API to delete 3 images 14:44:35 rather than have the user type 3 commands to do the same 3 calls 14:46:04 +1 14:47:11 we have only few mins left 14:47:52 so if everyone agrees can we implement this in the patch or should we have a separate patch for it? 14:48:47 mostly an SDK decision to be honest 14:49:08 you can tell them that we can work on it later 14:49:35 ok 14:50:14 shall we move to open discussions now? 14:51:06 yes 14:51:10 #Open Discussions 14:51:36 I will be on PTO from 18th-25th May 14:52:29 I don't have anything to for Open discussion 14:52:32 Would be nice to fix this before M1 as well :) https://review.opendev.org/c/openstack/python-glanceclient/+/880696 14:52:32 so won't be there for next 2 weekly meetings, any volunteer to chair the meeting for next 2 week? 14:52:48 18th is a public holiday in France 14:52:59 and I might take some PTO after that, gotta check my balance :p 14:53:11 we can call off 18th 14:53:32 and 25th if anything comes urgent then I will chair it 14:53:49 ok, I will send the mail accordingly 14:54:03 abhishekk, Thanks ! :) 14:54:14 That's it from me 14:54:29 anyone has anything else to discuss? 14:55:42 ok, assuming nothing left.. let's conclude for today ! 14:56:03 Thanks everyone for joining !! 14:56:17 #endmeeting