14:00:11 #startmeeting glance 14:00:12 Meeting started Thu Dec 12 14:00:11 2019 UTC and is due to finish in 60 minutes. The chair is abhishekk. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:00:13 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:00:15 #topic roll call 14:00:16 The meeting name has been set to 'glance' 14:00:24 #link https://etherpad.openstack.org/p/glance-team-meeting-agenda 14:00:27 o/ 14:00:30 o/ 14:01:01 lets wait 2-3 minutes for others to join 14:01:33 ok 14:01:47 * tosky lurking 14:02:00 tosky, o/ 14:03:03 lets start 14:03:06 #topic Updates 14:03:31 o/ 14:03:35 This is the M1 release week and as decided in last week we are skipping M1 for glance 14:03:40 jokke_, o/ 14:03:54 \o/ 14:04:09 (we skipped for cinder, too) 14:04:38 We didn't have anything concrete merged in glance since T release, all the work targeted for M1 is now expected to be done before M2 14:04:49 rosmaita, :D 14:05:08 I have priorities patch uploaded, kindly review the same 14:05:10 i don't think it would be illegal to release a milestone-1.5 14:05:19 #link https://review.opendev.org/#/c/696017/ 14:05:31 haha 14:06:18 i really like "Permanent solution for Subunit parser error" 14:06:23 We need reviews on specs, because I can not approve it unless it is reviewed by all the cores 14:07:00 rosmaita, yeah, I am working on the possibilities, started with removing registry test cases 14:07:26 Moving ahead 14:07:40 #topic release/periodic jobs update 14:07:55 Sean has proposed glance-store release patch for M1 14:08:15 I have voted -1 as we didn't merged anything in glance store either 14:08:23 ++ 14:08:43 we have bunch of stuff under review 'though 14:08:45 I have added comment that if we want to release it then we should merged sheepdog driver removal patch before 14:08:52 jokke_, yes 14:08:55 abhishekk: ++ 14:09:29 So, IMO we should skip this release and revisit the same around M2 14:09:46 yep, good plan 14:09:57 cool 14:10:00 smcginnis: dyk the current thinking about removing py2 support from libraries? 14:10:19 oops, just realized that he isn't here 14:10:26 thought i saw him come in 14:10:36 rosmaita, yeah, he has headed out 14:10:57 Periodic jobs all green, from last 3 weeks 14:11:09 \o/ 14:11:12 that is good news 14:11:15 those are all py3 now, right? 14:11:24 rosmaita, yes 14:11:28 cool 14:11:48 next topic 14:11:58 #topic Multi-store import plugins 14:12:16 We have two specs up for new import related work 14:12:27 #link 14:12:27 https://review.opendev.org/#/c/667132/ 14:12:47 Just wanted to throw this out there. The specs and corresponding WIPs are up 14:13:06 So this is something we can potentially move quite quickly off our plates 14:13:14 jokke_, ++ 14:13:20 #link https://review.opendev.org/669201 14:13:39 This link is for Import image in multiple stores 14:13:50 #link https://review.opendev.org/694724 14:13:51 ok, i will try to get through the specs at least today 14:14:05 This link is for Copying existing images in multiple stores 14:14:19 rosmaita, cool, it will really help to get going 14:14:55 Related code is also up for reference and is in good shape 14:15:26 moving ahead 14:15:29 #topic Delete from single store 14:15:36 jokke_, stage is yours :D 14:16:06 Another of mine for attention. So wrote quick spec as the approach I took would introduce new API endpoint 14:16:25 I have skeleton code up as well for it 14:16:58 I went through the specs and it is written based on our discussion during PTG 14:17:07 so thank you very much :D 14:17:09 #link https://review.opendev.org/#/c/698018/ 14:17:26 The spec is failing test because we do not have index merged yet 14:17:38 yes 14:17:40 #link https://review.opendev.org/#/c/698049/ 14:17:50 that's for the code part 14:18:23 cool 14:18:38 I pulled the code together quick to kind of POC it based on the proposed spec. If we get to agreement of the approach I start writing the rest of it 14:19:13 rosmaita, Put your glance reviewers hat as early as possible :P 14:19:14 there is still all testing (surprise, it's me), docs and some of the verifications missing 14:19:28 but that should be quick to finish after we're in agreement of the approach 14:19:45 jokke_, I like the idea, specially we are not going down each of the onion layer for the same 14:20:30 but TL;DR basically introducing "v2/stores/{store_id}/{image_id}" endpoint that accepts only DELETE method 14:21:14 just to make sure people don't accdentally kill the whole image because we would hve just appended the store_id to the current call 14:21:23 only concern is if user has delete right and don't have right to remove the location then it will now allow deleting the location (I guess this is happening at the moment as well) 14:21:27 but that's the other way to approach this 14:22:35 keeping in mind that i am way behind on my specs reading ... 14:22:36 If I recall correctly, the policy is checked on the image update, but that's something I'd (or someone else) need to test and verify 14:22:52 what is the api call to upload an image to a particular store? 14:23:06 its put 14:23:06 the image delete calls the location deletes in the background so it should be affected as well, but I'm not 100% 14:23:33 location delete call has policy check 14:23:35 what does the path look like? 14:23:48 rosmaita: in the upload it's the same call, you just add the store as header 14:23:56 correct 14:24:12 in case of new import workflow, its POST 14:24:13 as the body is occupied by the image data 14:24:29 in case of traditional create call it will be PUT 14:24:39 for both the calls we are passing store as a header 14:24:41 the multi-store imports it's gonna be part of the body of the import call 14:24:43 that works for an existing image? i would do multiple PUT /v2/images/uuid/file with different headers? 14:24:58 rosmaita: nope 14:25:06 ok ,that's waht i thought 14:25:16 that's why you need the copy to multistore spec 14:25:21 rosmaita: you upload the image once, ever. We don't allow uploading to active images 14:25:39 rosmaita, yes 14:26:12 so the alternative was DELETE /v2/images/uuid/store_id ? 14:27:18 rosmaita: yes, so the import to mutiple stores allows to populate multiple stores upong creation via import and the copy job is so far only that we allow for active image and it will be glance handling the data transfers. That should minimize the risk of anyone sneaking data in that is modified 14:27:21 rosmaita, that is the alternative but our router mapping is horrible and as delete image have same mapping it will always map to delete image instead of our new method 14:27:58 jokke_: makes sense 14:28:02 abhishekk: sorry to hear that 14:28:10 rosmaita: yeah, that's the baseline either what you wrote or /v2/images/uuid/store/store_id 14:28:52 ok, thanks 14:29:02 abhishekk: rosmaita: correct, so that approach would need us to do the logic in our image deletion code rather than have it's own isolated coe path 14:29:22 your proposal is slightly asymmetric, but probably ok 14:29:30 jokke_, right 14:29:48 kind of makes sense to think of image data as a resource owned by a store 14:30:01 oh there is the 3rd option as well but I think I didn't even write it into the spec 14:30:24 we just tell everyone to use the vulnerable locations api and do it via image-update 14:30:39 :d 14:31:01 PATCH seemed like such a cool idea back in 2011 14:31:08 ikr 14:31:39 I'm happy we got over that thought fairly quickly :P 14:31:46 :D 14:31:48 +1 14:32:06 lets move to next topic 14:32:20 ++ 14:32:36 all cores, kindly review all the specs as early as possible 14:32:55 #topic Native SSL removal 14:33:02 jokke_, again you 14:33:42 So as agreed at Train we wanted to get rid of py27 support to whack the SSL and classify ourselves as py3 compatible 14:34:48 py27 seems to be finally beated to dead and I did propose 3 patches for this 14:34:53 #link https://review.opendev.org/#/c/697969/ 14:35:54 is the key, it depends on the patch that just removes the py27 classifies without taking any side of what we're going to do with py27 testing and then on top of that is py3 classifier addition doing the same 14:36:12 Should we get this in, release glance M1 so that people can test it for regression? 14:36:37 That i one option. the 3 patch chain should be good to go 14:37:16 +1 from me on this approach 14:38:34 sounds ok to me 14:39:11 I will put a release patch on Monday if all these patches merged before weekend 14:39:50 I could have smashed them into one review but I didn't want to clean up the rebases if we happened to get those existing testing patches merged before that also changes the classifiers 14:40:17 the rebase would be clean this way 14:40:23 agree 14:41:15 so as jokke_ is owner, rosmaita you and me need to review the patches :D 14:41:25 or either smcginnis 14:41:32 i guess this is higher priority than specs? 14:41:45 rosmaita, yes 14:41:46 If we want to get M1 tagged 14:42:10 ok, i will hit these first and do as many specs as i can this afternoon 14:42:30 ty rosmaita \o 14:42:30 i think the py3 M1 is a good idea 14:42:39 but I agree, the earlier we tag this the easier it is to find out if I broke something in the process 14:42:48 :D 14:43:33 Moving ahead 14:43:40 #topic Open Discussion 14:43:42 I can check I think glance_store has py3 classifiers already 14:44:02 if not might be worth of getting py3 M1 out of that as well 14:44:38 yes we do so no worries there it's just the service itself 14:44:38 it has 14:45:05 abhishekk: I was looking the registry test removal patch you posted 14:45:18 ok 14:45:48 I have couple of questions it raised and I was trying to figure out why that task test fails just on the minute till the meeting 14:46:06 I will have my comments in the review right after this meeting 14:46:11 jokke_, have you figured that out? 14:46:14 cool 14:46:22 nope ... I did not figure it out yet 14:46:38 I need to run that tests and try to figure out where it actually fails 14:46:49 hopefully it throws stack trace or something 14:47:12 it tries to connect to sql database on image.save call and didn't find the connection 14:47:35 and then it goes to revert to try to delete the image and it fails there as well :D 14:47:53 ok, then I have good idea where that happens 14:48:03 great 14:48:20 so next week is Christmas and I will be on leave on 26 and 27 14:48:39 we need to probably mock the task_repo for it 14:49:07 is anyone from you available for the meeting or should we skip next weeks meeting? 14:49:17 jokke_, ack, I will try the same 14:49:27 next week is dec 19 ? 14:49:28 the gateway providing the repos does not take the simple database stuff into consideration 14:49:42 what really baffles me is how in earth it works now 14:49:54 me neither 14:49:56 abhishekk: rosmaita: indeed. You're week ahead still 14:50:10 abhishekk: XMas in in 2 week :P 14:50:13 sorry :P 14:50:56 I thought we decide it now :d 14:50:58 so next week, yes, but week after, no 14:51:37 jokke_, rosmaita I guess we should skip Christmas week meeting which is on 26 14:51:38 So I will pretty much honour company shutdown. Will be at least whole christmas week offline (Telegram might reach depending of my reception) and will return early Jan 14:51:54 yeah I definitely won't be around on 26th 14:51:55 I will be there in new years week to host the meeting 14:52:05 yes, i will be offline from 24 dec until 2 jan 14:52:38 cool, I will send the mail on ML for the awareness 14:52:59 I'll be Flying next Tue so I have bit early evening tuesday, but will work from Finland the rest of the weeks still 14:53:18 safe travels 14:53:18 rest of the next week even 14:53:20 I have sad news. Funding was pulled and my contract was canceled so today is probably my last day with supporting this project until aft I can find new job 14:53:34 davee_: crap :( 14:53:43 davee_, sorry to hear this 14:53:47 :( 14:53:49 Sorry to hear. What a "great" christmas present 14:53:54 not as sad as I was 14:54:22 davee_: that is a real bummer, i wish you much luck 14:55:02 +1 14:55:17 anything else? 14:55:18 I have a few leads, so hopefully I will bounce back quickly for new year 14:55:30 davee_, great 14:55:38 good to hear 14:55:47 Just wanted to say I have enjoyed working with you all 14:55:57 the feeling is mutual 14:55:59 likewise :D 14:56:04 thanks for the heads up, likewise 14:56:29 wrapping up guys 14:56:37 Merry Christmas in advance 14:56:46 crap 14:56:54 we are meeting next week :D 14:56:58 yes :D 14:57:09 abhishekk: get some sleep every now and then :P 14:57:15 Thank you all 14:57:15 Thanks all! 14:57:30 bye 14:57:35 jokke_, yes, thats why I was dreaming of chirstmas leave :D 14:57:56 #endmeeting