14:00:30 #startmeeting glance 14:00:31 Meeting started Thu May 23 14:00:30 2019 UTC and is due to finish in 60 minutes. The chair is jokke_. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:00:32 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:00:35 #topic roll-call 14:00:35 The meeting name has been set to 'glance' 14:00:38 o/ 14:00:42 o/ 14:01:00 o/ 14:01:01 o/ 14:01:39 yeii quick quorum. I guess everyone has recovered well 14:01:52 yep 14:02:05 #topic updates 14:02:08 brb ... coffee emergency 14:02:48 o/ 14:02:56 #link https://etherpad.openstack.org/p/Glance-Train-PTG-planning L199 onwards 14:03:32 while I figure out how I can get the priorities patch formatted decently 14:04:31 abhishekk: smcginnis can I get your acks on https://review.opendev.org/#/c/659531/ 14:04:43 Looking now 14:04:44 jokke_, looking 14:04:56 if it looks fine lets get it merged so we get the housekeeping done and dusted 14:05:49 that is all from me for now 14:06:13 #topic release updates 14:06:38 We are two weeks away from T1 milestone release 14:06:57 Need reviews on specs, spec-lites and patches 14:07:12 Almost 80 % work is done 14:07:39 I want to move cinder and nova support of multiple stores to T2 as specs are not approved yet 14:07:53 Kindly let me know your opinion on the same 14:08:27 For cinder, code is ready, tested but unit tests are remaining, for nova I am working on code 14:08:28 They will happen when they will happen ... that work really is tracked in those projects 14:08:31 Seems reasonable to me. 14:09:09 Obviously sooner they re ready to land, the more chances we have for them to actually merge 14:09:13 So if all of you agree, I will move them to T2 in the etherpad 14:10:04 ok by me 14:10:05 Regarding periodic jobs, no failure since PTG (or from one week before PTG) 14:10:14 \o/ 14:10:35 rosmaita, kindly help me in getting cinder specs merged :D 14:10:38 \\o \o/ o// o/7 14:11:00 abhishekk: i need to update your multistore spec 14:11:19 rosmaita, ack 14:11:20 taking into account that glance-cinder-backend thing you mentioned 14:11:32 i have an idea, need to take another look at the code 14:11:44 cool, let me know 14:11:53 will try to get that done today 14:12:19 thank you 14:12:26 That's it from me 14:12:38 kk 14:12:58 #topic changes in constrains 14:13:23 yeah, basically what that email says 14:13:38 they're changing how u-c is handled slightly 14:13:50 move out of install_command and also change the url 14:13:56 details are in the email 14:14:50 yeah, seems like nothing that needs too much attention. Lets review them swiftly when we see the patches coming in 14:14:57 also, i learned recently that a lot of stuff in test-requirements.txt is not handled in upper constraints 14:15:11 so taht's why we get weird breakage in the stable branches occasionally 14:15:36 i may put up a patch to pin the test-requirements to a known working version 14:15:46 because we are expected to manage those ourselves 14:16:03 i will have a more coherent proposal next week 14:16:09 that's all from me 14:16:13 ohh ... I did not realize the the test-requirements are not under global sync anymore 14:16:19 sounds good 14:16:39 no, it's on purpose, too -- there's a blacklist of stuff that is intentionally not included in u-c 14:16:54 #topic sheepdog store 14:17:20 Looks like this is unmaintained and not currently working? 14:17:22 there's a link on the commit message of sean's patch 14:17:29 to the sheepdog ML 14:17:46 yes, not maintained, and no plans to maintain 14:17:51 Yeah, upstream looks like they've shut down. 14:17:54 cinder is removing the sheepdog driver 14:18:06 so we should also do so for glance_store 14:18:09 Current code does not work on bionic. 14:18:34 ok, well that is quite simple then, kill the gate and lets hit the deprecation button 14:18:56 i wonder if we can speed up deprecation on an unsupported store? 14:19:18 It really already is in an unusable state. 14:19:26 Would have been good to have some heads up though. 14:20:00 since it's glance_store and not glance, maybe we can remove in Train? 14:20:41 yes 14:20:46 Well they are keeping the repo alive for those who have it in production, I dunno if anyone is running it in prod for glance, but I'd rather not just remove it. It not working in ubuntu is really not a reason to just remove without deprecation 14:21:19 At a bare minimum, I would say we would need a strongly worded release note to make sure it's very visible if we did. 14:21:26 well, glance_store is not under assert:follows-standard-deprecation 14:22:03 i would be good with announcing deprecation next week on the ML with intention to remove in Train, and see if anyone screams 14:22:07 Well there is no weight to it. Deprecate, strongly worded release note that if anyone is using it they need to maintain their own fork and removal in U 14:22:22 right 14:23:00 so if we decide to remove then we need to remove it before 1.0.0 release 14:23:06 I'll write mail today to announce the deprecation and giving heads up to anyone possibly using it 14:23:17 ok 14:23:25 ack 14:24:30 who knows, may prompt someone to adopt the sheepdog project 14:25:00 yeap 14:25:01 but it sounded like from that thread that ceph is being used instead 14:25:05 I doubt so 'though 14:25:47 ok, that's all from me about sheepdog 14:27:17 k 14:27:31 #topic glance-replicator 14:27:51 i think we missed this one in the v1 removal 14:28:04 replicator uses v1 exclusively 14:28:04 yes 14:28:09 we did 14:28:24 also, it doesn't use glanceclient, i think it was written at teh same time the client was being developed 14:28:41 Still needed/wanted, right? 14:28:48 that was my question 14:28:50 so do we want to move it to v2? 14:28:58 I'd say not 14:29:08 it's now 2 cycles, right? 14:29:18 Until that mail, I was not sure anyone was using replicator 14:29:33 jokke_, yes 14:29:54 I think the best thing to do is remove it and design it ground up if it's still wanted 14:30:26 I honestly hope/think that with multi-store and taskflow + the imrovements we have in pipeline, It's likely even less needed 14:30:50 ++ 14:30:57 i agree 14:31:37 So treat it as implicitly deprecated and just remove it? 14:31:57 So you want me to respond to that mail as well and be mean telling tht it's not going to be refactored? 14:32:25 well, the problem is that there's a big lag between what operators are using and our releases 14:32:50 i wonder whether it's still being used in production envs and they haven't caught on yet that it doesn't work since rocky 14:32:54 smcginnis: as it has had no chance of working for 2 cycles and no-one even noticed it by now, I'd treat it as part of v1 and just cleanup taken long 14:33:21 rosmaita: well, the reality is that we won't be backporting the refactoring anyways 14:33:45 yeah, but as a standalone script, that's not a big deal 14:33:53 so it either gets redone T+ or just not existing since v1 14:34:15 I'm more than happy to accept spec if someone wants to redo it 14:34:17 well, i guess the thing is announce its removal on the ML and see if there's any negative response 14:34:27 but I don't want to plan it in to our workload 14:34:43 without someone explicitely stating they need it and are willing to do the work 14:35:11 works for me 14:35:50 Any other opinions? 14:36:07 works for me as well 14:36:09 is there anyone here now who feels strong need for it and wants to do the work? 14:36:41 not me 14:37:27 not me 14:37:32 #action jokke to write mails of sheepdog deprecation & removal and replicator removal (as part of V1 API) 14:38:10 #topic open discussion 14:38:16 Anything else? 14:38:35 Reviews please 14:38:45 yes! 14:39:16 I will try to focus tomorrow reviewing everything we have pending for M1 14:39:24 and specs 14:40:02 cool, thank you 14:40:24 store work is almost done, unless I get some review comments 14:40:33 nice 14:40:42 only one patch is pending, to remove deprecated options 14:42:16 ok, if that's all we can get some time back 14:42:19 calling once 14:42:37 nothing from me 14:42:47 we have new attendee today, davee_ welcome 14:43:19 twice 14:43:35 welcome davee_ 14:43:59 hang on 14:44:08 as long as we have a few minutes 14:44:13 sure 14:44:19 question about the "permission-to-delete" spec 14:44:44 eric put a comment that maybe we want to do this for more than just the cinder_encryption_key_id property 14:44:57 what do people think? 14:45:29 #link https://review.opendev.org/#/c/656895/ 14:46:00 greetings, had an emergency caffeine requirement 14:46:07 looking at the spec, the name of the new property is general 14:46:26 (though i am pretty sure jokke_ hates it) 14:46:27 delete_encryption_key_on_image_deletion 14:46:51 so we could do cinder-only for now, and amend the spec if the other team gets their stuff together in Train 14:47:04 or exend the functionality in U if requested 14:47:33 how does that sound? 14:47:40 sounds good to me 14:47:53 Well the discussion/agreement on the forum/PTG was that we will do it only for this cinder special case as it's already implemented, released and not part of the work done cross project 14:47:56 +1 14:48:34 jokke_: ok, just wanted to make sure it was easily extensible later 14:48:35 We can start bikeshedding for few cycles and postpone the implementation or we can get this done like planned ;) 14:48:46 tbh, i forgot what the metadata was called 14:49:11 :D, I am tending towards or option 14:49:25 It was literally like "os_delete_cinder_exncryption_key" 14:49:38 -x 14:49:43 yeah 14:50:04 It is very very specific for this one cinder usercase 14:51:16 ok, that's what i wanted to know 14:52:09 ok 14:52:36 you think there is going to be fight about that? 14:52:54 about ... ? 14:53:11 about it being cinder feature specific? 14:53:37 probably not, no one reads glance-specs! 14:53:43 :D 14:53:48 tru tru 14:53:59 ok, anything else? 14:54:07 maybe a noob nerd 14:54:35 so I forgot davee_ wants to start contributing to glance 14:54:51 davee_: if you read/participate on the spec reviews you don't need to read the published specs anymore :D 14:54:52 already signed the CLA 14:55:07 Nice! Welcome indeed 14:55:10 \o/ 14:55:12 been monitoring the relevant IRC channels fo glance/cinder 14:55:27 Welcome! 14:55:36 thanks, and glad to be here 14:55:41 davee_: my only advice is don't leave +1s without comments on reviews! 14:55:41 davee_, hopefully we will find something for you :D 14:56:36 davee_: don't be afraid to ask questions 14:56:44 we all know that the curve is pretty steep 14:56:53 and we all have been there 14:56:58 now I just need to figure out where the documention is located and related supporting materials 14:57:26 like all of the related etherpads 14:59:56 ok, time 14:59:56 thank you all 15:00:03 Thanks all! 15:00:06 bye! 15:00:07 o/ 15:00:11 #endmeeting