16:00:08 #startmeeting oslo 16:00:09 Meeting started Fri Aug 15 16:00:08 2014 UTC and is due to finish in 60 minutes. The chair is dhellmann. Information about MeetBot at http://wiki.debian.org/MeetBot. 16:00:10 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 16:00:12 The meeting name has been set to 'oslo' 16:00:24 who's around for the oslo meeting? 16:00:28 o/ 16:00:31 o/ 16:00:36 hi 16:01:13 o/ 16:01:19 o/ 16:01:24 dimsum_, bnemec, markmc, flaper87|afk, jd__ : ping? 16:01:32 o/ 16:01:48 o. 16:01:55 o/ 16:02:04 ok, hi, everyone! 16:02:06 pong 16:02:23 #topic Review action items from previous meeting 16:02:31 #action dhellmann approach the other integrated projects who have not started work on oslo.i18n 16:02:31 carrying that over again, although we did make some good progress 16:03:01 nova glance and keystone are done, cinder is in progress 16:03:12 #info bknudson file a bug about timeutils mocking issue 16:03:26 dhellmann: did that. 16:03:47 I looked and didn't see it, do you have the number handy? I only had a minute for a quick search... 16:04:02 #link https://bugs.launchpad.net/oslo/+bug/1354521 16:04:03 Launchpad bug 1354521 in oslo "timeutils.utcnow use in oslo-incubator and oslo.utils" [High,Triaged] 16:04:09 thanks, bknudson 16:04:23 #info everyone review the repo being imported as oslo.concurrency 16:04:23 the repo looks ready to be imported, so we’re just waiting for infra to have time to do that, right YorikSar? 16:04:55 Yes. I'll be pursuading this on #openstack-infra. 16:04:55 bknudson: similar issue in nova bknudson, fyi 16:05:09 YorikSar: ok, thanks 16:05:18 dimsum_: ah, is this the same thing we were just discussing? 16:05:23 dhellmann: yep 16:06:00 dimsum_: I like the new nick. 16:06:15 thx! :) casual friday 16:06:26 ok, bknudson, the solution dimsum_ and I just agreed on was to port the incubated modules to use oslo.utils, and sync those changes over along with anything else needed to use oslo.utils 16:06:59 dhellmann: ok, that's what I was thinking would be the solution. 16:07:41 bknudson: the main concern I had was that would require projects that needed patches to any of those incubator modules to adopt oslo.utils 16:07:59 I suppose we can always create a branch later, if we need to 16:08:01 y, too bad for them. 16:08:10 L) 16:08:12 :) 16:08:16 all they have to do is import it 16:08:25 I mean put it in their requirements 16:08:42 this also brought up the idea for us to change the process in kilo, and create a branch early in the cycle then delete modules from master 16:09:35 bknudson, well, if they don't make the full change they'll have this same problem but in reverse -- their app code will use the incubated utils modules and some incubated code will use the library 16:10:03 under the new plan, backports to the incubator would go into the branch, not into master 16:10:12 I'll have to discuss that with ttx, but think about it 16:10:27 anything else on that topic? 16:11:06 moving on, then 16:11:07 #topic Red flags for/from liaisons 16:11:19 liaisons, how were things this week? 16:11:27 for keystone... 16:11:39 we tried to switch to the new oslo.config config file generator 16:11:53 because apparently the script doesn't support macos 16:12:06 we ran into a few things and opened bugs and proposed fixes 16:12:13 #link https://review.openstack.org/#/c/113940/ 16:12:20 1) oslo-incubator config options aren't listed 16:12:28 2) the config options are in alphabetical order 16:12:29 (that's the link I just posted) 16:12:40 hmm, does that matter? 16:12:43 then there was another one in rtc 16:13:12 dhellmann: it's a usability issue... the order they were in using oslo-incubator groups related options 16:13:35 3) https://review.openstack.org/#/c/114031/ 16:13:35 bknudson: ok, we do have explicit groups for that, but I can see where the ordering would be useful to be able to control 16:13:48 Set sample_default for rpc_zmq_host 16:14:15 #link https://bugs.launchpad.net/oslo/+bug/1356591 16:14:16 Launchpad bug 1356591 in oslo "oslo-config-generator alphabetizes options" [Medium,Triaged] 16:15:06 I've put those on my priority list for later today 16:15:17 and we'll mention them again later when we talk about weekly priorities 16:15:41 is there anything else we should talk about? 16:16:43 ok, let's keep going then 16:16:46 #topic Need someone to run the meeting for the next 2 weeks (dhellmann) 16:16:57 I’m traveling on the next 2 Fridays and need someone to lead those meetings. Volunteers? 16:17:22 I'll be around on other days, it just so happens that my travel falls on friday twice in a row 16:17:40 dhellmann: i can run the first one 16:17:56 #info dimsum to chair the meeting next week 16:18:00 thanks, dimsum_ :-) 16:18:06 I should be able to do the other. 16:18:18 #info beekneemech to chair the meeting in 2 weeks 16:18:20 thanks, beekneemech 16:18:27 #topic Adoption status 16:18:32 #link https://etherpad.openstack.org/p/juno-oslo-adoption-status 16:18:40 do we have any updates to mention here? 16:18:53 I think next cycle I'm going to use a wiki page for this so we can subscribe to edits :-) 16:19:13 dhellmann: oslo.vmware is ongoing (Nova) 16:19:15 maybe even bugs 16:19:29 dhellmann: tuskar is going to adopt oslo.db ... 16:19:37 dimsum_: ok, good, could you add a section to the etherpad with that? I assume there will be cinder work, too? 16:19:49 pblaho: good! 16:19:55 dhellmann: y, glance and cinder 16:20:02 dhellmann: will do 16:20:05 dimsum_: thanks 16:20:14 oslo.i18n merged for cinder 16:20:28 jecarey: \o/ 16:20:55 this is usually the happy part of the meeting :-) 16:21:09 is it possible to merge any oslo thing for cinder? 16:21:15 I've got a patch to update oslo.policy stuck for months 16:21:22 PITA 16:21:36 jd__: hrm, link? 16:21:48 is our cinder liaison here today? 16:22:01 https://review.openstack.org/#/c/78614/ 16:22:23 jd__: I'll talk to Duncan 16:22:33 I think oslo-incubator in cinder has not been updated for a long time 16:23:28 yeah, if they don't want to take the updates they may have to wait for libraries for fixes and features, at this point 16:23:46 are we blocked with adoption anywhere else? 16:24:41 ok, I've pinged the cinder folks, so let's see what happens 16:24:42 #topic oslo.serialization graduation status 16:24:48 #link https://blueprints.launchpad.net/oslo/+spec/graduate-oslo-serialization 16:25:23 I've lost track of who is actually running this one. beekneemech the bp is assigned to you, but I think there were others helping? 16:25:38 * dimsum_ is helping - 3 reviews in flight - https://review.openstack.org/#/q/status:open+project:openstack/oslo.serialization,n,z 16:25:52 Yep ^ 16:26:11 ok, good 16:26:16 I think I might have had an infra change pending too. Let me check. 16:26:17 * dhellmann adds to his review list 16:26:37 #link https://review.openstack.org/112994 16:26:39 beekneemech: there was a governance repo review from you 16:26:50 Yeah, governance merged. projects.txt merged. 16:26:55 cool 16:27:04 Just the devstack-gate change left to go I think. 16:27:10 ok, the governance ones are mostly procedural, ttx approves those after I +1 as long as there's no objection 16:27:27 beekneemech: ok, good 16:27:46 #topic oslo.concurrency graduation status 16:27:59 amrith, YorikSar, I have a paste-bomb with some of your requests coming... 16:28:07 #link https://blueprints.launchpad.net/oslo/+spec/graduate-oslo-concurrency 16:28:07 amrith and YorkSar have a few changes to be merged in the incubator related to the concurrency code 16:28:07 #link https://review.openstack.org/#/c/109417/ 16:28:07 #link https://review.openstack.org/#/c/109469/ 16:28:08 #link https://review.openstack.org/#/c/110933/ 16:28:35 #link https://wiki.openstack.org/wiki/Oslo#Graduation 16:28:41 our general policy is to allow back ports in the incubator, but since the concurrency repo is being imported we should wait until that is done, then merge the changes in the library, then in the incubator 16:28:44 thoughts? 16:29:14 From what we've clarified on #openstack-oslo, we can land all those changes to concurrency repo once it's created and then backport them to incubator. 16:29:17 dhellmann: Yes. 16:29:27 I'd rather wait. It's a nightmare trying to sync both directions once the repo gets created. 16:29:31 exactly 16:29:32 * beekneemech says from experience 16:29:46 ok, so as long as everyone is ok with that, I think it's settled 16:29:56 Although amrith wanted to get them in incubator as fast as we can, but we'll definitely have synchronization issues otherwise. 16:30:10 right, I'm more worried about having things diverge 16:30:12 beekneemech: Agree. 16:30:18 let's see if infra can set up the repo for us today 16:30:34 I'll ping them one more time. 16:30:55 We also have hot discussion going on around lockutils 16:31:01 YorikSar: fungi is usually pretty responsive, but I don't know if the team is busy with their own work today 16:31:07 #link http://lists.openstack.org/pipermail/openstack-dev/2014-August/043090.html 16:31:14 YorikSar: thanks for taking it to ML 16:31:17 * dhellmann is behind on the ML 16:31:26 That's a fun one. ;-) 16:31:29 I've created an etherpad and most of discussion is there. 16:31:38 is this the posix lock/process lifetime issue? 16:31:47 dhellmann: Yes. It was :) 16:31:47 Among other things 16:31:50 #link https://etherpad.openstack.org/p/lockutils-issues 16:32:08 Now that's about may be removing POSIX locks and using only file locks entirely. 16:32:15 YorikSar, beekneemech, jd__ : would tooz help with this? 16:32:19 (amongst other options) 16:32:34 dhellmann: I don't think so. tooz is using this backend for its IPC locks. 16:32:43 :-/ 16:32:49 dhellmann: help no, but it'll be related at some point 16:32:55 That's another question I have. I thought that tooz is for distributed locking and lockutils is for local locking. 16:33:34 lockutils unfortunately also got used for distributed locking because it was the only thing available at the time. 16:33:43 ok, I need to absorbe this, there's a lot of info here 16:33:52 IPC is distributed accross process 16:33:55 is there a consensus emerging about the approach? 16:33:56 * jd__ naiiiiled it 16:34:05 :-) 16:34:12 jd__: nothing but net 16:34:33 dhellmann: Well... I'm not sure yet. 16:34:49 I wouldn't call it anything like consensus, no. :-( 16:35:29 I think we need more people to weigh in. 16:36:11 ok, I'll catch up on the discussion and see if I can add anything to it, but I think there are probably other people we need to have involved so let's approach them directly and ask them to get involved in the ML thread 16:36:28 Although I can see reverting to file locking and finally merging lockutils to pylockfile entirelly as a good enough option... 16:37:00 dhellmann: YorikSar: i'll take a look--get me urls in #-infra at your convenience 16:37:07 fungi: thanks! 16:37:42 fungi: done 16:37:49 YorikSar: that approach seems ok, but we should look at the history of why we moved to posix locks before jumping back and reintroducing an old problem 16:38:02 jd__: is probably the one to answer that. 16:38:16 I _think_ the only reason was to remove the need for lock_path being set. 16:38:17 dhellmann: We'll need a lot of input from operators here. 16:38:36 it was an optimization 16:38:38 dhellmann: If lock_path is set correctly by everyone anyway, then we can safely move to it. 16:38:42 YorikSar: should we start asking for that now, or wait until we have a proposal for a change? 16:38:44 I think going back to file and moving to sysv/tooz later is ok 16:38:53 Which we had actually discussed addressing in a different way. 16:39:24 dhellmann: I think we should wait for our discussion to settle a bit first. 16:39:46 jd__, beekneemech : could you work on adding the details of that change to https://etherpad.openstack.org/p/lockutils-issues 16:39:53 YorikSar: ok 16:39:53 we should just have multiple impls :) 16:40:17 k 16:40:19 We could, but we need a sane default anyway for the existing consumers. 16:40:19 dimsum_: if there's some sort of fundamental issue with one, we wouldn't want anyone to actually use it, right? 16:40:26 jd__: thanks 16:40:28 dhellmann: right 16:40:33 jd__: We shouldn't first go back to file locks just to go to SysV locks later. We should pick one direction an follow it. 16:41:07 YorikSar: perhaps, but if we consider this a critical problem a short-term fix followed by a different longer term fix might be appropriate 16:41:17 * jd__ needs to read it first 16:41:38 we're not going to resolve it today, I think, so let's keep going 16:41:43 k 16:41:44 #topic bug fix for mask_password 16:41:46 dhellmann: There are rather clear implementation plans for every direction in the etherpad... 16:41:55 dhellmann: Sure 16:42:13 #undo 16:42:14 Removing item from minutes: 16:42:17 YorikSar: ok, that may change my opinion then 16:42:25 #topic bug fix for mask_password 16:42:25 amrith has a patch to mask_password, which is part of strutils in the incubator but is going to move to oslo.log later 16:42:25 #link https://review.openstack.org/#/c/113407/ 16:42:25 this one I think we can merge now, since it changes something that isn’t in a library yet 16:42:51 dimsum_, amrith, and I discussed this a bit on irc right before the meeting today 16:43:23 I've also proposed a change to update the oslo.log spec: 16:43:24 #link https://review.openstack.org/114579 16:44:08 net is...mask_password will live in oslo.utils and oslo.log will depend on oslo.utils 16:44:16 oh, actually, I didn't update my notes for the meeting after we talked, so that resolution I pasted above is wrong -- we're going to put mask_password in oslo.utils.strutils so that no libraries have to depend on oslo.log 16:44:23 yep 16:45:20 Works for me. 16:45:59 good 16:46:06 #topic oslo.utils release review 16:46:06 So what should be the way for amrith? 16:46:14 #undo 16:46:15 Removing item from minutes: 16:46:17 Should he propose change to oslo.utils first? 16:46:34 * YorikSar is sorry for being slow 16:46:55 dimsum_: did we say merge amrith's patch first, the move, or do them separately? 16:47:19 dhellmann: we did not talk about that 16:47:25 we can easily land a patch to put the function in oslo.utils, so I'm not sure it matters 16:47:38 +1 16:48:02 So fix function first, add it to oslo.utils later, rught? 16:48:08 let's move the function as-is to oslo.utils, since it is under graduation, and then backport the change to incubator 16:48:30 that's in keeping with our current policies, and I don't think it will delay amrith much because we'll land our move patch quickly 16:48:32 +1 to dhellmann's suggestion 16:48:43 dhellmann: Ok, got it. 16:48:44 dimsum_, can you start the move patch today? 16:48:44 Agreed. 16:48:56 if not we can find another volunteer 16:49:04 +1 i will 16:49:07 dimsum_: thanks 16:49:20 #action dimsum_ propose patch adding mask_password to oslo.utils.strutils 16:49:24 Great 16:49:33 anything else for oslo.utils? 16:49:49 Not from me :) 16:49:56 :-) 16:49:58 #topic oslo.db release review 16:50:05 rpodolyaka, do you have any updates for us? 16:50:14 or need anything? 16:50:24 still waiting for a few patches to be merged into consuming projects 16:50:56 and you are welcome to review the patches in oslo.db :) 16:51:11 rpodolyaka: ok, are those patches in other projects blocked at all in a way that I can help with? 16:51:11 there are a few of them which needs second +2 16:51:21 rpodolyaka: any day now I'm going to keep my promise of doing more reviews :-) 16:51:38 rpodolyaka: if you email me a list, I can bring it up at the cross-project meeting next week 16:51:40 dhellmann: probably just a lack of reviewers in glance or cinder 16:51:53 I know they had a midcycle meetup this week (cinder) 16:52:09 and we updated the patch the day before yesterday 16:52:10 rpodolyaka: ok, maybe bring it up in #openstack-cinder to see if anyone is listening there 16:52:19 dhellmann: sure! 16:53:09 so I hope we'll a cut a new release next week 16:53:34 jd__: looks like cinder is going to take another look at your patch (see #openstack-cinder) 16:53:36 this one will contain mysql-connector fixes 16:54:08 and yeah, please vote in the thread in which I nominated zzzeek for the oslo.db core reviewers team :) 16:54:19 rpodolyaka: yes, I think we should mention to the projects that this is blocking a release, but that we're not going to block forever and it may break their app -- the changes should be pretty small, so I think we just need to raise awareness 16:54:29 rpodolyaka: oh, yeah, good :-) 16:54:35 dhellmann: agreed 16:54:46 is there anything else on oslo.db? 16:55:03 that's all I have for now 16:55:21 viktors , i159 are both on PTO and zzzeek seems to be missing now 16:55:37 zzzeek sent me email that he had a conflict today 16:55:52 just a few minutes left, so let's talk about... 16:55:52 #topic review priorities for this week 16:56:06 we had a few mentioned earlier in the meeting 16:56:07 the mask_password patch is a good candidate: https://review.openstack.org/#/c/113407/ 16:56:07 adding list_opts to incubated modules: https://review.openstack.org/#/c/113940/ 16:56:07 Set sample_default for rpc_zmq_host: https://review.openstack.org/#/c/114031/ 16:56:08 oslo.serialization: https://review.openstack.org/#/q/status:open+project:openstack/oslo.serialization,n,z 16:56:13 as well as oslo.db 16:56:28 https://review.openstack.org/#/q/status:open+project:openstack/oslo.db,n,z 16:56:33 thanks, rpodolyaka 16:56:46 does anyone have anything they want us to focus on that's not mentioned there? 16:57:17 oh, you should all also check out the fancy new dashboard link on https://wiki.openstack.org/wiki/Oslo#Review_Links 16:57:29 * dhellmann is improving his gerrit query skills 16:57:51 nice! 16:58:36 we'll have to update the list of graduating libraries for that part of the page to stay current, but that won't change very often 16:59:02 2 minutes left for 16:59:05 #topic open discussion 16:59:09 We just got oslo.concurrency repo creating merged in. Thanks, fungi! 16:59:15 thanks, fungi! 16:59:17 \o/ 16:59:21 So we might need to change that list already ;) 16:59:22 woot 16:59:24 and good work, YorikSar 16:59:37 yw 16:59:37 I had a question about what to do with things like https://review.openstack.org/#/c/87375/ 16:59:40 #action make sure oslo.concurrency is in the graduating libs query for the dashboard 17:00:11 oh, good question, beekneemech 17:00:12 It touches a bunch of files that are graduating, but backporting fixes from all of them is going to be awful. 17:00:16 lucky for me we're out of time :-) 17:00:20 :-) 17:00:32 seriously, let me look at the patch and see if I can come up with a suggestion 17:00:46 Sounds good 17:01:01 this may reinforce my idea of early branching and deleting from master 17:01:13 Hmm, true. 17:01:25 we're over time, thanks for coming everyone, good discussion and progress this week 17:01:26 keep it up! 17:01:40 #endmeeting