17:02:03 #startmeeting charms 17:02:03 Meeting started Mon Aug 7 17:02:03 2017 UTC and is due to finish in 60 minutes. The chair is tinwood. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:02:04 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 17:02:06 The meeting name has been set to 'charms' 17:02:22 hi everyone, welcome to the OpenStack charms two-weekly meeting at this time. 17:02:28 okay, let's kick off. 17:02:41 #topic Review ACTION points from previous meeting 17:03:04 there's nothing in the agenda on this topic: https://etherpad.openstack.org/p/openstack-charms-weekly-meeting-20170807 17:03:25 Anyone having any back actions they want to bring up? 17:03:55 okay, in that case ... 17:04:05 #topic State of Development for next Charm Release 17:04:24 So, it's getting close to the next release: 17:04:33 Aug 24 - OpenStack Charms Feature Freeze 17:04:33 Sept 1 - Upstream OpenStack Pike Release 17:04:33 Sept 7 - OpenStack Charms Release 17:04:59 beisner, as you're here, any thoughts on this? :) 17:05:11 * beisner likes 17:05:56 We need to make sure high/crit bugs have traction now in order to make freeze. 17:06:21 Is gate running pike tests atm? 17:06:29 not yet 17:06:31 Isn't the freeze a feature-freeze; we continue on any high priority bugs? 17:06:34 ack 17:06:36 It is not yet. 17:06:39 pike b3 is in -staging 17:06:43 yes tinwood we can still do crit bugfixes in freeze 17:06:56 we just have to understand the impact to the test/revalidation matrix each time something lands during freeze. 17:07:42 jamespage, another subtopic here is the gnocchi charm. Any updates for the meeting? 17:08:04 aiming to get all existing charm touches done before wednesday 17:08:19 most things are ready - need to retro my ceilometer changes but I think they are nearly there 17:08:31 excellent! 17:08:35 only thing I've not managed to touch is the radosgw support 17:08:55 but that was kinda blocked on the service-discovery spec which remains un-implemented 17:09:13 so I'll suggest we defer that for this cycle 17:09:27 project-config and goverance changes are up for gnocchi 17:09:27 un-implemented in charms or un-implemented upstream in radosgw ? 17:09:32 so we'll leave it as an early-release charm? 17:09:37 gnuoy: un-implemetned in the charms 17:09:45 ack 17:10:08 for reference - http://specs.openstack.org/openstack/charm-specs/specs/pike/approved/service-discovery.html 17:10:22 thanks jamespage 17:10:40 The other sub-topic is snap integration. 17:11:03 snap integration has been put on hold for this release cycle 17:11:06 Any updates there w.r.t. to charm release 17:11:10 One more for state of development: 17:11:12 Dualstack IPv4 and IPv6 is making progress. Working out SSL bits this week. 17:11:40 I am reviewing a jamespage charmhelpers change today which will be related 17:11:56 thedac: ta 17:12:05 thanks for the update, thedac 17:12:12 haproxy bits are as well - backend/frontend mapping needs some work 17:12:12 i'm in the midst of dropping the cinder v1 api for pike. still testing b3 of pike with the charms so there may be other updates needed. 17:12:29 coreycb: that ceph fix should be through soon 17:13:05 jamespage: great. i'm backporting ceph now. seems like that may finish building by the end of august. :) 17:13:26 Also, from me, worker multipler and ipv4 memcache are now in the reactive charms. 17:13:26 yeah its slow - I have to limit parallel execution otherwise lp builders run out of ram 17:13:35 tinwood: ooo good 17:13:40 options.workers right? 17:13:44 yup 17:14:10 and a more extensive options. (I'd have to look it up) for the apache worker config. 17:14:17 tinwood: thanks for picking that one up 17:15:00 And, hopefully, landing this week, some refinements in designate in how much 'work' it does during an update-status hook. 17:15:24 Any more for this topic? 17:15:29 a write up of that would be good for those that follow 17:15:36 so we don't re-make the same errors... 17:15:56 Blog post or doc page? 17:16:15 blog would be nice 17:16:22 okay :) 17:16:22 doc page with a blog post pointing to it would be good imo 17:16:49 #action tinwood write up how to 'quieten' down a reactive charm during update-status, et al. 17:16:59 that way we have some doc thing under rev control, and a blog post where ppl can get the extra fluff/context. 17:17:04 :-) 17:17:21 Is the any science on how widely deployed the designate charm is ooi? 17:17:26 Okay, I'll follow that lead. 17:17:26 s/the/there/ 17:17:56 gnuoy, it's being tested by CPE at one of the customer's sites at the moment. 17:18:22 ack, people are using it, cool 17:18:36 gnuoy, indeed, and finding bugs :) 17:18:46 there are no bugs in that charm 17:18:50 :) 17:18:54 :) 17:18:57 That's right; I'm not fixing them. 17:19:00 :) 17:19:23 okay ... any more, as otherwise we're on to bugs ... 17:19:36 #topic High Priority Bugs 17:19:48 For review: https://tinyurl.com/osc-high-priority 17:20:01 and https://tinyurl.com/osc-critical-bugs 17:20:49 TWO new critical's on ceph/ceph-mon? 17:21:03 Plus, a very old hacluster bug? 17:21:19 tinwood: yeah I raised those today - needed for pike UCA 17:21:26 otherwise you don't get stats of any sort 17:21:34 tinwood: fairly trivial changes to the charms tbh 17:21:39 I'll prob get to those 17:21:53 Okay, thanks for the clarification! :) 17:21:54 hmm, that hacluster bug seems strangely familiar 17:22:15 I'll ping the guy who rqaised it 17:22:17 * raised 17:22:18 I disagree on that being critical so I've dropped it to high 17:22:49 +1 17:22:54 wolsen discussed tackling that bug a while back. 17:23:15 yeah 17:23:20 its annoying 17:23:24 jamespage, oddly, is it still on the old hacluster charms project? https://bugs.launchpad.net/charm-hacluster/+bug/1478980 17:23:24 Launchpad bug 1478980 in OpenStack hacluster charm "If the principle updates a resource parameter of an already configured resource hacluster ignores it" [High,Triaged] - Assigned to Billy Olsen (billy-olsen) 17:23:33 nope 17:23:39 charm-hacluster is right 17:23:53 Oh, yeah, it says 'invalid' 17:24:01 anyway good to switch focus to bugs as features complete folks 17:24:13 Absolutely. 17:24:36 we're running short on time, so can we move on? 17:24:57 #topic Openstack Events 17:25:11 PTG in September: any updates? 17:25:31 on my list for this week to get the etherpad circulated for ideas for the room 17:25:43 probably less design conversations, more sprint of features early 17:25:47 Would you like an action? 17:25:51 I'd like to nail the service-discovery one 17:25:58 tinwood: nah I'll do it anyway 17:26:03 'kk 17:26:29 So, finally: 17:26:37 #topic Open Discussion 17:26:41 the floor is open: 17:26:57 (also good to see gnuoy) :) 17:27:14 thanks :) 17:27:46 Any comments for Open Discussion? 17:28:47 yeah 17:29:06 the barbican charm - its not been promulgated into the general space, still in openstack-charmers space - so its beta? 17:29:42 wolsen, Ill take that .. 17:30:11 So the barbican charm is still marked 'experimental' as it doesn't really have a back end. 17:30:31 ah 17:30:33 Although it's looking like there might be a hashicorp vault one coming in Q 17:30:37 ok 17:30:39 (for production) 17:30:49 agree - i think we decided to wait to promulgate until there was an actual hardware story to test with. 17:30:53 However, we definitely want to know if people do want to use it. 17:30:56 have to drop - thanks for hosting tinwood 17:31:03 np jamespage 17:31:07 indeed, thanks tinwood 17:31:08 I think that makes sense - I've seen a request to use encrypted cinder volumes today 17:31:15 which would require the barbican charm 17:31:30 It either needs an HVM or vault (queens) 17:32:19 s/HVM/HSM/ ? 17:32:20 hsm 17:32:23 wolsen, it's also relatively easy to work up a config backend subordinate for barbican -- just need the usecase 17:32:29 * tinwood yes, HSM 17:32:39 tinwood: makes sense 17:33:09 I was just gathering the current state of it in order to understand the request as it relates to the use case presented by the user (encrypted volumes) 17:33:48 wolsen, okay; it'd probably need a little work on the HA side too - but most of that's already supported in charms.openstack. 17:34:04 yep 17:34:06 makes sense 17:34:36 Excellent; would love to get that charm out into the wild :) 17:35:03 I think we're slightly over time, but any more? 17:35:12 nope, all good here - thank you 17:35:26 In that case: thanks everybody for coming and contributions. And see you at the next one: details are at https://etherpad.openstack.org/p/openstack-charms-weekly-meeting 17:35:33 #endmeeting