19:12:16 #startmeeting TripleO 19:12:17 Meeting started Tue Jul 29 19:12:16 2014 UTC and is due to finish in 60 minutes. The chair is slagle. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:12:18 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 19:12:20 The meeting name has been set to 'tripleo' 19:12:23 thank you slagle 19:12:30 wooo slagle! 19:12:39 #topic agenda 19:12:48 bugs 19:12:52 reviews 19:12:56 Projects needing releases 19:12:59 CD Cloud status 19:13:02 CI 19:13:04 Tuskar 19:13:07 Specs 19:13:12 one-off items 19:13:15 open discussion 19:13:32 #action slagle make the agenda copy/pasteable from the wiki page 19:13:39 #topic bugs 19:13:50 let's talk about untriaged bugs everyone! 19:14:01 #link https://bugs.launchpad.net/tripleo/ 19:14:01 #link https://bugs.launchpad.net/diskimage-builder/ 19:14:01 #link https://bugs.launchpad.net/os-refresh-config 19:14:01 #link https://bugs.launchpad.net/os-apply-config 19:14:01 #link https://bugs.launchpad.net/os-collect-config 19:14:03 #link https://bugs.launchpad.net/os-cloud-config 19:14:06 #link https://bugs.launchpad.net/tuskar 19:14:08 #link https://bugs.launchpad.net/python-tuskarclient 19:15:05 there are 7 untriaged bugs in tripleo 19:15:19 we should have had a bug day last week :( 19:15:28 can folks step in and triage those? 19:15:33 i don't want to go through them all 19:15:49 * slagle looks at criticals 19:16:28 this one was failing CI this morning: https://bugs.launchpad.net/tripleo/+bug/1349913 19:16:31 Launchpad bug 1349913 in tripleo "Error: Failed to execute command via SSH" [Critical,Triaged] 19:16:38 * slagle assigns to derekh. 19:16:42 he's not here to defend himself 19:17:32 the ironic bug is unassigned 19:18:15 hah. I did the same. 19:18:37 lucas gomes +2's the tripleo revert, so i suspect the ironic bug should be assigned to him 19:18:49 unfortunately i'm totally failing at guessing his launchpad id 19:18:51 so, moving on 19:19:39 all the other crits have assignees 19:20:03 given the midcycle last week, i wouldn't be surprised if a lot of progress wasn't made given other committments 19:20:11 so i don't think it's worth going through each one asking for status 19:20:26 does anyone have anything specific to bring up? or ask for help on any they own? 19:20:40 not just the mid-cycle last week 19:21:05 also oscon, and pycon-au starts on friday, and I think there's another con on in the northen hemisphere, and a heat mid-cycle around now 19:21:13 so I'd expect things to be slow for another week or two 19:21:23 yea, agreed 19:21:57 let me take a quick look at the other projects besides tripleo 19:22:08 unless someone already did... 19:22:20 dib ok 19:22:50 huh 19:23:25 os-*-config looks ok. i triaged one bug, there's a fix proposed 19:23:27 lifeless: hi 19:23:32 I thought the meeting was 12 hours out - sorry everyone, and thanks for running it slagle 19:23:38 just running through the bug list 19:23:48 don't let me interrupt, I'm not truely here 19:24:05 tuskar has 20 or so bugs 19:24:22 jdob: it would be cool if someone could run through those and close those out that are no longer relevant 19:24:32 will do; I suspect quite a few won't be relevant 19:24:39 yea, i see one about sqlite, tec 19:24:40 etc 19:24:53 if we're not going to fix them, kill with fire 19:25:06 oh, fire will be brought 19:25:18 any other bug business? 19:25:35 #topic reviews 19:26:33 all review stats links are failing to load for me 19:26:43 anyone else having any luck? 19:27:09 stackalytics works for me 19:27:13 #link www.nemebean.com/reviewstats/ 19:28:12 #link http://russellbryant.net/openstack-stats/tripleo-openreviews.html 19:28:16 ok, that's loading now 19:28:20 glitch in the matrix i guess 19:28:37 3rd quartile wait time: 13 days, 9 hours, 11 minutes 19:28:49 about the same, or perhaps a little worse 19:29:14 i think we took a vote on something 2 weeks ago 19:29:25 to review the longest waiting reviews or something? 19:29:39 i'm having a hard time remembering. last week wiped out all short term memory 19:29:46 not sure it was a vote or just a request 19:29:53 but that was the general idea 19:30:23 right 19:30:24 so I've been doing that using the dashboard 19:30:26 * derekh_ sneaks in late 19:30:39 speaking of the dashboard, for some reason tuskar had disappeared from it 19:30:44 and finding lots of reviews that are old, and have no negative reviews, but don't have a clear path forward 19:30:45 so if you had it bookmarked, please rebookmark 19:30:56 jdob: did you update the wiki? 19:31:06 yes, and the gerrit dashboard thingie too 19:31:16 tchaypo: i'd humbly suggest a friendly -1 in that case 19:31:29 with a suggestion to define the path forward :) 19:31:45 maybe a spec is needed, discussion on the list, etc. 19:32:08 or a workflow -1 19:32:16 so like 19:32:18 https://review.openstack.org/#/c/106909/ 19:32:30 title says WIP, but it's not workflow-1 - worth adding a -1? 19:32:49 tchaypo: yes 19:32:59 if the commit message says WIP, i'd workflow -1 it 19:33:15 it's easy to forget to do that as the patch submitter 19:33:32 I'd love to workflow-1, but I don't have the power :p 19:33:41 also, some of the "longest waiting reviews" have had recent patchsets, i'm not sure these stats are accurate 19:33:51 83 days, 4 hours, 24 minutes https://review.openstack.org/86316 (Add elasticsearch element) 19:34:30 maybe i'm not interpreting things correctly, but i think the stats are misleading a bit 19:34:31 slagle: that's a different section, no? That's longest since oldest rev? 19:35:00 oh, "based on oldest revision" 19:35:12 jp_at_hp: yea 19:35:23 ok, any other review business? 19:35:28 I don't think that dashboard entirely aligns with what reviewstats looks at, but it seems like it should help 19:35:36 for those of us who can't W-1 19:35:44 should we raise it in channel and ask a core to jump on it? 19:36:11 not sure it's worth a disruption, but you could always ping the submitter 19:36:23 ask them if they forgot to update the commit message, or workflow-1 19:36:35 or comment in the review 19:37:21 #topic releases 19:37:44 any volunteers to release? 19:38:33 this week is kinda shitty for me, but sure, I'll do it 19:38:50 (can't think of any new movie quotes on "volunteering", so you get practical this week) 19:39:05 i think you and I are the only ones present who have released :) 19:39:12 and you' 19:39:20 if you don't get to it, ping me 19:39:21 re running the meeting... damn, I'm stuck :D 19:39:25 will do, thanks slagle 19:39:37 #action jdob release the world 19:39:38 i'm sure i will; it's a good break from things that make my head hurt 19:39:51 #topic CD Cloud Status 19:40:00 this one is going to time out in 5 seconds... 19:40:07 #topic CI 19:40:30 we had some blockers, they were reverted (thanks derekh_) 19:40:45 ironic one is still there... 19:40:54 slagle: we're still waiting on the ironic ssh revert aren't we? 19:40:57 slagle: yup as of this morning We're now carriing a patch for a horizon regression https://review.openstack.org/#/c/110250/ 19:40:57 it was reverted in tripleo-ci 19:41:11 Then soon after ironic merged a commit that causes it to use a new command over ssh, so out tests are failing because ssh on the testenvs is locked down to only allow specific commands 19:41:13 so i'm good to kick off a bunch of rechecks? 19:41:19 #link https://review.openstack.org/#/c/110352/ 19:41:37 not yet merged (and not yet passing ci) 19:41:38 and...it failed 19:41:52 we still have no solution for the ironic problem, test revert failed twice now, not sure if its related or not, 19:41:59 I see there is a recheck going again 19:42:08 derekh_: that was me 19:42:16 i can look into it if that fails 19:42:30 although i have some other committments this evening, so i may not get to it right away 19:42:37 This isn't really a regresion in ironic, more that our TE's just don't allow it 19:43:04 I started looking into what we need to do to allow the new commands over ssh but didn't get far before I had to go 19:43:44 #idea move definition of ssh commands needed into ironic test envs... 19:44:05 For now if the revert passes I think merge it and I'll try and build new TE tomorrow 19:44:20 jp_at_hp: how do you mean? 19:44:58 if it doesn't pass then maybe another relate thing needs to be mreged also, somebody in ironic may be able to help there 19:45:02 have ironic describe the commands they will use and pull that for the tripleo testenvs from the latest landed ironic codebase 19:45:21 kinda like rootwrap, but for ssh, and defined at usage-point not by us 19:46:13 there's a handful of ironic errors in the seed logs on that failed CI jobs 19:46:34 jp_at_hp: ahh ok, if we went down that road we would need to dynamically check them to ensure we were current (possible but maybe tricky to get rigth and secure) 19:46:35 just need to correlate that with the proposed revert in ironic to see if they're related or not 19:46:53 also there is a tmp eventlet patch somewhere that seems to help with the bug we have been seeing 19:47:02 ironic must have landed that patch with a CI failure from us I presume ? 19:47:17 https://review.openstack.org/#/c/109543/ 19:47:36 lifeless: it did land with a CI failure (but unrelated) 19:48:55 there was successful tripleo CI job on the patch on 7/22 19:49:05 #link https://review.openstack.org/#/c/89884/ 19:49:11 anyone mind if I +A 109543? 19:49:21 so maybe it's a combination of things, etc 19:49:28 requires more investigation 19:49:29 slagle: ya, I noticed that, I *think* its because the interface in question wasn't being used on that date, until another commit merged in 19:49:36 derekh_: ah, ok 19:50:13 derekh_: so thats my point, there must be some failure somewhere ;) 19:50:22 derekh_: wouldn't that mean the commit using the interface would also have to revert? 19:50:51 jp_at_hp: yup, so maybe we need to test a double revert 19:51:22 or just the one triggering the use 19:51:26 thats the latest one 19:52:09 lifeless: yup it would make sense but I couldn't find it 19:53:22 ok, let's pick up the ci topic in #tripleo post meeting 19:53:28 before the meeting times out 19:53:36 #topic tuskar 19:54:01 anything of note? 19:54:06 nothing in particular 19:54:10 ok :) 19:54:13 #topic specs 19:54:16 just the earlier comment about the dashboard 19:54:35 we had a fair amount of discussion about specs at the midcycle 19:54:45 i was going to summarize that in a note to the list 19:55:03 also, as requested i pulled together a draft of the spec approval checklist 19:55:11 slagle: can you also note the thread about voting on a weekly spec minimum? 19:55:12 #link https://wiki.openstack.org/wiki/TripleO/SpecReviews 19:55:43 jdob: you mean right here? or in the email? 19:55:49 in the email :) 19:55:52 jdob: ok 19:55:55 i wondered if that wasn't clear after I hit enter 19:56:10 lifeless: ^ the wiki page about specs approval i said i'd pull together 19:57:06 lifeless: also i'd like to ask for some clarification around approving specs in relation to a "juno priority" 19:57:22 lifeless: if you could see my reply on https://review.openstack.org/#/c/94876/ 19:57:48 i think that's something we need to have a consensus on as a group... 19:58:02 but will bring it up in the email 19:58:20 #topic open discussion 19:58:53 i think the midcycle was productive :) 19:59:04 +1 19:59:23 slagle: I will reply there 19:59:30 thanks 19:59:35 see everyone in #tripleo 19:59:45 #endmeeting