15:00:44 #startmeeting stable 15:00:47 o/ 15:00:48 Meeting started Tue Mar 22 15:00:44 2016 UTC and is due to finish in 60 minutes. The chair is mriedem. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:50 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:52 The meeting name has been set to 'stable' 15:00:59 #link agenda https://wiki.openstack.org/wiki/Meetings/StableTeam#Agenda 15:01:16 hi, anyone around? 15:01:16 hi 15:01:30 o/ 15:01:50 will wait a couple minutes 15:01:58 ihrachys: tonyb: ping stable team meeting 15:02:05 jroll: ping 15:02:09 o/ 15:02:17 I thought one o/ is enough :) 15:02:23 hi 15:03:05 ok, let's get started 15:03:17 #topic status 15:03:26 #link stable-tracker etherpad https://etherpad.openstack.org/p/stable-tracker 15:03:48 the liberty section in there needs some cleanup, e.g. the trove stuff is fixed 15:03:56 #action mriedem to cleanup stable-tracker etherpad 15:04:23 most recent issue was cryptography 1.3 release broke glance unit tests 15:04:31 a 1.3.1 release is out which fixes the problem 15:04:36 a blacklist will work it's way through g-r 15:05:00 there is one item for kilo, which is an old item, just wasn't on here 15:05:05 mriedem, I had report of nova-api failure even w/ 1.3.1, waiting for details 15:05:18 apevec: on liberty? 15:05:27 unit tests? 15:05:36 not sure if test env was clean, not even sure which nova version was it 15:05:40 ok 15:05:46 so the kilo issue is ironic 15:05:50 it was w/ packstack using RDO trunk repos, will get back w/ more 15:05:51 hence me summoning jroll 15:05:57 yar. 15:06:03 here is an example https://review.openstack.org/#/c/271391/ 15:06:12 the dsvm ipa jobs seem to be totally broken 15:06:17 and have been since the last kilo release 15:06:18 so we noticed this a while back 15:06:26 some folks tried to dig into it, didn't get very far 15:06:31 got distracted with other things it seems 15:06:39 yeah, i tried backporting this https://review.openstack.org/#/c/277921/ 15:06:49 but one of the jobs was still failing even with that 15:06:58 and now ianw has a -1 on both backports 15:07:03 jroll, would anyone scream if ironci kilo were EOLed ? 15:07:08 like right now 15:07:30 well i think there are a couple of options, 15:07:34 apevec: nobody has screamed about it being broken, so I would think not, but who knows 15:07:47 right now https://review.openstack.org/#/c/271391/ has 5 failing jobs, 2 are non-voting 15:07:47 I'm also happy to dig into this once we get our mitaka stuff out the door this week 15:07:58 when i had https://review.openstack.org/#/c/277921/ as a dependency, there was one failing job 15:08:23 so the non-voting jobs could be ignored probably for kilo 15:08:49 and the other remaining failure could be investigated, but someone would have to work those backports with ianw's -1 on them 15:09:07 although if he has a problem with the backports, then he probably has a problem with the devstack plugin change in ironic from which they originated 15:09:20 right 15:09:35 i would expect the ironic team to work that with ianw if that's the option to take 15:10:12 there would still be the matter of the single failing job - would need investigation. if it's just a busted job, then we could skip it for kilo i gues 15:10:32 gate-tempest-dsvm-ironic-pxe_ipa-ipxe was the job what was still failing btw 15:10:39 http://logs.openstack.org/91/271391/2/check/gate-tempest-dsvm-ironic-pxe_ipa-ipxe/2fe3646/ 15:10:46 * jroll looking 15:11:08 another alternative is to eol ironic before 5/2 15:11:21 i'd prefer not doing that if possible 15:11:33 sure 15:12:07 the failure in that job though is a result of the dependent patch to restart n-cpu 15:12:09 http://logs.openstack.org/91/271391/2/check/gate-tempest-dsvm-ironic-pxe_ipa-ipxe/2fe3646/logs/devstacklog.txt.gz#_2016-02-09_20_12_13_294 15:12:12 huh, that's odd http://logs.openstack.org/91/271391/2/check/gate-tempest-dsvm-ironic-pxe_ipa-ipxe/2fe3646/logs/devstacklog.txt.gz#_2016-02-09_20_12_13_299 15:12:14 yeah 15:12:32 curious why is_service_enabled would return that 15:12:33 http://logs.openstack.org/91/271391/2/check/gate-tempest-dsvm-ironic-pxe_ipa-ipxe/2fe3646/logs/screen-n-cpu.txt.gz?level=TRACE#_2016-02-09_20_11_21_954 15:12:47 it's failing to restart b/c it can't connect to keystone, 15:12:52 which is what that restart fix is for 15:12:59 O_o 15:13:04 hm. 15:13:22 anyway, yeah I think we can probably pursue this 15:14:19 jroll: ok, so it'd be good to have some notes or a plan in the stable-tracker etherpad under this issue 15:14:25 it's now on my todo list. 15:14:27 so that by next week we can assess 15:14:38 we're getting mitaka out the door this week, won't be able to hit this until thursday or friday 15:14:40 #action jroll and company to work the kilo failures for ironic and come back next week with results 15:14:45 that's fine 15:14:49 same time next week? 15:14:59 2100 on monday i think 15:15:05 Monday March 28th at 2100 UTC in #openstack-meeting-4 (mriedem to chair) 15:15:22 okay 15:15:24 i'll add that dependency back to the ironic kilo change and see what progress is made 15:15:39 ok, thanks jroll 15:15:52 np, thanks for bringing it up 15:15:57 #topic action items from previous meeting 15:15:59 <3 surprise meetings 15:16:03 heh 15:16:11 1. mriedem to talk to Daviey, ttx and dhellmann about a stable/kilo release 15:16:21 The next kilo release is scheduled for 2016-05-02 (which is also the scheduled EOL date). Kilo is still a coordinated release and historically CVEs have not triggered those releases so we'll wait until 2016-05-02. Note that we'll still be talking at the summit about extending stable/kilo another 6 months, but we might have to consider EOL for some projects that aren't passing Jenkins anymore, e.g. Ironic 15:16:41 2. mriedem to start prodding other projects like keystone and cinder to apply for the stable:follows-policy tag 15:16:46 cinder and keystone were poked last week, so far no governance changes for the stable:follows-policy tag on those projects. 15:16:56 bknudson: ^ you might want to do that for keystone 15:17:06 I'll put it on the agenda for today 15:17:10 bknudson: thanks 15:17:26 i already poked cinder once, i'm not sure i feel like doing it again 15:17:45 mriedem: this kilo thing is why I haven't applied for that tag fwiw :P 15:18:06 jroll: ok, probably a good thing b/c i'd -1 for it :) 15:18:09 :) 15:18:17 #topic release news 15:18:23 only one item 15:18:26 designate 1.0.2 release request for liberty: https://review.openstack.org/295527 15:18:31 i've already +1ed that 15:18:40 #topic tagging 15:18:54 i counted 4 open reviews to the governance repo for the stable:follows-policy tag 15:18:58 designate: https://review.openstack.org/#/c/289513/ 15:19:08 i'm leaning toward +1 on that 15:19:11 o/ sorry late 15:19:20 now that desginate has a periodic-stable job running and passing 15:19:31 and from what i've seen their stable/liberty release requests have had clean change logs 15:19:51 i've struggled a bit in reviewing requests for the tag from projects that i've had little exposure to, 15:20:06 so i've tended to trust them at the outset and then if they break policy the tag could be removed as a stick 15:20:14 +1 15:20:27 2. neutron + friends: https://review.openstack.org/289970 15:20:41 i need to get back to that one 15:20:48 neutron stadium scares me for the stable branch tag 15:20:54 too many cats to herd 15:20:58 mriedem: for now it's just core repos 15:21:05 ihrachys: ok, that should help 15:21:11 i see armax is -1 15:21:11 mriedem: but there is a question from armax on what we do with previous tag 15:21:18 probably valid to follow ihrachys lead on those 15:21:31 yeah i'm deferring to ihrachys on that one mostly 15:21:43 ttx: i forget what we said about replacing the has-stable-branches tag 15:21:54 should we leave it in, or remove it? 15:21:59 mriedem: do we want to replace tags? 15:22:04 mriedem: waiting until follows-policy is mostly applied and then remove it completely 15:22:10 mriedem: or do we want to simply add the follow-policy tag? 15:22:11 ttx: ok 15:22:14 armax: yeah for now 15:22:22 but why? 15:22:25 armax: sounds like we'll do a mass cleanup later 15:22:30 mriedem: ah, ok 15:22:46 mriedem: so others will have the tag replaced too? 15:22:51 yeah 15:22:52 mriedem: if that’s the case, then it’s fine 15:22:59 armax: has-stable-branches only means "has branches named "stable/*" and then the git repo is more accurate to determine that 15:23:08 ttx: understood 15:23:18 ttx: that was the reason for my question/-1 15:23:19 also has no value for downstream consumers 15:23:29 armax: so I guess you can revert the vote now :) 15:23:35 (since having a stable branch doesn't mean you follow any policy) 15:23:37 ihrachys: done 15:23:41 thanks! 15:23:53 3. oslo: https://review.openstack.org/292633 15:24:10 so i have some concerns in that one 15:24:14 but oslo is odd 15:24:33 what's odd about oslo? 15:24:40 most of the stable branch violations in my mind for oslo have not been bad backports, which is the policy, but making backward incompatible changes on master which broke stable 15:24:53 which really falls more under lifeless' spec for backward compat 15:25:09 mriedem: yeah, because we consume master releases in stable 15:25:17 we *did* consume 15:25:17 but when oslo lib foo on master releases something which breaks stable branches, it sticks in my mind as breaking stable policy 15:25:22 ihrachys: yeah, did 15:25:35 so it's become less of an issue since upper-constraints in liberty is frozen 15:25:46 right. assuming projects adopted them :) 15:25:54 not sure whether too many projects did 15:26:03 ihrachys: probably more so in mitaka 15:26:09 at least for unit test jobs 15:26:17 yes. though from neutron side, it's all constrained in liberty 15:26:37 also on that oslo one, there was a release request for kilo and oslo.messaging last week which i -1'ed b/c it would have bumped the minimum required version of a runtime dependency, 15:26:45 which is against stable policy really 15:27:00 so the release request was dropped, but that effectively kills stable/kilo for oslo.messaging 15:28:01 mriedem: then shouldn't we revert whatever caused the bump? 15:28:08 ihrachys: that's the other option 15:28:13 which i pointed out in the release request 15:28:28 anyway, i've -1'ed the stable:follows-policy request for oslo on the grounds of the oslo.messaging thing 15:28:45 since the +1 votes are rolling in 15:28:55 4. trove: https://review.openstack.org/#/c/295733/ 15:29:09 the trove one is adding several tags, i've asked that they split that up 15:29:33 with the recent revert of the api change for the slave_of removal stuff, i'm a bit sore on the trove one yet 15:30:16 but that's more about backward compat again than stable policy 15:30:20 so i might be too harsh here 15:30:39 anyway, if people can help review those i'd appreciate it 15:30:44 moving on 15:30:51 #topic stuck reviews 15:31:00 there were none on the agenda, did anyone have something for this? 15:31:15 #topic tooling 15:31:21 1. script to check for unreleased changes https://review.openstack.org/#/c/270273/ is merged; zuul cloner dumps a lot of info logging and it would be nice to disable that... 15:31:42 i need to start using that to see what other projects should consider a liberty release 15:31:49 2. TODO: script to check for backport-potential bugs which are fixed on trunk but backports are not yet proposed; markus_z might have something for this in nova. 15:32:01 ihrachys: ^ i forget but you might have something for that one too 15:32:25 3. TODO: reviewstats; there is a 'stable' group but it's only for stable-maint-core, there isn't an easy way (that I can see) to check for stable branch reviewers on a given project (would have to add it). 15:32:38 i've worked around ^ 15:32:49 by hacking reviewstats and running it 15:32:53 for the right project and core team 15:33:26 e.g. designate was requesting we add someone to designate-stable-maint, i ran the stats and found that person hadn't reviewed anything in stable for designate in 90 days or something 15:33:28 so i nacked it 15:33:56 ^ not a fun conversation btw 15:34:04 since it was the PTL making the request 15:34:18 #topic open discussion 15:34:28 there was nothing on the agenda, does anyone have anything for this? 15:34:34 nope 15:34:37 no 15:34:46 mriedem: re the tool, I have a tool that removed -potential tags from bugs that are merged in corresponding branch 15:34:53 ihrachys: ah ok 15:35:09 and I need to get back to it and move it to release-tools. 15:35:10 i guess one quick reminder is that we'll have a fishbowl session at the summit about stable branch EOL, 15:35:16 i haven't yet seen what time that is 15:35:16 you may add an action for me :) 15:35:49 #action ihrachys to add tool that removes -backport-potential tag from bugs merged in corresponding branch to release-tools 15:36:11 ok, well i'll wrap this up, thanks everyone for joining! 15:36:20 even against some people's wills :) 15:36:23 #endmeeting