08:00:09 #startmeeting ptl_sync 08:00:09 Meeting started Tue Sep 2 08:00:09 2014 UTC and is due to finish in 60 minutes. The chair is ttx. Information about MeetBot at http://wiki.debian.org/MeetBot. 08:00:10 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 08:00:13 The meeting name has been set to 'ptl_sync' 08:00:14 #topic Nova 08:00:24 #link https://launchpad.net/nova/+milestone/juno-3 08:00:32 so, yeah, blueprints, yuck 08:00:51 #info 15 implemented, 45 in review 08:01:03 so some improvements 08:01:09 That's only +7 in one week, and we have 0-2 days to go 08:01:16 mikal has been pinging people directly to review the top few 08:01:37 there are one or two almost there, the rest, well we probably should start to chop them I think 08:02:06 I guess anything without a +2 at this point, is not going to make it, bar a miricle 08:02:30 Yes, we should allow the last 2 days for gate processing 08:02:39 (+7 in one week is probably the best we have managed all release, buy yeah...) 08:02:53 s/buy/but/ 08:03:15 So looking at the top ones... 08:03:21 https://blueprints.launchpad.net/nova/+spec/add-ironic-driver 08:03:21 OK 08:03:54 I am told its closer than it looks 08:04:24 ok, so we keep it on the j3 list 08:04:33 I think we have to, at this point 08:04:48 https://blueprints.launchpad.net/nova/+spec/compute-manager-objects-juno 08:04:58 This one we have to consider "complete at some point 08:05:02 yeah, this one is ongoing, yeah 08:05:11 I think thats probably today-ish 08:05:12 what's in-flight already approved for it ? 08:05:37 I think some stuff was mostly with −1 from something on it 08:05:42 but loads has merged 08:05:59 https://review.openstack.org/#/q/topic:bp/compute-manager-objects-juno,n,z 08:06:21 I will try catch dansmith when he comes online 08:06:32 (and nikola) 08:07:00 https://blueprints.launchpad.net/nova/+spec/vmware-spawn-refactor 08:07:05 yeah, ideally we would only pursue the ones that make it a bit more logicaly complete 08:07:09 I approved a chuck of those yesterday 08:07:29 ttx: yeah, good point, but not sure its really that neat right now, will check 08:07:57 i.e. what REALLY needs to merge so that Juno looks like something vs. what could merge 08:08:29 Is it all contained in ?ttps://review.openstack.org/#/q/topic:bp/vmware-spawn-refactor,n,z 08:08:37 Is it all contained in https://review.openstack.org/#/q/topic:bp/vmware-spawn-refactor,n,z ? 08:08:42 well, probably nothing in here, it was more high priority because its blocking loads of the other vmware bits 08:09:11 I think the answer is kinda of no, but it has the most important bits after the oslo patch, I think 08:09:33 I think I probably block all the patches not on that list 08:10:00 basically, starting tomorrow morning I think we should defer everything that is not already approved and in-flight 08:10:05 right 08:10:11 thats a good line to draw 08:10:13 that leaves today for the last +2s 08:10:23 sounds fair 08:10:38 well, agreed that still makes it very tight 08:10:38 and to trace a line in the sand for ongoing stuff like the objects work 08:11:11 So for this morning I would defer/-2 everything that is just far from makign it 08:11:28 yeah, stuff with −1s etc 08:11:34 only keep the stuff that is 80% there at least, and which could possibly be approved today 08:12:13 right, the lows I have no idea on right now, but the mediums are mostly not done, from what I saw yesterday 08:12:33 johnthetubaguy: want us to go through them ? 08:12:37 (the mediums) 08:12:50 maybe pick some examples 08:13:23 actually, not sure its worth it 08:13:32 some have −2s with fix up this patch first, etc 08:13:34 hmm yeah 08:13:45 they can go 08:14:05 wondering if https://blueprints.launchpad.net/nova/+spec/per-aggregate-disk-allocation-ratio is not completed 08:14:09 some just have, sorry on holiday will fix this next week notices on them, but the boat has kinda sailed now 08:14:24 oops, yeah, I merged that, so if its through the gate now, we should be good 08:14:42 yep, done, thanks 08:16:01 there some API ones that most people seem to have avoided reviewing, but will see how they are looking this morning 08:16:33 yeah, no other obvious candidate 08:16:46 anyways, I think I know what we have to do I guess, just been putting off the mass −2 stuff 08:17:09 so yeah, we should probably communicate that anything that's not in-flight by tomorrow morning will be deferred 08:17:15 maybe I should just send that overall 08:17:23 oh, that would be cool 08:17:38 Like, it's Juno feature day 08:17:54 + stuff looking blocked at this point, will probably get deferred today, I guess 08:18:00 yep 08:18:13 awesome 08:18:26 at least it will feel consistently painful, if that makes sense 08:18:29 OK, I think we'll use the meeting today to go through the list again 08:18:37 so I won't ask for topics 08:18:41 :) 08:18:46 ah, thats a good plan 08:19:07 about summit stuff, when is that due to open up? 08:19:08 johnthetubaguy: anything else? 08:19:24 We still need to decide how we want to handle that part 08:19:30 just saying, because we are thinking of requiring specs for feature like submitions 08:19:53 I feel like it's a bit unproductive to open a general CFP 08:20:04 I think teams working on etherpads can do a better job at it 08:20:27 yeah, I guess we need to make that feel as open, and give people feedback somehow 08:20:31 but first I need to check if the space allows for the setup I proposed 08:20:31 but maybe thats just a meeting 08:20:38 I should know about that today 08:20:42 ah, cool 08:21:07 anyways, I am sure you have that all in hand, just checking timescales really 08:21:26 well, the original plan was to open yesterday 08:21:40 but since we're talking about format changes 08:21:44 ... 08:21:53 OK 08:21:56 ok, will send that email now 08:22:03 talk to you later? 08:22:17 later is fine 08:22:56 yeah, I will trawl through the list, and see what I can do to tidy things up a bit 08:23:17 thanks for the help, as always :) 11:45:37 ttx: knock, knock ... back to our usual 1:1 slot? 11:46:01 yes 11:46:07 #topic Ceilometer 11:46:19 BTW hat-tip to gordc for keeping on top of things while I was on vacation 11:46:20 #link https://launchpad.net/ceilometer/+milestone/juno-3 11:46:27 yes, he did a very good job! 11:46:28 LP doesn't quite capture the full status 11:46:37 4 of 11 BPs merged, but another 3 are approved and wending their way thru' the gate as we speak 11:46:48 which ones? 11:46:48 which leaves 4 outsanding (1 high, 2 medium, 1 low) 11:47:01 which ones are gating? 11:47:14 So like I said in a recent email to -dev, ideally startig tomorrow morning we would only keep the ones that are in-flight 11:47:21 yes, which ones are gating 11:47:47 central-agent-partitioning, hash-based-alarm-partitioning, xenapi-support 11:47:55 because i expect it will take 1+ day to get approved stuff through, with the obvious retries 11:48:12 yeap 11:48:21 the high priority BP bigger-data-sql is a bit of a concern at this stage, as the review has stalled out on differing performance test results being seen 11:48:30 (different mysql versions etc) 11:48:40 I'll speak to gordc about that when he comes online (out for Labor Day yesterday) 11:48:59 but, fair warning, I *may* be coming cap-in-hand looking for a FFE on bigger-data-sql to allow time to get more clarity on the performance improvement 11:49:10 on the medium BPs ... 11:49:10 yeah, and that one isn't exactly the kind you would like to grant a Feature Freeze exception for 11:49:41 because it's potentially disruptive 11:49:48 yeah, I'll have a better view later on today on whether that might be required 11:49:50 and affects existing functionality 11:49:58 yeap, fair point 11:50:06 I mean, if it's just a couple extra days away, why not, but I suspect there is more 11:50:26 fair enough, I'll bear that in mind 11:50:33 on the mediums ... 11:50:41 grenade-resource-survivability is still waiting on review attention from jogo & sdague 11:50:45 (again, Labor Day yesterday not ideal) 11:50:58 paas-event-format-for-ceilometer is documentation-only, but review has stalled so I'm prepared to bump it off the juno slate if necessary 11:51:04 eglynn_: but I think we said that one can merge post-j3 11:51:11 (grenade one) 11:51:16 cool 11:51:20 maybe we can push it to rc1 already 11:51:27 yeah, that's fair 11:51:35 it's only extra tests, right ? 11:51:46 exactly, tests only 11:51:55 on paas-event-format-for-ceilometer, I've mailed Phil though to given him fair warning and an opportunity to close it out 11:51:58 ok, moving to RC1 11:52:18 on the low priority BP still outstanding ... 11:52:26 ipmi-support has patches up, but not yet in a landeable state 11:52:48 ... I'm not massively concerned about it missing tho, as it's just an alternative way of generating the same IPMI sensor data that Ironic already notifies us about 11:53:10 ok, shall be deferred tomorrow if it doesn't get approved today? 11:53:29 yeah if not in flight by EoD, definitely a candidate for bumping 11:53:58 on the j3 targetted bugs, I've bumped the higher priority bugs to RC1 that didn't look like they were going to make it 11:54:30 cdent just did a trawl thru all the outstanding reviews for the lower-priority j3-targetted bugs 11:54:34 ok, we'll just bump to RC1 anything not fixcommitted by tag 11:54:46 cool, that's reasonable 11:56:06 OK, you seem in good shape 11:56:26 so what time on Thurs are you aiming to pull the trigger on the juno-3 tag? 11:56:29 6 left, 3 in flight, 2 which might get bumped, 1 that you could ask FFE for 11:56:45 When all the features in the pipe land 11:56:46 yep, that sums it up 11:56:53 cool enough 11:57:07 ideally sometimes during the day :) 11:57:16 but highly dependent on gate load 11:57:30 BTW cdent tells me he just got a +2 from Sean on the grenade patch 11:57:42 shall we bump to rc1 the doc-only one ? 11:57:59 so we may be able to re-instate that to juno-3, as I think jogo is pretty much on board with the approach 11:58:20 yeap, let's do that re. the paas-event-format 11:58:24 if it makes it we'll retroactively readd it to j3 11:58:43 but in the meantime we should bump all the non-featury stuff 11:58:43 cool 11:58:58 cool paas-event-format-for-ceilometer bumped to rc1 11:59:19 #info In flight: central-agent-partitioning, hash-based-alarm-partitioning, xenapi-support 11:59:51 #info Shall get bumped if not approved today: ipmi-support 12:00:08 #info Might get FFE-d if not approved today: bigger-data-sql 12:00:22 We'll update that picture at the meeting tonight 12:00:40 eglynn_: thx!: 12:01:13 SergeyLukjanov: around? 12:01:16 #info may be reinstated if approved today: grenade-resource-survivability 12:01:23 ttx, yup 12:01:27 #topic Sahara 12:01:41 #link https://launchpad.net/sahara/+milestone/juno-3 12:02:06 SergeyLukjanov: is that current ? Not a lot of progress since last week 12:02:18 hm, looks like it's a bit outdated 12:02:48 SergeyLukjanov: could you update it now? We can discuss it afterwards 12:03:09 we're moving two huge blueprints to K and adding one very small, already implemented 12:03:30 it's required for backward compat support between older versions 12:03:53 I'll update the lp page right now 12:05:15 SergeyLukjanov: let me know when you're done 12:05:24 ack 12:07:35 ttx, good news - we've implemented integration w/ ceilometer (sending notifications), all user faced strings are now wrapped for translation using i18n and all sahara objects could be created using heat resources (it's in the gate now) 12:09:14 ttx, am I right that it's ok to work on docs, tests and critical bug fixes after the j3? 12:09:37 yes 12:09:49 You can actually push back doc-only and test-only blueprints to RC1 already 12:09:55 that would add clarity 12:11:53 ttx, ok 12:13:29 SergeyLukjanov: are you done with the J3 status update? 12:14:00 ttx, mostly yes 12:15:18 SergeyLukjanov: which, if any of those, is currently gating? 12:15:50 there are two bps - https://blueprints.launchpad.net/sahara/+spec/edp-swift-trust-authentication that is re-invented way for working with swwift data sources, it fix critical security issue re storing credentials 12:16:18 that one is gating ? ^ 12:16:35 and https://blueprints.launchpad.net/sahara/+spec/cluster-persist-sahara-configuration that is needed for backward compat, both could be moved to rc1 due to their "bugfix" nature 12:16:44 oh, sorry, it's not about gating 12:16:49 oh ok 12:17:14 no changes in gate right now 12:17:19 SergeyLukjanov: we would probably have to grant a FFE for the first one, but i wouldn't consider it a bugfix 12:17:23 we have a small rebase bomb exploaded 12:17:31 since you need new feature code to plug the hole 12:17:56 and probably for the second one too 12:18:06 but those would still be exceptions 12:18:09 ttx, yup, I meen that I'd like to ask for the FFE for this two bps because of their importance and "bugfix" *nature* 12:18:31 #info edp-swift-trust-authentication & cluster-persist-sahara-configuration still in progress, will probably require FFEs 12:18:33 for the second one code is ready and not approved only because of labor day in US :) 12:19:04 and spec isn't approved officially too (it was discussed on the last IRC meeting and we've agreed that we need it) 12:19:29 cluster-persist-sahara-configuration is not j3 12:19:34 should it be added there ? 12:19:51 ttx, yup, but spec isn't approved 12:20:02 add it with "Blocked" status 12:20:22 ttx, ok, thx 12:20:26 ttx, done 12:20:29 if we know we'll require it, I prefer it tracked/Blocked 12:21:14 Of the 4 "needs code review", should they all just get deferred if they are not approved today? 12:21:55 ttx, I think so 12:22:03 ok 12:22:39 #info cluster-secgroups, swift-url-proto-cleanup-deprecations, ceilometer-integration, anti-affinity-via-server-groups to be edferred if they don't get required approvals today 12:23:03 SergeyLukjanov: OK, I think that's all 12:23:09 ok, thx 12:23:23 SergeyLukjanov: thx! 12:23:27 dhellmann: around? 12:23:34 ttx: here 12:23:35 #topic Oslo 12:24:03 the new project group looks good; we probably should have done this a while back 12:24:04 dhellmann: so the only unintended sideeffect of the oslo projectgroup transition so far seems to be the BP link hack in gerrit 12:24:33 yeah, is that something we can fix in gerrit, or should we link to our blueprints differently? 12:24:54 it's a bit tricky 12:25:15 the BP name is unique in a LP project namespace, not unique in general 12:25:59 so when you say "blueprint foo-bar", gerrit rseolves that to a search for blueprints names foo-bar in the openstack projectgroup namespace 12:26:24 Like https://blueprints.launchpad.net/openstack/?searchtext=ceilometer-integration 12:26:37 which was a suboptimal hack already 12:26:44 I would be ok if we had to say "blueprint oslo/foo-bar" and gerrit found that oslo prefix 12:27:19 we could fix that by making the link smarter 12:27:37 like if no project is specified, search in the smae project as the basename of the repo 12:28:01 interesting 12:28:01 so if a review for openstack/nova contains "blueprint foo-bar" you would look for nova/foo-bar 12:28:06 right 12:28:46 that would still make it possible for us to link to blueprints across projects, which is something we do when we submit changes to infra and the requirements project to add a new lib 12:28:56 and if a review for openstack-infra/config contains blueprint oslo.rootwrap/moo then you would look for oslo.rootwrap/moo 12:29:03 right 12:29:15 that seems like a good solution to me 12:29:20 I fear that the linkification code is pretty basic though 12:29:32 might be a basic pattern subst IIRC 12:29:52 so not sure you can feed it the repo name 12:30:00 probably, but we should be able to specify 2 patterns, I would think 12:30:15 if the bp name includes / do one thing, otherwise do what we're doing now 12:30:25 ah, true 12:30:28 oh, default to openstack namespace search 12:30:32 yeah 12:30:40 that would work, I guess 12:30:53 that way all of the old links work the same way 12:31:19 i should build an oslo library release script to handle Lp duties automatically 12:31:41 that would be great to have 12:31:47 one question: in every oslo lib, would you track only released versions ? or alphas as milestones as well? 12:32:04 i.e. for olso.rootwrap, would we have a1, a2 milestones ? 12:32:16 I think it would make sense for intra-cycle planning 12:32:25 BUt then I need to think twice about the meaning of fixReleased 12:32:52 We could track alphas, and then do consolidation like we do for the openstack integrated release 12:32:53 I created juno-3 and juno-rc1 milestones for all of them so far, but now that you mention it I think we agreed to just use "next" and rename them with a version number when we do the release 12:33:25 so.. finals only ? No way of knowing that a fix/feature is available in an alpha? 12:33:54 oh, no, I mean I would run something like "release_oslo_lib.sh oslo.rootwrap 1.0.0.0a3" and it would rename "next" to that version number 12:34:04 and fix up all of the commited bugs, etc. 12:34:25 ah I see 12:34:52 would you consolidate all alphas into a single final release page ? 12:35:10 we don't necessarily know in advance if the next release is an alpha or the final, since we might only cut one alpha during a cycle for example 12:35:15 i.e. when you do the final "release" you would move all alphas bugs and features to the final 12:35:24 oh, hmm 12:35:34 I don't know if we would want that or not 12:35:40 the way it's done in the integrated release is.. 12:35:48 you have -rcX 12:35:58 then you add a 2014.2 milestone 12:36:12 move over all juno-X and juno-rcX bugs/features to it 12:36:17 and mark that released 12:36:25 ok, I guess that makes sense then 12:36:38 that way during dev you know when something is available 12:36:49 but after dev you shall forget the alphas 12:36:49 we still have release notes to see when a fix actually went in, but if consolidating at the end of the cycle is the pattern used for the apps that's good 12:37:07 yes, git history still tracks what landed when if you really need to know 12:37:35 OK, so I might need to do two scripts, one for the "next" and one for the "final" 12:37:56 yeah 12:38:00 is the final aligned on the last alpha ? i.e. the same commit ends up tagged twice ? 12:38:14 release-candidate style ? 12:38:18 I would expect that to be the case, yes 12:38:35 ok, so one alpha-release script and one "promote-alpha" one 12:38:42 we do have some libs with versions < 1.0 that aren't alphas per-se, and I guess we would treat those the same way 12:39:05 we didn't use alpha tags for some technical reason having to do with the mirror syncing 12:39:06 yeah, it wouldn't be sensitive to the actual version number used anyway 12:39:11 yeah 12:39:29 OK, sounds like a plan 12:39:36 #link https://launchpad.net/oslo/+milestone/juno-3 12:39:51 now, if that script could send email to the -dev list, that would save me an extra step, too :-) 12:40:11 we can work on that :) 12:40:24 Now I suspect a few of those should now move to their specific projects 12:40:49 https://blueprints.launchpad.net/oslo-incubator/+spec/rootwrap-daemon-mode -> oslo.rootwrap 12:40:54 will do that asap 12:41:00 yeah, I asked the library owners to clean up the bugs already via the ML 12:41:14 I don't remember if I asked for the bps too 12:41:28 first time I see this cross-project-sharing-same-milestone view. It's not bad 12:41:55 almost makes me want the libs to use the common milestones :) 12:41:56 hmm, the shared view is going to be harder if we use "next" instead of "juno-3" 12:42:01 heh 12:42:17 #link https://launchpad.net/oslo-incubator/+milestone/juno-3 12:42:24 Let's look at that one instead ^ 12:42:49 k 12:42:52 Semantic version support for pbr -> pbr probably 12:43:01 yes 12:43:19 My question is... what on this list is still oslo-incubator work taht would be subject to FF 12:43:27 I put it in the incubator because of the specs2bp script, I think 12:43:35 the "graduate" stuff we already know is not affected 12:43:53 shall we move all the "graduate" ones to rc1 ? 12:44:08 yes, that's fair at this point 12:44:12 (if it makes it this week you can retroactively target it here) 12:44:17 ok, I'm on it 12:44:27 I think the concurrency and serialization ones are almost done, but we might as well move all of them 12:44:59 * dhellmann thinks about ff 12:45:26 * ttx craetes "next" milestones for rootwrap and pbr so that we can move stuff there 12:45:52 all of this work is actually happening in a library outside of the incubator 12:47:31 actually calling the milestone next-juno 12:47:51 not juno-next like the numbered ones? 12:48:03 I guess it doesn't matter 12:48:11 no, there already is a next-juno for.. swift iirc 12:48:24 OK, reload https://launchpad.net/oslo-incubator/+milestone/juno-3 12:48:38 Is that all oslo-incubator work ? 12:48:47 the mysql one goes in oslo.db 12:48:55 Adoption of pylock file could move to RC1 too 12:49:00 the 2 owned by josh harlow go in taskflow 12:49:12 yes, we can move the adoption one to rc1 12:49:31 the policy configuration directories is an incubator change 12:49:36 not sure I can update the taskflow stuff 12:49:44 also weird series there 12:50:05 oh, I can 12:50:06 yeah, josh has been using semver numbering. I need to get him to give me access there so I can update it 12:50:31 Looks liek we get power from the projectgroup 12:50:42 ah, ok 12:51:57 OK: https://launchpad.net/oslo-incubator/+milestone/juno-3 12:52:11 and https://launchpad.net/oslo/+milestone/next-juno 12:52:38 oh, the mysql one 12:53:15 sec, brb 12:54:17 sorry, hungry cats don't understand "you need to wait a minute" 12:54:25 That leaves policy-configuration-directories 12:54:38 the policy stuff is in the incubator, so that's the right place for that one 12:55:00 should we move any of the closed items, or is that just going to end up being confusing? 12:55:04 Shall it get bumped to kilo if it fails to get reviewed/apprved today ? 12:55:26 * dhellmann looks at that changeset 12:55:28 We should move the closed items yes 12:55:41 use-events-for-error-wrapping should be in oslo.db? 12:56:03 yes 12:56:07 ok, moving 12:56:41 Now, that's simpler 12:57:26 I moved the config bug 12:58:03 and the db bug 12:58:16 ok, good cleanup 12:58:31 one question this creates is... 12:58:53 shouldn't the libs also be feature frozen and get their final alphas soon ? 12:59:07 yes, probably so 12:59:26 anything that's already been released once this cycle, at least 12:59:41 Maybe we can address that next week 12:59:54 I have a couple of releases to do, I was going to put those off to monday to avoid issues this week 12:59:54 but you should issue a fair warning to lib people 13:00:00 yeah, I'll do that this morning 13:00:17 we cna have feature-"final" alphas 13:00:25 we shall have* 13:00:39 even if we can still work on bugfixes in the coming week(s) 13:00:43 right 13:00:54 we shoudl also probably target a date for final releases 13:01:02 ideally before RC1s 13:01:25 I was thinking sept 18 13:01:27 so that we can have the requirements all set 13:01:36 Sept 18 would work 13:02:05 oslo.rootwrap is all set 13:02:16 (final feature alpha already out) 13:02:31 great! 13:02:44 dhellmann: ok, I think that's all I wanted to discuss 13:02:48 more later! 13:03:02 ok, tty this afternoon 13:15:20 ttx, still around? 13:16:22 dhellmann: yes 13:16:38 ttx: I forgot to ask about cutting stable/juno for the incubator early 13:16:55 when do you want to cut it ? 13:17:20 well, I had wanted to do it today but bnemec's comments on the ML thread made me reconsider -- did you see his reply? 13:18:03 yeah 13:18:16 I figured this could wait for a bit mre discussion 13:18:23 ok 13:24:53 ttx: I'm having second thoughts about consolidating the alpha milestones at the end of a cycle. If someone wants to see what was released in a version, can't they just look at the page for the whole series? 13:28:02 dhellmann: It's tricky. there is no single page for bugs in series for example 13:28:30 I mean bugs fixed in series 13:28:55 Only milsetone pages show the list of bugfixes and implemented features 14:02:24 jgriffith: i'm available if you are 14:02:57 ttx: i also wouldn't mind going early; need to be on the road soonish 14:04:15 dolphm: ddeal 14:04:19 #topic Keystone 14:04:36 #link https://launchpad.net/keystone/+milestone/juno-3 14:04:39 2 left 14:04:51 https://blueprints.launchpad.net/keystone/+spec/keystone-to-keystone-federation 14:05:10 that's going to be our next 48 hours ^ 14:05:24 overview of pending changes: https://gist.github.com/dolph/651c6a1748f69637abd0 14:06:01 we landed the biggest patch for k2k last night, which was blocking the rest... and the rest are much simpler 14:06:20 dolphm: ok. The trick is you need to allow time for gate processing 14:06:24 yeah :( 14:06:29 so ideally that would be all approved today :) 14:06:34 that is my goal! 14:06:40 and i think it's achievable 14:06:56 the last patch for v3 api validation is basically a nice to have add-on. the rest of that bp is done 14:07:15 i suspect we would ask for FFE on the k2k stuff if it misses 14:07:19 What about the other ? 14:07:27 for k2k, i would 14:07:51 for v3 api validation, i'd just mark it as Implemented and file a wishlist bug for the remaining patch and move it to Kilo 14:07:56 ok 14:08:19 #info keystone-to-keystone-federation still pending some reviews, would require FFE if it misses 14:08:43 #info api-validation still pending some reviews, would defer to kilo the remaining items if it misses 14:09:10 I'll just defer all bugs that don't make it to RC1, too 14:09:24 I don't see any milestone-critical stuff there 14:10:13 i also haven't triaged bugs in a week, so there might be some surprises in there. i'll review the new stuff as soon as we're gating the last change for k2k 14:10:38 OK, I think that is all. You can run! 14:10:44 thank you :) 14:10:48 ttyl 14:21:42 mestery: you can go early if you want 14:21:50 ttx: o/ 14:21:56 #tpoic Neutron 14:22:03 You caught me just after I got back from walking kids to school (first day here today) 14:22:04 #topic Neutron 14:22:12 first day today too 14:22:23 Hey, congrats on the new role ttx! Well deserved. :) 14:22:31 mestery: thanks! 14:22:41 #link https://launchpad.net/neutron/+milestone/juno-3 14:23:02 19 implemented, +9 since last week 14:23:16 Yes, we should land a few more today yet, some are close 14:23:38 So the trick is, we need to allow for gate time 14:23:40 And the 3 big ones we're tracking (L3 HA, ipset, and security group refactor) are fairly close as well, looking like ipset may need an FFE, but the other 2 look good 14:24:01 ack on gate time 14:24:11 so ideally we'd get them approved today and wait until they navigate the gate tomorrow, in time for tagging Thursday 14:24:21 Yes 14:24:33 A bit concerned with all those "High" still up 14:24:43 since it sounds like a recipe for a lot of exceptions 14:24:56 The majority of those are all going to the incubator 14:25:05 So I've left them in Juno-3 for now but we will move them once the incubator is up 14:25:07 ah, hm. 14:25:09 Group Policy and LBaaS stuff 14:25:22 how could we... get them off the list somehow 14:25:35 Create a new milestone for the incubator perhjaps? 14:25:36 creating a launchpad project is a bit early 14:25:38 Would thatm ake sense? 14:25:42 yeah, i was thinking that would work 14:25:57 I can create a "incubator" target 14:26:02 Cool 14:26:05 Once you do, I'll move stuff out of J3 14:26:07 and then we can move all the work that will escape FF there 14:26:13 ok, let's do that now 14:26:19 Awesome! 14:26:30 * mestery waits for it so he can migrate stuff in real time 14:26:55 https://launchpad.net/neutron/+milestone/incubator 14:27:18 OK, I'll move things after we're done talking so I can focus here ;) 14:27:19 thanks! 14:27:36 hmm, actually it's hard to discuss unless I see what's left after the cleanup 14:27:42 OK doing it now ;) 14:27:49 awesome :) 14:32:12 Sorry, connectivity issue there for a moment, but I am back now. 14:32:14 So, the list should be clean now of LBaaS and GBP BPs 14:32:47 The two testing BPs (https://blueprints.launchpad.net/neutron/+spec/retargetable-functional-testing and https://blueprints.launchpad.net/neutron/+spec/remove-unit-test-autodeletion) won't make Thursday and will need FFEs if we want them 14:32:57 The owners already pinged me on those, I'll move htem out for now as well. 14:34:31 ttx: How does it look now? 14:34:33 are they testing-only ? 14:34:53 i.e. they only touch tests/ code ? 14:35:12 Yes, but the owner didn't want to use scarce infra resources this week, so an FFE would make al ot of sense 14:35:16 in which case they don't need FFE 14:35:23 Ah, ok, cool! Thanks! 14:35:24 It's fine to merge extra tests or doc 14:35:27 until RC1 14:35:36 so you can move them to RC1 already 14:35:40 OK, thanks! 14:35:43 you should move them to RC1 actually 14:35:51 that will make the list more limited 14:36:06 Yes 14:36:23 It looks better now, I can likely clear a few more ones out today after talking to owners 14:36:32 Is there anything all-approved and in-flight already ? Or are those all waiting on extra reviews ? 14:36:55 Most of these have multiple patches, some approved and some in flight, and some waiting on approval 14:37:00 So, all over the board I guess :( 14:37:55 what about neutron-dvr-fwaas and ml2-hierarchical-port-binding ? 14:38:04 The other two "High" you said were in good shape 14:38:25 dvr-fwaas is ready for review, we'll need an FFE for it to make FWaaS work with DVR 14:38:42 ml2-hiearchichal-port-binding is also out, but I don't think that one qualifies for an FFE if it won't make it 14:39:00 ml2-hiearchical can likely wait for kilo if it won't make it, I'll talk to rkukura to verify that 14:39:12 Should ml2-hiearchichal-port-binding be meidum and security-group-rules-for-devices-rpc-call-refactor be high ? 14:39:24 Try to keep the ones you would ask FFE for as "High" 14:39:28 I think so, yes. 14:39:31 * mestery updates that now 14:39:54 Done 14:40:58 so.. ideally if it's not fully approved and in-flight tomorrow morning, we shoudl defer it 14:41:07 Ack 14:41:07 at least for the medium/low ones 14:41:11 Agreed 14:41:42 OK, we'll talk again at the meeting and tomorrow morning 14:42:04 #info 33 blueprints still under review 14:42:06 Thanks! 14:42:17 #info Only the 4 Highs are likely to trigger FFEs 14:42:41 #info planning to defer what is not fully approved and in-flight tomorrow morning 14:42:48 mestery: thx! 14:42:56 ttx: Thank you too! Talk to you tomorrow. 14:42:58 david-lyle: ready when you are 14:43:02 ttx: ready 14:43:07 #topic Horizon 14:43:42 #info 20 implemented, +11 since last week 14:43:59 #info 18 blueprints left open 14:44:23 david-lyle: so the general idea would be to defer what is unlikely to be fully approved today 14:44:37 david-lyle: and tomorrow we defer anythign that is not fully approved and in-fliught in the gate 14:44:51 so that we have time for retries at the gate before the tag on Thursday 14:45:22 ttx: sounds good, we have a couple of potential FFEs that may not make Thurs 14:45:23 So that means working on final approvals today 14:45:28 sure 14:45:37 david-lyle: which ones for potential FFEs ? 14:45:42 the FFEs are mostly delayed due to other service dependencies 14:45:52 IPv6 support 14:45:52 that's a good reason for them 14:46:18 maybe we can mark them "High" priority to show that we'll likely FFE them 14:46:22 and there are a few around the new glance meta-data functionality 14:47:01 #info Likely FFEs for IPv6 support and a few around the new glance meta-data functionality 14:47:07 I'll bump them then 14:47:36 Are you OK in general for deferring stuff that won't be fully-approved and in gate hell tomorrow ? 14:48:06 yeah, I'll have a better idea after the Horizon meeting in an hour 14:48:11 ok 14:48:16 then I'll start deferring bps 14:48:36 OK, we'll be in touch again at the meeting, and tomorrow 14:48:38 I think 3-4 should still land without FFE 14:49:46 sound good 14:49:50 *sounds 14:49:54 david-lyle: ok, thx! 14:50:09 ttx: thanks 14:50:28 jgriffith: you can fit in 10 minutes now if you hurry up 15:42:16 stevebaker: ready when you are 16:04:01 SlickNik: ready when you are 16:04:29 ttx: here now 16:04:43 #topic Trove 16:04:49 #link https://launchpad.net/trove/+milestone/juno-3 16:04:59 Still 4 to go 16:05:09 Are any of those already approved and in-flight ? 16:05:29 2 of them have 1 +2, and are looking for another. 16:05:59 ANother one is really close to done, and I think I'm going to have to defer one of them. 16:06:19 So, tomorrow we'll just defer anything that is not in-flight 16:06:25 that leaves today for the last push 16:07:07 ttx: sounds good — I've already communicated that to the rest of the team. 16:07:31 I'll make sure to keep the status updated as thing are on the move. 16:07:57 Anything you'd request FFE for if it can't get the requires approvals in time ? 16:08:04 We have one FFE that will probably come in after. 16:08:19 Move to oslo.messaging. 16:08:39 hmm, ok. we'll discuss that then :) 16:08:51 that's all I had. Will be talking to you again tomorrow 16:09:35 Okay sounds good. Will keep working on getting the BPs reviewed and closed out. 16:09:37 Thanks! 16:19:08 OK, will be back at 18:30 UTC 16:19:15 stevebaker: maybe we cna talk then 17:52:39 ttx: ping 18:36:13 markwash__: pong? 18:36:24 pong, indeed 18:36:45 markwash__: ready for a glance status update? 18:36:53 ttx as ready as I'll ever be :-) 18:36:59 #topic Glance 18:37:08 #link https://launchpad.net/glance/+milestone/juno-3 18:37:12 Is that current? 18:38:13 ttx I think there was some movement this past weekend I'm not certain about 18:38:22 I know the metadata definitions catalog has moved along nicely 18:38:28 it is probably "Implemented" at this point 18:38:49 async processing and "restrict users" are accurate tmk 18:39:25 markwash__: https://review.openstack.org/#/c/111483/ may still be needed for metadata def catalog 18:40:07 ah yes 18:40:14 but the prereqs have merged 18:40:36 markwash__: today is like the last day to approve features, starting tomorrow we'll cut what's not in flight and it will require FFE if you want it in Juno 18:40:43 yeah 18:41:04 so it migth be worth a few extra reviews to get them past the last hurdle 18:41:14 Anything you're likely to ask a FFE for ? 18:41:35 Actually, depending on the status, I will ask for an FFE for all of those save GPFS 18:42:06 OK, i'll bump restrict to Medium then 18:42:35 yeah that issue was in good shape except we hit some snags pulling in changes from oslo-incubator policy 18:43:15 had to cherry-pick oslo-incubator (effectively) to bring things to a workable state for juno 18:43:20 We'll do another status update tomorrow, and defer what won't land in time then 18:43:31 ttx: what time do you want? 18:43:36 and we'll discuss FFEs after J3 on Friday 18:43:44 markwash__: same as today? 18:43:49 should work 18:43:51 thanks 18:43:55 I'll try to be around 18:44:19 #info FFEs likely for everything but GPFS 18:44:37 markwash__: thanks for coming! 18:44:53 ttx: sorry I was late. . I got confused by the long weekend 18:46:06 that happens :) 19:18:28 stevebaker: around? 19:19:50 bah, I guess we'll sync during the release meeting then 19:20:00 #endmeeting