19:59:38 <xgerman> #startmeeting Octavia
19:59:38 <openstack> Meeting started Wed Dec  9 19:59:38 2015 UTC and is due to finish in 60 minutes.  The chair is xgerman. Information about MeetBot at http://wiki.debian.org/MeetBot.
19:59:39 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
19:59:42 <openstack> The meeting name has been set to 'octavia'
19:59:45 <johnsom> o/
19:59:48 <sbalukoff> Howdy, folks!
19:59:49 <blallau> o/
19:59:50 <barclaac> o/
19:59:56 <sbalukoff> (Yay for not being sick this week.)
20:00:21 <dougwig> o/
20:00:26 <xgerman> #topic Coronation
20:00:31 <xgerman> http://civs.cs.cornell.edu/cgi-bin/results.pl?id=E_cf118bc0186899ff
20:00:35 <xgerman> #link http://civs.cs.cornell.edu/cgi-bin/results.pl?id=E_cf118bc0186899ff
20:00:38 <sbalukoff> Congrats, Michael!
20:00:51 <xgerman> Congrats!!
20:00:56 <johnsom> Oh brother, there is a tiara?
20:00:58 * xgerman bows to new King of Octavia
20:01:06 <barclaac> El Presedente
20:01:07 <johnsom> Thanks!
20:01:09 <xgerman> #chair blogan johnsom
20:01:10 <openstack> Current chairs: blogan johnsom xgerman
20:01:10 <sbalukoff> I look forward to being a thorn in your side (even moreso) going forward! :D
20:01:20 <johnsom> Hahaha
20:01:31 <xgerman> and I will let johnsom take over the meeting… my work is done :-)
20:01:40 <johnsom> What is it, every rose has it's thorn?
20:01:41 <blallau> lol
20:02:02 <johnsom> Ok, so thrown into the deep end.
20:02:03 <sbalukoff> That's very sweet of you!
20:02:16 <dougwig> can we call you donald trump now?
20:02:17 <johnsom> #topic Mid-cycle in San Antonio
20:02:19 <bana_k> Hi
20:02:27 <johnsom> #link https://etherpad.openstack.org/p/lbaas-mitaka-midcycle
20:02:36 <johnsom> dougwig No
20:02:36 <sbalukoff> Hi bana_k
20:03:04 <eranra> hi
20:03:09 <johnsom> It looks like we are going to have a good showing for the mid-cycle.  Please update with your attendance and topics
20:03:09 <ajmiller_> Hi
20:03:16 <sbalukoff> Yay! eranra! Glad to see you here!
20:03:20 <bana_k> Hi sbalukoff
20:04:01 <johnsom> Any discussion/questions about the meet up?
20:04:03 <sbalukoff> Yep. Any recommendations on hotel? (I'mma be booking my travel stuff early next week.)
20:04:21 <ptoohill> o/
20:04:24 <sbalukoff> (I see the list on the etherpad...)
20:04:26 <xgerman> there is a hotel section but overall I advise against downtown hotels...
20:04:32 <sbalukoff> (I'm just asking, are people picking a specific one there?)
20:04:35 <xgerman> need some more airport choices...
20:04:38 <johnsom> There are a few on the etherpad.  The boardwalk hotels are nice I hear, but it is a drive.
20:05:03 <xgerman> boardwalk? riverwalk?
20:05:03 <johnsom> We stayed at the Hilton garden inn by the airport last time and it was fine.
20:05:07 <sbalukoff> Does it make sense to get a hotel near the Castle?
20:05:15 <sbalukoff> Or is that still a scary neighborhood?
20:05:22 <xgerman> are their any hotels?
20:05:30 <xgerman> blogan? rm_work?
20:05:33 <rm_work> o/ sorry late
20:05:49 <sbalukoff> rm_work: You shall be flogged forthwith.
20:05:58 <johnsom> We were just chatting about hotels near RAX
20:06:12 <johnsom> Any comments on those listed on the etherpad?
20:06:19 <ptoohill> Not a lot of 'great' choices near castle, but it's not a horrible choice if you find something reasonable that you'd prefer. I could look some up and link them
20:06:38 <sbalukoff> ptoohill: That would be really helpful, eh!
20:06:41 <ptoohill> Save on driving time/potential traffic
20:06:56 <rm_work> yeah the stuff right near us is more like... motels
20:06:57 <rm_work> lol
20:06:57 <xgerman> sbalukoff +1
20:06:59 <ptoohill> will do, I'm in two meetings ATM.
20:07:03 <johnsom> I hope that freeway is still not open, that was a nice little expressway
20:07:04 <ptoohill> yea :(
20:07:15 <rm_work> there is a fourpoints right next to the airport that i've stayed in before i think, it was fine
20:07:16 <eranra> To add active-active (N+1) discussions to list of topics for mid-cycle
20:07:45 <ptoohill> I'll add choices to etherpad, cause choices!
20:07:45 <johnsom> Excellent, active/active would be a great topic
20:07:46 <sbalukoff> eranra: Feel free to just edit the etherpad to do that, eh.
20:08:10 <dougwig> eranra: already on the list
20:08:20 <johnsom> #topic GSLB/Kosmos mid-cycle
20:08:21 <sbalukoff> (It's actually there, but put your name next to it if you can come to the mid-cycle.)
20:08:28 <johnsom> #link https://wiki.openstack.org/wiki/Octavia/Weekly_Meeting_Agenda#Agenda
20:08:28 <dougwig> eranra: oh, n+1. sure, tack it on the same line.
20:08:31 <bana_k> #link https://etherpad.openstack.org/p/lbaas-mitaka-midcycle
20:08:54 <rm_work> lol the drury inn right by the home depot
20:09:19 <johnsom> Kosmos is meeting the week after in Seattle.  Again, please update the etherpad if you are coming or have topics.
20:09:39 <johnsom> This will probably be a kick-off/let's get some code going meeting
20:09:43 <sbalukoff> Geez, now I'm actually tempted...
20:09:56 <sbalukoff> (I'm so sorry for y'all.)
20:10:01 <johnsom> sbalukoff it's a short walk....
20:10:08 <sbalukoff> Yep. 6 blocks.
20:10:09 <johnsom> assuming you are at the office.
20:10:21 <sbalukoff> Blue Box hasn't moved (yet).
20:10:32 <blogan> i'm here now!
20:10:47 <johnsom> #topic Progress reports
20:10:48 <sbalukoff> blogan: You shall be flogged shortly after we get done flogging rm_work.
20:11:18 <johnsom> Mine is Active/Standby merged.  Happy, happy.
20:11:24 <dougwig> is the octavia launchpad up-to-date with mitaka bugs/blueprints?
20:11:27 <sbalukoff> Ok! so! The update to the active-active spec has been written, I'm going to be working with other IBM folks to get my CR for that updated.
20:11:34 <sbalukoff> Hopefully y'all will like this one better.
20:11:47 <johnsom> I am working on updating the failover flow to support active/standby.  I have working code up, just need to finish testing and unit test updates.
20:11:52 <sbalukoff> (I want to make sure Dean on the Haifa team knows how to update that stuff in gerrit.)
20:12:06 <rm_work> Still working on the last few bits of getting the TLS workflow complete
20:12:16 <minwang2> we still need some review for tsl backport patches #link https://review.openstack.org/#/c/250068/ ,#link https://review.openstack.org/#/c/250070/, #link https://review.openstack.org/#/c/250080/
20:12:30 <rm_work> I think we just need the right people to look at them (not our team?)
20:12:36 <rm_work> since none of us can +2
20:12:42 <minwang2> rm_work you are correct
20:12:44 <sbalukoff> Also, my work on the shared pools for Octavia (requirement for L7) is humming along nicely. I expect to have a CR tomorrow or Friday, and good test coverage added to the same on Friday or early next week.
20:12:51 <johnsom> Yeah, we need to find the list of stable approvers and poke them.
20:12:52 <minwang2> but i dont know whom to contact with
20:12:56 <rm_work> hmmm
20:13:10 <rm_work> I thought we already sent out a line to see why they weren't getting reviewed
20:13:21 <rm_work> mestery around?
20:13:29 <xgerman> mestery already +2d
20:13:32 <rm_work> or maybe armax knows
20:13:39 <rm_work> ah yeah k
20:13:58 <xgerman> yeah, the new PTL needs to poke in the Neutron channel
20:14:01 <armax> rm_work: fill me in?
20:14:05 <dougwig> my guess is that the stable team is just super busy.
20:14:18 <rm_work> armax: backport patches, not sure who else we need to look at them, have been sitting for a bit
20:14:26 <rm_work> armax: min's set of three review links above
20:14:30 <johnsom> armax we have some stable/liberty patches that need to get +A'd
20:14:34 * armax looks
20:15:07 <dougwig> armax: can we consider some lbaas-only members of the stable team, similar to subcores?
20:15:07 <johnsom> I know some gate issues have slowed this down
20:15:10 <blogan> alright caught up now
20:15:34 <xgerman> dougwig +!
20:15:38 <xgerman> +1
20:15:45 <armax> dougwig: you know the my answer to this question, don’t you?
20:16:12 <dougwig> armax: yes, but i'm not sure waiting until the stadium stuff is sorted out is feasible.
20:16:12 <johnsom> Great, thanks armax, who should we nominate?  grin
20:16:32 <johnsom> Oh, not what he was thinking....
20:17:46 <johnsom> #action johnsom to track down and request reviews on stable/liberty TLS patches.
20:17:54 <armax> dougwig: I need to refresh the stable team anyway, bear with me
20:17:58 <armax> dougwig: I don’t have the rights
20:18:23 <johnsom> Any other progress reports?
20:18:30 <dougwig> armax: ty.
20:18:39 <armax> dougwig: no problemo
20:19:08 <johnsom> #topics Holiday meeting schedule
20:19:28 <rm_work> I'm basically off until January after this friday :P
20:19:29 <minwang2> i will be out from Dec13-January 1
20:19:34 <rm_work> lol
20:19:44 * rm_work high fives minwang2
20:19:48 <johnsom> Many of us are taking time off, so I was thinking have next weeks meeting and then pick up January 6th?
20:19:51 <minwang2> oh yeah~
20:19:56 <sbalukoff> I'm gone from the 17th through the end of the year.
20:20:00 <dougwig> johnsom: +1
20:20:09 <sbalukoff> johnsom: +1
20:20:15 <minwang2> +1
20:20:15 <blallau> +1
20:20:20 <xgerman> +1
20:20:22 <crc32> +0
20:20:27 <sbalukoff> And expect next week's meeting to be less attended.
20:20:33 <rm_work> I was thinking I might just work most of next week even though I'm ETO... so prolly will show up for that :P
20:20:37 <sbalukoff> So! If you want to say something important, this is the meeting in which to do so!
20:20:46 <johnsom> Yeah, I suspect it will be a short one.
20:20:55 <sbalukoff> rm_work: You are a glutton for punishment.
20:20:57 <crc32> rm_work: PTO
20:21:04 <rm_work> ah right, PTO pool
20:21:21 <rm_work> literally required to take all of it before dec 31 or I lose it >_>
20:21:38 <xgerman> well, I am off as well but might attend anyway...
20:21:44 <johnsom> #agreed Meeting will happen on Dec. 16th, then resume Jan. 6th
20:21:44 <sbalukoff> rm_work: Road trip time?
20:21:54 <rm_work> sbalukoff: flying to Bellingham tonight :P
20:21:59 <rm_work> in ... a couple hours actually
20:22:08 <rm_work> SAT->SEA-.BLI
20:22:13 <sbalukoff> Haha! Then snowboarding time! the mountains have been getting a lot of snow.
20:22:21 <rm_work> yeah looking at Baker eagerly
20:22:28 <minwang2> nice,enjoy the trip rm_work
20:22:31 <sbalukoff> Have fun!
20:22:32 <rm_work> prolly this weekend
20:22:56 <dougwig> i'd like to say thank you to both johnsom and sbalukoff for being willing to be our cat herder.
20:23:07 <sbalukoff> Thanks dougwig!
20:23:15 <johnsom> #topic OpenStack Tokyo Summit presentation feedback
20:23:21 <johnsom> Thanks dougwig
20:23:24 <blogan> whats sno
20:23:26 <sbalukoff> I would like to thank johnsom for being masochistic enough to want the job. :)
20:23:27 <blogan> w
20:23:31 <crc32> did any USB drives make it back. Just wondering.
20:23:41 <blogan> indeed!
20:23:44 <rm_work> crc32: yeah i have like 10
20:23:46 <rm_work> you want em?
20:23:50 <crc32> yes.
20:23:51 <dougwig> crc32: i stole one.
20:23:56 <johnsom> Ok, so while updating our slides on the new OpenStack page (vs. sched) I asked about the feedback.
20:23:56 <crc32> how big were they again?
20:23:59 <sbalukoff> I totally stole one as well.
20:24:03 <rm_work> I'll bring them to the office in January
20:24:04 <crc32> thats cool 20 others are missing
20:24:19 <johnsom> crc32 I think I have one too.  If you want it let me know I will mail it.
20:24:21 <rm_work> yeah we told people to keep them for the most part, a surprising number gave them back
20:24:31 <crc32> bad sign?
20:24:31 <rm_work> they were cheap and you expensed them right?
20:24:42 <sbalukoff> johnsom: That's dangerious, asking the proletariat what they think...
20:24:43 <crc32> no. I don't like dealing with people.
20:24:44 <dougwig> i think i have a big box of a10 logo'ed ones.  only 4gb, though. so if you just need to cover yourself in thumb drives or seomthing...
20:24:46 <rm_work> lol
20:24:58 <johnsom> Our LBaaS talk got one review.  Very positive: The SPEAKER(S) were knowledgeable, well organized, and effectively presented the content in an engaging manner.
20:24:59 <rm_work> crc32: you do it via the webapp :P
20:25:05 <rm_work> lol woo
20:25:08 <sbalukoff> Oh, that's great!
20:25:20 <blogan> johnsom: the lab?
20:25:29 <crc32> rm_work: Then they start asking questions then I panic and saying something dumb.
20:25:34 <johnsom> I'm suspecting that was from one of our teams...  grin
20:25:40 <johnsom> The lab, well....
20:25:41 <sbalukoff> HAHA!
20:25:47 <johnsom> Two responses:
20:26:02 <johnsom> The hardware requirements for the lab were borderline unrealisic (16+ GB of physical memory)
20:26:05 <johnsom> and
20:26:31 <rm_work> looool
20:26:36 <johnsom> worst wa's octavia, image on USB but guide of what to do only on the slides on the screen on multiple slides. most people could not follow as everyone needed another step on another slide total chaotic
20:26:55 <rm_work> heh yeah really wish we had the slides uploaded beforehand
20:26:56 <rm_work> now I know
20:27:04 <rm_work> s/I/we/
20:27:07 <johnsom> So, the rating was a bit low for the lab.
20:27:15 <sbalukoff> Well, lots of room for improvement next time, I guess. :P
20:27:16 <xgerman> boo
20:27:17 <johnsom> Only one screen with the slides and no demo screen was hard.
20:27:17 <crc32> pfft
20:27:39 <dougwig> you had lots of people show up and leave with octavia bits in their pockets. that's positive.
20:27:47 <johnsom> Yeah, it was our first time with the lab.  Mostly the second person was telling them they should have fixed hardware
20:27:50 <crc32> yea the slides were heard to read. I have 2050 vision.
20:27:56 <crc32> 20/50
20:28:08 <sbalukoff> Well...  we all know the circumstances there: We pulled the lab together last minute because we were working on getting code done until the last possible minute. :)
20:28:11 <xgerman> well, we need print outs next time...
20:28:17 <johnsom> Plus we thought we only have 14 people signed up, but many more showed up
20:28:23 <rm_work> heh yeah
20:28:26 <sbalukoff> Yep.
20:28:35 <sbalukoff> That's...  a little annoying.
20:28:45 <crc32> next time we should turn people away at the door. :)
20:28:51 <xgerman> johnsom don’t blame the customer
20:28:53 <johnsom> Anyway, that is the feedback for us.  I wanted to share.
20:29:01 <sbalukoff> xgerman: Unless the customer is a moron. ;)
20:29:11 <sbalukoff> Thanks for sharing!
20:29:14 * johnsom Looks at xgerman
20:29:23 <sbalukoff> xgerman: I'm kidding (mostly)
20:29:23 <johnsom> That wasn't my point, but anyway.
20:29:27 <xgerman> well, I take responsibility for the lab. I resigned...
20:29:36 <johnsom> Hahaha
20:29:44 <blogan> sepuku?
20:29:54 <johnsom> I guess we took effective action for our stakeholders...  grin
20:29:55 <sbalukoff> All things considered, I think the lab went OK:  After all...  the examples there worked!
20:29:59 <xgerman> blogan don’t push it
20:30:01 <sbalukoff> That's actually pretty huge. XD
20:30:21 <johnsom> Yeah, there was one guy that had not used devstack and got Octavia running, so to me that is a win
20:30:32 <dougwig> johnsom: hahahaha, that's huge.
20:30:35 <sbalukoff> There was more than one.
20:30:49 <johnsom> He was staying for the devstack hands-on after ours....
20:30:59 <sbalukoff> Hah!
20:31:08 <johnsom> #topic Open discussion
20:31:26 <xgerman> When are we cutting our M1 release?
20:31:31 <johnsom> Other topics from folks?
20:31:38 <sbalukoff> After the mid-cycle, hopefully?
20:31:39 <johnsom> That is a good question.
20:31:52 <xgerman> after the midscycle is M2 BTW
20:32:01 <johnsom> Well, M1 closed last week I think.  So after the midcycle is M2 time
20:32:05 <xgerman> sbalukoff you say we should skip M1?
20:32:08 * sbalukoff sighs
20:32:26 <sbalukoff> Well, M1 would include what we've done for active-standby, right?
20:32:35 <xgerman> I hope so...
20:32:35 <johnsom> #link http://docs.openstack.org/releases/schedules/mitaka.html
20:32:39 <sbalukoff> (Which has some fixes in the works... but are we happy with that in there?)
20:33:02 <johnsom> The missing part is fail over flow IMHO
20:33:13 <johnsom> Which is WIP
20:33:21 <sbalukoff> So, we should just plan on cutting M1 anytime then.
20:33:30 <xgerman> or skip it
20:33:35 <sbalukoff> And get failover flow in M2 (and hopefully shared pools and L7. *grin*)
20:33:44 <sbalukoff> What are the consequences if we skip it?
20:33:49 <xgerman> dougwig?
20:34:08 <xgerman> I doubt there are any. I don’t think we are integrated enough...
20:34:20 <johnsom> None really, we are independent release cycles.
20:34:29 <sbalukoff> Ok! Well, if there are no consequences, then let's save ourselves the trouble.
20:34:53 <rm_work> sometimes these releases seem so close together...
20:35:04 * xgerman thinks all the time
20:35:07 <johnsom> Yes they do
20:35:20 <sbalukoff> yeah.
20:35:24 <sbalukoff> It's pretty crazy.
20:35:40 <johnsom> Ok, so anyone think we need to cut a release now?
20:35:41 <dougwig> sorry, sec.
20:35:44 <dougwig> was reviewing.
20:35:53 <xgerman> vote?
20:35:56 <rm_work> meh
20:36:05 <johnsom> dougwig loves votes
20:36:12 <dougwig> johnsom: we are not independent anymore.
20:36:21 <dougwig> are we? if so, that needs updating, since we're the ref.
20:36:22 <johnsom> When did that change????
20:36:33 <dougwig> oh wait, nvm.
20:36:38 <dougwig> ignore me.
20:36:43 <johnsom> We put it up for Liberty, but they down voted it
20:37:10 <dougwig> if octavia is really a separate appliance type thing, it can release on its own timeline, as long as it works and is backwards compatible.
20:37:28 <sbalukoff> Good enough for me! Let's skip the M1 then.
20:37:49 <johnsom> Yeah, it's really about making sure we stay in sync with neutron-lbaas changes enough.
20:38:00 <xgerman> +1
20:38:38 <dougwig> at this point, we should need to be in sync. we should be managing those interfaces to not break each other, right?
20:38:42 <dougwig> shouldnt'
20:38:52 <johnsom> Ok, so I'm not hearing push back on skipping M1, so let's do a release later.
20:38:57 <johnsom> dougwig +1
20:39:16 <sbalukoff> dougwig: +1
20:39:52 <sbalukoff> AFAIK, shared pools + L7 is going to be the next significant change to the Neutron LBaaS <-> Octavia interface.
20:40:06 <sbalukoff> Since Neutron LBaaS isn't aware of any of the active-standby stuff (nor should it be)
20:40:25 <johnsom> Another topic, I hope to startup a bug section in these meetings.  We have a bunch of bugs that I would hope we can start working.
20:40:27 <sbalukoff> So, doing a release after that stuff is ready is a good idea.
20:40:40 <sbalukoff> johnsom: That's a very good idea!
20:40:48 <sbalukoff> After all, we're trying to become more stable, right?
20:41:12 <sbalukoff> And we have people interested in these project looking to cut their teeth on something. ;)
20:41:21 <johnsom> Yep.  Most of the stuff is minor annoyances, but it would be good to have some visibility
20:41:31 <sbalukoff> Yeah.
20:41:35 <xgerman> yep
20:41:49 <johnsom> Ok, other topics?
20:42:08 <sbalukoff> bana_k: Did you want to mention your outstanding heat stuff?
20:42:44 <johnsom> Ah yes, we tagged ajmiller_ to take a look at that
20:42:47 <bana_k> yea the stuff is looking good, I think we need more review from octavia
20:42:51 <bana_k> #link https://review.openstack.org/#/q/status:open+project:openstack/heat+branch:master+topic:bp/lbaasv2-suport,n,z
20:43:34 <rm_work> bana_k: I am *mostly* through my review queue now, I think, so poke at me if you need a review from an octavia core
20:43:34 <xgerman> bana_k I also tapped the HP Heat team to have a look
20:43:35 <sbalukoff> Schweet!
20:43:36 <ajmiller_> aye
20:43:39 <johnsom> #action Please review the heat work for LBaaSv2
20:43:42 <rm_work> though I don't know much about HEAT
20:43:55 <bana_k> awesome thanks :)
20:45:05 <johnsom> Ok, we will have another meeting next week and then pick up on the 6th.
20:45:22 <johnsom> Going one...
20:45:29 <johnsom> Going twice...
20:45:33 <johnsom> #endmeeting