20:00:06 <johnsom> #startmeeting Octavia
20:00:07 <openstack> Meeting started Wed Jan 16 20:00:06 2019 UTC and is due to finish in 60 minutes.  The chair is johnsom. Information about MeetBot at http://wiki.debian.org/MeetBot.
20:00:08 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
20:00:08 <xgerman> o/
20:00:11 <openstack> The meeting name has been set to 'octavia'
20:00:18 <cgoncalves> hi
20:00:21 <johnsom> Hi folks
20:00:34 <johnsom> nudge rm_work
20:00:49 <johnsom> #topic Announcements
20:01:16 <xgerman> board elections going on
20:01:17 <johnsom> We are past the MS2 milestone for Stein
20:01:19 <xgerman> please vote
20:01:35 <johnsom> Yes, that. Check your e-mail for your ballot
20:02:10 <johnsom> Also, python-octaviaclient was cut as part of the release cycle process
20:02:52 <johnsom> I think that is all I have for announcements, anything else I missed?
20:03:20 <johnsom> #topic Gate stability
20:03:38 <johnsom> Ok, next up I wanted to give a quick update on some of the gate issues we have seen recently.
20:04:08 <johnsom> The Ubuntu Bionic gate is failing due to some path validation issues when installing rsyslog.
20:04:17 <johnsom> I have put a DIB patch up:
20:04:23 <johnsom> #link https://review.openstack.org/629744
20:04:29 <johnsom> That fixes that issue.
20:04:46 <johnsom> Probably need to bug people for the second +2 there
20:05:03 <rm_work> o/
20:05:11 <rm_work> thanks for the nudge :P
20:05:16 <johnsom> We have also seen some issues with the multi-node and intermittently with the scenario gates.
20:05:37 <johnsom> They show as "resource in use" errors during the test cleanup phases.
20:05:51 <johnsom> This is a neutron issue that is being tracked here:
20:05:57 <johnsom> #link https://bugs.launchpad.net/neutron/+bug/1810504
20:05:58 <openstack> Launchpad bug 1810504 in neutron "neutron-tempest-iptables_hybrid job failing with internal server error while listining ports" [High,In progress] - Assigned to Slawek Kaplonski (slaweq)
20:05:59 <johnsom> and
20:06:06 <johnsom> #link https://review.openstack.org/#/c/628492/
20:06:56 <johnsom> So, until that fix merges we are doing recheck roulette
20:07:27 <johnsom> Any questions/comments on those updates?
20:08:01 <johnsom> cgoncalves I think you were looking at some issues with the centos gate. Any updates to share?
20:08:51 * johnsom wonders if he said Hi and ran...
20:08:54 <cgoncalves> johnsom, not much progress yet. it's DIB doing something I couldn't pin-point yet. I've reached out to Ian. he's helping
20:09:04 <cgoncalves> hi^2
20:09:06 <johnsom> Ok cool.
20:09:28 <johnsom> #topic Brief progress reports / bugs needing review
20:09:29 <cgoncalves> but he's also busy with DIB RHEL 8 support :)
20:09:40 <cgoncalves> https://review.openstack.org/#/c/623137/
20:10:06 <johnsom> I have been focused on wrapping up flavors support. The patch chain including the tempest tests is ~14 patches deep now.
20:10:34 <johnsom> I still have some polish to finish on the tests, etc. but it should be done this week.
20:10:45 <cgoncalves> I'll find a typo in the head patch and make you rebase all :P
20:11:07 <johnsom> It has happened a few times already
20:12:14 <johnsom> After I am done with flavors, I need to get back to doing some more reviews. I know the TLS patches are up. I have done some small reviews on some of that, but more is needed.
20:12:36 <johnsom> Any other updates today?
20:13:36 <johnsom> There are a couple of easy reviews too. doc update, updated coverage, etc.
20:13:43 <xgerman> logs
20:13:48 <johnsom> #link https://review.openstack.org/630792
20:13:56 <johnsom> #link https://review.openstack.org/629953
20:14:07 <cgoncalves> not from my side. I was traveling last week and will be OoO next 2 days at least
20:14:10 <johnsom> #link https://review.openstack.org/629955
20:15:04 <johnsom> Ok, on to the TLS barbican patch topic
20:15:13 <johnsom> #topic Discuss TLS / Barbican integration gate
20:15:20 <johnsom> #link https://review.openstack.org/628075
20:15:47 <johnsom> I took some time out of flavors to get a TLS using barbican gate put together.
20:16:00 <xgerman> cgoncalves: travel was so stressfuk that you need time off? What happened?
20:16:01 <johnsom> We have only been talking about needing this for a few years...  sigh.
20:16:34 <rm_work> T_T
20:16:48 <johnsom> Basically it follows the cookbook steps for setting up a terminated TLS LB using barbican and then passes some HTTPS traffic through it.
20:16:54 <cgoncalves> xgerman, lol, unrelated
20:17:18 <johnsom> Didn't get the depth of tan he wanted.....
20:18:05 <johnsom> Anyway, I set this up as a separate gate instead of running it with the main scenario tests. cgoncalves has some concerns about that, so I thought we should talk about it.
20:18:55 <cgoncalves> comment in https://review.openstack.org/#/c/628075/10/octavia_tempest_plugin/tests/barbican_scenario/v2/test_tls_barbican.py
20:18:57 <johnsom> My main motivation is that this test is highly dependent on barbican being functional and I felt we should isolate that from our main tests and make it easy to make non-voting, etc.
20:19:56 <cgoncalves> again, I'm not against that. I'm just making it clear that some code will not be tested against stable maintenance releases unless we also set up tls jobs with stable releases
20:19:58 <johnsom> It also means not every scenario test gate needs to install barbican in the devstack, which saves some RAM and runtime.
20:21:00 <xgerman> yeah, don’t think making us that dependent on BBQ is good - especially with things like OVN L4 which don’t need SSL
20:21:32 <johnsom> So, I'm interested in comments/discussion. We have a few options that are obvious to me: 1. Leave it as is. 2. Merge it into the main scenario jobs. 3. Leave as is but add stable branch jobs. 4. Leave as is and add periodic jobs for the stable branches.
20:23:35 <johnsom> No more discussion????
20:24:13 <cgoncalves> I shared my comment in gerrit and here. I am happy to revert my -1, not a problem
20:24:48 <johnsom> What did you think of the periodic idea?
20:25:25 <cgoncalves> I'm not a fan of periodic jobs for such things. building amps for publishing is good, but tests? :/
20:26:00 <johnsom> We run periodics for the stable unit tests, etc. now. I know everyone looks at those all the time...  grin
20:26:22 <cgoncalves> we what? :S
20:26:52 <cgoncalves> filtering octavia jobs doesn't even seem to be possible in http://zuul.openstack.org/builds
20:27:16 <cgoncalves> doesn't seem to accept regex, so we have to be sure of the full job name
20:27:29 <johnsom> #link http://zuul.openstack.org/builds?project=openstack%2Foctavia&pipeline=periodic-stable
20:27:59 * cgoncalves bookmarks
20:28:09 <johnsom> These are bit-rot jobs, basically makes sure the unit tests still pass
20:29:07 <johnsom> Some day I should fix this too:
20:29:09 <johnsom> #link http://grafana.openstack.org/d/dfOINcIiz/octavia-failure-rates?orgId=1
20:30:17 <cgoncalves> rm_work, xgerman: what's your stance with regard to the TLS job?
20:30:18 <johnsom> So what I am hearing is leave it as is? (well, enable it on the master octavia branch too)
20:30:47 <rm_work> i'm on the fence
20:31:07 <cgoncalves> I'll +2 it. no strong reason to block it
20:31:08 <xgerman> I think it should be an extra job since we have drivers (OVN) which don’t use TLS and would be tied up with BBQ which might be britte
20:31:35 <rm_work> i just feel like we have so many jobs ... but maybe i should get over it
20:31:42 <xgerman> cgoncalves: you cna leave the -1 rm_work +I can overwrite ;-)
20:31:56 <cgoncalves> I can imagine Reedip submitting a patch introducing provider_tls_enabled :)
20:32:18 <cgoncalves> xgerman, payback for the DB connection patch, eh? xD
20:32:46 <xgerman> lol
20:33:02 <johnsom> Ahead of you on that: CONF.loadbalancer_feature_enabled.terminated_tls_enabled
20:33:39 <cgoncalves> oh, right. great, thank you :)
20:34:43 <johnsom> Ok, if no one else has any more comments. I think we should merge this and consider a periodic for stable/queens and rocky.
20:35:40 <xgerman> +2
20:35:44 <johnsom> #topic Open Discussion
20:35:52 <johnsom> Ok, any other topics this week?
20:36:13 <xgerman> oh, did we talk about the deadine for Denver talks?
20:36:23 <xgerman> that should be coming up, too...
20:36:37 <johnsom> January 23rd
20:36:44 <xgerman> yeah, soon-ish
20:36:45 <johnsom> Yeah, next week
20:37:09 <johnsom> I am thinking about putting in a new features deep dive. Anyone else want in on that?
20:38:32 <xgerman> sweet — if I get to go happy to help ;-)
20:39:17 <cgoncalves> good, that would make a good session
20:39:22 <xgerman> maybe do a forum thing where each of us talsk about his favorite feature and you moderate
20:39:31 <xgerman> :-)
20:39:48 <xgerman> then we can have a tone of speakers, too :-
20:39:50 <xgerman> )
20:40:05 <johnsom> Hahaha, I will consider it.
20:40:18 <johnsom> Ok, so just German with me on the session?
20:41:14 <cgoncalves> I thought some weeks ago of a troubleshooting octavia. I get loads of presumbaly octavia bugs that turn out to be user-side or underlying infra
20:41:31 <cgoncalves> if you need assistance, you could probably count me in
20:41:32 <xgerman> yeah, my guys send one your way :-)
20:41:41 <johnsom> Right. I can help with that too.
20:41:52 <johnsom> Ok, I will put you down as well.
20:42:12 <johnsom> #link https://www.openstack.org/summit/denver-2019/call-for-presentations
20:42:13 <cgoncalves> another one was OVN driver but still kind of early
20:42:28 <xgerman> yeah, we should have redip do that :-0
20:43:25 <johnsom> Ok. Any other topics for today?
20:43:44 <cgoncalves> agreed. I need to check with him asap
20:44:17 <xgerman> if he does it I want it Marvel themed… have him wear a cape
20:44:25 <xgerman> :-)
20:45:39 <johnsom> Ok, thanks for attending. Have a great week. Please work on reviews!
20:45:47 <johnsom> I will try to update the review list today as well
20:45:48 <cgoncalves> no VMs, active-active, ... so much to gain from ovn driver
20:46:25 <johnsom> #endmeeting