16:01:59 <rm_work> #startmeeting Octavia
16:02:00 <openstack> Meeting started Wed Jan  8 16:01:59 2020 UTC and is due to finish in 60 minutes.  The chair is rm_work. Information about MeetBot at http://wiki.debian.org/MeetBot.
16:02:01 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
16:02:03 <openstack> The meeting name has been set to 'octavia'
16:02:08 <haleyb> o/
16:02:27 <johnsom> o/
16:02:28 <rm_work> Welcome back folks
16:02:35 <johnsom> I did put a quick agenda together
16:02:48 <johnsom> #link https://wiki.openstack.org/wiki/Octavia/Weekly_Meeting_Agenda#Meeting_2020-01-08
16:03:20 <cgoncalves> hi
16:03:35 <gthiemonge> hi
16:03:39 <rm_work> Ah, you want to drive?
16:03:45 <rm_work> I'm on mobile
16:03:48 <johnsom> Sure
16:03:57 <johnsom> #topic Announcements
16:04:06 <johnsom> I just have two things here
16:04:14 <johnsom> Happy New Year!
16:04:25 <johnsom> and an FYI:
16:04:34 <johnsom> Python-octaviaclient 2.0.0 release proposed - First release to no longer support python 2.x
16:04:53 <johnsom> #link https://review.opendev.org/701449
16:05:13 <johnsom> I bumped the client to 2.x.x. to reflect that it no longer supports python 2.x
16:05:22 <johnsom> Any other announcements today?
16:06:16 <johnsom> #topic Brief progress reports / bugs needing review
16:06:24 <johnsom> Ok, moving on.
16:06:40 <johnsom> I have been on holiday for a few weeks, so not a lot to report.
16:06:57 <rm_work> Same kinda
16:07:51 <cgoncalves> idem
16:08:17 <johnsom> I am again focused on fixing the issues with the failover flow. I have standalone failover in a pretty good place now, just need to wrap that up, then I will enhance it for active/standby.  Still a lot of work to do here around act/stdby, updating for the v2 flows, and test updates.
16:09:01 <johnsom> That and internal work will probably be my focus for the next week.
16:10:23 <johnsom> I did allocate some review time yesterday as well, trying to catch up on what happened over my time off
16:11:08 <johnsom> Any other updates this week?
16:11:25 <haleyb> I've been trying to continue the py2 deprecation work, and doing a little deprecation cleanup
16:11:42 <johnsom> Yes, thanks!
16:12:06 <haleyb> also noticed there should be some follow-on once the jobboard code merges to get some of the constants into octavia-lib perhaps
16:12:22 <cgoncalves> I'll be spending time this week with reviews and internal work
16:12:30 <johnsom> Yeah, I have three open patches for the constants, though they are very out of date now....
16:13:09 <haleyb> johnsom: ack, i'll try and find them before opening new ones
16:13:10 <johnsom> haleyb The chain starts here:
16:13:12 <johnsom> #link https://review.opendev.org/#/c/617015/
16:14:02 <johnsom> Those kind of got caught up in the Octavia velocity and got out of date quickly
16:14:04 <haleyb> 1 year, 1 month ago, yikes!
16:14:29 <johnsom> Sadly I have older open patches... lol
16:15:26 <johnsom> We may want to wait on those until the jobboard -> dict work merges as they will definitely conflict.
16:16:14 <haleyb> johnsom: yeah, there's a lot of updates in those patches, should definitely go first
16:17:37 <johnsom> Any other updates, or should we move on?
16:18:04 <johnsom> #topic Open Discussion
16:18:15 <johnsom> Any other topics today?
16:18:23 <haleyb> i have one...
16:18:32 <haleyb> http://grafana.openstack.org/d/p6DsYOBWz/octavia-failure-rate?orgId=1
16:19:09 <haleyb> the grafana dashboard merged last month actually, so we can look back a while in failure history
16:19:22 <haleyb> gate is at the top, check the bottom
16:20:00 <johnsom> lol, doesn't look like it saw a lot of runs over the break.
16:20:05 <haleyb> it's split into failures on the left, number of jobs on the right
16:20:11 <haleyb> or over the weekend(s)
16:20:35 <cgoncalves> thanks for working on the dashboard, haleyb!
16:20:55 <johnsom> Yeah, thanks for updating those
16:20:59 <cgoncalves> do you have the link to the dashboard patch handy?
16:21:04 <haleyb> and just an fyi that there can be failures in the check queue that are simply "bad" patches, but at least there's some data there
16:21:33 <haleyb> https://review.opendev.org/698994
16:21:35 <johnsom> Right
16:21:39 <cgoncalves> thanks
16:21:47 <johnsom> #link https://review.opendev.org/698994
16:21:55 <cgoncalves> which jobs do we want to see reported in the dashboard?
16:22:00 <johnsom> Are there graphs for the periodic jobs as well?
16:22:09 <haleyb> at least there's info now for when you want to move something to voting
16:22:10 <johnsom> I think the old dashboard had that
16:22:36 <haleyb> i don't remember finding much info on periodic jobs in graphite
16:23:09 <haleyb> are there still any post-zuulv3 migration?
16:23:12 <johnsom> It was on line 28 in the old dashboard config file in the patch you just posted
16:23:28 <johnsom> Oh yes
16:23:54 <johnsom> We have narrowed the image builds:
16:24:01 <johnsom> https://github.com/openstack/octavia/blob/master/zuul.d/projects.yaml#L70
16:24:16 <haleyb> i'm having either ffox or gerrit load issues so can't see the review
16:24:53 <johnsom> There are also a bunch of bit-rot jobs and oslo cross-project jobs
16:25:26 <haleyb> if i can find the info in graphite i'll add a periodic job panel
16:25:49 <cgoncalves> octavia has a lot of jobs. would it be of interest to still add missing jobs to the dashboard? like barbican, active-standby, spare pool, amphora v2, etc.
16:26:07 <johnsom> Ok, thanks. I was trying to get a zuul health page that had the list, but they have changed the UI again, so it's taking me a bit to find
16:26:11 <haleyb> if you haven't gone there, https://graphite01.opendev.org/ has a lot
16:27:04 <haleyb> cgoncalves: yes, i'd consider that dashboard a work in progress as we add/remove jobs
16:27:14 <haleyb> or just have ones i missed
16:27:43 <cgoncalves> haleyb, ok. simply trying to understand if too many jobs is a good/desired or bad/undesired thing
16:27:48 <johnsom> Here are some:
16:27:50 <johnsom> #link https://zuul.openstack.org/builds?project=openstack%2Foctavia&pipeline=periodic#
16:28:09 <johnsom> I don't see the unit test runs on the stable branches there though.
16:29:05 <johnsom> I wonder if those got dropped/moved in the parent zuul configs and we don't have those any longer.
16:29:25 <johnsom> We have been backporting a lot, so at least we have had coverage on them
16:30:30 * haleyb shrugs
16:31:15 <johnsom> On another topic, I would like to ask that we all review the Protocol validation patch:
16:31:17 <johnsom> #link https://review.opendev.org/#/c/594040
16:31:33 <johnsom> It's #2 on the priority list and we have been getting more questions about it recently.
16:31:41 <johnsom> This has also sat for way too long IMO.
16:32:28 <johnsom> I think it also needs an update and I'm not sure the original author is still working on it, so if someone wants to pick that up..... Please consider it
16:32:41 <cgoncalves> agreed. will review it this week
16:32:59 <cgoncalves> I can help with that, I think
16:33:56 <johnsom> Cool, thank you!
16:34:04 <cgoncalves> I have a question related to 3rd party providers
16:34:06 <johnsom> I will also carve some review time today
16:34:44 <cgoncalves> is octavia open to have 3rd party CI jobs reporting in Gerrit? if yes, voting or non-voting?
16:35:08 <johnsom> There is a process for third party jobs. Let me find a link:
16:35:11 <cgoncalves> IMO 3rd party CI jobs should always be non-voting
16:35:36 <johnsom> #link https://docs.openstack.org/infra/system-config/third_party.html
16:35:50 <johnsom> I have run third party CI for octavia in the past.
16:36:02 <johnsom> This process works very well
16:36:37 <cgoncalves> thanks
16:36:54 <cgoncalves> so, skimming that page it looks like they should always be non-voting. good.
16:37:09 <johnsom> As for voting, yes, since the third party driver code is out of the main project tree, and typically requires equipment/code/licenses that upstream does not have access to, they should be limited to non-voting.
16:37:39 <johnsom> I.e. There isn't much we can do to fix the third-party code when it breaks.
16:38:19 <cgoncalves> what about Zuul jobs by 3rd party providers available in Opendev Gerrit, how should they be set up?
16:38:52 <cgoncalves> because in that case I think they wouldn't be external CI, or whatever they are called
16:39:12 <cgoncalves> instead they would have to be added to octavia check queue
16:39:18 <johnsom> Yeah, that is a new situation for sure.
16:39:55 <johnsom> My vote would be to add them at the bottom of the list (clearly labeled thrid-party somehow) and leave them non-voting.
16:40:23 <johnsom> There is value to having them such that we can make sure we don't break them if they are following our guidelines for providers.
16:41:22 <cgoncalves> sounds good
16:41:37 <johnsom> Those are my opinions, what do others think?
16:44:18 <johnsom> Ok
16:44:21 <haleyb> +1, all neutron reviews list third-party jobs in a sepearte section
16:45:40 <haleyb> at least the cloudbase one is separate
16:47:23 <cgoncalves> thanks for the input!
16:49:16 <johnsom> Other topics for today?
16:52:26 <johnsom> Ok, thank you folks! Chat with you next week.
16:52:35 <johnsom> #endmeeting
16:52:50 <johnsom> hmmm, rm_work may need to end the meeting
16:53:26 <johnsom> I bet all of my topics didn't log either
16:53:37 <rm_work> Really... Eugh
16:53:53 <rm_work> #endmeeting