16:01:59 #startmeeting Octavia 16:02:00 Meeting started Wed Jan 8 16:01:59 2020 UTC and is due to finish in 60 minutes. The chair is rm_work. Information about MeetBot at http://wiki.debian.org/MeetBot. 16:02:01 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 16:02:03 The meeting name has been set to 'octavia' 16:02:08 o/ 16:02:27 o/ 16:02:28 Welcome back folks 16:02:35 I did put a quick agenda together 16:02:48 #link https://wiki.openstack.org/wiki/Octavia/Weekly_Meeting_Agenda#Meeting_2020-01-08 16:03:20 hi 16:03:35 hi 16:03:39 Ah, you want to drive? 16:03:45 I'm on mobile 16:03:48 Sure 16:03:57 #topic Announcements 16:04:06 I just have two things here 16:04:14 Happy New Year! 16:04:25 and an FYI: 16:04:34 Python-octaviaclient 2.0.0 release proposed - First release to no longer support python 2.x 16:04:53 #link https://review.opendev.org/701449 16:05:13 I bumped the client to 2.x.x. to reflect that it no longer supports python 2.x 16:05:22 Any other announcements today? 16:06:16 #topic Brief progress reports / bugs needing review 16:06:24 Ok, moving on. 16:06:40 I have been on holiday for a few weeks, so not a lot to report. 16:06:57 Same kinda 16:07:51 idem 16:08:17 I am again focused on fixing the issues with the failover flow. I have standalone failover in a pretty good place now, just need to wrap that up, then I will enhance it for active/standby. Still a lot of work to do here around act/stdby, updating for the v2 flows, and test updates. 16:09:01 That and internal work will probably be my focus for the next week. 16:10:23 I did allocate some review time yesterday as well, trying to catch up on what happened over my time off 16:11:08 Any other updates this week? 16:11:25 I've been trying to continue the py2 deprecation work, and doing a little deprecation cleanup 16:11:42 Yes, thanks! 16:12:06 also noticed there should be some follow-on once the jobboard code merges to get some of the constants into octavia-lib perhaps 16:12:22 I'll be spending time this week with reviews and internal work 16:12:30 Yeah, I have three open patches for the constants, though they are very out of date now.... 16:13:09 johnsom: ack, i'll try and find them before opening new ones 16:13:10 haleyb The chain starts here: 16:13:12 #link https://review.opendev.org/#/c/617015/ 16:14:02 Those kind of got caught up in the Octavia velocity and got out of date quickly 16:14:04 1 year, 1 month ago, yikes! 16:14:29 Sadly I have older open patches... lol 16:15:26 We may want to wait on those until the jobboard -> dict work merges as they will definitely conflict. 16:16:14 johnsom: yeah, there's a lot of updates in those patches, should definitely go first 16:17:37 Any other updates, or should we move on? 16:18:04 #topic Open Discussion 16:18:15 Any other topics today? 16:18:23 i have one... 16:18:32 http://grafana.openstack.org/d/p6DsYOBWz/octavia-failure-rate?orgId=1 16:19:09 the grafana dashboard merged last month actually, so we can look back a while in failure history 16:19:22 gate is at the top, check the bottom 16:20:00 lol, doesn't look like it saw a lot of runs over the break. 16:20:05 it's split into failures on the left, number of jobs on the right 16:20:11 or over the weekend(s) 16:20:35 thanks for working on the dashboard, haleyb! 16:20:55 Yeah, thanks for updating those 16:20:59 do you have the link to the dashboard patch handy? 16:21:04 and just an fyi that there can be failures in the check queue that are simply "bad" patches, but at least there's some data there 16:21:33 https://review.opendev.org/698994 16:21:35 Right 16:21:39 thanks 16:21:47 #link https://review.opendev.org/698994 16:21:55 which jobs do we want to see reported in the dashboard? 16:22:00 Are there graphs for the periodic jobs as well? 16:22:09 at least there's info now for when you want to move something to voting 16:22:10 I think the old dashboard had that 16:22:36 i don't remember finding much info on periodic jobs in graphite 16:23:09 are there still any post-zuulv3 migration? 16:23:12 It was on line 28 in the old dashboard config file in the patch you just posted 16:23:28 Oh yes 16:23:54 We have narrowed the image builds: 16:24:01 https://github.com/openstack/octavia/blob/master/zuul.d/projects.yaml#L70 16:24:16 i'm having either ffox or gerrit load issues so can't see the review 16:24:53 There are also a bunch of bit-rot jobs and oslo cross-project jobs 16:25:26 if i can find the info in graphite i'll add a periodic job panel 16:25:49 octavia has a lot of jobs. would it be of interest to still add missing jobs to the dashboard? like barbican, active-standby, spare pool, amphora v2, etc. 16:26:07 Ok, thanks. I was trying to get a zuul health page that had the list, but they have changed the UI again, so it's taking me a bit to find 16:26:11 if you haven't gone there, https://graphite01.opendev.org/ has a lot 16:27:04 cgoncalves: yes, i'd consider that dashboard a work in progress as we add/remove jobs 16:27:14 or just have ones i missed 16:27:43 haleyb, ok. simply trying to understand if too many jobs is a good/desired or bad/undesired thing 16:27:48 Here are some: 16:27:50 #link https://zuul.openstack.org/builds?project=openstack%2Foctavia&pipeline=periodic# 16:28:09 I don't see the unit test runs on the stable branches there though. 16:29:05 I wonder if those got dropped/moved in the parent zuul configs and we don't have those any longer. 16:29:25 We have been backporting a lot, so at least we have had coverage on them 16:30:30 * haleyb shrugs 16:31:15 On another topic, I would like to ask that we all review the Protocol validation patch: 16:31:17 #link https://review.opendev.org/#/c/594040 16:31:33 It's #2 on the priority list and we have been getting more questions about it recently. 16:31:41 This has also sat for way too long IMO. 16:32:28 I think it also needs an update and I'm not sure the original author is still working on it, so if someone wants to pick that up..... Please consider it 16:32:41 agreed. will review it this week 16:32:59 I can help with that, I think 16:33:56 Cool, thank you! 16:34:04 I have a question related to 3rd party providers 16:34:06 I will also carve some review time today 16:34:44 is octavia open to have 3rd party CI jobs reporting in Gerrit? if yes, voting or non-voting? 16:35:08 There is a process for third party jobs. Let me find a link: 16:35:11 IMO 3rd party CI jobs should always be non-voting 16:35:36 #link https://docs.openstack.org/infra/system-config/third_party.html 16:35:50 I have run third party CI for octavia in the past. 16:36:02 This process works very well 16:36:37 thanks 16:36:54 so, skimming that page it looks like they should always be non-voting. good. 16:37:09 As for voting, yes, since the third party driver code is out of the main project tree, and typically requires equipment/code/licenses that upstream does not have access to, they should be limited to non-voting. 16:37:39 I.e. There isn't much we can do to fix the third-party code when it breaks. 16:38:19 what about Zuul jobs by 3rd party providers available in Opendev Gerrit, how should they be set up? 16:38:52 because in that case I think they wouldn't be external CI, or whatever they are called 16:39:12 instead they would have to be added to octavia check queue 16:39:18 Yeah, that is a new situation for sure. 16:39:55 My vote would be to add them at the bottom of the list (clearly labeled thrid-party somehow) and leave them non-voting. 16:40:23 There is value to having them such that we can make sure we don't break them if they are following our guidelines for providers. 16:41:22 sounds good 16:41:37 Those are my opinions, what do others think? 16:44:18 Ok 16:44:21 +1, all neutron reviews list third-party jobs in a sepearte section 16:45:40 at least the cloudbase one is separate 16:47:23 thanks for the input! 16:49:16 Other topics for today? 16:52:26 Ok, thank you folks! Chat with you next week. 16:52:35 #endmeeting 16:52:50 hmmm, rm_work may need to end the meeting 16:53:26 I bet all of my topics didn't log either 16:53:37 Really... Eugh 16:53:53 #endmeeting