15:00:15 #startmeeting neutron_ci 15:00:17 hi 15:00:20 Meeting started Wed Sep 2 15:00:15 2020 UTC and is due to finish in 60 minutes. The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:22 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:24 The meeting name has been set to 'neutron_ci' 15:01:13 o/ 15:01:37 Hi bcafarel! 15:02:04 hi bcafarel and lajoskatona :) 15:02:10 hey lajoskatona slaweq :) 15:02:19 setuptools 50 is capped now so we can say CI is good right? 15:02:19 I think we can start as ralonsoh is on pto today 15:02:20 Hi everybody :-) 15:02:39 bcafarel: yes, I think we can possible say that ;) 15:03:43 Grafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:03:55 #topic Actions from previous meetings 15:04:01 ralonsoh to check timing out neutron-ovn-tempest-full-multinode-ovs-master jobs - bug https://bugs.launchpad.net/neutron/+bug/1886807 15:04:02 Launchpad bug 1886807 in neutron "neutron-ovn-tempest-full-multinode-ovs-master job is failing 100% times" [High,Confirmed] - Assigned to Maciej Jozefczyk (maciejjozefczyk) 15:04:12 I will assign it to ralonsoh for next week 15:04:18 #action ralonsoh to check timing out neutron-ovn-tempest-full-multinode-ovs-master jobs - bug https://bugs.launchpad.net/neutron/+bug/1886807 15:04:27 next one 15:04:29 slaweq to ask jlibosva and lucasgomes if they can check https://bugs.launchpad.net/neutron/+bug/1890445 15:04:29 Launchpad bug 1890445 in neutron "[ovn] Tempest test test_update_router_admin_state failing very often" [Critical,Confirmed] 15:04:34 I asked them today about it :) 15:04:43 so I hope jlibosva will check this issue this week 15:04:51 next one 15:04:53 slaweq to propose neutron-tempest-plugin switch to focal nodes 15:05:00 Patch https://review.opendev.org/#/c/748367 15:05:17 now I'm waiting for new results after all this setuptools issues are fixed 15:05:21 I hope it will be ok now 15:06:45 nice, I had not seen that one 15:06:46 ralonsoh to check issue with pep8 failures like https://zuul.opendev.org/t/openstack/build/6c8fbf9b97b44139bf1d70b9c85455bb 15:07:11 that is the last one and I will assign it to ralonsoh for next week as I saw such issue even today 15:07:17 #action ralonsoh to check issue with pep8 failures like https://zuul.opendev.org/t/openstack/build/6c8fbf9b97b44139bf1d70b9c85455bb 15:08:26 and that are all actions from last week 15:08:34 lets move to the next topic 15:08:35 #topic Switch to Ubuntu Focal 15:08:40 Etherpad: https://etherpad.opendev.org/p/neutron-victoria-switch_to_focal 15:08:56 except that neutron-tempest-plugin patch I didn't check anything else this week 15:08:59 did You maybe? 15:09:15 small fixes https://review.opendev.org/#/c/734304/ and https://review.opendev.org/#/c/748168/ 15:09:31 (sorry with rebasing I lost all the +2 on first one) 15:09:49 it will get functional and lower-constraints out of the "to fix" list 15:10:35 +2'ed both already 15:11:02 I checked last wee kbagpipe but that's failing with some volume tests 15:11:13 https://64274739a94af6e87bea-e6e77a9441ae9cccde1b8ed58f97fc24.ssl.cf5.rackcdn.com/739672/3/check/networking-bagpipe-tempest/4a5e824/testr_results.html 15:12:16 lajoskatona: is it all the time the same issue? 15:13:17 for last few runs yes 15:14:06 can You ask cinder team to check that? 15:14:42 yes, I just rechecked and if it still exists I ask them with fresh logs 15:14:49 lajoskatona: ok, thx 15:15:18 ok, I think we can move on to the next topic 15:15:21 #topic standardize on zuul v3 15:15:26 Etherpad: https://etherpad.openstack.org/p/neutron-train-zuulv3-py27drop 15:15:56 I did only small progress on neutron-grenade-ovn job last week 15:16:10 I will try to continue it this week if I will have some time 15:16:34 of the 3 jobs in openstack-zuul-jobs, 2 have been removed, and one seems an artifact of my scripts (legacy-periodic-neutron-dynamic-routing-dsvm-tempest-with-ryu-master-scenario-ipv4) because it's not defined in master 15:17:21 tosky: great, thx for info 15:17:30 tosky: for last one a recent fix, I think I cleaned it same time as unused playbooks 15:17:33 so we still have neutron-grenade-ovn and networking-odl-grenade to finish 15:17:38 and then midonet jobs 15:17:47 the big open question about midonet :) 15:17:58 there is 2 guys who wants to maintain this project 15:18:06 there was some recent discussion on ML about it 15:18:38 and I already discussed with them that making those jobs working on Ubuntu 20.04 and migrating to zuulv3 is most important for now 15:20:03 I update my todo list to go back to odl grenade.... 15:20:26 thx lajoskatona 15:21:19 with that I think we can move on 15:21:23 next topic is 15:21:25 #topic Stable branches 15:21:39 Ussuri dashboard: http://grafana.openstack.org/d/pM54U-Kiz/neutron-failure-rate-previous-stable-release?orgId=1 15:21:40 Train dashboard: http://grafana.openstack.org/d/dCFVU-Kik/neutron-failure-rate-older-stable-release?orgId=1 15:21:49 bcafarel: any issues with stable branches? 15:22:38 I have a few rechecks in backlog after setuptools issue, but from what I saw, all branches are back in working order :) 15:22:51 that's good news 15:23:39 so I think we can quickly move to the next topic 15:23:41 #topic Grafana 15:23:45 #link http://grafana.openstack.org/d/Hj5IHcSmz/neutron-failure-rate?orgId=1 15:24:23 in general after setuptools issue now things are getting back to normal 15:25:12 pep8 job is failing too often recently but IMO it's related to this issue which is assigned to ralonsoh already 15:25:39 and other issue is with periodic jobs 15:26:35 and TBH I didn't found any specific new issues in functional/fullstack nor scenario jobs 15:27:06 only issue which is worth to mention here is related to the periodic job openstack-tox-py36-with-ovsdbapp-master 15:27:10 I am not complaining that you did not find any new issues 15:27:22 2 tests are failing every day since 28.08 15:27:40 so we should definitely check and fix that before we will release new ovsdbapp version 15:27:50 and we should release new version this week :/ 15:27:59 anyone wants to check that? 15:28:33 most likely https://review.opendev.org/#/c/745746/ is the culprit of that issue 15:28:50 I can try to look into it end of this week (or Monday probably) 15:29:16 bcafarel: monday may be too late as this week is final release of non-client libraries 15:29:30 so tomorrow we should do release of ovsdbapp for victoria 15:29:39 I will ask otherwiseguy to check that issue maybe 15:29:41 :/ 15:29:54 yeah I was about to suggest to drag otherwiseguy in 15:31:51 I just pinged him about it 15:33:28 ok, that's all from me for today 15:33:39 do You have anything else regarding our ci? 15:34:10 nothing from me 15:35:08 lajoskatona: anything else from You? 15:35:23 nothing 15:35:34 ok, so lets finish meeting earlier today 15:35:37 thx for attending 15:35:45 o/ 15:35:47 #endmeeting