15:01:15 #startmeeting neutron_ci 15:01:16 Meeting started Wed May 20 15:01:15 2020 UTC and is due to finish in 60 minutes. The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:01:17 hi 15:01:18 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:01:20 The meeting name has been set to 'neutron_ci' 15:01:36 hi 15:01:53 (sorry, too many meetings at once) 15:02:21 hi 15:02:28 o/ 15:02:49 \o 15:03:18 ok, lets do that fast :) 15:03:29 Grafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:03:30 Please open now :) 15:04:42 #topic Actions from previous meetings 15:04:51 ralonsoh to continue checking ovn jobs timeouts 15:04:57 I'm on it 15:05:07 focusing in ovsdbapp and python-ovn 15:05:25 ok, thx ralonsoh 15:05:28 but no conclusions yet 15:05:41 so I will keep it for next week, just to track that 15:05:44 sure 15:05:47 #action ralonsoh to continue checking ovn jobs timeouts 15:05:49 thx a lot 15:05:55 next one 15:05:57 bcafarel to update stable branches grafana dashboards 15:07:05 #link https://review.opendev.org/#/c/729291/ 15:07:26 as we had updated these recently, this time it was easier 15:08:05 thx bcafarel 15:08:33 and the last one from previous week 15:08:35 slaweq to switch functional uwsgi job to be voting 15:08:43 Patch proposed https://review.opendev.org/729588 15:08:59 I will also update grafana when that will be merged 15:09:13 and that's all from last week 15:09:22 ironic that this specific job failed on this patch :) 15:10:37 ouch 15:10:51 lets check it couple of times before we will merge it 15:11:11 https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_9b8/729588/1/check/neutron-functional-with-uwsgi/9b8e96a/testr_results.html looks unrelated 15:12:14 yes, doesn't look like related to uwsgi 15:13:31 ok, lets move on 15:13:33 #topic Stadium projects 15:13:44 standardize on zuul v3 15:13:57 I started looking today at neutron-ovn-grenade job 15:14:05 and I saw that it's not run in neutron gate at all 15:14:27 so I proposed to add it as non-voting job for now 15:14:29 https://review.opendev.org/#/c/729591/ 15:14:42 and then I will also work on migration to zuulv3 15:14:57 I just want to have some legacy job's results to compare during migration 15:16:18 I don't think there is any other update about that 15:16:28 but if I am wrong, please tell now :) 15:17:32 looks like the silent crowd agrees 15:17:42 yeah 15:17:45 so lets move on 15:17:55 #topic Stable branches 15:18:01 Train dashboard: http://grafana.openstack.org/d/pM54U-Kiz/neutron-failure-rate-previous-stable-release?orgId=1 15:18:03 Stein dashboard: http://grafana.openstack.org/d/dCFVU-Kik/neutron-failure-rate-older-stable-release?orgId=1 15:18:48 ^ that will change soon :) 15:19:14 :) 15:19:50 I did not see many failures last week I think it was good on stable front 15:20:04 yes, that's also my impression 15:20:13 more recheckes are on master branch 15:21:09 anything else regarding stable branches for today? 15:22:49 not from me 15:22:50 ok, so lets move on 15:22:58 #topic Grafana 15:23:07 #link http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:24:07 ovn related jobs are a bit high in check queue - around 40% 15:24:29 http://grafana.openstack.org/d/Hj5IHcSmz/neutron-failure-rate?viewPanel=16&orgId=1 15:24:58 now they are going down a bit but still are top on this graph 15:27:17 but I don't have now any specific failure to check 15:27:36 maybe it's just because there was recently pretty many WIP or DNM patches related to ovn driver 15:27:55 lets check that in next days and we will see what to do with it 15:28:07 +1 15:28:18 probably 15:28:25 other than that all looks ok'ish IMO 15:29:08 ok 15:29:21 anything else regading grafana? 15:30:13 no 15:30:23 ok, lets move on 15:30:25 #topic fullstack/functional 15:30:50 I saw few issues in functional tests this week, like e.g. 15:30:53 https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/testr_results.html 15:31:15 but it was only once when I saw it 15:31:22 Yeah I noticed that too. 15:33:37 I don't see anything obvious in log from this test: https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/controller/logs/dsvm-functional-logs/neutron.tests.functional.agent.l3.test_ha_router.L3HATestFailover.test_ha_router_failover.txt 15:34:06 but I will check it more deeply this week 15:34:22 #action slaweq to check failure in test_ha_router_failover: https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_6d0/726168/2/check/neutron-functional/6d0b174/testr_results.html 15:34:55 regarding fullstack tests, I still see failing firewall test, like https://93b88ea1fe64fba121c6-f42d955827477dc68a274454ee4340d5.ssl.cf2.rackcdn.com/726168/2/check/neutron-fullstack/cb8503e/testr_results.html 15:35:11 I think we need to reopen bug related to this and mark this test as unstable again 15:36:14 what do You think? 15:36:25 ok 15:36:55 shouldn't we have more debug info there? 15:37:24 yes, I will try to add some additional logging in this test 15:37:26 I remember we added the routes, the devices, etc. 15:37:32 (maybe I'm wrong) 15:37:44 ralonsoh: not to fullstack tests AFAIR 15:37:48 but I will check that 15:37:50 ups 15:38:10 #action slaweq to add additional logging for fullstack's firewall tests 15:38:26 #action slaweq to reopen bug related to failing fuillstack firewall tests 15:38:56 and that's all what I have regarding functional and fullstack tests for today 15:39:16 lets move on 15:39:18 #topic Tempest/Scenario 15:39:31 here I found one new issue, Address already allocated in subnet 15:39:37 https://2f302d35d7c2b9201857-14c47b0c762b46266aadd7f2c624d382.ssl.cf5.rackcdn.com/665467/70/check/neutron-tempest-plugin-scenario-openvswitch/9514b9a/testr_results.html 15:40:04 maybe it's just http request timeout on client's side and after recheck it was already done in neutron 15:40:12 but IMO worth to check that 15:40:17 any volunteer for that? 15:40:28 me 15:40:33 but on friday 15:40:48 ralonsoh: great, thx a lot 15:41:12 #action ralonsoh to check Address already allocated in subnet issue in tempest job 15:41:24 ok, and the last thing for today 15:41:26 #topic Periodic 15:41:36 thx maciejjozefczyk for fixing fedora ovn job, it's now fine 15:41:49 \o/ 15:41:53 today I noticed that openstack-tox-py36-with-ovsdbapp-master is failing 15:42:05 Bug is reported https://bugs.launchpad.net/ovsdbapp/+bug/1879717 15:42:05 Launchpad bug 1879717 in ovsdbapp "Neutron's unit tests are failing with ovsdbapp from master branch" [Undecided,New] 15:42:13 and otherwiseguy already proposed patch for that 15:42:36 Yes, looks like its related to autoindex feature recently merged... 15:42:56 https://review.opendev.org/728306 ? 15:42:57 we cached it in last minute because there is proposed new release of ovsdbapp and if we would release it, neutron gate would be doomed 15:43:36 bcafarel: https://review.opendev.org/729627 15:44:32 and basically that's all from me for today 15:44:38 anything else You want to discuss? 15:44:50 thanks (and nice to see the -with-xx-master jobs catching this kind of issues!) 15:45:16 bcafarel: yes, good that we are checking them at least once a week :) 15:45:24 that too :) 15:45:37 and nothing else from me 15:45:38 btw. I started reorganisation of zuul jobs definitions in neutron-tempest-plugin https://review.opendev.org/#/c/729567/ 15:45:59 there is some issue there now, so I have to check it but please be ready for review when it will pass zuul :) 15:47:33 nice 15:47:33 if You don't have anything else, I will give You 13 minutes back 15:47:49 thx for attending and have a great rest of the week :) 15:47:51 o/ 15:47:56 o/ 15:47:56 #endmeeting