15:00:04 #startmeeting neutron_ci 15:00:05 Meeting started Wed Sep 30 15:00:04 2020 UTC and is due to finish in 60 minutes. The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:06 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:08 The meeting name has been set to 'neutron_ci' 15:00:54 hi 15:01:32 Hi 15:01:35 o/ 15:03:22 ok, lets start 15:03:32 Grafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:03:53 #topic Actions from previous meetings 15:04:13 ralonsoh to check timing out neutron-ovn-tempest-full-multinode-ovs-master jobs - bug https://bugs.launchpad.net/neutron/+bug/1886807 15:04:15 Launchpad bug 1886807 in neutron "neutron-ovn-tempest-full-multinode-ovs-master job is failing 100% times" [High,Confirmed] - Assigned to Maciej Jozefczyk (maciejjozefczyk) 15:04:28 please read comment in https://bugs.launchpad.net/neutron/+bug/1886807/comments/3 15:04:50 long story short: I think the error is related to https://bugs.launchpad.net/nova/+bug/1863021 15:04:53 Launchpad bug 1863021 in OpenStack Object Storage (swift) "[SRU] eventlet monkey patch results in assert len(_active) == 1 AssertionError" [Undecided,In progress] - Assigned to Chris MacNaughton (chris.macnaughton) 15:05:12 and I psuhed https://review.opendev.org/#/c/755256/, I think this should fix this error 15:05:28 but I don't think this is related to OVN or neutron 15:05:44 that's great news, thx ralonsoh :) 15:05:47 yw 15:05:57 ok 15:06:00 next one is 15:06:02 slaweq to report bug with neutron-dhcp-agent and Fedora 32 15:06:11 I didn't but bug is reported https://bugs.launchpad.net/neutron/+bug/1896945 15:06:12 Launchpad bug 1896945 in neutron "dnsmasq >= 2.81 not responding to DHCP requests with current q-dhcp configs" [High,In progress] - Assigned to Dan Radez (dradez) 15:06:18 and radez already works on that 15:06:30 I think he found that it is "fedora only" issue 15:06:51 yeah this was the one I checke with ubuntu20.10, and on that it worked 15:07:21 and that is also good news for us, at least from u/s PoV :) 15:07:21 o/ sry stepped away and missed the start 15:07:33 radez: hi 15:07:49 don't worry, I was just saying that You are working on this issue with dnsmasq and F32 15:08:11 saw that thx :) 15:08:30 thank You for taking care of this issue :) 15:08:43 and with that I think we can move on to the next topic 15:08:45 which is 15:08:49 #topic Switch to Ubuntu Focal 15:09:00 Etherpad: https://etherpad.opendev.org/p/neutron-victoria-switch_to_focal 15:09:04 I updated it today 15:09:16 we are basically mostly done for most of the jobs 15:09:21 (finally) 15:09:46 good work everyone! 15:09:46 but we should just check once again e.g. stadium projects if we didn't miss anything 15:11:10 I hope that next week we will be able to remove that topic from this meeting :) 15:11:25 ok, lets move on 15:11:28 #topic Stadium projects 15:11:33 https://review.opendev.org/#/c/754068/ for sfc is still in queue :) 15:11:59 bcafarel: sorry, I missed that somehow 15:12:07 I will look at it just after this meeting 15:12:51 bcafarel: (and others) if You have any other patch related to migration to Focal which we missed, please add it to that etherpad 15:13:05 so it will be easier to find and review it (at least for me) :) 15:13:44 ok, I think we can move on to the next topic 15:13:59 so stadium projects 15:14:04 and migration to zuulv3 15:14:09 Etherpad: https://etherpad.openstack.org/p/neutron-train-zuulv3-py27drop 15:14:20 we still have just those last 2 jobs to convert: 15:14:26 networking-odl: https://review.opendev.org/#/c/725647/ 15:14:28 neutron: https://review.opendev.org/#/c/729591/ 15:15:22 for neutron-ovn-grenade job it seems that it passed now :) 15:15:27 yes! 15:15:29 so please review this patch 15:15:55 nice! 15:16:05 I just check odl logs, but now it seems better 15:16:08 and lajoskatona, regarding odl patch I see that grenade job is still failing 15:16:16 is it ready to review or not yet? 15:17:02 hmmm, it failed originally as well, the same true for tempest jobs sadly 15:17:32 I check the logs in details, and "report" on irc the status and on etherpad 15:18:15 so, if that is failing same way like old job, and failure isn't related to the job's configuration than I think we should be good to go with it now 15:18:22 and fix job later 15:18:33 yeah 15:18:40 ok, so I will review it also 15:19:34 ok, anything else regarding stadium projects' CI? 15:20:52 if not, than lets move on 15:20:55 #topic Stable branches 15:21:33 first of all I think that we need to update our dashboards for "previous" and "older" stable release 15:21:36 that makes me think we should switch the stable dashboards to victoria soon 15:21:40 :) 15:21:40 :) 15:21:44 :D 15:21:51 every time I'm first :P 15:22:06 I should train on fast typing :) 15:22:26 actually You are typing faster than me usually :) 15:22:44 bcafarel: can You take care of that? 15:23:38 sure 15:23:41 thx a lot 15:24:02 #action bcafarel to update our grafana dashboards for stable branches 15:24:25 except that, do we have anything else regarding stable branches for today? 15:24:36 I think that ci for stable branches is pretty ok, isn't it? 15:26:01 yes patches got in without too many rechecks 15:26:31 and as list of opened backports is short, I sent https://review.opendev.org/#/c/755203/ for stable releases 15:26:57 thx 15:28:07 ok, so I think we are good with stable branches now 15:28:13 lets move on to the next topic 15:28:15 #topic Grafana 15:28:25 #link http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:29:25 grafana is starting to look better now 15:29:37 ovn jobs and neutron-tempest-plugin jobs are going down finally 15:29:55 functional job is failing pretty often 15:30:08 but I didn't really found any specific issues there so far 15:30:53 anything else You want to add here? 15:31:19 no thanks 15:32:04 ok, so lets move on 15:32:14 #topic Tempest/Scenario 15:32:30 recently, after I opened 3 new bugs: 15:32:37 https://bugs.launchpad.net/neutron/+bug/1897898 15:32:38 Launchpad bug 1897898 in neutron "Scenario test test_multiple_ports_portrange_remote is unstable" [Critical,Confirmed] 15:32:38 https://bugs.launchpad.net/neutron/+bug/1897326 15:32:39 Launchpad bug 1897326 in neutron "scenario test test_floating_ip_update is failing often on Ubuntu 20.04" [Critical,In progress] - Assigned to Slawek Kaplonski (slaweq) 15:32:43 https://bugs.launchpad.net/neutron/+bug/1896735 15:32:44 Launchpad bug 1896735 in neutron "Scenario tests from neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON failing due to ssh failure" [Critical,In progress] - Assigned to Slawek Kaplonski (slaweq) 15:33:03 first 2 are happening after migration to Focal 15:33:09 at least I didn't saw them before 15:33:32 and for https://bugs.launchpad.net/neutron/+bug/1897326 I just sent patch https://review.opendev.org/755314 which should hopefully fix it 15:33:51 and https://bugs.launchpad.net/neutron/+bug/1896735 is happening pretty often also 15:33:52 Launchpad bug 1896735 in neutron "Scenario tests from neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON failing due to ssh failure" [Critical,In progress] - Assigned to Slawek Kaplonski (slaweq) 15:33:59 so I will probably mark this test as unstable for now too 15:34:02 wdyt? 15:34:19 sounds in line with the other 2 yes 15:34:35 last one you say? 15:35:29 why https://bugs.launchpad.net/neutron/+bug/1896735 is related to https://review.opendev.org/#/c/748367/? 15:35:30 Launchpad bug 1896735 in neutron "Scenario tests from neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON failing due to ssh failure" [Critical,In progress] - Assigned to Slawek Kaplonski (slaweq) 15:35:59 sorry, I mixed some links probably 15:36:05 np 15:36:58 ahh, it is related because is marking this test as unstable due to that bug 15:37:46 right, thanks! 15:38:22 also, after our recent issues with merging patches to neutron-tempest-plugin repo, I was checking some of the failures and I proposed some small patches 15:38:27 https://review.opendev.org/#/c/725526/ 15:38:33 https://review.opendev.org/#/c/755112/ 15:38:38 https://review.opendev.org/#/c/755122/ 15:39:02 https://review.opendev.org/#/c/755112/ should help with some of the Authentication failures on stable branches 15:39:27 and other 2 should at least add some extra logging of the console log and network config on host in case of failures 15:39:43 if You will have few minutes, please check those patches 15:40:28 and that's all from my side regarding tempest/scenario jobs for today 15:40:33 do You have anything else? 15:40:45 no thanks 15:41:00 so cirros 0.5.1 can help? will take a look at patch later 15:41:05 yes 15:41:30 because maciejjozefczyk did patch to cirros to retry failed calls to metadata 15:41:34 and it is in 0.5.1 15:41:50 so in case if call for public-keys will fail once, it will be retried 15:42:00 and key should be configured in the guest vm 15:42:15 oh nice! 15:42:46 ok, so I have one more topic for today 15:42:53 #topic Periodic 15:43:06 first of all thx bcafarel for fixing fedora job :) 15:43:48 but now it seems that openstack-tox-py36-with-ovsdbapp-master is failing contantly since about a week 15:44:34 every time same 2 tests are failing 15:44:45 anyone wants to report a bug and check it? 15:45:04 I can 15:45:09 thx ralonsoh 15:45:34 #action ralonsoh to report a bug and check failing openstack-tox-py36-with-ovsdbapp-master periodic job 15:45:50 ralonsoh: You can find logs in https://zuul.openstack.org/buildsets?project=openstack%2Fneutron&pipeline=periodic&branch=master 15:45:56 thanks 15:46:04 thank You 15:46:20 ok, and that was last thing from me for today 15:46:29 do You have anything else You want to discuss? 15:46:43 not from me 15:47:53 so I think we can finish now :) 15:48:00 thx for attending 15:48:02 o/ 15:48:05 #endmeeting