15:00:08 #startmeeting neutron_ci 15:00:08 Meeting started Tue Jun 22 15:00:08 2021 UTC and is due to finish in 60 minutes. The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:08 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:08 The meeting name has been set to 'neutron_ci' 15:00:15 hi (again) 15:00:20 hi 15:00:24 hi 15:00:35 o/ again 15:00:42 Grafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:00:50 o/ 15:01:11 #topic Actions from previous meetings 15:01:19 ralonsoh to update our ci job's definitions 15:01:39 #link https://review.opendev.org/c/openstack/neutron/+/797051 15:01:40 and 15:01:51 #link https://review.opendev.org/c/openstack/project-config/+/797454 15:02:34 thx ralonsoh I will review those patches later tonight or tomorrow morning 15:02:43 thanks 15:03:23 added to the pile too 15:04:02 next one 15:04:05 obondarev to check fullstack timeouts in L3 agent tests 15:04:27 yes, so I checked the failure You mentioned on the last meeting: 'Lost connection to MySQL server during query' in neutron server logs 15:04:38 seems like a single random failure 15:04:40 ahh, ok, so it was oom-killer 15:04:48 but I checked other failures 15:04:52 yes, we have such problems from time to time 15:05:11 and seems there is a pattern 15:05:23 not a pattern but a test that fails from time to time 15:05:24 https://bugs.launchpad.net/neutron/+bug/1933234 15:05:40 it's not related to Port failed to become active 15:05:59 it's about MTU not updated for router interface 15:06:03 yes, I saw that one at least twice this week too 15:06:10 didn't have time to look closer into it 15:06:32 might still be related to privsep hanging but not usre 15:06:39 sure* 15:07:25 if I will have some time, I will try to have a look into that one 15:07:27 that's all update for now 15:07:35 thx obondarev 15:07:47 np 15:07:54 #action slaweq to check failing update mtu fullstack test: https://bugs.launchpad.net/neutron/+bug/1933234 15:08:18 ok, I think we can move on 15:08:20 #topic Stadium projects 15:08:26 any updates here? 15:08:32 or new ci issues? 15:09:02 on old branches we ar fighting with things, 15:09:30 but thanks to elod and everybody seems to be things are progressing 15:10:01 that's good 15:10:13 if You would need any help, please ping me 15:10:33 I also recently proposed fix for some tap-as-a-service job https://review.opendev.org/c/x/tap-as-a-service-tempest-plugin/+/797318 15:10:48 the exercise turned into a round of stable ci status though that's good to have yes 15:11:14 (still have to read elod's fix on that tox dependencies trick) 15:11:27 I check it, to tell the truth I left taas sleep, and hoped that the rename happens soon, peraps I ping infra about it 15:12:02 bcafarel: to tell the truth I can leave without understanding all these dependency hell solutions 15:12:05 ahh, so we should wait with that my patch until it will be in openstack/ namespace first, correct? 15:12:26 lajoskatona: this is also a reasonable way to keep your sanity :) 15:12:31 slaweq: not sure, if the rename happens in the far future.... 15:13:02 lajoskatona so feel free to check that patch and either approve or stop it for now, I'm ok with both ways 15:13:06 slaweq: You can add an action point on me to check with infra 15:13:16 ok, I check it 15:13:25 #action lajoskatona to check with infra status of the tap-as-a-service move 15:13:25 bcafarel: LOL 15:13:44 thx lajoskatona 15:14:21 anything else regarding stadium or can we move on? 15:14:29 we can move I think 15:14:34 thx 15:14:37 #topic Stable branches 15:14:43 any updates/issues here? 15:14:49 bcafarel? :) 15:15:29 ussuri had issue with neutron-tempest-slow, fix already merged https://review.opendev.org/c/openstack/neutron/+/797273 15:15:47 ++ 15:15:50 thx for that fix 15:15:57 else overall as of right now all good :) 15:16:20 good, so we can move on 15:16:28 #topic Grafana 15:16:35 http://grafana.openstack.org/dashboard/db/neutron-failure-rate 15:17:11 biggest problem was recently with functional tests job 15:17:34 but fix was merged last night and it seems to be better now 15:18:08 thx for quick review of the fix btw. :) 15:19:01 anything else related to grafana for today? 15:19:05 or can we move on? 15:19:36 sounds like a +1 for moving on (and yes it was nice to see that functional fix quickly in) 15:20:00 good catch slaweq! 15:20:23 thx :) 15:20:29 ok, let's move on 15:20:42 #topic fullstack/functional 15:20:52 here I had only one issue for today 15:21:07 but obondarev already reported and mentioned it here actually :) 15:21:16 :) 15:21:19 other than that I didn't found any new issues 15:21:46 btw. I proposed to add neutron-functional job to the neutron-lib queues: https://review.opendev.org/c/openstack/neutron-lib/+/797281 15:21:58 but amotoki found some issue with that job's definition 15:22:11 I will need to have closer look to make it running properly 15:22:22 will you propose the change to remove "tox_install_siblings"? 15:22:42 I think we can switch it to false. 15:22:44 ralonsoh I'm not sure what will be the consequences of that 15:22:49 s/false/true/ 15:23:37 let's change that in neutron, some stadium projects are already doing it 15:23:38 the first thing we can is to set tox_install_siblings to true in neutron-lib patch to land it. 15:23:49 then we can try to switch it in the neutron repo. 15:23:50 having it set to false may have been for jobs inheriting the base one (but this is a very vague memory) 15:24:17 I'll check any other project inheriting from neutron-functional 15:24:34 ralonsoh I will check it tomorrow :) 15:24:40 thx for advices guys 15:24:40 perfect 15:25:13 fyi: n-d-r explicitly set it to true in its zuul.yaml. 15:25:52 ok, so this seems like good way to go :) 15:25:58 I will update my patch tomorrow morning 15:26:47 ok, let's move one 15:26:48 #topic Tempest/Scenario 15:26:57 here I also didn't found any new issues for today 15:27:09 I have just quick heads-up 15:27:24 Tis playbook is executed if true: https://opendev.org/zuul/zuul-jobs/src/branch/master/roles/tox/tasks/siblings.yam 15:27:28 I proposed patch https://review.opendev.org/c/openstack/neutron-tempest-plugin/+/795929 to enable tls-proxy where it wasn't enabled 15:27:29 it looks like it works fine 15:28:20 cool, I think we can merge it now again 15:28:23 thx lajoskatona 15:29:01 ralonsoh why "again"? 15:29:33 that was disabled some years ago 15:29:51 that was an error in tempest, if I'm not wrong 15:30:26 ahh ok 15:30:37 but now it's enabled by default in tempest jobs IIRC 15:30:51 so also in most of our jobs which inherits from devstack-tempest 15:31:08 we just need to enable it in some jobs where it is disabled explicitly 15:31:15 and looks that it's good now :) 15:32:30 ok, that's basically all what I had for today 15:32:37 periodic jobs are fine this week 15:32:47 anything else You want to discuss today? 15:32:55 or if not we can finish earlier today 15:33:19 nothing from me 15:33:56 nothing from me either 15:34:53 ok, so thx for attending the meeting today 15:35:04 o/ 15:35:07 #endmeeting