15:00:31 #startmeeting neutron_ci 15:00:31 Meeting started Tue Dec 12 15:00:31 2023 UTC and is due to finish in 60 minutes. The chair is ykarel. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:31 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:31 The meeting name has been set to 'neutron_ci' 15:00:35 \o 15:00:41 ping bcafarel, lajoskatona, mlavalle, mtomaska, ralonsoh, ykarel, jlibosva, elvira 15:00:46 hi 15:00:47 Grafana dashboard: https://grafana.opendev.org/d/f913631585/neutron-failure-rate?orgId=1 15:00:47 Please open now :) 15:01:41 hi 15:02:44 o/ 15:02:56 o/ 15:03:03 * bcafarel in another meeting, watching from time to time 15:03:10 Let's start 15:03:15 #topic Actions from previous meetings 15:03:17 o/ 15:03:25 ykarel to check and reopen bug for shy ports in functional tests 15:03:38 reopened https://bugs.launchpad.net/neutron/+bug/1961740 15:03:54 Rodolfo Alonso proposed openstack/ovn-bgp-agent master: [UT] Fix the ``TestLinuxNet`` wrong assert calls https://review.opendev.org/c/openstack/ovn-bgp-agent/+/903511 15:04:02 ykarel to check/report issue with test_unshelve_to_specific_host 15:04:14 reported https://bugs.launchpad.net/nova/+bug/2045785 15:04:26 mlavalle to check failure with test_port_binding_chassis_create_event 15:04:43 I did chck it. Using https://zuul.openstack.org/builds?job_name=neutron-functional-with-uwsgi&project=openstack%2Fneutron&branch=master&result=FAILURE&skip=0 15:05:02 I didn't find another ocuurence since the begining of the month 15:05:10 so it was probably a one of 15:05:25 Thanks for checking mlavalle 15:05:35 slawek to report bug for fullstack failure 15:05:45 #link https://bugs.launchpad.net/neutron/+bug/2045757 15:06:00 it should be also fixed now 15:06:02 #link https://review.opendev.org/c/openstack/neutron/+/902762 15:06:18 slaweq, so it could be even backported ? 15:06:34 it is up to Zed 15:06:43 all backports merged already 15:06:47 https://review.opendev.org/q/I1c20c90b8abd760f3a53b24024f19ef2bd189b5a 15:06:48 okk good, i missed that 15:06:54 thx for taking care of it 15:07:01 yw 15:07:05 ralonsoh to send fix for ft test test_port_forwarding 15:07:26 ups, I think I have the patch but never sent 15:07:30 I'll do it today 15:07:35 sorry 15:07:43 np, thx for looking into it 15:07:55 #topic Stable branches 15:08:09 bcafarel, all yours ^ 15:09:11 i noticed couple of patches merged this week in stable, and periodic lines were good too. 15:09:25 so should be all good in stable 15:09:27 yes, I have a few reviews not catched on 15:09:33 but from what I saw all good! 15:09:42 thx bcafarel 15:09:52 and with ussuri going EOL, one less branch to worry about soon :) 15:09:52 #topic Stadium projects 15:09:59 +1 15:10:19 wrt Stadium weekly periodics were green 15:10:25 so all good here too 15:10:34 #topic Rechecks 15:10:52 quite a good number of patches merged this week across branches 15:11:09 and no bare rechecks out of 36 rechecks in total 15:11:16 let's keep doing it 15:11:33 there were some infra issues too that lead to such rechecks 15:12:18 #topic Unit tests 15:12:28 Just an info 15:12:34 Random POST_FAILURES against jobs, resolved with https://review.opendev.org/c/opendev/base-jobs/+/903351 15:13:11 #topic fullstack/functional 15:13:34 i have not noticed anything new in fullstack/functional so not reported here 15:14:01 it was general fullstack timeout and known functional failures, the one i reopened as linked above 15:14:05 that's good news :) 15:14:26 slaweq++ nice patches for fullstack 15:14:43 ++ 15:15:05 #topic Tempest/Scenario 15:15:25 Ubuntu image download random issue should be resolved with https://review.opendev.org/q/I7163aea4d121cb27620e4f2a083a543abfc286bf 15:15:42 are You sure? 15:15:57 that was merged last night in master 15:16:08 and since then i have not seen the failure, you seen some? 15:16:10 I think I commented in some patch like that already 15:16:24 timeouts seems to be in the neutron-tempest-plugin's devstack plugin 15:16:53 https://github.com/openstack/neutron-tempest-plugin/blob/master/devstack/customize_image.sh 15:17:36 yes the function here is wrapper to the devstack one, and as fix already in devstack so we should be good now 15:17:36 ahh, ok 15:17:43 now I see 15:17:48 sorry for the noise then :) 15:17:53 ++ 15:18:26 There are some other random failures in tempest jobs for which bugs already reported. 15:18:49 So if we start seeing more such failures in those tempest jobs we can consider skipping impacted tests 15:20:05 i mean https://bugs.launchpad.net/neutron/+bug/2039940 and https://bugs.launchpad.net/nova/+bug/2045785 15:20:28 #topic Periodic 15:21:00 https://zuul.openstack.org/builds?job_name=devstack-tobiko-neutron&project=openstack%2Fneutron&branch=master&skip=0 15:21:12 Last run failed with virt-customize errors, need to check if it's one time failure or happens again 15:21:38 i will check next periodic runs to see if it's some consistent new issue or one time thing 15:22:12 #action ykarel to look for next periodic runs of devstack-tobiko-neutron and file bug if consistent issue 15:22:19 Stable periodics 15:22:27 https://zuul.openstack.org/builds?job_name=neutron-functional-with-uwsgi-fips&project=openstack%2Fneutron&branch=stable%2Fyoga&skip=0 15:22:36 Already assigned bug to ralonsoh https://bugs.launchpad.net/neutron/+bug/2039650 15:23:03 ok, I have this before my parental leave 15:23:06 I'll retake it again 15:23:42 thx ralonsoh 15:24:09 i just noticed as only yoga had consistent failures in periodic 15:24:21 #topic Grafana 15:24:27 https://grafana.opendev.org/d/f913631585/neutron-failure-rate 15:24:42 let's have a quick look here too 15:25:24 some spikes during the weekend 15:26:17 I wouldn't bother with weekends too much, it usually has some spikes as there is less patches tested during that time 15:26:30 right 15:26:38 +1 15:26:44 rest all looks good 15:27:00 #topic On Demand 15:27:52 I'm good 15:28:02 me too 15:28:58 nothing from me 15:29:01 either 15:29:27 thx everyone then, let's close and have everyone 31 minutes back :) 15:29:31 #endmeeting