15:00:40 #startmeeting neutron_ci 15:00:40 Meeting started Mon Mar 10 15:00:40 2025 UTC and is due to finish in 60 minutes. The chair is ykarel. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:40 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:40 The meeting name has been set to 'neutron_ci' 15:00:46 Ping list: bcafarel, lajoskatona, slawek, mlavalle, mtomaska, ralonsoh, ykarel, jlibosva, elvira 15:00:47 \o 15:00:57 hello, video or IRC today? 15:01:05 I always forget 15:01:32 IRC 15:01:36 cool 15:02:03 ++ 15:02:09 hi 15:02:29 Hi 15:02:38 Merged openstack/neutron stable/2024.2: [ovn][trivial] Add 'empty_string_filtering' extension to OVN https://review.opendev.org/c/openstack/neutron/+/943417 15:02:42 Merged openstack/neutron stable/2024.1: [ovn][trivial] Add 'empty_string_filtering' extension to OVN https://review.opendev.org/c/openstack/neutron/+/943418 15:02:46 Merged openstack/neutron stable/2023.2: [ovn][trivial] Add 'empty_string_filtering' extension to OVN https://review.opendev.org/c/openstack/neutron/+/943419 15:03:26 late o/ 15:03:42 k let's start with topics, others can join in meanwhile 15:03:47 #topic Actions from previous meetings 15:03:57 ralonsoh to check and open bug for privsep related failure in functional job 15:04:28 Sorry, I didn't open it yet 15:04:36 I was investigating it 15:05:05 ack thx for checking, will readd 15:05:10 #action ralonsoh to check and open bug for privsep related failure in functional job 15:05:21 ralonsoh to check and open bug for fullstack failures failing at wait_until_true from quite sometime, original suspect was https://review.opendev.org/c/openstack/neutron/+/937843/ 15:05:28 yes 15:05:30 #link https://bugs.launchpad.net/neutron/+bug/2101839 15:05:37 And I proposed https://review.opendev.org/c/openstack/neutron/+/943942 15:05:43 temporarily 15:06:55 thx ralonsoh 15:07:07 ohkk so eventlet reintroduced in the test 15:07:24 yes only for fullstack tempest 15:07:31 until we remove it from this FW 15:08:07 ok 15:08:16 I hope we will have time and compony support in the comming cycles for this eventlet work otherwise Neutron will be like the gordian knot 15:08:44 ++ 15:08:47 I finshed last week the metadata and I'm working now in the L3 agent 15:08:56 but yes, more hands are welcome 15:09:13 moving to next 15:09:16 ykarel to check and open bug for ovs tempest failures 15:09:33 opened https://bugs.launchpad.net/neutron/+bug/2101840 15:09:37 #link https://bugs.launchpad.net/neutron/+bug/2101840 15:10:00 it's kind of critical now as the ovs jobs failing quite frequently now 15:10:11 seems related to the eventlet cleanups we have ? 15:10:51 like https://review.opendev.org/c/openstack/neutron/+/942393 15:10:51 thanks fro checking, I just read your comment in lp, I check it 15:10:56 but the dhcp agent moved out of eventlet some months ago 15:11:28 from logs i see dhcp agents have high number of events to process like 200+ 15:11:28 right... it received some adaptations 15:11:40 but the complete removal was finished 2 weeks ago 15:13:07 I check if the eventlet removal is related or not, and check this bug 15:13:24 thx lajoskatona 15:13:39 #action lajoskatona to check https://bugs.launchpad.net/neutron/+bug/2101840 15:14:07 ykarel to open bug for ping failure post upgrade in grenade 15:14:17 #link https://bugs.launchpad.net/neutron/+bug/2101166 15:14:30 this also seen in ovs jobs, dvr multinode 15:14:54 not much frequent though 15:15:24 ykarel to open bug for tempest failure in test_established_tcp_session_after_re_attachinging_sg 15:15:31 #link https://bugs.launchpad.net/neutron/+bug/2101165 15:15:51 this also seen mostly in ovs jobs 15:16:02 and i recall seeing similar traces from long 15:16:10 didn't find an old bug so reported this to track 15:16:28 that's it for action items from previous week 15:16:29 #topic Stable branches 15:16:45 bcafarel, any update ^ 15:17:16 a few backports last week (and merging), nothing looking bad in the CI runs there 15:17:18 couple of patches merged in stable 15:17:35 yeap merges in both stable and unmaintained went fine 15:17:50 in periodic i saw some failures but none of them consistent this week 15:18:13 last was openstacksdk but that got resolved already 15:18:23 thx bcafarel for the updates 15:18:31 #topic Stadium projects 15:18:35 all green in periodic 15:18:39 lajoskatona, anything to add 15:18:53 nothing from my side for stadiums 15:19:17 k thx 15:19:26 #topic Rechecks 15:19:52 we still have couple of rechecks to get patches merged due to known issues 15:20:04 bare recheck wise we had 4/39 15:20:15 out of those 3 in same patch, test patch for pyroute2 master 15:20:27 so good, let's keep avoiding bare rechecks 15:20:28 yes, sorry for that 15:20:39 Peter is testing pyroute2 with a branch created only for neutron 15:20:39 np those were from Petr 15:21:02 ++ thx for working on that 15:21:12 Now let's check failures 15:21:22 test_ovn_nb_sync_repair_delete_ovn_nb_db in stable 15:21:29 https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_811/periodic/opendev.org/openstack/neutron/stable/2024.2/neutron-functional-with-uwsgi-fips/811c11f/testr_results.html 15:21:36 pushed backports https://review.opendev.org/q/Ibaeaa36dbbb9def10a163b4ee071cb432db5a383 15:21:52 all merged to stable 15:22:02 some randomly failures in functional 15:22:09 - https://27d33baddc3887bf0580-122b756d505eee79094990db0a134990.ssl.cf1.rackcdn.com/periodic/opendev.org/openstack/neutron/master/neutron-functional-with-oslo-master/77dd0c1/testr_results.html 15:22:10 - https://6047f309b8432f0272cd-3f475577843c6519bda0c8c12c28da61.ssl.cf2.rackcdn.com/periodic/opendev.org/openstack/neutron/master/neutron-functional-with-sqlalchemy-master/1d78049/testr_results.html 15:22:40 from failures could see traces related to sqlite but as seeing mainly in master and logger driver involved 15:22:40 ^^ I'll check them 15:22:51 proposed https://review.opendev.org/c/openstack/neutron/+/943937 15:23:11 we can check again once this merges 15:23:19 cool 15:23:35 +1 15:23:40 test_migration failed twice in last periodic runs in 2024.1 15:23:47 - https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_d55/periodic/opendev.org/openstack/neutron/stable/2024.1/neutron-fullstack-with-uwsgi-fips/d55b927/testr_results.html 15:23:48 - https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_b18/periodic/opendev.org/openstack/neutron/stable/2024.1/neutron-fullstack-with-uwsgi-fips/b18873b/testr_results.html 15:24:37 have to confirm if next run fails similarly or it was intermittent failure 15:25:25 i can check that tomorrow and report an issue for this in any case if consistent or not 15:25:38 thanks 15:25:51 #action ykarel to check next runs for fullstack fips and report issue with fullstack 15:26:19 test_configurations_are_synced_towards_placement(NIC Switch agent) 15:26:27 https://70ea00e7f434a8c178c6-dd4a29c0fe6beace2356eed9dd1f7a86.ssl.cf2.rackcdn.com/928586/18/gate/neutron-fullstack/cd00a4b/testr_results.html 15:26:41 lajoskatona, may be you recall ^ 15:26:55 i recall you looking into this in past 15:27:05 from perspective of timeout increase is needed or not 15:27:11 yes, occasionally this failed 15:27:24 I check this one also 15:28:00 thx lajoskatona 15:28:28 #action lajoskatona to check fullstack test failure test_configurations_are_synced_towards_placement(NIC Switch agent) 15:28:37 #topic Tempest/Scenario 15:28:50 ubuntu jammy job fails randomly https://zuul.openstack.org/builds?job_name=neutron-tempest-plugin-ovn-ubuntu-jammy&project=openstack/neutron 15:29:19 i noticed couple of failures where tests fail during ssh for missing keys 15:29:35 i recall some recent discussion but couldn't find it 15:29:40 I talked about this to haleyb|out last week 15:29:55 I just proposed to mark it as non-voting (the easy path...) 15:29:57 ohkk do we already have bug for this? 15:30:03 nope sorry 15:30:12 there was no pattern 15:30:22 ohkk let's start with bug and see if we need non voting or any fix for it 15:30:31 perfect 15:30:32 I'll open it 15:30:48 i recall one issue with ovn metadata in jammy, not sure if that's related but can check 15:31:56 https://bugzilla.redhat.com/show_bug.cgi?id=2172036 and https://bugs.launchpad.net/neutron/+bug/2007166 15:32:13 but let's open a new one and we can check if it's same ^ or different 15:32:31 #action ralonsoh to open an issue for random issues in ovn ubuntu job 15:32:36 thx ralonsoh 15:32:44 #topic Periodic 15:32:57 - pyroute2 functional, already being tracked in https://bugs.launchpad.net/neutron/+bug/2100261 15:33:08 #topic Grafana 15:33:16 https://grafana.opendev.org/d/f913631585/neutron-failure-rate 15:33:23 let's have a quick look here 15:35:06 can see some spike in ovs jobs, which we already discussed 15:35:19 and same in fullstack and functional 15:35:39 and other check pipeline failures likely patches specific 15:35:45 anything to add? 15:36:20 no thanks 15:37:06 k let's move 15:37:07 #topic On Demand 15:37:29 anything else you would like to raise? 15:37:42 nothing from me 15:41:18 k if nothing else, let's close early and have everyone 19 minutes back 15:41:25 #endmeeting