15:01:16 #startmeeting neutron_ci 15:01:16 Meeting started Mon Feb 24 15:01:16 2025 UTC and is due to finish in 60 minutes. The chair is ykarel. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:01:16 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:01:16 The meeting name has been set to 'neutron_ci' 15:01:22 Ping list: bcafarel, lajoskatona, slawek, mlavalle, mtomaska, ralonsoh, ykarel, jlibosva, elvira 15:01:31 hello 15:01:57 o/ 15:02:08 o/ 15:04:25 o/ 15:04:42 Hello everyone, let's start with topics from previous week 15:04:48 #topic Actions from previous meetings 15:04:55 ralonsoh to move maintenance test to concurrency 1 15:04:55 o/ 15:05:03 #link https://review.opendev.org/c/openstack/neutron/+/941941 15:05:07 already merged 15:05:09 thx ralonsoh 15:05:13 yw 15:05:15 ralonsoh to check fullstack issue triggered by patch 937843 or any related eventlet removal patches 15:05:32 I didn't see any relation between the merged patch and these errors 15:05:50 actually I was able to reproduce some of them (locally) without this patch 15:06:16 ohkk able to consistantly reproduce? 15:06:24 no, randomly only 15:06:27 we still seeing these failures 15:06:29 okk 15:06:48 you reported some bug for this already? 15:06:54 no, I'm still checking it 15:06:57 but I'll do it today 15:07:03 k thx ralonsoh 15:07:18 #topic Stable branches 15:07:48 periodic wise seen one failure in 2024.1 15:07:59 couple of patches merged last week in stable branches 15:08:00 o/ 15:08:07 bcafarel, anything else to add ^ 15:08:23 yes, overall quiet week, not too many patches and they got in without trouble 15:08:36 though thanks for catching the periodic one I had not checked that 15:09:00 k thx for the updates 15:09:14 #topic Stadium projects 15:09:23 Finally green after many weeks :) 15:09:30 lajoskatona, anything else to add 15:09:53 yes, thanks all for the attention and reviews and help 15:10:17 hope this will keep till the release :-) 15:10:23 ++ 15:10:48 +1 15:11:12 #topic Rechecks 15:11:25 less activity in master this week, 3 merged with 0 rechecks 15:11:45 bare recheck wise it was bit higher 12/45 15:12:02 most of the bare rechecks in few patches 15:12:08 let's keep avoiding bare rechecks 15:12:27 Now let's check failures 15:12:34 #topic fullstack/functional 15:12:39 - functional wait_until_true failures as last week 15:12:48 - https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_bc6/periodic/opendev.org/openstack/neutron/master/neutron-functional/bc60662/testr_results.html 15:12:48 - https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_4e4/periodic/opendev.org/openstack/neutron/master/neutron-functional/4e4cdde/testr_results.html 15:12:48 - https://0398d1fc47f7068c1379-e807054499ce854694349e023b0ab289.ssl.cf1.rackcdn.com/periodic/opendev.org/openstack/neutron/master/neutron-fullstack-fips/a3491fa/testr_results.html 15:13:01 this is same we discussed above and ralonsoh already looking into it 15:13:04 I'll check these new ones 15:13:14 - test_securitygroup(ovs-openflow) 15:13:24 - https://227c79f6282be3c4ba5c-c685ac7d1fe5f5adbb4d0f30da5e685e.ssl.cf2.rackcdn.com/periodic/opendev.org/openstack/neutron/master/neutron-fullstack/ecd2d79/testr_results.html 15:13:25 - https://d7dc64ccb2654383b602-73af3e7fb0ca2f6ff45efb1cdea09ef7.ssl.cf2.rackcdn.com/941459/1/gate/neutron-fullstack/50b4a4c/testr_results.html 15:14:04 we still not eventlet free yet, right? 15:14:11 as seeing the traces in the failures 15:14:29 yes, specially the tests, that are still importing eventlet 15:14:31 all of them 15:14:57 ohkk 15:15:16 you mean both funvctional and fullstack use eventlet now? 15:15:42 YES 15:15:44 sorry 15:15:45 yes 15:16:22 if we have working code without eventlet those test will fail anyway, or shall we have a mixed state where the prod code is eventletless but functional is not for example? 15:16:45 this part was out of my horizon till this time sorry for dumb question 15:16:49 the point is that the code we are pushing now should work with both 15:17:17 but we can start segregating the UTs/FT/fullstack tests 15:17:25 ralonsoh: ack, thats what I hopes based on my patch for dhcp for example 15:17:27 and create jobs with and without eventlet 15:17:51 lajoskatona, yes, exactly 15:18:00 #link https://review.opendev.org/c/openstack/neutron/+/942393 15:18:31 sorry but if we are going to eventlet free, why we need to keep testing both? 15:18:38 or eventlet free will be next release 15:19:03 because tests cover parts of the code that are still not migrated 15:19:22 for example: L3 agent is still using eventlet pools 15:19:36 we also need to refactor some parts of the tests, that are using eventlet methods 15:19:52 the amount of work is huge yet... 15:19:58 and we are close to the FF date https://releases.openstack.org/epoxy/schedule.html 15:19:59 ok you meant until everything migrated to keep both versions, and drop extra job once migration finishes 15:20:07 yes 15:20:11 thx 15:20:25 lajoskatona, yes but we don't need to remove eventlet completely in this release 15:20:51 ralonsoh: yes, that was what I wanted to say, thanks 15:21:02 ohk coming back to above failures, anyone willing to check these failures? 15:21:55 which ones? Those related to the fullstack security group tests? 15:22:02 yes 15:23:15 I think I was checking some random failures in those tests in the past but I may be mixing it with something else also 15:23:24 and this week I don't think I will have time for it 15:24:54 i will be also out mostly this week 15:24:58 ack let's start with opening a bug for it 15:25:01 i will open that 15:25:17 #action ykarel to open bug for fullstack security group test failures 15:26:19 test_metadata_not_provisioned_on_foreign_additional_chassis_change 15:26:25 I found https://bugs.launchpad.net/neutron/+bug/1871908 but this should be fixed 15:26:27 AssertionError: NoDatapathProvision not raised. 15:26:34 so it is probably some different issue now 15:26:35 - https://a7dbc1742e9a3f72e252-f52818fbddf6bbc6fa2f220fdc17218b.ssl.cf1.rackcdn.com/periodic/opendev.org/openstack/neutron/stable/2024.1/neutron-functional/fb76481/testr_results.html 15:26:41 or it wasn't really fixed properly 15:26:49 seen once, anyone recall seeing similar in past? 15:27:09 no sorry 15:27:26 slaweq, thx when reporting new one will check that bug too 15:28:49 k can keep a watch on it for now as seen just once 15:29:01 test_trunk_creation_with_subports 15:29:10 failed as oslo_privsep.daemon.FailedToDropPrivileges: privsep helper command exited non-zero (96) 15:29:17 https://f0f27d8bde84b9d726b9-26f184bb59af339cfe698349cbda4177.ssl.cf2.rackcdn.com/periodic/opendev.org/openstack/neutron/master/neutron-functional/e356bb0/testr_results.html 15:30:08 seen this also once 15:30:33 let me check the logs of this one 15:31:23 k thx 15:31:37 #action ralonsoh to check privsep related failure in functional job 15:31:50 #topic Periodic 15:32:02 https://zuul.openstack.org/builds?job_name=neutron-functional-with-pyroute2-master&project=openstack/neutron 15:32:12 some recent change in pyroute2 broke this job 15:33:11 I'll check with latest code 15:33:35 there are many patches recently in pyroute2 15:33:38 ralonsoh, so you will also report the lp bug/ pyroute2 issue for this? 15:33:43 for sure 15:33:47 yes seems some new branch got merged 15:34:03 or a pr with multiple series of commits, i didn't checked that 15:34:13 I' 15:34:14 #action ralonsoh to check pyroute2 functional failures 15:34:18 I'll check patch by patch 15:34:34 since feb 19, that we have the first error in periodic 15:35:42 k thx 15:35:56 https://zuul.openstack.org/builds?job_name=neutron-functional-with-pyroute2-master&project=openstack/neutron 15:36:04 last 2 runs failed but different test failures 15:36:14 yeah... I'll check that manually 15:36:31 thx for this too :) 15:37:06 sorry wrong link 15:37:20 https://zuul.openstack.org/builds?job_name=neutron-functional-with-oslo-master&project=openstack/neutron 15:37:26 ahhh ok 15:37:46 Merged openstack/neutron-tempest-plugin master: Add scenario tests for the Vlan QinQ feature https://review.opendev.org/c/openstack/neutron-tempest-plugin/+/937778 15:37:58 let me send a testing patch 15:38:05 because these errors could be random 15:38:13 ++ yes that should be quick and good 15:38:14 I'll confirm that with this patch 15:38:53 #action ralonsoh to confirm failure with oslo master function job 15:38:56 - https://zuul.openstack.org/builds?job_name=openstacksdk-functional-devstack-networking&project=openstack%2Fneutron&branch=stable%2F2024.1&skip=0 15:39:06 last one for stable/2024.1 15:39:36 2025-02-24 03:01:47.575562 | controller | openstacksdk 4.3.1.dev26 depends on typing-extensions>=4.12.0 15:39:36 2025-02-24 03:01:47.575584 | controller | The user requested (constraint) typing-extensions===4.9.0 15:39:51 that seems an clash in requirements 15:40:04 yes looks job issue, master openstacksdk not compatible with stable/2024.1 upper-constratints 15:40:19 normally for such cases job is fixed to use stable version of openstacksdk 15:40:24 why are we using master? 15:40:28 yeah, that was my question 15:40:29 i recall fixing similar in past 15:40:36 i will send a patch for it 15:40:39 cool 15:40:53 the sdk team want to use master for stable branches till it could be used 15:41:07 ahhh ok 15:41:14 #action ykarel to send fix for openstacksdk job 2024.1 15:41:25 #topic Grafana 15:41:38 let's also check https://grafana.opendev.org/d/f913631585/neutron-failure-rate 15:42:37 (is not working for me) 15:42:41 there's warnings on the page like "The panel requires angular (deprecated)" it just on my side? 15:42:54 same 15:42:56 yeah, I was just going to ask the same about this warning 15:43:17 I see graphs still but there is warning in each of them also on top 15:43:34 ok, now is refreshing for me 15:44:15 yes i too just seeing the warning 15:44:31 I see high spikes in FTs and fullstack 15:44:47 it was quite week, so less bits in gate queue and no failures 15:45:03 Merged openstack/ovsdbapp master: Added method and command to allow setting 'dns.options' column https://review.opendev.org/c/openstack/ovsdbapp/+/942367 15:45:18 yes in ft and fullstack from what i saw it was patch specific or similar issues we discussed above 15:45:26 right 15:45:31 ralonsoh especially functional has high failure rate, around 50-60% 15:47:50 yes 15:48:12 we should be good once we have current known issues fixed for these 15:49:03 #topic On Demand 15:49:09 anything else you would like to raise? 15:50:01 nothing 15:50:10 nothing from me 15:50:23 nope 15:50:40 all good 15:51:39 k in that case let's close early and have everyone 8 minutes back 15:51:43 thx everyone for joining 15:51:47 #endmeeting