15:00:49 #startmeeting neutron_ci 15:00:49 Meeting started Tue Feb 22 15:00:49 2022 UTC and is due to finish in 60 minutes. The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:49 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:49 The meeting name has been set to 'neutron_ci' 15:00:56 hi all 15:00:59 Grafana dashboard: https://grafana.opendev.org/d/f913631585/neutron-failure-rate?orgId=1 15:00:59 Please open now :) 15:01:04 o/ 15:01:15 no video today, right? 15:01:41 mlavalle video is every other week 15:01:44 and it was last time 15:01:49 so today only irc :) 15:01:55 o/ 15:01:59 o/ 15:02:37 hi 15:02:48 o/ 15:03:10 Grafana dashboard: https://grafana.opendev.org/d/f913631585/neutron-failure-rate?orgId=1 15:03:10 Please open now :) 15:03:22 Hi 15:03:42 ok, lets start 15:03:44 #topic Actions from previous meetings 15:03:54 mlavalle https://bugs.launchpad.net/neutron/+bug/1945283 15:04:09 hi 15:04:17 I've started looking at it. Haven't come up with a root cause yet 15:04:29 will continue working on it this week 15:04:35 ok, let me assign it to You for next week too 15:04:44 yeah, please 15:04:49 #action mlavalle will continue work on https://bugs.launchpad.net/neutron/+bug/1945283 15:04:51 thx 15:04:55 :-) 15:04:59 next one 15:05:01 slaweq to get data about rechecks in stable branches 15:05:15 I fetched that data 15:05:25 let me paste it here 15:06:09 https://paste.opendev.org/show/bWpvJRHMisJAvcenIY1Y/ 15:06:44 in general there were weeks where numbers for e.g. wallaby or victoria were higher 15:06:53 we had less patches 15:06:59 but it's not very bad 15:07:05 right, quite good 15:07:09 and there were some recent breakages (like pyroute bump in wallaby) 15:07:20 ralonsoh but those are "average rechecks per week" 15:07:27 yes, I know 15:07:41 but overall less than 1.xx nice 15:08:01 but the higher the number of patches is, usually the higher the number of rechecks 15:08:05 generally, as with master could be better, but it's not the nightmare 15:09:10 and overall does not get too bad as we go to older branches 15:09:27 good news before the release :-) 15:09:27 yes 15:10:13 ok, so speaking about stable branches :) 15:10:15 #topic Stable branches 15:10:23 any new issues to discuss there? 15:11:37 none that I am aware of 15:11:55 that's great news :) 15:12:07 indeed I will never complain about "no news" :) 15:12:40 LoL 15:12:50 ok, so I think we can move on 15:12:54 next topic 15:12:57 #topic Stadium projects 15:13:04 no news :-) 15:13:14 ++ 15:13:25 I checked this morning and the ones I checked are green 15:13:29 I like that :) 15:13:59 +1 15:13:59 thx lajoskatona 15:14:12 ok, so lets move on to the next topic 15:14:12 #topic Grafana 15:14:15 https://grafana.opendev.org/d/f913631585/neutron-failure-rate 15:15:31 I think ykarel did some update to our dashboard today 15:15:38 ykarel do You have link to Your patch? 15:15:52 slaweq, just sent https://review.opendev.org/c/openstack/project-config/+/830440 15:16:04 thx 15:16:20 also periodic jobs are not running since 15th on master branch 15:16:39 for that sent https://review.opendev.org/c/openstack/neutron/+/830371 15:17:02 nice 15:17:04 I just +2 this patch 15:17:09 thx for it 15:17:40 ack Thanks 15:18:07 from other things related to grafana, it seems that there are missing some data from last days 15:18:20 but I don't think we can do anything with it 15:18:26 I hope it will be better in next days 15:18:48 as for rechecks in master branch, here are the stats: 15:18:50 +---------+----------+... (full message at https://matrix.org/_matrix/media/r0/download/matrix.org/opdMcATWObwJnLwbGuBZmVPE) 15:19:09 we are going a bit higher last 2 or 3 weeks 15:19:20 now is 0?? 15:19:29 but I know there were e.g. issue with some fullstack local_ip test which is now fixed by obondarev 15:19:52 ralonsoh it's this week, maybe there wasn't many patches in gate this week :) 15:21:12 any other questions/comments regarding grafana? 15:22:18 ok, I guess that this silence means "no" :) 15:22:22 :) 15:22:23 #topic fullstack/functional 15:22:44 I looked at the patches with failed tests today and I found only 2 issues 15:22:47 one in functional 15:22:56 Network interface not found in namespace (again) 15:23:07 I opened bug https://bugs.launchpad.net/neutron/+bug/1961740 15:23:12 https://89b1b88fa362b409cfb1-2a70ac574f4ba34d12afc72df211f1b3.ssl.cf5.rackcdn.com/828687/1/gate/neutron-functional-with-uwsgi/a5b844b/testr_results.html 15:23:22 I have it on my todo list to check 15:23:35 but if there is anyone who would like to check it, feel free to take it 15:28:07 ralonsoh I wonder if it's not the same issue as we have seen in one d/s bug 15:28:15 which one? 15:28:19 I added https://review.opendev.org/c/openstack/ovsdbapp/+/825727 to hopefully understand it better 15:28:36 but in logs of the functional test there is no such logs so it's hard to say really 15:29:33 we just discussed it but there was no bug for it, am I right? 15:29:33 but looking at timestamps, it's probably something different 15:29:52 lajoskatona yes 15:33:13 I will check journal log from that test to see if there is maybe something interesting there 15:33:46 #action slaweq to check missing devices in namespace, see https://bugs.launchpad.net/neutron/+bug/1961740 for details 15:33:56 now fullstack 15:34:01 https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_99b/828022/3/gate/neutron-fullstack-with-uwsgi/99b5482/testr_results.html 15:34:06 this is issue with local_ip test 15:34:18 but IIUC it should be fixed with https://review.opendev.org/c/openstack/neutron/+/829659 15:34:22 thx obondarev for the fix 15:34:56 and it is merged so we are on the safe side :-) 15:35:07 we should be :) 15:36:07 hopefully :) but not 100% 15:36:25 anything else regarding functional/fullstack jobs for today? 15:36:32 or can we move on? 15:37:57 ok, so lets move on 15:38:02 #topic Periodic 15:38:09 generally periodic jobs looks ok 15:38:19 thx ykarel for fix of the tripleo jobs there 15:38:39 from other things I have one patch to review https://review.opendev.org/c/openstack/neutron/+/827302 15:39:10 and I have one more question regarding periodic jobs 15:39:33 wdyt if we would add fips related jobs, especially functional and scenario to the periodic queue? 15:39:39 those jobs are so far in the experimental queue 15:39:50 I'm ok with this 15:40:08 but I though to maybe have them in perdiodic to have more data about how stable they are 15:40:10 will you remove them from experimental queue? 15:40:16 and if we will see any issues with them 15:40:35 ralonsoh yes, I though to simply move them from experimental to periodic now 15:40:42 perfect for me 15:40:45 +1 15:40:46 +1 15:40:48 +1 15:40:51 +1 15:40:58 thx, so I will do it 15:41:00 joining the +1 club :) 15:41:00 the plan is to later run it for every patch in check and gate queue? 15:41:08 #action slaweq to move fips jobs to periodic queue 15:41:39 lajoskatona that would be ideal at some point but giving number of jobs which we already have, I don't think it's topic for now 15:41:51 maybe if that will become official community goal at some point 15:41:59 and then it may be required to have such job 15:42:40 ok, agree 15:42:51 that's basically all what I had for today 15:43:03 anything else related to our CI what You want to discuss today? 15:44:12 if not, I think I can give You 15 minutes back :) 15:44:21 thx for attending the meeting and have a great week 15:44:22 yay \o/ 15:44:25 see You online 15:44:26 o/ 15:44:26 \o/ 15:44:26 bye 15:44:29 bye 15:44:31 o/ 15:44:31 #endmeeting