15:00:15 #startmeeting qa 15:00:15 Meeting started Tue Jun 11 15:00:15 2024 UTC and is due to finish in 60 minutes. The chair is kopecmartin. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:15 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:15 The meeting name has been set to 'qa' 15:00:20 #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting#Agenda_for_next_Office_hours 15:00:32 \o 15:00:36 o/ 15:02:13 o/ 15:03:24 let's get through the agenda really quick as it seems not much has happened 15:03:33 no announcements 15:03:42 #topic Gate Status Checks 15:03:43 #link https://review.opendev.org/q/label:Review-Priority%253D%252B2+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:03:43 #topic Sub Teams highlights 15:03:44 Changes with Review-Priority == +1 15:03:46 #link https://review.opendev.org/q/label:Review-Priority%253D%252B1+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade) 15:04:01 anything to bring up for review? 15:04:57 Nothing from my side 15:05:49 #topic Periodic jobs Status Checks 15:05:49 Periodic stable full: https://zuul.openstack.org/builds?pipeline=periodic-stable&job_name=tempest-full-2023-1&job_name=tempest-full-2023-2&job_name=tempest-full-2024-1 15:05:49 Periodic stable slow: https://zuul.openstack.org/builds?job_name=tempest-slow-2024-1&job_name=tempest-slow-2023-2&job_name=tempest-slow-2023-1 15:05:51 Periodic extra tests: https://zuul.openstack.org/builds?job_name=tempest-full-2024-1-extra-tests&job_name=tempest-full-2023-2-extra-tests&job_name=tempest-full-2023-1-extra-tests 15:05:53 Periodic master: https://zuul.openstack.org/builds?project=openstack%2Ftempest&project=openstack%2Fdevstack&pipeline=periodic 15:07:41 all seems good here \o 15:07:43 \o 15:07:45 \o/ 15:07:46 hmm, checking some of the timeouts, they seems to be on rax, similar to what sean-k-mooney noted yesterday 15:07:46 :D 15:08:40 i havent got aroudn to tryign to compute the adverage devstack times for diffent providers yet but i was brifly trying to od that this mornign 15:09:15 you mean devstack-no-tls-proxy and tempest-full-centos-9-stream jobs? yeah .. it doesn't happen often though 15:10:43 the tests passed so it timeouted on a step after that 15:10:55 sean-k-mooney: just looking at the opensearch output, the variance seemed to be very high for all providers, so maybe the average doesn't tell as much, if just the outliers are worse 15:11:38 right i wanted to try and comptue the variance and or exlude outliers mroe then 1 sigma 15:11:48 with that sai di really dont know if ill have time to do that or not 15:12:35 this is the slightly cleaned up inital data i was goingt to work form but i realise i also need to look at the test time not jsut devstack 15:12:37 https://termbin.com/2wso 15:13:47 yes, I was looking at similar data with message:"Sum of execute time for each test:", but you get two of those per run 15:14:28 we can proably just addd them 15:14:57 i was orgianlly trying to do this with awk and realised a small python script will be much eaiser 15:15:21 but i havnt start on that yet 15:16:28 time is always an issue, even for timings ;) 15:17:01 anyway, we can continue to discuss after the meeting if needed 15:17:41 :) 15:18:08 all right, moving on to periodic jobs 15:18:15 *distros 15:18:17 #topic Distros check 15:18:17 Centos 9: https://zuul.openstack.org/builds?job_name=tempest-full-centos-9-stream&job_name=devstack-platform-centos-9-stream&skip=0 15:18:17 Debian: https://zuul.openstack.org/builds?job_name=devstack-platform-debian-bullseye&job_name=devstack-platform-debian-bookworm&skip=0 15:18:19 Rocky: https://zuul.openstack.org/builds?job_name=devstack-platform-rocky-blue-onyx 15:18:21 openEuler: https://zuul.openstack.org/builds?job_name=devstack-platform-openEuler-22.03-ovn-source&job_name=devstack-platform-openEuler-22.03-ovs&skip=0 15:18:23 jammy: https://zuul.opendev.org/t/openstack/builds?job_name=devstack-platform-ubuntu-jammy-ovn-source&job_name=devstack-platform-ubuntu-jammy-ovs&skip=0 15:19:08 centos 9 timeouts on "TASK [upload-logs-swift : Upload logs to swift]" as a few of the other jobs as was discussed .. but the tests pass 15:19:44 openeuler fails though , well the logs also ends with the uploading logs to swift ... but the installation fails 15:20:14 * kopecmartin scrolling to see some useful traceback 15:21:20 seems like it fails on "Started Devstack devstack@n-novnc-cell1.service" 15:21:32 nova-novncproxy[92997]: Can not find html/js files at /usr/share/novnc. 15:21:41 #link https://b8364cd60eb1cfb73569-2426a842487e6155337d0caa8d4dcb05.ssl.cf5.rackcdn.com/periodic-weekly/opendev.org/openstack/devstack/master/devstack-platform-openEuler-22.03-ovs/9d3eae6/job-output.txt 15:24:27 hmm, isn't 22.03 old? the latest version seems to be 24.03 15:25:03 is it even worth debugging? shouldn't we update the version first? 15:25:54 iirc it didn't work without some adaption work 15:26:46 like building in nodepool and updating mirrors. we might not have space to mirror both versions in parallel, too 15:29:03 ok, i'll file an LP to track the errors and the question whether we want to fix, remove or bump the version 15:29:15 #topic Open Discussion 15:29:15 anything for the open discussion? 15:30:19 I have something regarding the bug #link https://bugs.launchpad.net/tempest/+bug/1583220 15:30:46 nice, do you have a patch? 15:31:05 or any findings? 15:31:36 I do not have the patch yet. I am still investigating but believe I have narrowed down where the issue lies. #link https://opendev.org/openstack/tempest/src/branch/master/tempest/test_discover/plugins.py 15:32:11 I am continuing the investigation but wanted to give an update from my last time in the meeting 15:32:52 that's good news , thank you investing time into that 15:33:39 I think we need to narrow down a particular line where we can catch the exception:). But this sounds like a progress! 15:33:44 feel free to also write a comment in the LP for the record and future reference 15:35:04 Will do, I believe we found the general location of where the error occurs and will be adding exceptions to test 15:36:10 +1 15:36:10 serami +1 15:36:30 if there isn't anything else, let's move to the last item on the agenda, bug numbers 15:36:36 that will be quick 15:36:42 #topic Bug Triage 15:36:42 #link https://etherpad.openstack.org/p/qa-bug-triage-dalmatian 15:37:01 it's been quiet on this front, nothing new 15:37:37 aand that's it, if there isn't anything else, let's close the office hour 15:37:48 just noting that there are a lot of zuul config errors due to the devstack-gate retirement. any help with cleaning those up would be nice 15:38:28 i could send a patch or 2, is it tracked anywhere? 15:39:00 just via https://zuul.opendev.org/t/openstack/config-errors?severity=error&skip=0 so far 15:39:44 that will help! 15:40:22 seems like an easy upstream contribution , lpiwowar we could ask our interns to help too 15:40:55 I would force-merge patches that don't get timely reviews 15:41:07 aand i have a procrastination activity for those long calls :D 15:41:10 great, thanks 15:41:23 kopecmartin, ok, sounds good, will take a look! 15:41:37 all right, thank you all, see you around 15:41:40 o/ 15:41:46 so best keep a common topic like https://review.opendev.org/q/topic:%22retire-devstack-gate%22 15:41:49 thx all 15:42:00 perfect, will do 15:42:09 #endmeeting