14:00:05 #startmeeting qa 14:00:05 Meeting started Tue Jul 27 14:00:05 2021 UTC and is due to finish in 60 minutes. The chair is kopecmartin. Information about MeetBot at http://wiki.debian.org/MeetBot. 14:00:05 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 14:00:05 The meeting name has been set to 'qa' 14:00:17 #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting#Agenda_for_next_Office_hours 14:00:19 agenda ^^^ 14:00:25 o/ 14:00:41 o/ 14:02:15 hi all 14:02:21 let's satrt 14:02:22 start 14:02:27 #topic Announcement and Action Item 14:02:35 I'll be on vacation the whole next week 14:02:45 o/ 14:02:56 i won't have access to email, irc anything (after a very long time :D ) 14:03:13 can you gmann moderate the meeting? 14:03:19 or should we cancel? 14:03:21 kopecmartin: sure 14:03:52 gmann: thanks! 14:04:06 that's all from my side on announcements 14:04:36 #topic Xena Priority Items progress 14:04:41 any updates? 14:05:30 #link https://etherpad.opendev.org/p/qa-xena-priority 14:05:36 priority items etherpad ^^ 14:05:42 started tempest removal from u-c for testing tempest plugin with master version 14:05:48 but need more testing for this 14:06:24 on system scope in devstack/tempest- few config setting the scope are merged for few services 14:06:57 I am debugging on nova to move completely yo system/project scope first in devstack and then tempest. 14:07:20 in some n-v job and rest all existing jobs keep running same way as they are doing currently 14:07:41 that is all from me 14:07:54 thank you gmann 14:09:27 #topic OpenStack Events Updates and Planning 14:09:38 #link https://etherpad.opendev.org/p/qa-yoga-ptg 14:09:52 we don't have any topics yet ..although there's plenty of time to add some 14:10:00 and i'm sure will add some overtime 14:10:21 yeah, we should start filling it though we have time 14:10:42 I will add system scope thing to continue discuss or check status 14:10:49 the time slots for qa are picked, seems we're on track for now (only the topics to discuss are missing for now) 14:10:59 gmann: good, thanks 14:11:58 #topic Gate Status Checks 14:12:02 any issues? 14:12:24 i've noticed there was an issue with the docs job, however, that's fixed already 14:12:39 yeah 14:13:07 few projects might be failing in cinder v2 removed from python-cinderclient but it is easy fix for them to move to v3 14:13:29 #topic Periodic jobs Status Checks 14:13:36 #link https://zuul.openstack.org/builds?job_name=tempest-full-victoria-py3&job_name=tempest-full-ussuri-py3&job_name=tempest-full-train-py3&pipeline=periodic-stable 14:13:43 #link https://zuul.openstack.org/builds?job_name=tempest-all&job_name=tempest-full-oslo-master&pipeline=periodic 14:13:48 periodic jobs are green 14:14:12 gmann: oh .. good 14:14:35 tempest/devstack is all good on that as taken care in advance 14:15:34 there is one issue with run-tempest-26 .. although i haven't figured out why yet 14:15:47 but this job is failing since that role is used in stein 14:16:03 and i think it's because the environment is recreated: https://zuul.opendev.org/t/openstack/build/d133896d1e114c9cacfb5aee022dd3e2/log/job-output.txt#37048 14:16:05 #link https://zuul.opendev.org/t/openstack/build/d133896d1e114c9cacfb5aee022dd3e2/log/job-output.txt#37048 14:16:10 yeah, i was tryig to debug that 14:16:17 which is the only difference 14:16:26 seems constraints mismatch is forcing them to recreate venv 14:16:33 I will check after meeting and see 14:16:44 but is it happening in all stable with plugins? 14:17:06 I tested devstack/tempest stable jobs but i think i did not test stable with plugins 14:17:18 i've seen that only in watcher-tempest-plugin 14:17:23 k 14:17:40 however i didn't check others, so it may happen in some other plugin too 14:18:00 I will add some testing patch and see 14:18:09 I am sure it is constraints things 14:19:23 makes sense, i didn't find anything wrong in the role itself 14:19:25 #topic Sub Teams highlights 14:19:35 #link https://review.openstack.org/#/q/project:openstack/tempest+status:open 14:19:35 #link https://review.openstack.org/#/q/project:openstack/patrole+status:open 14:19:35 #link https://review.openstack.org/#/q/project:openstack/devstack+status:open 14:19:37 #link https://review.openstack.org/#/q/project:openstack/grenade+status:open 14:19:39 #link https://review.opendev.org/#/q/project:openstack/hacking+status:open 14:19:41 any updates from any subteam? 14:19:56 what will we do with patrole? 14:20:02 I started debugging the patrol failure. did recheck for fresh failure and will continue today 14:20:21 kopecmartin: let's see if I can make patrole green by this week 14:20:26 thank you for that! 14:20:37 there was some fedora on devstack stuff last week 14:20:40 or rather devstack on fedora 14:22:03 and what's the status? any help needed? 14:22:05 ah no, it was yesterday 14:22:06 https://review.opendev.org/c/openstack/devstack/+/795640 14:22:13 https://review.opendev.org/c/openstack/devstack/+/802223 14:22:19 * kopecmartin checking 14:22:34 should we drop the old jobs and definitions? 14:22:40 I think that's what we did the last time 14:23:02 yup 14:23:03 https://review.opendev.org/c/openstack/devstack/+/755449 14:23:40 so I guess let's remove again? 14:23:44 fedora only on master 14:23:54 we could document that somewhere I guess... 14:24:12 yoctozepto: these patch are moving current job right not adding new? 14:24:17 yeah 14:24:21 as always 14:24:27 yeah 14:24:31 yeah, let's do that again .. if fedora 32 is eol, there is no sense to run it in older releases 14:24:41 'should we drop the old jobs and definitions?' I did nt get this 14:24:53 on stable 14:24:55 ? 14:24:58 gmann: we need to go to stable branches and drop the job and nodeset 14:24:59 yes 14:25:02 ohk 14:25:20 I will do it then 14:25:24 thanks for confirming 14:25:31 we usually do not update the version in stable 14:25:37 yoctozepto: thanks 14:25:44 yeah, we could just remove fedora on branching ;-) 14:25:46 but if they start failing then yes removal is only option 14:25:52 +1 14:27:18 moving on, i don't see any patches with priority +1 nor +2 14:27:19 #link https://review.opendev.org/q/label:Review-Priority%253D%252B1+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade+OR+project:openstack/hacking) 14:27:25 #link https://review.opendev.org/q/label:Review-Priority%253D%252B2+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade+OR+project:openstack/hacking) 14:27:30 #topic Open Discussion 14:27:35 anything from anyone? 14:27:50 not from me 14:29:15 nothing from me 14:29:17 #topic Bug Triage 14:29:22 #link https://etherpad.opendev.org/p/qa-bug-triage-xena 14:29:27 numbers are recorded as always 14:29:39 there is a new bug on devstack 14:29:40 #link https://bugs.launchpad.net/devstack/+bug/1938151 14:29:56 which might be caused by a lately merged patch 14:30:03 #link https://review.opendev.org/c/openstack/devstack/+/788056 14:31:30 interesting 14:32:25 I think that is what brtknr mentioned in channel too 14:32:47 gmann, kopecmartin, I don't have much updates but just to say, patches are continuosly failing on patrole 14:32:49 is it again size quota? 14:33:03 soniya29|rover: yeah, i will look into that this week 14:33:08 dansmith: ^^ 14:33:11 gmann: yep it was mentioned 14:33:13 multiple rechecks is also not helping them 14:33:14 I have not checked yet 14:33:52 log link? 14:34:08 dansmith: https://bugs.launchpad.net/devstack/+bug/1938151 14:34:29 okay will look 14:34:42 soniya29|rover: yeah, there is an issue somewhere, i was able to get a passing review if i skipped a few tests 14:35:12 thanks 14:36:21 that's all from my side regarding bug triage 14:36:39 anything else to discuss? 14:36:58 nothing from me 14:37:11 let's close the office hour then 14:37:18 thank you all for attending 14:37:27 #endmeeting