15:00:10 <kopecmartin> #startmeeting qa
15:00:10 <opendevmeet> Meeting started Tue May 30 15:00:10 2023 UTC and is due to finish in 60 minutes.  The chair is kopecmartin. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:00:10 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:00:10 <opendevmeet> The meeting name has been set to 'qa'
15:00:16 <kopecmartin> #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting#Agenda_for_next_Office_hours
15:00:18 <kopecmartin> agenda ^^
15:05:54 <frickler> o/ sorry got distracted
15:06:21 <kopecmartin> np, me too
15:06:41 <kopecmartin> let's start
15:07:46 <kopecmartin> #topic Announcement and Action Item (Optional)
15:08:04 <kopecmartin> there's gonna be ptg in person in 2 weeks during the summit in vancouver
15:09:21 <kopecmartin> #topic Bobcat Priority Items progress
15:09:33 <kopecmartin> #link https://etherpad.opendev.org/p/qa-bobcat-priority
15:10:15 <frickler> the cirros update was done I think?
15:10:30 <frickler> https://review.opendev.org/c/openstack/devstack/+/881437
15:10:58 <kopecmartin> correct
15:11:00 <kopecmartin> it was
15:11:13 <frickler> for the venv/bookworm patch I pushed an update earlier today and it has some newish failures, need to look into those
15:11:57 <frickler> #link https://review.opendev.org/c/openstack/devstack/+/558930
15:12:36 <kopecmartin> ack, thanks
15:12:45 <frickler> for cirros there are new versions pending to be released with updated kernels
15:13:00 <frickler> but switching to those should hopefully go much smoother
15:13:29 <frickler> #link https://github.com/cirros-dev/cirros/issues/102
15:14:00 <kopecmartin> \o/ at the end it wasn't that bad with the dhcp client change, it just took me forever to find time to propose the changes :/
15:15:09 <frickler> for the venv patch maybe someone with more rocky/centos experience could look into the failures one those distros
15:19:01 <frickler> seems there are no volunteers around, so we can go on ;)
15:19:06 <kopecmartin> i'll try, at least i'll try to ping someone who can :)
15:19:15 <frickler> yay
15:19:27 <kopecmartin> yeah, sorry, i had to step to a quick call ..
15:19:29 <kopecmartin> let's move on
15:19:48 <kopecmartin> #topic OpenStack Events Updates and Planning
15:20:13 <kopecmartin> the etherpads for the upcomming ptg in person in vancouver were autogenerated
15:20:14 <kopecmartin> #link https://ptg.opendev.org/etherpads.html
15:20:26 <kopecmartin> here is ours
15:20:27 <kopecmartin> #link https://etherpad.opendev.org/p/vancouver-june2023-qa
15:20:46 <kopecmartin> if anyone has anything to discuss, feel free to add it there in advance
15:21:10 <kopecmartin> AI for me to promote this on ML and add a structure to the etherpad
15:21:34 <kopecmartin> #topic Gate Status Checks
15:21:40 <kopecmartin> #link https://review.opendev.org/q/label:Review-Priority%253D%252B2+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade)
15:21:56 <kopecmartin> 2 reviews there
15:21:57 <frickler> two config error fixes, please approve
15:23:17 * kopecmartin looking
15:23:40 <frickler> also CI for those branches seems pretty broken
15:24:09 <frickler> but I guess we cannot simply retire them while others still want to run jobs
15:24:24 <kopecmartin> uff, yeah, i see we need a lot of non votings
15:24:33 <kopecmartin> yeah
15:25:00 <kopecmartin> approved
15:25:47 <kopecmartin> #topic Bare rechecks
15:25:51 <kopecmartin> #link https://etherpad.opendev.org/p/recheck-weekly-summary
15:25:56 <kopecmartin> we're doing good
15:26:08 <kopecmartin> QA has bare recheck rate below 13% over the last 90 days
15:26:25 <kopecmartin> #topic Periodic jobs Status Checks
15:26:25 <kopecmartin> periodic stable full
15:26:25 <kopecmartin> #link https://zuul.openstack.org/builds?pipeline=periodic-stable&job_name=tempest-full-yoga&job_name=tempest-full-xena&job_name=tempest-full-zed&job_name=tempest-full-2023-1
15:26:27 <kopecmartin> periodic stable slow
15:26:29 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=tempest-slow-2023-1&job_name=tempest-slow-zed&job_name=tempest-slow-yoga&job_name=tempest-slow-xena
15:26:31 <kopecmartin> periodic extra tests
15:26:33 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=tempest-full-2023-1-extra-tests&job_name=tempest-full-zed-extra-tests&job_name=tempest-full-yoga-extra-tests&job_name=tempest-full-xena-extra-tests
15:26:35 <kopecmartin> periodic master
15:26:37 <kopecmartin> #link https://zuul.openstack.org/builds?project=openstack%2Ftempest&project=openstack%2Fdevstack&pipeline=periodic
15:27:24 <kopecmartin> the centos 9 stream fips job got fixed
15:27:42 <kopecmartin> by this
15:27:45 <kopecmartin> #link https://review.opendev.org/c/openstack/devstack/+/884277
15:28:06 <frickler> slow-2023-1 looks unstable. and some errors on master today
15:28:34 <kopecmartin> yeah, one post failure one timeout, let's monitor that one
15:29:12 <kopecmartin> tempest-full-test-account-no-admin-py3 and one other got broken by enabling enforce scope in glance and nova by default in devstack
15:29:14 <kopecmartin> #link https://bugs.launchpad.net/tempest/+bug/2020859
15:29:22 <kopecmartin> #link https://bugs.launchpad.net/tempest/+bug/2020860
15:30:18 <kopecmartin> the other jobs (devstack-no-tls-proxy and tempest-slow-parallel) failed due to a timeout in some requests in a few tests
15:30:24 <kopecmartin> :/
15:31:33 <kopecmartin> let's see if it's gonna repeat
15:31:50 <kopecmartin> #topic Distros check
15:31:51 <kopecmartin> cs-9
15:31:52 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=tempest-full-centos-9-stream&job_name=devstack-platform-centos-9-stream&skip=0
15:31:54 <kopecmartin> fedora
15:32:01 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=devstack-platform-fedora-latest&skip=0
15:32:01 <kopecmartin> debian
15:32:01 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=devstack-platform-debian-bullseye&skip=0
15:32:02 <kopecmartin> focal
15:32:04 <kopecmartin> #link https://zuul.opendev.org/t/openstack/builds?job_name=devstack-platform-ubuntu-focal&skip=0
15:32:06 <kopecmartin> rocky
15:32:08 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=devstack-platform-rocky-blue-onyx
15:32:10 <kopecmartin> openEuler
15:32:12 <kopecmartin> #link https://zuul.openstack.org/builds?job_name=devstack-platform-openEuler-22.03-ovn-source&job_name=devstack-platform-openEuler-22.03-ovs&skip=0
15:32:40 <kopecmartin> all looks quite good, i expected worse :D
15:33:39 <kopecmartin> #topic Sub Teams highlights
15:33:39 <kopecmartin> Changes with Review-Priority == +1
15:33:41 <lpiwowar> o/
15:33:46 <kopecmartin> #link https://review.opendev.org/q/label:Review-Priority%253D%252B1+status:open+(project:openstack/tempest+OR+project:openstack/patrole+OR+project:openstack/devstack+OR+project:openstack/grenade)
15:33:58 <kopecmartin> nothing there
15:34:00 <kopecmartin> #topic Open Discussion
15:34:04 <kopecmartin> anything for the open discussion?
15:34:28 <frickler> yes
15:34:47 <frickler> I spoke with elodilles briefly about the pending stable/newton cleanups
15:35:12 <gmann> for grenade also ?
15:35:23 <frickler> grenade and devstack
15:35:26 <gmann> cool
15:35:36 <frickler> not sure yet how to proceed
15:36:05 <frickler> those eols happened on the edge of transitioning from manual branch deletion to automated
15:36:35 <frickler> so either a gerrit admin can clean up manually or we can try to run the automation for those still
15:37:16 <gmann> if it need more work on automation scripts may be asking gerrit admin to remove manually will be easy
15:38:25 <frickler> I'd ask myself for that, by I'd still like to have the approval from the release team first
15:38:33 <kopecmartin> i'd be also for the option that requires less input from our side
15:38:48 <gmann> ++
15:39:55 <frickler> you can see the history quite nicely here https://review.opendev.org/admin/repos/openstack/devstack,tags
15:41:44 <frickler> anyway, waiting for feedback for now
15:41:56 <frickler> and that's it from me
15:42:09 <gmann> frickler: thanks for handling it.
15:42:27 <lpiwowar> I'd like to mention the failure in the test-account job that started to appear when we enabled the enforce scope in devstack for nova. I want to work on fixing it. Currently there seems to be an issue with getting users from the same project and distinguishing users with member and reader role - for pre-provisioned credentials. I do not know
15:42:27 <lpiwowar> whether someone has an idea of how should I proceed or what should I be aware of. I have a rough idea but someone might know more. -- https://review.opendev.org/c/openstack/tempest/+/884509
15:43:08 <gmann> yeah pre provisioned account if not yet ready for new RBAC, it is not small change
15:43:28 <gmann> I started looking into those and found the issues but will work this week to propose something
15:43:40 <frickler> so disable scope in that job for now?
15:43:42 <gmann> *is not yet ready
15:43:57 <gmann> frickler: yeah, that is one option.
15:44:20 <gmann> until we get pre provisioned account setup ready
15:44:38 <gmann> I can propose the disable today so that job continue running
15:44:56 <lpiwowar> gmann: ack, sounds good to me:)
15:45:21 <lpiwowar> There seems to be a lot of work with the pre-provisioned credentials.
15:45:29 <gmann> and once this gets fixed we need to make these jobs voting otherwise they are broken many times
15:45:40 <gmann> yeah and no admin one also
15:45:45 <lpiwowar> gmann: +1
15:46:30 <lpiwowar> I cleaned up some object_storage tests that were causing failure in this job. But once I finished this new error started appearing.
15:46:50 <kopecmartin> #link https://bugs.launchpad.net/tempest/+bug/1996624
15:46:56 <kopecmartin> #link https://review.opendev.org/c/openstack/tempest/+/881575
15:47:16 <kopecmartin> the new errors are unrelated to the fix and LP
15:47:33 <gmann> k, will check those today or tomorrow
15:47:34 <kopecmartin> lpiwowar fixed the bug, thank you, once the gate is stable/fixed, we can proceed
15:47:50 <lpiwowar> kopecmartin: ack
15:47:56 <kopecmartin> there is one more interesting bug
15:47:57 <gmann> ++ thanks lpiwowar
15:47:59 <kopecmartin> #link https://bugs.launchpad.net/tempest/+bug/2020659
15:48:18 <kopecmartin> which was caused , somehow, by this
15:48:20 <kopecmartin> #link https://review.opendev.org/c/openstack/tempest/+/881675
15:48:28 <kopecmartin> that patch uncovered something broken in the logic
15:48:40 <kopecmartin> there is a fix in progress
15:48:42 <kopecmartin> #link https://review.opendev.org/c/openstack/tempest/+/884584
15:49:00 <gmann> it did not fail in that change ?
15:49:08 <kopecmartin> no
15:49:17 <kopecmartin> it failed in other jobs in totally different project
15:49:31 <kopecmartin> in order to fail you had to set this
15:49:31 <kopecmartin> [network].floating_network_name = public
15:49:41 <gmann> humm, some different config
15:49:43 <lpiwowar> I commented on this today. Is the verify_ssh() function needed altogether?
15:49:54 <kopecmartin> yep, and that lead to a different execution path ...
15:49:58 <kopecmartin> it's described in the LP
15:50:15 <kopecmartin> yeah, there is a discussion in 884584
15:50:32 <kopecmartin> not sure how to proceed, seems like the whole logic (if else) is quite outdated
15:50:47 <lpiwowar> We wait for the server to be sshable. So create_server() should do the checking instead of the verify_ssh() ?
15:50:47 <kopecmartin> anyway, we can continue discussing there
15:50:49 <gmann> ack, let me check. we can make it more consistent with other tests SSH verification
15:51:03 <gmann> lpiwowar: yeah
15:52:06 <gmann> added it in list, will check this today
15:52:13 <kopecmartin> thanks gmann
15:52:16 <lpiwowar> gmann: ack:)
15:52:27 <kopecmartin> i also commented on your comment gmann here
15:52:29 <kopecmartin> #link https://review.opendev.org/c/openstack/tempest/+/879923
15:52:33 <kopecmartin> regarding the cleanup
15:52:39 <kopecmartin> and resource naming
15:53:07 <gmann> kopecmartin: ack
15:53:33 <kopecmartin> #topic Bug Triage
15:53:37 <kopecmartin> #link https://etherpad.openstack.org/p/qa-bug-triage-bobcat
15:53:41 <kopecmartin> all recorded tehre ^^
15:53:49 <kopecmartin> and as we've already covered the bugs
15:53:53 <kopecmartin> this is all from my side
15:54:18 <kopecmartin> if there isn't anything else ...
15:54:32 <kopecmartin> thank you everyone , see you online
15:54:33 <kopecmartin> #endmeeting