Tuesday, 2024-05-28

opendevreviewMerged openstack/pbr master: zuul: Drop retired repos from required-projects  https://review.opendev.org/c/openstack/pbr/+/92059400:47
opendevreviewMonty Taylor proposed openstack/project-config master: Add inaugust/wandertracks repo  https://review.opendev.org/c/openstack/project-config/+/92061608:59
opendevreviewMonty Taylor proposed openstack/project-config master: Add inaugust/wandertracks repo  https://review.opendev.org/c/openstack/project-config/+/92061609:35
opendevreviewMonty Taylor proposed openstack/project-config master: Add inaugust/wandertracks repo  https://review.opendev.org/c/openstack/project-config/+/92061609:36
*** haleyb|out is now known as haleyb13:25
opendevreviewJeremy Stanley proposed openstack/pbr master: Add openstack-tox-py312 as non-voting job  https://review.opendev.org/c/openstack/pbr/+/92059514:58
opendevreviewJeremy Stanley proposed openstack/pbr master: Use SetupTools' vendored distutils in tests  https://review.opendev.org/c/openstack/pbr/+/92067614:58
opendevreviewJeremy Stanley proposed openstack/pbr master: Add SetupTools to our functional testing venvs  https://review.opendev.org/c/openstack/pbr/+/92067714:58
fungifrickler: ^14:58
clarkbany reason to not make that job voting?14:59
fungii'm in favor of adding it as voting if it's passing, yes15:01
fungimaybe the risk is when we switch py312 jobs from pyenv to noble packages it will re-break?15:02
fungii could see holding off for that change15:02
clarkbor just going straight to noble for these jobs15:03
fungialso a good idea for early dogfooding15:03
clarkbthe coverage job looks flaky15:12
clarkberror: [Errno 2] No such file or directory: 'build/bdist.linux-x86_64/wheel/pbr/tests/testpackage/doc/source/index.rst'15:14
clarkbis this the global build problem again? I thought we "addressed" that by running tests serially. Maybe that didn't affect the coverage job?15:15
clarkbya coverage uses a separate stestr invocation that needs --serial too15:16
clarkbI can push that up after my current meeting if no one beats me to it15:16
fungii don't think there's any hurry15:16
fungiclarkb: do you think pbr should start listing setuptools as an install_requires, since we can't assume all environments will have it?15:29
clarkbfungi: its suggested to do that with pyproject.toml already. Listing it in setup.py is a bit chicken and egg isn't it?15:34
clarkbbasically I'm not sure using install_requires will help anything. But using pyproject.toml does address the issue and we already suggest this for people using that method15:35
fungiclarkb: install_requires, not setup_requires. basically if projects are installing pbr at runtime (not as a build backend), then they'll need distutils which they can only get from setuptools now15:38
fungi(as of python 3.12)15:39
fungiwe could i guess add a requirements.txt that only installs setuptools for python 3.12 and later via an environment marker15:40
fungirather than for all interpreter versions15:40
clarkboh I see. I guess that helps dot i's and cross t's particularly since wheels exist15:40
fungiclarkb: well, not just dot and cross. the version of venv which ships with python 3.12 doesn't install setuptools by default either, so runtime calls to pbr in a venv will just be broken15:45
clarkbya in my head if you got pbr for installation working that shouldn't be an issue then I remembered wheels exist so maybe not15:52
clarkbsince a wheel of $package doesn't require pbr or setuptools to execute right? Its all been precomputed at wheel build time15:52
fungiright15:53
opendevreviewClark Boylan proposed openstack/pbr master: Use SetupTools' vendored distutils in tests  https://review.opendev.org/c/openstack/pbr/+/92067615:59
opendevreviewClark Boylan proposed openstack/pbr master: Add SetupTools to our functional testing venvs  https://review.opendev.org/c/openstack/pbr/+/92067715:59
opendevreviewClark Boylan proposed openstack/pbr master: Add openstack-tox-py312 as non-voting job  https://review.opendev.org/c/openstack/pbr/+/92059515:59
opendevreviewClark Boylan proposed openstack/pbr master: Also run coverage tests serially  https://review.opendev.org/c/openstack/pbr/+/92068615:59
haleybi wanted to create an autohold for a job i'm trying to debug, how do i create an account on zuul.openstack.org?19:47
clarkbI think the credentials for that are still largely based on the idea being all of the accounts are zuul admins.19:48
clarkbhaleyb: if you tell us what you need an autolhold for we can put one in place for you19:49
fungiyes, only zuul administrators can hold nodes, but haleyb just let us know the project/job/change you want a hold for19:49
haleybclarkb: oh, thanks. i found a youtube video and it just said "login in the upper right corner" :-/19:49
clarkbyes depending on how your zuul is set up or if you are a zuul admin then you can do it that way19:50
haleybhttps://review.opendev.org/c/openstack/neutron/+/920150 is the change in question - neutron cover job, i guess just a single job there just openstack-tox-cover19:51
clarkbfor our zuul the expectation is that only zuul admins have accounts (currently anyway) and only zuul admins can hold nodes19:51
clarkbhaleyb: thanks I'll get that done for you momentarily19:51
haleybclarkb: thanks!19:51
clarkbhaleyb: done19:53
fungihaleyb: part of the reason it's not self-service is that the list of ssh keys allowed access is baked into our node images, so one of us still has to ssh into the held node once it exists and add access for your ssh key anyway19:53
haleybfungi: ack. i just did a recheck hopefully it fails19:54
fungicool, once the job finishes, just ping us and let us know what your public key is (or where to find it)19:55
opendevreviewMerged openstack/pbr master: Also run coverage tests serially  https://review.opendev.org/c/openstack/pbr/+/92068620:01
opendevreviewMerged openstack/pbr master: Use SetupTools' vendored distutils in tests  https://review.opendev.org/c/openstack/pbr/+/92067620:01
haleybfungi: was the hold on just one of the jobs? the one i mentioned passed of course20:47
tonybhaleyb: yeah it's only one job. we can another of you'd like20:49
haleybtonyb: can you add/try neutron-cover-job-1 ? I created a bunch of clones and they don't always fail20:50
tonybsure.20:51
tonybhaleyb: https://zuul.opendev.org/t/openstack/autohold/000000003821:02
tonybThe reason is clearly bogus sorry :/21:03
haleybtonyb: thanks, and the reason doesn't matter to me :)21:03
tonybhaleyb: I figured, and it still points at me so as far as an audit trail goes it's adequate.21:04
tonybhaleyb: If you send me a public key I can add it to the authorized_users file when it fails21:05
tonybhaleyb: cover-job-3 failed the others are still running21:32
haleybtonyb: yup, it's typically like a 75% failure rate, we'll see if i picked the right one21:33
tonybhaleyb: if we don't hit the failure I'll just hold all the jobs.  If we get more than one hit I'll delete any extras21:34
haleybtonyb: ack, thanks. i'm about eod so probably can't look today anyway21:35
tonybhaleyb: fair enough.21:37
clarkbis it always the same test case that fails?21:40
clarkbneutron.tests.unit.services.ovn_l3.test_plugin.OVNL3ExtrarouteTests.test_floatingip_update_invalid_fixed_ip is the one I see in the current failed job21:40
clarkblooking at the subunit file (I converted it from v2 to v1 for human readability) it appears that test case starts about 5 minutes before the end of the job which is curious. I wonder if instrumentation for coverage is tripping it into a deadlock or similar21:41
opendevreviewMerged openstack/project-config master: Retire devstack-gate  https://review.opendev.org/c/openstack/project-config/+/91963021:52
opendevreviewMerged openstack/pbr master: Add SetupTools to our functional testing venvs  https://review.opendev.org/c/openstack/pbr/+/92067722:07
tonybclarkb: the hold for haleyb triggered but I can't get into the node to add their key.  It should be `ssh -i ~/${my_opendev_admin_key} tonyb@${held_ip}` correct?22:28
clarkbtonyb: I don't know that we've switched the ci nodes over to the new key definition (or that the new key definition successfully applied yet)22:28
clarkbfrickler: was poking at that by cleaning up security groups to hopefully get the ansible cloud updates to run successfully which would update the keys in the clouds, then separately we have to update the nodepool definitions22:29
* clarkb looks for logs22:29
tonybclarkb: Oooooooh! that'd make sense22:30
clarkblooks like it is still failing on the "Found more a single matching security group rule which match the given parameters." error. Not sure where frickler ended up with those edits since I think this is the same cloud22:32
clarkbTASK [cloud-launcher : Processing security_group_rule for opendevzuul-osuosl RegionOne] failed22:32
clarkbtonyb: maybe we can followup on that early tomorrow our time and see if we can get that resolved with frickler. In the mean time I can add the key to hte node if you point me to the info22:32
tonybsent via /query22:34

Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!