*** marios is now known as marios|ruck | 04:57 | |
elodilles | hberaud ttx : hi, when you are around: there are 2 things that need some reviews | 05:14 |
---|---|---|
elodilles | hberaud ttx : the 1st one is an easy one, another reno links patch: https://review.opendev.org/c/openstack/releases/+/835045 | 05:15 |
elodilles | hberaud ttx : the 2nd one is the difficult one: the yoga-final patch is failing due to 3 deliverables that is impacted by the latest setuptools releases | 05:17 |
elodilles | (the patch is: https://review.opendev.org/c/openstack/releases/+/835322 ) | 05:17 |
elodilles | hmmm, i see there was a new setuptools release yesterday (61.2.0), so i've rechecked the patch to see the impact of that | 05:20 |
elodilles | (anyway, some words about the issue: https://meetings.opendev.org/irclogs/%23openstack-release/%23openstack-release.2022-03-26.log.html ) | 05:21 |
hberaud | ack will do soon | 06:33 |
elodilles | thx \o/ | 07:18 |
-opendevstatus- NOTICE: zuul isn't executing check jobs at the moment, investigation is ongoing, please be patient | 07:19 | |
elodilles | so that's why the jobs haven't started ^^^ | 07:20 |
ttx | looking | 07:25 |
elodilles | thx! | 07:26 |
ttx | probably simpler to wait for an infra fix anyway | 07:26 |
ttx | Also I would consider doing late releases for those 3 -- they are pretty peripheral anyway | 07:28 |
ttx | aren't those ansible roles release-trailing? | 07:29 |
ttx | ah barbican stuff | 07:30 |
elodilles | ttx: releasing the ansible-role-thales-hsm ansible-role-atos-hsm and sahara-image-elements would probably be not so risky - though 1st we need to get those workaround patches merged ( https://review.opendev.org/q/topic:setuptools-issue-3197 ) somehow | 07:50 |
elodilles | ttx: what about barbican? | 07:50 |
elodilles | oh, yes, those are barbican team's deliverables | 07:52 |
elodilles | (the ansible-role-*-hsm ones) | 07:52 |
elodilles | i've tried to test whether the new setuptools release (61.2.0) helps, but it seems it does not | 07:56 |
opendevreview | Merged openstack/releases master: Add remaining release note links for Yoga https://review.opendev.org/c/openstack/releases/+/835045 | 08:48 |
elodilles | so gate is working again ^^^ | 08:54 |
elodilles | the result is the same with latest (61.2.0) setuptools: https://zuul.opendev.org/t/openstack/build/3faec3c5dce94457916cadbe4b2da211 | 09:04 |
elodilles | i've added reviewers to the 3 WA patches | 09:05 |
elodilles | also pinged the teams on their channels | 09:12 |
elodilles | other option i guess (if we don't get answers from the teams) is to skip those from the yoga-final patch / release, as they are not even active projects it seems | 09:13 |
elodilles | hberaud ttx : what do you think? ^^^ | 09:13 |
hberaud | WFM | 09:14 |
ttx | ideally we would release them | 09:15 |
ttx | since they are part of the release and we have no real process for not releasing them | 09:16 |
ttx | Would there be a way to somehow pin to a working setuptools? | 09:16 |
elodilles | so we now either do another 'final-rc' for ansible-role-atos-hsm, ansible-role-thales-hsm and sahara-image-elements or leave them out from yoga if we can't get the patches merged and release them. | 09:16 |
elodilles | ttx: well, we can add this to tox.ini: | 09:17 |
elodilles | requires = setuptools<61.0.0 | 09:17 |
ttx | Like if that release happened on Wednesday morning and broke 12 projects what would we have done | 09:17 |
elodilles | that's true | 09:18 |
ttx | ideally we would be isolated from that, but I guess setuptools is special | 09:18 |
elodilles | yes, most of the things are constrained, but not setuptools, virtualenv, pip, etc. afaik | 09:19 |
ttx | OK so let's try to get those reviews some attention -- barring that I would consider asking the TC to allow infra to merge them to save their release | 09:20 |
ttx | We can give teams most of today to react, then tomorrow we switch to TC/infra to get them in | 09:21 |
ttx | I'm fine with late releases for those since they are pretty minor | 09:22 |
elodilles | ttx: sounds like a plan! | 09:22 |
ttx | but they should still make it to the final release :) | 09:22 |
ttx | gmann and fungi: see ^ | 09:22 |
elodilles | thanks, the plan looks good | 09:23 |
ttx | basically we need the changes in https://review.opendev.org/q/topic:setuptools-issue-3197 merged and a new release for the 3 affected deliverables before EOD Tuesday, which may require some TC authorizing Infra to force the patches in. | 09:24 |
elodilles | and we still have the 'setuptools constrain' in tox.ini as an option (that should work but maybe we need some testing for that) | 09:24 |
elodilles | as you suggested | 09:24 |
ttx | elodilles: which tox.ini would that be? | 09:25 |
ttx | openstack/release? | 09:25 |
elodilles | yes | 09:25 |
ttx | hmm, that is tempting | 09:25 |
elodilles | and after the release we should revert that | 09:25 |
elodilles | (i'll push now some DNM testing patch for that, just to make sure) | 09:26 |
ttx | OK let's discuss today with gmann and fungi which plan B they prefer (forcing changes or constraining setuptools for release day). Plan A remains to get those patches in today | 09:26 |
opendevreview | Elod Illes proposed openstack/releases master: DNM: testing to cap setuptools https://review.opendev.org/c/openstack/releases/+/835423 | 09:30 |
elodilles | this is the DNM patch, let's see if it works ^^^ | 09:30 |
opendevreview | Elod Illes proposed openstack/releases master: DNM: testing to cap setuptools https://review.opendev.org/c/openstack/releases/+/835423 | 10:10 |
elodilles | updated the DNM patch, this should work now (we need to constrain setuptools via virtualenv as it bundles setuptools) ^^^ | 10:14 |
hberaud | it works \o/ | 10:28 |
hberaud | kudos | 10:29 |
elodilles | so yes, this is another option \o/ | 10:32 |
hberaud | This one seems a good security for us that we can handle without asking to external team. We are autonomous with that | 11:24 |
hberaud | And we could drop it once the final yoga is out | 11:25 |
elodilles | hberaud: yepp | 11:38 |
fungi | elodilles: the new way is to call `python3 -m build` using https://pypi.org/project/build/ | 11:44 |
fungi | but i wasn't aware that direct calls to setup.py were more than simply deprecated at the moment (i don't see any removal called out in the setuptools changelog) | 11:46 |
fungi | possibly more relevant is the virtualenv 20.14.0 release on friday if these things are being run by tox, since that updated installed envs from setuptools 60.10.0 to 61.1.0 | 11:49 |
*** dviroel|pto is now known as dviroel | 11:52 | |
fungi | https://zuul.opendev.org/t/openstack/build/3faec3c5dce94457916cadbe4b2da211 is really making my browser work hard | 11:54 |
fungi | oh wow that's massive | 11:55 |
fungi | okay, after giving up and opening the log in a text editor, i see that the problem is the same one tripleo was dealing with on friday: "Multiple top-level packages discovered in a flat-layout" | 12:29 |
fungi | the warnings about calling setup.py are non-fatal, btw | 12:31 |
fungi | a number of potential workarounds are mentioned in https://github.com/pypa/setuptools/issues/3197 though the workaround tripleo went with was adding a py_modules=[] parameter in the setuptools.setup() of their setup.py files in the affected projects | 12:36 |
fungi | thankfully this only impacts a very small number of our projects (those which add multiple packages in a single repo) | 12:37 |
elodilles | fungi: sorry, yes, only 3 project is affected | 12:41 |
elodilles | fungi: and I've proposed similar workarounds (with some gate fix): https://review.opendev.org/q/topic:setuptools-issue-3197 | 12:41 |
elodilles | but still we need to merge these and release them | 12:42 |
elodilles | we have counted 3 possible way-forward so far (and 'python3 -m build' is maybe the 4th :)) | 12:43 |
*** dviroel is now known as dviroel|brb | 12:43 | |
elodilles | (Sean actually mentioned some similar fix Saturday, but i haven't tried that yet, and this one ^^^ seems more close to what validator script does) | 12:45 |
elodilles | maybe now the easiest and less painful solution is if we use the 'setuptools pinning via virtualenv in tox' patch temporarily, for the release, and then we revert it (the patch would be something like this: https://review.opendev.org/c/openstack/releases/+/835423 ) | 12:47 |
elodilles | but I'll try soon the 'python3 -m build' as well | 12:47 |
fungi | well, using build is only probably going to silence the warnings about calling setup.py, it's not likely to address the multiple top-level packages problem | 13:02 |
fungi | we probably do want to change our release scripts to no longer call setup.py directly, but that's not urgent | 13:07 |
elodilles | if that is the new way, then it needs to be changed anyway, i agree | 13:07 |
elodilles | i'll propose a patch for that | 13:08 |
fungi | yeah, i've been using it on my personal projects which also rely on pbr, and it works fine there so should work for openstack as well | 13:08 |
fungi | this is how i build and test releases for my projects with tox: https://paste.opendev.org/show/bSzpKscw3K6mgjuPXGD4/ | 13:10 |
*** dviroel|brb is now known as dviroel | 13:27 | |
opendevreview | Elod Illes proposed openstack/releases master: Replace old sdist and wheel build command in validate https://review.opendev.org/c/openstack/releases/+/835450 | 13:34 |
opendevreview | Elod Illes proposed openstack/releases master: Pin setuptools for Yoga release https://review.opendev.org/c/openstack/releases/+/835423 | 13:44 |
elodilles | i've updated the setuptools pinning patch as well, so that we can rebase the release patch to this ^^^ if needed | 13:46 |
hberaud | 2'ed | 13:53 |
elodilles | another thing is: zigo mentioned @ openstack_sahara that he sees issue with multiple projects, that comes from oslo.context replaced 'tenant' to 'project_id' | 13:57 |
elodilles | he mentioned: magnum, mistral, murano, sahara, trove, zaqar having that error | 13:58 |
zigo | Yeah, at lease those ... | 13:58 |
zigo | It's kind of blocking my packaging then. | 13:58 |
zigo | And if all of these projects have (pending?) patches, it's kind of hard to track. | 13:59 |
elodilles | i guess this issue needs to be fixed somehow before we release Yoga :S | 13:59 |
elodilles | hberaud ttx ^^^ | 13:59 |
hberaud | projects listed above are mostly not really active so it could take long time to see them fixed | 14:03 |
hberaud | another option for these deliverables could be to pin oslo.context below the version 4.0.0 instead of moving tenant to project_id | 14:05 |
zigo | Magnum and Sahara got the fixes. | 14:05 |
zigo | https://review.opendev.org/c/openstack/magnum/+/834296 | 14:05 |
zigo | Others I haven't investigated yet. | 14:05 |
hberaud | cool | 14:06 |
zigo | hberaud: *NO*, that's *NOT* an option to hide the dust under the carpet. | 14:06 |
zigo | Not everyone uses venv ... | 14:06 |
zigo | The solution could be to fix oslo.context and release a 4.0.1 that doesn't yell with deprecation. | 14:06 |
hberaud | the tenant was deprecated since long time ago | 14:07 |
zigo | 4.1.1 I mean. | 14:07 |
zigo | Yeah, but it wasn't just failing UT. | 14:07 |
zigo | Mistral got the fix too. | 14:09 |
hberaud | What do you mean by fixing oslo.context? I'm not fan to back and forth with this deprecated argument too... | 14:09 |
zigo | I'll investigate the others and let you know the state of things ... :) | 14:09 |
opendevreview | Vishal Manchanda proposed openstack/releases master: Release horizon 22.1.0(Yoga) https://review.opendev.org/c/openstack/releases/+/835461 | 14:11 |
elodilles | the main problem is that some not actively maintained project missed the deprecation warning and now that oslo.context is really removed the 'tenant' context argument they are broken. so i also think this is not something that oslo.context "should" fix, but something that the other projects should | 14:30 |
elodilles | the question is whether we now release them 'as broken packages' and they need to fix this via a stable release OR we jeopardize the release by adding more and more changes, for which we don't even have the time | 14:32 |
hberaud | Agree with you. oslo.context is not broken... | 14:33 |
elodilles | i tend to vote for the 1st option. it also indicates that those projects need maintainers | 14:33 |
hberaud | I'd suggest to fix them via stable release | 14:33 |
hberaud | +1 | 14:33 |
elodilles | hberaud: ++ | 14:33 |
elodilles | ttx: do you agree? ^^^ | 14:34 |
hberaud | i see no reason to prevent oslo from moving forward if the other projects are at a standstill | 14:34 |
elodilles | the 'api-break' change (oslo.context 4.0.0) was released almost 2 months ago... it would be better to release them sooner, but if there are not enough maintainers then maybe that would not make any difference to release them 3 months ago :/ | 14:42 |
hberaud | yeah | 14:45 |
elodilles | and I also undersand zigo that it is a pain in packaging :/ | 14:49 |
elodilles | (as well) | 14:49 |
*** amoralej|off is now known as amoralej | 14:53 | |
ttx | sorry I can't look right now | 14:56 |
elodilles | sorry, no worries, hberaud and I agree, just wanted to ask for your opinions as well o:) | 15:04 |
zigo | hberaud: The reason is: you submited it a WAY too late in the release cycle. That's the kind of breaking change to do at the begining of a cycle. | 15:17 |
zigo | Best course would be: revert the change in the Yoga branch in oslo.context, but leave it as-is in master. | 15:18 |
zigo | If you look at the code of these projects, most of them have the word "tenant" a bit everywhere ... | 15:18 |
zigo | Just for the fun of it, I wrote this: https://review.opendev.org/c/openstack/trove/+/833186 (which of course breaks everything, as I just did sed / grep stuff without thinking...). | 15:19 |
zigo | Yes, it's a disaster, and projects should move forward, I very much agree with the reasonning. | 15:20 |
zigo | BUT, that's not the way to fix things. | 15:20 |
zigo | I mean, not at the end of a release cycle. | 15:20 |
*** marios|ruck is now known as marios|ruck|call | 15:23 | |
ttx | I think it's important that all projects use the same oslo.context release, so pinning project-by-project would not be good | 15:54 |
ttx | I'm not sure I understand how broken those packages are and how many are affected | 15:55 |
ttx | And yes that shows that 2 months is no longer sufficient lead time for projects to adapt to lib changes | 15:55 |
ttx | at least for sub-maintained projects | 15:56 |
ttx | we should fix the ones we can today for a last-minute release... and the others on a stable release asap | 15:58 |
ttx | but again, I'd need a description of how broken they actually are -- they pass testign so I assume they are not completely broken? | 15:59 |
elodilles | oslo.context was released between milestone-2 and milestone-3. the question is would it make a difference for these projects if the release would be produced earlier. I think maybe a bit less project would be broken, but otherwise could be the same. :/ | 16:00 |
ttx | elodilles: are they unusable, or just weirdly incoherent? | 16:01 |
ttx | sorry, going into meetings again | 16:01 |
zigo | ttx: That's the issue, they do not pass testing with the newer oslo.context. | 16:02 |
elodilles | testing is broken (simple example patch for fix of such issue: https://review.opendev.org/c/openstack/castellan/+/834669) | 16:03 |
*** marios|ruck|call is now known as marios|ruck | 16:03 | |
fungi | it's likely that they simply merged no new changes after the oslo.context bump in requirements broke them | 16:03 |
elodilles | but i'm not sure whether we have broken code as well in projects themselves beyond testing | 16:04 |
fungi | if they're fairly unmaintained, then probably nobody even noticed until downstream packaging work tripped over it | 16:04 |
elodilles | that is definitely the case | 16:05 |
* zigo heads back home | 16:05 | |
*** dviroel is now known as dviroel|lunch | 16:13 | |
*** marios|ruck is now known as marios|out | 16:15 | |
fungi | elodilles: on further reading, it looks like updating tox.ini to force setuptools>=61.1 may also solve the problem | 16:26 |
fungi | the pr claiming to have solved it for at least some projects is included in that version from saturday | 16:26 |
fungi | it's just not the default version installed by tox yet | 16:26 |
fungi | er, well, bundled in virtualenv i mean | 16:27 |
elodilles | yes it needs virtualenv to bundle setuptools>=61.1 | 16:27 |
elodilles | let me check if we have that already | 16:27 |
elodilles | still virtualenv 20.14.0 is the latest from Friday | 16:28 |
elodilles | * latest release | 16:28 |
hberaud | zigo: Well, I proposed my patch at the begining of yoga (in october)... and the patch was merged 6 months later so I don't think I submitted it too late https://review.opendev.org/c/openstack/oslo.context/+/815938 | 16:28 |
elodilles | fungi: in case virtualenv will be released before Wednesday (with bundled setuptools 61.2) then we are OK :) | 16:29 |
fungi | elodilles: alternatively we can set https://tox.wiki/en/latest/config.html#conf-requires to require setuptools>=61.1 as a workaround | 16:30 |
*** jbadiapa is now known as jbadiapa|off | 16:30 | |
clarkb | if we do that in a base enough tox job it should have good coverage | 16:31 |
fungi | so on the oslo.context 4.x conflict, do we have a complete list of which projects are still impacted? | 16:32 |
elodilles | fungi: that does not work, we need to do it via virtualenv (see former patchsets of my patch) | 16:32 |
fungi | elodilles: tox updating setuptools in its virtualenv doesn't solve it? or we're not calling tox? | 16:32 |
elodilles | fungi: when i set setuptools, it still installs latest virtualenv with the bundled setuptools (see PSets of this patch: https://review.opendev.org/c/openstack/releases/+/835423 ) | 16:34 |
elodilles | fungi: but pinning virtualenv works | 16:35 |
fungi | https://github.com/pypa/virtualenv/pull/2324 doesn't seem to be getting fast-tracked | 16:36 |
elodilles | fungi: about oslo.context we don't have the complete list, but zigo reported these as broken magnum, mistral, murano, sahara, trove, zaqar. this might not be the complete list. | 16:37 |
fungi | so our options with oslo.context are 1. roll back the removal to a deprecation in an emergency point release on stable/yoga, 2. try to get fixes merged to all affected projects, 3. don't include the above projects in the yoga release, or 4. release yoga with those untestable and possibly broken | 16:40 |
clarkb | fungi: I think you can also set the constraint for the older version? | 16:40 |
clarkb | I guess that implies caps in the affected projects and is part of 2? | 16:41 |
fungi | right, that's probably a sub-option of #2 | 16:41 |
fungi | since just pinning oslo.context in global requirements won't really pin it in the projects themselves (but will result in them getting tested with older oslo.context and maybe also being a signal to package maintainers to not package oslo.context 4.x with yoga?) | 16:42 |
fungi | gets to be very mixed-signal though with oslo.context 4.x appearing in the yoga release itself | 16:43 |
clarkb | ya | 16:43 |
elodilles | fungi: good summary. when discussing it with hberaud we agreed that maybe option 4 is the one we should go for, and the failing projects can propose stable releases afterwards | 16:52 |
*** dviroel|lunch is now known as dviroel | 16:55 | |
elodilles | fungi: https://meetings.opendev.org/irclogs/%23openstack-release/%23openstack-release.2022-03-28.log.html#t2022-03-28T14:30:27 | 16:56 |
fungi | okay, so just to be clear, the consensus is that releasing broken services is preferable to rolling back oslo.context in the last hours before the release? | 16:58 |
elodilles | though maybe about your 4 option: option 1 is feasible, option 2 seems time consuming, option 3 feasible too (if we find all the affected projects + we don't have core projects among them), option 4 feasible, not nice, but indicates which projects are 'undermaintained' and can be fixed with stable releases | 16:59 |
elodilles | fungi: so far I think that is the consensus, at least me and hberaud said so. but i'm not super confident, and open to other options | 17:02 |
fungi | i do agree that holding back progress in maintained projects because of unmaintained projects is not great, but releasing services in a state where downstream consumers can't test them (and the broken tests may even indicate that the projects themselves are broken) is also not great. not including those projects in the release might help send a stronger signal as to the actual | 17:06 |
fungi | problem, in this case, but is something which might benefit from input from tc members as well | 17:06 |
elodilles | fungi: that's true, i can accept this | 17:08 |
elodilles | so you vote for option 3 | 17:08 |
clarkb | If it were me I think I would undeprecate that since clearly it wasn't well coordinated | 17:08 |
clarkb | come back and update all of the places that need updating and depends on those changes in oslo.context to make the switch | 17:09 |
ttx | So a new oslo.context would fix it but we'd lose another cycle | 17:09 |
elodilles | clarkb: so you vote option 1 | 17:09 |
clarkb | yes option 1 is my vote. We have the tools to test this stuff and make these removals safe. We didn't do that seems like its a good idea to reset and try again using the tools | 17:10 |
ttx | i don;t think (3) is an option. We promised those projects would be part of the release, they have to be. Even if untestable (that would not be the first time) | 17:10 |
ttx | (2) would be best, but unlikely | 17:11 |
ttx | I would consider (1) but I'm not sure I understand all the implications | 17:11 |
fungi | 2 is particularly unlikely for the same reasons the fixes hadn't been done already, yeah | 17:11 |
ttx | (1) is probably the least disruptive | 17:12 |
fungi | 1 would need a revert in stable/yoga, a point release tagged, and stable/yoga requirements update merged | 17:12 |
ttx | I assume it's a pretty localized change | 17:12 |
fungi | but yes, as to how much of oslo.context would need unrolling in order to revert the deprecation i'm not sure | 17:13 |
ttx | If that's not acceptable (like we can't get it merged in oslo.context) than we'd go for (4) | 17:13 |
fungi | luckily it merged relatively late in the cycle, so there wasn't a lot of time to do other related things after it, i guess | 17:13 |
ttx | I feel like the TC should make a call between (1) and (4) sicne (2) did not happen in time | 17:13 |
ttx | yeah I'd like to hear how feasible (1) is | 17:14 |
ttx | Because the initial change took 3 months to merge | 17:15 |
fungi | so, if i'm reading the oslo.context stable/yoga history correctly, the only changes to land after the tenant removal were a constraints and .gitreview updates for stable/yoga earlier this month, mypy typehinting added last month, a setuptools workaround merged in january | 17:17 |
ttx | hberaud did propose it at the start of the cycle, that was just not processed downstream fast enough | 17:17 |
zigo | fix for trove: https://review.opendev.org/c/openstack/trove/+/834373 (currently building the package) | 17:17 |
fungi | typehinting might need to be tweaked if 815938 is reverted on stable/yoga, but probably not much (or may not be critical) | 17:18 |
zigo | Not enough, it still fails ... :( | 17:18 |
elodilles | the patches in oslo.context: https://paste.opendev.org/show/bzXX9zGo8Prb4N6SM50X/ | 17:18 |
ttx | how many packages are broken? | 17:18 |
elodilles | zigo: do you have a list perhaps? ^^^ | 17:18 |
zigo | elodilles: sahara has a patch and is fixed. | 17:18 |
ttx | If it's just a couple, I'd lean towards (4) and calling out for quick stable releases | 17:18 |
fungi | ttx: 815938 was proposed in october, had no revisions, and was eventually approved in mid-january | 17:19 |
elodilles | zigo: the problem is that we proposed the release patch for sahara, but the team did not react to that so it was abandoned :( | 17:19 |
ttx | right -- it's just symptomatic of our inability to process such a change in 6 months | 17:19 |
elodilles | zigo: and that is the better case, because at least there we had fix for the issue. :/ | 17:20 |
zigo | ttx: For the moment, I believe the list is: mistral, murano, trove. | 17:20 |
fungi | well, the sahara example also points out that at least some of the projects involved may just be unmaintained for all intents and purposes | 17:20 |
ttx | i'd be tempted to release those as-is | 17:20 |
zigo | I backported the patch for Sahara, and it's fine. | 17:20 |
elodilles | fungi: yes, it seems so | 17:20 |
ttx | fungi: do you think the TC could step in and force those patches today? If we have a new release tomorrow it can make it | 17:21 |
ttx | so... strongarm option 2 | 17:21 |
fungi | the tc should be able to approve the changes, eys | 17:21 |
fungi | er, yes | 17:21 |
clarkb | ya if fixes can be landed making 2 happen that seems reasonable | 17:22 |
fungi | i can grant tc-members approval rights in gerrit over the repos or i can just elevate my own privs and approve those if they agree | 17:22 |
ttx | elodilles: could you make a case on #openstack-tc for forcing patches in mistral, murano, trove to fix trheir release in time for the Yoga date? | 17:22 |
ttx | Those patches are pretty uncontroversial I suspect | 17:22 |
zigo | Agreed. | 17:22 |
zigo | It's just mostly s/tenant/project_id/ | 17:23 |
ttx | OK so my vote is on forcing those patches in master and stable/yoga for mistral, murano, trove | 17:23 |
elodilles | ttx so you mean on IRC? yes i can | 17:23 |
ttx | elodilles: yes | 17:23 |
elodilles | OK, jumping over there | 17:24 |
ttx | If we have them all merged today we can do re-releases tomorrow and get the final release patch in time for Wednesday | 17:24 |
ttx | zigo: thanks for bringing it up | 17:25 |
ttx | sahara needs a re-release right | 17:25 |
ttx | (in addition to mistral, murano, trove) | 17:25 |
zigo | The sahara one is already in the stable/yoga branch: https://review.opendev.org/q/I1bc81b3c13d2c08bc175d0d4f4365de7b4f71cf9 | 17:26 |
ttx | At this point rolling back 6 months of getting that oslo.context change in is more work / more impactful than forcing 3 patches in | 17:26 |
zigo | Though the RC1 doesn't contain it. | 17:26 |
ttx | ack so it needs a RC2 | 17:27 |
zigo | Mistral is broken with: | 17:28 |
zigo | AttributeError: 'Retrying' object has no attribute 'call' | 17:28 |
zigo | So probably not oslo.context ? I'm not sure ... | 17:28 |
fungi | zigo: have a link to the change? | 17:28 |
zigo | https://review.opendev.org/c/openstack/trove/+/834373 <--- This patch isn't enough, and I still get some errors. | 17:28 |
ttx | it feels like we should run all tests when requirements freeze to spot those earlier | 17:29 |
clarkb | having a single tempest job that installs all libs from git might be a good canary too | 17:30 |
zigo | Then I haven't found a patch (yet?) for Murano. | 17:30 |
zigo | Murano does 143 times: AttributeError: 'RequestContext' object has no attribute 'tenant' | 17:30 |
fungi | zigo: that trove change seems to be passing testing and is approved" | 17:36 |
fungi | i guess you mean you're seeing errors trying to use trove, not in upstream testing | 17:36 |
elodilles | ttx: yes, it sounds like we need such testing :S | 17:38 |
fungi | zigo: i see https://bugs.debian.org/1005467 which has example tracebacks for mistral at least | 17:38 |
elodilles | zigo: what about zaqar? (magnum seems to be fixed as we were informed on tc channel) | 17:38 |
fungi | the mistral bug looks like it might have to do with testtools | 17:39 |
fungi | that was with testtools==2.5.0 according to the included freeze | 17:40 |
* zigo attemps to build zaqar | 17:42 | |
zigo | Oh, zaqar looks like, it seems.... :) | 17:43 |
ttx | elodilles: PTG discussion! | 17:46 |
elodilles | ttx: for sure a good topic | 17:46 |
ttx | I'll have to drop, but it seems the TC will slowly get persuaded and hopefully fungi will fix as many of them as possible | 17:47 |
ttx | FWIW worst case scenario we do release them as broken. That happened in the past, and we fxied them in stable releases | 17:47 |
fungi | yeah, i'm happy to use gerrit admin access to approve changes if tc-members give the go-ahead | 17:48 |
*** amoralej is now known as amoralej|off | 17:51 | |
ttx | ok it seems to go in the right direction | 17:53 |
ttx | I'm dinnering | 17:53 |
elodilles | bon appetit! | 17:53 |
elodilles | i'll try to add the details here: https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes | 17:53 |
elodilles | so that we can follow the progress | 17:54 |
fungi | zigo: looks like that's the same testtools version we're using in stable/yoga upper-constraints.txt, so we should have hit the error upstream if that were the problem. i'll have to look closer at the code in mistral to see where that retries class is coming from | 17:59 |
fungi | elodilles: are you going to send a formal proposal to openstack-discuss for bypassing the usual core reviewer and ptl approvals for these changes and release candidates, or would you like me to do so on behalf of the release team? i'm sure it's getting well into dinner time in your locale | 18:01 |
elodilles | fungi: if you could do that that would be awesome :) | 18:03 |
elodilles | fungi: i'm also in middle of my dinner ;) | 18:03 |
fungi | okay, i'm happy to do so. i'll send something now. please get back to enjoying your evening! | 18:03 |
elodilles | meanwhile i'm also adding details to the etherpad -- https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes | 18:04 |
elodilles | (patch list to see where we are) | 18:04 |
elodilles | zigo: if you could also double-check the list then it would be awesome: ^^^ | 18:08 |
elodilles | (though i'm still adding the patches there) | 18:08 |
fungi | zigo: are there any upstream test failure examples for mistral you're aware of? otherwise i'm inclined to leave that off the formal proposal to the tc for now while we investigate further relevance there | 18:14 |
*** lajoskatona_ is now known as lajoskatona | 18:27 | |
fungi | proposal posted here: https://lists.openstack.org/pipermail/openstack-discuss/2022-March/027864.html | 18:32 |
fungi | for those not following along in #openstack-tc, an "emergency" meeting was convened in which the tc members in attendance agreed to that proposal | 18:33 |
elodilles | fungi: ack, thanks! | 18:40 |
elodilles | fungi zigo : as I see mistral and magnum already having the oslo.context tenant->project_id fix in their releases (if the patches I've found are the ones we are looking for -- https://etherpad.opendev.org/p/tenant-projectid-last-minute-fixes ) | 18:42 |
elodilles | fungi zigo : so i guess those shouldn't be broken due to our oslo.context issue. maybe there are something else? | 18:43 |
zigo | fungi: Trove's patch is *not enough* and it continues to fail with 'tenant' failures. | 18:47 |
fungi | so the retry objects in those tracebacks are coming from tenacity's Retrying class | 18:47 |
fungi | which would make sense in light of https://bugs.debian.org/1005467 | 18:47 |
zigo | fungi: Yeah, right. | 18:47 |
zigo | It needs 8.0.1 compat, it's likely only working with 6.x | 18:48 |
fungi | we're testing openstack projects with tenacity===6.3.1 on stable/yoga | 18:48 |
fungi | yep | 18:48 |
fungi | so that explains why we're not seeing those errors | 18:48 |
fungi | so while i agree that openstack projects should be working on support for newer tenacity versions, that train has already sailed for yoga i think | 18:49 |
fungi | the trove fix does seem to have merged on the master branch 9 days ago and passed testing | 18:52 |
fungi | i guess we'll see if there are errors on the backport to stable/yoga (835492) | 18:53 |
fungi | all the voting jobs for that change have already passed too | 18:54 |
fungi | looks like coverage and functional-mysql have failed but they're non-voting. tempest tests are also all non-voting on trove at the moment, it looks like? | 18:55 |
fungi | looks like the only trove jobs which are voting on proposed changes are requirements-check, openstack-tox-pep8, openstack-tox-py36, openstack-tox-py39, and openstack-tox-docs | 18:56 |
elodilles | it's a typical sign of a not well maintained project :S | 19:00 |
fungi | looks like there's a lot of mapping from tenant to context.project_id referencesm so they seem to have tackled a lot of it | 19:09 |
fungi | though in trove/configuration/service.py i still see a tenant_id=context.tenant instead of tenant_id=context.project_id in at least one place | 19:10 |
fungi | kwargs['tenant_id'] = context.tenant | 19:10 |
fungi | there's another | 19:10 |
fungi | same file | 19:11 |
fungi | zigo: where you're seeing the errors, does it seem to be raising in trove/configuration/service.py or somewhere else? | 19:11 |
gmann | without test coverage/voting jobs it will be hard to trace them manually | 19:11 |
fungi | given the complete lack of voting integration test jobs for trove, i feel pretty confident that project is entirely broken at this point | 19:14 |
fungi | and probably not just from the oslo.context tenant removal | 19:14 |
fungi | so next steps are probably someone proposing fixes context.tenant->project_id for murano and zaqar? | 19:17 |
fungi | is anyone working on those yet or should i take a swing at it? | 19:17 |
fungi | 15 files need editing in murano | 19:20 |
elodilles | fungi: i tried to look into it, but i won't have time for it for today i fear :S | 19:26 |
fungi | i'm working on the murano master patch now, about to push it but i'm coming into this somewhat blind and have little idea what i'm actually doing, so we'll see what happens ;) | 19:27 |
elodilles | :S crossing fingers | 19:28 |
elodilles | and yes, the problem for me was the same, i wanted to understand what i am changing and it seemed a bit too much given that i should sleep already as I woke up early today o:) | 19:30 |
fungi | okay, an attempt at fixing murano's master branch is now linked in the etherpad. i'll put together something similar for zaqar while i await test results | 19:31 |
elodilles | fungi: thanks! \o/ | 19:31 |
fungi | zaqar has places where it's doing things like context.RequestContext(project_id... | 19:32 |
fungi | uh, that line dates from 2014-10-22 according to git blame | 19:34 |
fungi | zigo: do you have any examples of failures for zaqar? a naive git grep isn't turning up obvious places where oslo.context is being called into incorrectly | 19:35 |
zigo | As I wrote, it just built fine, so it probably should be removed from the list. | 19:35 |
fungi | oh, zaqar? okay, i'll strike it off | 19:39 |
elodilles | yes, unit test jobs seems to be passing for zaqar, though the gate has a failing tripleo-ci-centos8 job: https://review.opendev.org/c/openstack/zaqar/+/833321/1 | 19:39 |
zigo | http://shade.infomaniak.ch/trove_17.0.0~rc1-2_build.log | 19:39 |
zigo | http://shade.infomaniak.ch/trove_17.0.0~rc1-2_build.log.txt if you prefer to have it in the browser... | 19:44 |
zigo | This is *after* applying https://review.opendev.org/c/openstack/trove/+/834373 | 19:44 |
fungi | elodilles: looks like the tripleo-ci-centos-8-scenario002-standalone job may be generally broken for stable/yoga (barbican has set theirs to non-voting): https://zuul.opendev.org/t/openstack/builds?job_name=tripleo-ci-centos-8-scenario002-standalone&branch=stable%2Fyoga&skip=0 | 19:51 |
elodilles | oh, i see. actually there are failing centos8 jobs here-and-there so i guess that's some common issue | 19:54 |
elodilles | zigo: btw it seems there are 2 patches in trove that are merged on master | 19:57 |
elodilles | zigo: did you apply both? | 19:58 |
elodilles | fungi: fyi ^^^ added the second patch to the list. i wonder if those need to be squashed... if they are needed for the fix then they should need the squash, too :-o | 20:01 |
fungi | well, since trove isn't testing things, squashing is probably unnecessary | 20:03 |
fungi | they could be merged independently | 20:03 |
elodilles | yes, true, you wrote that already, sorry | 20:04 |
zigo | elodilles: Only one... :/ | 20:18 |
zigo | I probably missed the 2nd one then. | 20:18 |
zigo | Thanks, I'll try it tomorrow. | 20:19 |
elodilles | +1 | 20:29 |
fungi | elodilles: so we need a backport of 834373 to stable/yoga then? | 20:29 |
elodilles | let me cherry pick that | 20:29 |
elodilles | as i think so | 20:29 |
fungi | the other trove backport has +1 from zuul now, at least | 20:29 |
*** dviroel is now known as dviroel|out | 20:30 | |
fungi | note that we've got an unrelated problem with zuul, so it's not started testing on my murano change yet, but it's being actively investigated and we at least know how to get things moving again without disruption once we've collected some logs and poked at it for a bit | 20:32 |
elodilles | :S | 20:32 |
elodilles | fungi: ack | 20:32 |
elodilles | (added a 'nit' comment for one of the trove patch as oslo.contect>=4.0.0 in requirements.txt is not necessary i think) | 20:37 |
fungi | agreed, older versions of oslo.context should work with project_id (that's the point of the deprecation) | 20:40 |
opendevreview | Tobias Urdin proposed openstack/releases master: Release tooz 2.11.0 https://review.opendev.org/c/openstack/releases/+/835517 | 20:59 |
elodilles | fungi: do you want me to remove the change from requirements.txt? do we have time for that? | 21:05 |
fungi | will it be a problem if it's in there? | 21:09 |
fungi | and yeah, you can if you like, i'm indifferent on it | 21:10 |
elodilles | well, most probably not a problem. OK i won't waste on CI time on it then. and anyway, i have to leave now for some hours of sleep. see you tomorrow! thanks for working on this! | 21:20 |
elodilles | just let some note here in IRC if i need to do anything during (my) morning | 21:21 |
fungi | will do! | 21:21 |
elodilles | o/ | 21:21 |
fungi | as soon as zuul gets back on track, i'll make sure we've got good check results on the outstanding (non-release) changes so far, and then approve them | 21:48 |
fungi | my murano patch failed a bunch of jobs, need to look into the cause(s) | 23:14 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!