tkajinam | hberaud, yes but you said 2024.2 but 2023.1 | 00:58 |
---|---|---|
tkajinam | hberaud, 2024.2, not 2023.2, I mean | 00:58 |
opendevreview | OpenStack Proposal Bot proposed openstack/oslo.log master: Imported Translations from Zanata https://review.opendev.org/c/openstack/oslo.log/+/912705 | 03:34 |
hberaud | sean-k-mooney: concerning the multiple readers support, we are aware of this https://github.com/eventlet/eventlet/issues/874 | 06:58 |
hberaud | sean-k-mooney: IMO I think it should be possible to adapt swift to not rely on this unsupported feature https://opendev.org/openstack/swift/src/branch/master/swift/common/utils/__init__.py#L6102-L6116 | 09:05 |
hberaud | Swift heavily rely on eventlet so I think Swift will be transitioned last, it leave time to see how to adapt Swift to not rely on multireaders. | 09:08 |
opendevreview | Merged openstack/oslo.log stable/2024.1: Fix eventlet detection https://review.opendev.org/c/openstack/oslo.log/+/914262 | 09:09 |
opendevreview | Daniel Bengtsson proposed openstack/oslo.messaging stable/zed: Fix typo in quorum-related variables for RabbitMQ https://review.opendev.org/c/openstack/oslo.messaging/+/914435 | 10:02 |
opendevreview | Merged openstack/oslo.log stable/2023.2: Fix eventlet detection https://review.opendev.org/c/openstack/oslo.log/+/914266 | 10:49 |
hberaud | tkajinam++ damani++ thanks guys for your reactivity with the previous eventlet topic! | 10:52 |
damani | hberaud, thanks a lot for the code review and merged :) | 11:32 |
damani | about it | 11:32 |
sean-k-mooney | hberaud: ya so i could retrigger the job and just turn of swift and see how far it gets | 11:43 |
sean-k-mooney | but i was expecting swift to be problmeatic | 11:43 |
opendevreview | Merged openstack/oslo.messaging master: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/909517 | 12:02 |
opendevreview | Merged openstack/oslo.log stable/2023.1: Fix eventlet detection https://review.opendev.org/c/openstack/oslo.log/+/914267 | 12:18 |
hberaud | damani: FYI https://bugs.launchpad.net/octavia/+bug/2039346/ | 14:09 |
hberaud | (related to the previous backports) | 14:10 |
hberaud | See the discussion here: https://github.com/eventlet/eventlet/issues/432 , IMO octavia is suffering from several incomplete backports on the oslo side. | 14:18 |
opendevreview | Takashi Kajinami proposed openstack/oslo.messaging stable/2023.2: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/914448 | 15:08 |
opendevreview | Takashi Kajinami proposed openstack/oslo.messaging stable/2024.1: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/914449 | 15:08 |
opendevreview | Takashi Kajinami proposed openstack/oslo.messaging stable/2023.2: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/914448 | 15:09 |
opendevreview | Takashi Kajinami proposed openstack/oslo.messaging stable/2023.1: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/914450 | 15:10 |
crohmann | damani: tkajinam: I'd like to discuss "the" greenthreads issue with eventlet - https://bugs.launchpad.net/octavia/+bug/2039346. | 15:56 |
crohmann | hberaud: and I discussed this in https://github.com/eventlet/eventlet/issues/432 and he suggested there might be sensible bugfix backports required for Zed and Yoga (https://github.com/eventlet/eventlet/issues/432) | 15:57 |
hberaud | crohmann: just replied to your latest comment https://github.com/eventlet/eventlet/issues/432#issuecomment-2023130882 | 15:58 |
crohmann | Ah, thx for your time and patience figuring this out and naming the right steps required. | 15:59 |
hberaud | np, you are welcome | 16:00 |
crohmann | I might need a moment to understand this fully. So you are saying https://opendev.org/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17 should go into Zed and Yoga, but that then also needs the more recent bugfix on top? | 16:02 |
hberaud | crohmann: see my advice here => https://github.com/eventlet/eventlet/issues/432#issuecomment-2023145393 | 16:03 |
hberaud | the short response to your question is, yes! | 16:04 |
hberaud | The 3 patches I refered too in my last comment are all mandatories | 16:05 |
hberaud | The are the different pieces of the puzzle | 16:05 |
crohmann | Allright. Does it make sense for me to push backport changes then? Or would you push them? Maybe also clustered via topic or changeid to make clear they belong together? | 16:06 |
crohmann | I am more than willing to do the work, but don't want to create more chaos and confusion in pushing backports. | 16:08 |
hberaud | I won't push these backports, I don't have enough bandwidth to manage them, so, if you want, feel free to cherry-pick them, else discuss with damani if he can handle this story. | 16:09 |
hberaud | I already pinged damani early today about this story but for now I've no update from him, so, I think you are safe proposing them. | 16:11 |
johnsom | Do we have a similar problem in oslo messaging? | 16:11 |
johnsom | https://opendev.org/openstack/oslo.messaging/src/branch/master/oslo_messaging/_utils.py#L25 | 16:11 |
johnsom | I see that global "try_import" in two places in oslo messaging | 16:12 |
hberaud | I don't think, because we have this condition `eventletutils.is_monkey_patched("thread"):` | 16:12 |
hberaud | the problem in oslo.log was that the condition only relied on the import of eventlet | 16:13 |
johnsom | Ok, cool. I wasn't sure it the simple fact of importing it led to issues as well or not. | 16:13 |
hberaud | so if eventlet was imported elsewhere, it would have been present in sys.module, and so it would have been considered as "enabled" by that poorly designed condition (in oslo.log) | 16:14 |
crohmann | hberaud: thanks gain for your time. If I may ask, what's the best approach to bundle the three backports into one? Use the same changeid? | 16:14 |
hberaud | the condition was similar to "if eventlet is in sys.module, then we can consider that the env is monkey patched" | 16:15 |
hberaud | simply cherry-pick each patch in the same branch and then submit your review by using git review would do the job | 16:16 |
hberaud | 3 cherry-pick by local development branch and then followed by a git review | 16:17 |
hberaud | git review will manage all these changeid stuff for you | 16:17 |
crohmann | hberaud: johnsom: We've seen issues with RabbitMQ timeouts / connections lossed (oslo.messaging) but I suppose the new pthread feature (https://review.opendev.org/q/topic:%22disable-green-threads%22) helps with that. | 16:19 |
crohmann | hberaud: I know how git review and cherry-picking works, but was just wondering if there was some special way to bundle multiple commits properly :-) | 16:20 |
hberaud | no we don't have specific process | 16:20 |
crohmann | Am I not seeing clearly or is there no stable/zed branch for oslo.log anymore? | 16:22 |
hberaud | ah sorry... I missed that point | 16:23 |
crohmann | https://opendev.org/openstack/oslo.log/branches | 16:23 |
hberaud | indeed, years ago we transitioned oslo.log as an independent deliverable and then after 2 series we moved back oslo.log (and some other oslo deliverables) to the coordinated series | 16:24 |
hberaud | So, we have no option for these stable branches | 16:25 |
crohmann | So nowhere to apply the patches to you mean? | 16:25 |
hberaud | only downstream packages can be patched | 16:25 |
hberaud | AFAIK yes | 16:25 |
JayF | hberaud: I know in Ironic in the past, we've suggested to deployers to install a library outside of constraints to resolve some bugs when we've been in that case | 16:25 |
crohmann | As in Ubuntu Cloud Archive you mean? | 16:25 |
JayF | hberaud: e.g. "If you install the newer version of sushy, it should resolve your bug" even if we can't test it or package it to install that way for other reasons | 16:26 |
JayF | I know that's not an ideal fix; but at least giving a workaround is better than no option whatsoever | 16:26 |
hberaud | JayF: ack, thanks for examples | 16:26 |
hberaud | crohmann: then maybe you could follow JayF's suggestions | 16:27 |
crohmann | For our own sake (OpenStack Cloud) we are also happy with all of them issues fixed for 2023.1 (Antelope) onwards. We will iterate through multiple updates since Zed will approach unmaintained state soon anyways. | 16:28 |
hberaud | That's why I moved back oslo libs into the coordinated series, because it was more annoying than something else to have them in independent deliverables | 16:28 |
crohmann | After this discussion I now have a much better understanding of the issues. Well, we will then skip quickly past Zed. I don't know if you still want to at least somehow get the word out to deployers how to fix this Zed? | 16:32 |
hberaud | that would be awesome to socialize this point with operators/deployers | 16:33 |
hberaud | though, I've no idea how to spread the words in an official manner | 16:34 |
hberaud | using the ML won't really help and the message would be lost in a couple of week | 16:35 |
hberaud | release notes are strongly linked to specific version/series | 16:36 |
JayF | hberaud: in the past, we've been able to ask operator sigs to promote things on socials, but we usually use that for meetups and such | 16:36 |
JayF | hberaud: perhaps if you were able to write up something about it on a wiki page for oslo or in the docs somewhere, you could have it made into something for Socials ... maybe with a spin on it being "another good reason to upgrade"? | 16:37 |
hberaud | yeah, AFAICS that would be the more reasonable approach | 16:38 |
tkajinam | hberaud, I'm wondering if we really need https://opendev.org/openstack/oslo.log/commit/de615d9370681a2834cebe88acfa81b919da340c , since I suspect https://review.opendev.org/c/openstack/oslo.log/+/914190 would fix the original issue | 16:40 |
tkajinam | we can attempt to backport it but I'm hesitant to rush it at this timing while we haven't yet confirmed that the solution has no side effect... zed is moving to unmaintained soon (after 2024.1 GA) so if we break it now then it may not likely be fixed. | 16:41 |
tkajinam | unless someone steps up to maintain oslo's unmaintained branches | 16:41 |
tkajinam | ( and I now caught up with the discussion... | 16:44 |
hberaud | tkajinam: you are far well more active than me on oslo, so if you think we should holds this topic for a bit, I trust you at 100% | 16:47 |
hberaud | I think this is a wise decision | 16:48 |
tkajinam | :-) | 16:49 |
hberaud | however, that won't prevent us to update the doc as Jay suggested | 16:49 |
hberaud | (IMO) | 16:49 |
tkajinam | I wonder how we can update the doc without stable/zed branch | 16:49 |
tkajinam | probably the first step is to record these discussions in the bug as a placeholder ? | 16:49 |
hberaud | we could just put a comment on the latest version | 16:50 |
hberaud | tkajinam: indeed, that's a good starting point | 16:50 |
hberaud | even the best starting point | 16:50 |
tkajinam | probably but I find it a bit strange that we document a problem of an old version in the latest version. | 16:51 |
tkajinam | I'll give it another thought during my day | 16:51 |
hberaud | sure np | 16:51 |
hberaud | no rush | 16:51 |
hberaud | I tried to summarize the situation into the bugtracker https://bugs.launchpad.net/octavia/+bug/2039346/comments/15 | 17:06 |
JayF | hberaud: part of me wonders if an upgrade of the oslo.log package (and only that package) would fix the issue for them, too, but that is risky to suggest without testing :D | 18:06 |
hberaud | yep | 18:06 |
opendevreview | Merged openstack/oslo.messaging stable/2024.1: kafka: Fix invalid hostaddr format for IPv6 address https://review.opendev.org/c/openstack/oslo.messaging/+/914449 | 18:23 |
crohmann | sorry hberaud and I had to run earlier. I did read through the rest of the conversation and thank you for the summary in the ticket. | 18:48 |
sean-k-mooney | hberaud: just an fyi nova is also dependign on aise RuntimeError("Multiple readers are not yet supported by asyncio hub") | 18:51 |
sean-k-mooney | for the pipe_mutex stuff in oslo_log | 18:52 |
sean-k-mooney | https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_130/914108/5/check/tempest-full-py3/130576a/controller/logs/screen-n-api.txt | 18:52 |
sean-k-mooney | we can possibly work aorund that by setting heatbeat_in_pthread to false in the api | 18:53 |
sean-k-mooney | that is the only native thread we have in the wsgi applications | 18:54 |
sean-k-mooney | but we have others that we cant remvoe in other processes | 18:54 |
sean-k-mooney | ideally we shoudl remove oslo_messaging_rabbit.heartbeat_in_pthread | 18:55 |
sean-k-mooney | its caused a bunch of issues when its set to true so i would never recommend doing that personally | 18:56 |
sean-k-mooney | that wasnt why the pipe mutex stuff was added to oslo log however | 18:56 |
sean-k-mooney | hberaud: we will need to modify https://review.opendev.org/c/openstack/oslo.log/+/852443 or add suport for multi reader support to proceed with the asycio hub work | 18:58 |
opendevreview | Merged openstack/oslo.messaging master: Fix incorrect desc of rabbit_stream_fanout option https://review.opendev.org/c/openstack/oslo.messaging/+/913870 | 20:08 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!