15:00:13 #startmeeting Ironic 15:00:13 Meeting started Mon May 19 15:00:13 2025 UTC and is due to finish in 60 minutes. The chair is JayF. Information about MeetBot at http://wiki.debian.org/MeetBot. 15:00:13 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 15:00:13 The meeting name has been set to 'ironic' 15:00:19 o/ 15:00:20 o/ 15:00:22 Welcome to the weekly Ironic meeting, I'm Jay, I'll be your host today :D 15:00:23 o/ 15:00:24 o/ 15:00:30 \o 15:00:34 o/ 15:00:34 As always our meetings are operated under the OpenInfra Code of conduct 15:00:41 #note Standing reminder to review patches tagged ironic-week-prio and to hashtag any patches ready for review with ironic-week-prio: https://tinyurl.com/ironic-weekly-prio-dash 15:01:00 #note It's R-19 week! 2025.2 Flamingo Release Schedule https://releases.openstack.org/flamingo/schedule.html 15:01:14 Honestly this is usually the point where I'd say we'll wait for quorum 15:01:19 but I think we're already loaded in :D 15:01:29 #topic Working Group Updates: Standalone Networking 15:01:31 (o/) 15:01:36 Any update on Standalone Networking work? 15:01:52 not from me. alegacy? 15:03:09 I'm going to assume there's nothing to update; we can revisit if folks come around later. 15:03:19 #topic Discussion Topics: Phasing off Python 3.9 in Tests 15:03:26 There are several related PRs linked on this topic 15:03:38 #link https://review.opendev.org/c/openstack/ironic-python-agent-builder/+/950152 use Python 3.12 in DIB CS9 image 15:03:51 #link https://review.opendev.org/c/openstack/ironic-python-agent-builder/+/950235 remove tinycore tests (does not support Python > 3.9) 15:04:03 #link https://review.opendev.org/c/openstack/bifrost/+/949861 pinning UCs in bifrost CS9 jobs 15:04:17 yeah they just represent what's currently happening 15:04:17 python 3.9 support in UCs is not going to last for long 15:04:29 It seems like, at least to me, there's general consensus we're likely better off removing our need for a radically tiny ramdisk than trying to chase tinyipa or a replacement forever 15:04:43 Now would be a great time to make noise if you don't agree :) 15:04:53 it is really a huge effort to use anything higher than pythgon 3. 9 in tinycore at the moment 15:05:19 My research into using gentoo could get us a smaller one, but not smaller enough to prevent our need to rework the tests 15:05:20 Is there any official position by the TCL team? 15:05:41 meaning that you need to compile it yourself or use a tool like pyenv, which is still a big effort due to the chroot spaghetti stuff 15:06:02 https://forum.tinycorelinux.net/index.php/board,31.0.html they appear to release approximately once a year 15:06:12 and the most recent update was 4/26/2025 with no reference to python 15:06:16 yep 15:06:25 I'll note I also don't know tinycore well enough to know if that is separate 15:06:33 fun fact: python 3.12 for tinycore exists only for ARM 15:06:43 we can run tests under arm :P 15:06:58 me shrugs 15:06:59 I prefer the approach of getting multinode working and trying to rework our tests to use realistic ramdisks 15:07:04 it moves us closer to what our customers use 15:07:08 sounds great to me 15:07:19 gets us into the business of making our software/CI better rather than chasing a distro which I think none of us love :) 15:07:40 I used to love it, now it's yet another Stockholm syndrome I have :D 15:08:00 It's (seemingly) one of those things that's really cool as a 1 time project; really painful to maintain for years 15:08:14 I agree 15:08:44 So is there anything else to talk about around the discussion topic of python 3.9? I'll give a couple minutes 15:11:01 Aight, next topic 15:11:05 #topic Bug Deputy Updates 15:11:12 1 new bug: https://bugs.launchpad.net/ironic/+bug/2110916 15:11:15 o/ 15:11:20 1 new RFE: https://bugs.launchpad.net/ironic/+bug/2110694 15:11:29 3 bugs closed, 3 triaged 15:11:39 Whoever was the bug deputy feel free to speak up with more detail :) 15:11:45 it was me 15:11:46 Also we need a bug deputy for next week 15:11:52 \p 15:12:13 so last wed Michael Sherman (not sure about his irc nick) created this rfe 15:12:44 The RFE seems sensible, I just wanna make sure it's unique :D 15:13:10 Yeah the real hard part of the RFE is around handing credentials and uploading the image to glance 15:13:15 based on the details, seems like we retired, so maybe he wants to work on it? 15:13:31 wondering how we should reach out to him about it 15:13:34 I am +1 to the feature existing, a little nervous about someone implementing it specless 15:13:47 agree 15:14:21 I can add the needs-spec to it 15:14:45 and mention we briefly talked at the weekly meeting 15:14:47 Yeah, especially since there is a prior art already 15:15:15 o/ 15:15:19 sorry, got super distracted this morning 15:15:26 np TheJulia o/ 15:15:35 Sounds like needs-spec is the answer, and I think he was already down that path. 15:15:40 \o 15:15:50 nick should be shermanm[m] 15:16:01 Do we have someone who wants to bug deputy this week? 15:16:15 I can watch the bugs this week 15:16:17 I'll note for US-ians: many of us will have Monday off next week (memorial day). So take this into consideration when volunteering 15:16:21 frickler, ack 15:16:31 Skipping RFE review, we just kinda did that in the bug deputy section 15:16:42 #note Next bug deputy: Julia 15:16:45 #topic Open Discussion 15:16:59 re: next week's Monday being Memorial Day in the US, would we like to keep the meeting? 15:17:02 I will not be here. 15:17:35 I should not be here next monday :) 15:17:44 I may take Friday off as well, who knows! 15:17:59 I'd be +1 to just cancelilng next week 15:18:33 fine to me 15:18:33 I'll be here but I'm fine with cancelling 15:18:42 ++ 15:19:02 Julia Kreger proposed openstack/ironic master: DNM: CI Science - Expand the multinode job https://review.opendev.org/c/openstack/ironic/+/950206 15:19:07 JayF, you want to send an email or should I? 15:19:08 #note Meeting next week cancelled due to US Memorial Day holiday. Next meeting June 2 (if my math is right :D) 15:19:08 cancel, ++ 15:19:12 iurygregory: I'll do it 15:19:17 ack 15:19:18 Anything else for Open Discussion? 15:20:20 CI related 15:20:20 something odd with metal3 jobs 15:20:42 not sure if anyeone else noticed but there are auth issues with sushy 15:21:04 like 2025-05-19 13:20:52.997 1 WARNING sushy.connector [None req-f7d87eb6-999d-4248-a780-1a044a68911a - - - - - -] Session authentication appears to have been lost at some point in time. Connectivity may have been lost during a prior session refresh. Attempting to re-authenticate.: sushy.exceptions.AccessError: HTTP POST https://192.168.111.1:8000/redfish/v1/SessionService/Sessions returned code 401. Base.1.0.GeneralError: 15:21:04 Authorization required Extended information: [{'@odata.type': '/redfish/v1/$metadata#Message.1.0.0.Message', 'MessageId': 'Base.1.0.GeneralError'}] 15:21:39 please check the other jobs as well 15:21:39 just want to exclude it's Python 3.12 only 15:22:13 if anyone wants to have a look see the ironic logs here https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_3ed/openstack/3ed3085139af4072b86cd6c990706577/controller/before_pivoting/ironic.log 15:23:00 That is a super weird edge to fall into to get that error 15:23:16 :( 15:23:34 yeah 15:23:34 I don't get it 15:23:34 it was working until yesterday? 15:23:51 so we should check for requirements/constraints changes? 15:24:26 https://opendev.org/openstack/requirements/commits/branch/master looks like a lot of bumps for internal libraries two days ago 15:24:36 inc keystoneauth1/keystonemiddleware 15:24:54 might be a place to dig from, if you could lock to the u-c from three days ago and see if it repros 15:25:51 latest uc updated sushy to 5.6.0 15:25:52 well, if memory serves the way to get into that case is you think you've authenticated, but in the process you lost connectivity *or* the remote side never confirmed your session 15:26:17 so so when you try to use what you believe is valid, then you error and you need to reauthenticate, the code *should* be re-authenticating 15:27:27 sushytools logs look very weird 15:27:35 https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_3ed/openstack/3ed3085139af4072b86cd6c990706577/controller/before_pivoting/sushy-tools.log 15:28:12 maybe time to add more debug logging to sushy-tools ? I'm guessing that is all re-auth requests but why are we getting 401s being sent back 15:28:16 that seems like a bug 15:28:45 yeah 15:29:34 I have to drop in like 10 minutes, I can get another look tomorrow if no one gets to it before 15:29:46 May I suggest we continue this troubleshooting async/outside of meeting thne? 15:30:34 ++ 15:30:50 yep 15:30:57 yeah, wrap the the meeting :) 15:30:58 #endmeeting