19:01:30 #startmeeting infra 19:01:30 Meeting started Tue Aug 9 19:01:30 2022 UTC and is due to finish in 60 minutes. The chair is clarkb. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:01:30 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 19:01:30 The meeting name has been set to 'infra' 19:01:47 #link https://lists.opendev.org/pipermail/service-discuss/2022-August/000351.html 19:01:59 ahoy 19:02:07 \o 19:02:10 #topic Announcements 19:02:31 First up we are currently in the middle of the opendev service coordinator nomination period. It will run through August 16, 2022 19:02:36 #link https://lists.opendev.org/pipermail/service-discuss/2022-July/000347.html for details. 19:02:44 If you are interested please go for it :) 19:03:05 The other thing to announce (which should've gone on the meeting agenda) is that the PTG is a virtual event now 19:04:15 That doesn't change a whole lot for us 19:04:32 But we should probably expect more people trying to use meetpad during that week (and ptgbot was going to be used either way) 19:04:48 More details on that change will be coming out as things get sorted out 19:06:27 #topic Updating Grafana Management Tooling 19:06:37 ianw: I see some but not all of the changes have merged? 19:06:41 #link https://review.opendev.org/q/topic:grafana-json 19:06:56 I think there were problems with newer grafana image updates (they release beta's under the :latest tag?) 19:07:07 Any chance you and/or frickler can catch us up on this item? 19:07:14 yes, unclear if that is a bug or a feature 19:07:18 the release of betas 19:08:33 anyway, i dug into that and it got fixed 19:08:36 #link https://github.com/grafana/grafana/issues/53275 19:08:50 the last thing is cleaning up the jobs 19:09:33 And that appears to be two chagnes that just need reviews 19:09:39 #link https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/851951 19:09:46 #link https://review.opendev.org/c/openstack/project-config/+/851954 19:10:13 i just went and switched their topics to grafana-json, sorry, to keep them together 19:10:28 but yeah, a couple of eyes on that and i think this is done 19:10:30 https://grafana.opendev.org/d/f3089338b3/nodepool-dib-status?orgId=1 that does show different colors for build status now. Which is sort of what kicked this all off again 19:10:41 Thank you for getting this done 19:10:58 heh, yeah, i had it before that but trying to fix that was what pushed me to get it in :) 19:11:19 did we switch to running grafana:latest again? do we want that? 19:11:48 frickler: i think we probably do want that, otherwise we fall behind and it's just more painful 19:11:50 I'd actually prefer to avoid beta versions if possible 19:12:06 but it's a thin line, I admit 19:12:27 seems like they need not only a grafana:latest but also a grafana:greatest 19:12:39 i feel like it gives us generally one problem to sort out at a time, instead of updating every X months/years and having many problems all together 19:12:48 Looks like they don't have a 8 series tag or similar that we can just hang out on? But ya I think chances are we'll just end up super out of date if we aren't on latest 19:12:52 there was an issue opened about them releasing the beta ... 19:13:23 #link https://github.com/grafana/grafana/discussions/47177 19:14:35 cool sounds like there are some people asking for a stable tag which would probably also work for us 19:14:37 i guess ideal for us would be production on a stable, and we can run a -devel job 19:15:15 i mean, it might be said we have better CI than upstream, since we found and bisected it before any of its testing noticed :) 19:15:57 heh, but ya seems like that discussion is headed the right direction for our needs. Hopefully they enact the requested changes 19:15:59 (let they who have not released something buggy cast the first stone ... but we can certainly help :) 19:16:06 given their business model, making it easier to run up-to-date stable systems may not be aligned with their agenda anyway 19:16:36 Anything else on this subject? 19:16:47 not from me, thanks 19:16:56 #topic Bastion Host Updates 19:17:16 I haven't seen any movement on this (which is fine), but I've also been focused on mailman3 stuff so wanted to double check I didn't miss anything important 19:17:21 anything new to add to this? 19:17:59 yeah i've started looking at getting ansible into a venv, to isolate things better 19:18:21 #link https://review.opendev.org/q/topic:bridge-ansible-venv 19:18:34 so far it's cleanups 19:18:45 But they are ready for review? 19:19:32 those bits can be reviewed, they're independent 19:19:45 great. I'll put them on my list 19:20:00 it is tangential, but related 19:20:02 #link https://review.opendev.org/q/topic:gra1-bump-timeouts 19:20:20 (because these changes are all running jobs that sometimes fail) 19:20:42 oh I completely missed there was a parent change to review there 19:20:50 Thats what I get for doing code review first thing in the morning 19:20:53 that became a bit of a yak shaving exercise because updating the borg job ran it, which made me see it was failing 19:22:27 i also now have borg 1.2 updates on my todo list. however the first item in the upstream upgrade check-list is "do you really want to run borg 1.2, the 1.1 branch is still getting fixes" ... so that didn't exactly scream "do this now" :) 19:22:51 I guess that is good of them to call out :) 19:23:13 i missed the parent change as well 19:23:42 i think we've seen that before, that old pip doesn't know to not try and install later packages? 19:24:04 yeah, it's the abi3 stuff 19:24:17 newer wheels can declare they support abi3 rather than specific interpreter versions 19:24:25 but older pip knows nothing about abi3 19:24:39 there is also the thing where old pip doesn't know to check the required python version metadata on pypi 19:24:52 yeah, if it's too old, there's that as well 19:24:56 yeah i think its metadata here 19:25:05 oof 19:25:39 though usually not checking requires_python metadata means you end up trying to install too-new packages which lack support for the interpreter version you have 19:25:45 ya then that fails 19:25:53 er the software itself fails to run 19:26:37 in any case I'll need to review the chagnes to page in the context here 19:26:41 it did make me think that ensure-pip in zuul-jobs exports the command to get a virtualenv 19:26:41 I'll try to do that soon 19:27:26 which we could make "bash -c 'python -m venv ; ./venv/bin/pip install --upgrade pip setuptools;" ... maybe 19:27:43 i don't know if we could template in the path 19:27:59 anyway, no need to discuss here, but something to keep an eye on i guess 19:28:25 it's probably actually only bionic that has this issue with old versions, so impact is limited 19:28:41 ok 19:28:57 #topic Upgrading Bionic Servers to Focal/Jammy 19:29:02 (just hitting us because our bridge is that ... so back to the topic :) 19:29:05 #link https://etherpad.opendev.org/p/opendev-bionic-server-upgrades Notes on the work that needs to be done. 19:29:26 The changes I had made to support mailman3 on jammy which would generally support jammy updates have landed 19:29:35 ianw: that borg work is also related ? 19:29:58 yeah, there's a follow-on that adds borg testing on a jammy node. no reason it wouldn't work, but good to cover it 19:30:11 got it and ++ to having test coverage there 19:30:32 I think if we can avoid updating to focal from bionic and jump to jammy we'll save ourselves future work so getting the jammy bootstrapping done is worthwhile 19:31:56 As mentioned I've been somewhat focused on mailman3 stuff but I think that is starting to solidify so I'm hoping to have time for an actual upgrade or two in the near future 19:32:03 But I didn't have anything else on this topic 19:32:08 #topic Mailman 3 19:32:13 #link https://review.opendev.org/c/opendev/system-config/+/851248 WIP change to deploy a mailman 3 instance 19:32:49 I think the deployment aspects of this change are largely done at this point. (though I may have discovered a new bug I'll push a silly workaround for shortly). 19:33:19 There is still a fair bit of work to do around figuring out how we want to configure mailman3 and what our list settings should look like. But I've got a skeleton framework for addressing that in ansible in the change as well 19:33:51 There is a held node 104.130.26.212 which we'll use to try and answer some of those questions. I'm enlisting fungi's help because it involves email and things that I just don't undersatnd as well as others 19:34:08 If you are interested I think reviewing the change at this point and/or checking out the server would be helpful 19:35:20 yeah, i have a basic test scenario from my earlier mm3 poc i want to run through, creating test lists on multiple domains 19:35:30 manual testing, that is 19:35:55 also don't worry about breaking things. Holding a new node isn't difficult 19:36:19 definitely poke at it and see what we can improve. Ideally when we get around to doing the migration people will for the most part not notice other than that the UI and user database has changed 19:37:57 I am slightly concerned that the mailman service is going from simple and straightforward to giant django monolith. But the new thing shoudl support our use cases better and it has a future so I'm rolling with it 19:38:17 django in particular is not the easiest thing to automate around whcih the ansible illustrates 19:39:44 yeah they love a big old turing complete configuration file 19:40:07 you could say the same of exim 19:40:36 * fungi won't speak ill of exim though 19:41:38 ha 19:41:48 I think that is all on mailman3 msotly just it is ready for your feedback and help :) 19:41:51 #topic Gitea 1.17 19:41:58 Gitea 1.17 has been out for about a week now 19:42:06 #link https://review.opendev.org/c/opendev/system-config/+/847204 19:42:12 We have a change to upgrade to it if we like. 19:42:26 However it looks like 1.17.1 is in the works and will include a number of bugfixes 19:42:34 #link https://github.com/go-gitea/gitea/milestone/122 1.17.1 Milestone is in progress 19:43:08 We're probably better off just waiting for that since there isn't a pressing need to upgrade right now. The change for 1.17.0 shouldn't be much different than the one for 1.17.1 though. The only difference I expect is the tag version in the docker file 19:43:16 that means if you want to review that now it won't be wasted effort 19:43:56 #topic Open Discussion 19:44:08 That was everything I had. Anything else before we find $meal? 19:46:01 i didn't have anything 19:46:27 nope. i still need to finish up the ansible-lint upgrades, but thanks for reviews on that one last week 19:46:54 #link https://github.com/ansible/ansible/issues/78423 19:47:08 if you're interested in the python/ansible versions interaction 19:48:45 wow ansible 5 will still talk to python2.6 on the target nodes 19:50:18 Sounds like that is it. Thank you everyone! 19:50:25 We'll be back here next week same time and location 19:50:26 thanks clarkb! 19:50:26 i don't think we ever had py26 support in zuul-jobs? 19:50:37 corvus: we didn't. Mostly just surprised that upstream does it 19:50:46 #endmeeting