19:01:30 <clarkb> #startmeeting infra
19:01:30 <opendevmeet> Meeting started Tue Aug  9 19:01:30 2022 UTC and is due to finish in 60 minutes.  The chair is clarkb. Information about MeetBot at http://wiki.debian.org/MeetBot.
19:01:30 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
19:01:30 <opendevmeet> The meeting name has been set to 'infra'
19:01:47 <clarkb> #link https://lists.opendev.org/pipermail/service-discuss/2022-August/000351.html
19:01:59 <fungi> ahoy
19:02:07 <frickler> \o
19:02:10 <clarkb> #topic Announcements
19:02:31 <clarkb> First up we are currently in the middle of the opendev service coordinator nomination period. It will run through August 16, 2022
19:02:36 <clarkb> #link https://lists.opendev.org/pipermail/service-discuss/2022-July/000347.html for details.
19:02:44 <clarkb> If you are interested please go for it :)
19:03:05 <clarkb> The other thing to announce (which should've gone on the meeting agenda) is that the PTG is a virtual event now
19:04:15 <clarkb> That doesn't change a whole lot for us
19:04:32 <clarkb> But we should probably expect more people trying to use meetpad during that week (and ptgbot was going to be used either way)
19:04:48 <clarkb> More details on that change will be coming out as things get sorted out
19:06:27 <clarkb> #topic Updating Grafana Management Tooling
19:06:37 <clarkb> ianw: I see some but not all of the changes have merged?
19:06:41 <clarkb> #link https://review.opendev.org/q/topic:grafana-json
19:06:56 <clarkb> I think there were problems with newer grafana image updates (they release beta's under the :latest tag?)
19:07:07 <clarkb> Any chance you and/or frickler can catch us up on this item?
19:07:14 <ianw> yes, unclear if that is a bug or a feature
19:07:18 <ianw> the release of betas
19:08:33 <ianw> anyway, i dug into that and it got fixed
19:08:36 <ianw> #link https://github.com/grafana/grafana/issues/53275
19:08:50 <ianw> the last thing is cleaning up the jobs
19:09:33 <clarkb> And that appears to be two chagnes that just need reviews
19:09:39 <clarkb> #link https://review.opendev.org/c/openstack/openstack-zuul-jobs/+/851951
19:09:46 <clarkb> #link https://review.opendev.org/c/openstack/project-config/+/851954
19:10:13 <ianw> i just went and switched their topics to grafana-json, sorry, to keep them together
19:10:28 <ianw> but yeah, a couple of eyes on that and i think this is done
19:10:30 <clarkb> https://grafana.opendev.org/d/f3089338b3/nodepool-dib-status?orgId=1 that does show different colors for build status now. Which is sort of what kicked this all off again
19:10:41 <clarkb> Thank you for getting this done
19:10:58 <ianw> heh, yeah, i had it before that but trying to fix that was what pushed me to get it in :)
19:11:19 <frickler> did we switch to running grafana:latest again? do we want that?
19:11:48 <ianw> frickler: i think we probably do want that, otherwise we fall behind and it's just more painful
19:11:50 <frickler> I'd actually prefer to avoid beta versions if possible
19:12:06 <frickler> but it's a thin line, I admit
19:12:27 <fungi> seems like they need not only a grafana:latest but also a grafana:greatest
19:12:39 <ianw> i feel like it gives us generally one problem to sort out at a time, instead of updating every X months/years and having many problems all together
19:12:48 <clarkb> Looks like they don't have a 8 series tag or similar that we can just hang out on? But ya I think chances are we'll just end up super out of date if we aren't on latest
19:12:52 <ianw> there was an issue opened about them releasing the beta ...
19:13:23 <ianw> #link https://github.com/grafana/grafana/discussions/47177
19:14:35 <clarkb> cool sounds like there are some people asking for a stable tag which would probably also work for us
19:14:37 <ianw> i guess ideal for us would be production on a stable, and we can run a -devel job
19:15:15 <ianw> i mean, it might be said we have better CI than upstream, since we found and bisected it before any of its testing noticed :)
19:15:57 <clarkb> heh, but ya seems like that discussion is headed the right direction for our needs. Hopefully they enact the requested changes
19:15:59 <ianw> (let they who have not released something buggy cast the first stone ... but we can certainly help :)
19:16:06 <fungi> given their business model, making it easier to run up-to-date stable systems may not be aligned with their agenda anyway
19:16:36 <clarkb> Anything else on this subject?
19:16:47 <ianw> not from me, thanks
19:16:56 <clarkb> #topic Bastion Host Updates
19:17:16 <clarkb> I haven't seen any movement on this (which is fine), but I've also been focused on mailman3 stuff so wanted to double check I didn't miss anything important
19:17:21 <clarkb> anything new to add to this?
19:17:59 <ianw> yeah i've started looking at getting ansible into a venv, to isolate things better
19:18:21 <ianw> #link https://review.opendev.org/q/topic:bridge-ansible-venv
19:18:34 <ianw> so far it's cleanups
19:18:45 <clarkb> But they are ready for review?
19:19:32 <ianw> those bits can be reviewed, they're independent
19:19:45 <clarkb> great. I'll put them on my list
19:20:00 <ianw> it is tangential, but related
19:20:02 <ianw> #link https://review.opendev.org/q/topic:gra1-bump-timeouts
19:20:20 <ianw> (because these changes are all running jobs that sometimes fail)
19:20:42 <clarkb> oh I completely missed there was a parent change to review there
19:20:50 <clarkb> Thats what I get for doing code review first thing in the morning
19:20:53 <ianw> that became a bit of a yak shaving exercise because updating the borg job ran it, which made me see it was failing
19:22:27 <ianw> i also now have borg 1.2 updates on my todo list.  however the first item in the upstream upgrade check-list is "do you really want to run borg 1.2, the 1.1 branch is still getting fixes" ... so that didn't exactly scream "do this now" :)
19:22:51 <clarkb> I guess that is good of them to call out :)
19:23:13 <fungi> i missed the parent change as well
19:23:42 <ianw> i think we've seen that before, that old pip doesn't know to not try and install later packages?
19:24:04 <fungi> yeah, it's the abi3 stuff
19:24:17 <fungi> newer wheels can declare they support abi3 rather than specific interpreter versions
19:24:25 <fungi> but older pip knows nothing about abi3
19:24:39 <clarkb> there is also the thing where old pip doesn't know to check the required python version metadata on pypi
19:24:52 <fungi> yeah, if it's too old, there's that as well
19:24:56 <ianw> yeah i think its metadata here
19:25:05 <fungi> oof
19:25:39 <fungi> though usually not checking requires_python metadata means you end up trying to install too-new packages which lack support for the interpreter version you have
19:25:45 <clarkb> ya then that fails
19:25:53 <clarkb> er the software itself fails to run
19:26:37 <clarkb> in any case I'll need to review the chagnes to page in the context here
19:26:41 <ianw> it did make me think that ensure-pip in zuul-jobs exports the command to get a virtualenv
19:26:41 <clarkb> I'll try to do that soon
19:27:26 <ianw> which we could make "bash -c 'python -m venv <path>; ./venv/bin/pip install --upgrade pip setuptools;" ... maybe
19:27:43 <ianw> i don't know if we could template in the path
19:27:59 <ianw> anyway, no need to discuss here, but something to keep an eye on i guess
19:28:25 <ianw> it's probably actually only bionic that has this issue with old versions, so impact is limited
19:28:41 <clarkb> ok
19:28:57 <clarkb> #topic Upgrading Bionic Servers to Focal/Jammy
19:29:02 <ianw> (just hitting us because our bridge is that ... so back to the topic :)
19:29:05 <clarkb> #link https://etherpad.opendev.org/p/opendev-bionic-server-upgrades Notes on the work that needs to be done.
19:29:26 <clarkb> The changes I had made to support mailman3 on jammy which would generally support jammy updates have landed
19:29:35 <clarkb> ianw: that borg work is also related ?
19:29:58 <ianw> yeah, there's a follow-on that adds borg testing on a jammy node.  no reason it wouldn't work, but good to cover it
19:30:11 <clarkb> got it and ++ to having test coverage there
19:30:32 <clarkb> I think if we can avoid updating to focal from bionic and jump to jammy we'll save ourselves future work so getting the jammy bootstrapping done is worthwhile
19:31:56 <clarkb> As mentioned I've been somewhat focused on mailman3 stuff but I think that is starting to solidify so I'm hoping to have time for an actual upgrade or two in the near future
19:32:03 <clarkb> But I didn't have anything else on this topic
19:32:08 <clarkb> #topic Mailman 3
19:32:13 <clarkb> #link https://review.opendev.org/c/opendev/system-config/+/851248 WIP change to deploy a mailman 3 instance
19:32:49 <clarkb> I think the deployment aspects of this change are largely done at this point. (though I may have discovered a new bug I'll push a silly workaround for shortly).
19:33:19 <clarkb> There is still a fair bit of work to do around figuring out how we want to configure mailman3 and what our list settings should look like. But I've got a skeleton framework for addressing that in ansible in the change as well
19:33:51 <clarkb> There is a held node 104.130.26.212 which we'll use to try and answer some of those questions. I'm enlisting fungi's help because it involves email and things that I just don't undersatnd as well as others
19:34:08 <clarkb> If you are interested I think reviewing the change at this point and/or checking out the server would be helpful
19:35:20 <fungi> yeah, i have a basic test scenario from my earlier mm3 poc i want to run through, creating test lists on multiple domains
19:35:30 <fungi> manual testing, that is
19:35:55 <clarkb> also don't worry about breaking things. Holding a new node isn't difficult
19:36:19 <clarkb> definitely poke at it and see what we can improve. Ideally when we get around to doing the migration people will for the most part not notice other than that the UI and user database has changed
19:37:57 <clarkb> I am slightly concerned that the mailman service is going from simple and straightforward to giant django monolith. But the new thing shoudl support our use cases better and it has a future so I'm rolling with it
19:38:17 <clarkb> django in particular is not the easiest thing to automate around whcih the ansible illustrates
19:39:44 <ianw> yeah they love a big old turing complete configuration file
19:40:07 <fungi> you could say the same of exim
19:40:36 * fungi won't speak ill of exim though
19:41:38 <clarkb> ha
19:41:48 <clarkb> I think that is all on mailman3 msotly just it is ready for your feedback and help :)
19:41:51 <clarkb> #topic Gitea 1.17
19:41:58 <clarkb> Gitea 1.17 has been out for about a week now
19:42:06 <clarkb> #link https://review.opendev.org/c/opendev/system-config/+/847204
19:42:12 <clarkb> We have a change to upgrade to it if we like.
19:42:26 <clarkb> However it looks like 1.17.1 is in the works and will include a number of bugfixes
19:42:34 <clarkb> #link https://github.com/go-gitea/gitea/milestone/122 1.17.1 Milestone is in progress
19:43:08 <clarkb> We're probably better off just waiting for that since there isn't a pressing need to upgrade right now. The change for 1.17.0 shouldn't be much different than the one for 1.17.1 though. The only difference I expect is the tag version in the docker file
19:43:16 <clarkb> that means if you want to review that now it won't be wasted effort
19:43:56 <clarkb> #topic Open Discussion
19:44:08 <clarkb> That was everything I had. Anything else before we find $meal?
19:46:01 <fungi> i didn't have anything
19:46:27 <ianw> nope.  i still need to finish up the ansible-lint upgrades, but thanks for reviews on that one last week
19:46:54 <ianw> #link https://github.com/ansible/ansible/issues/78423
19:47:08 <ianw> if you're interested in the python/ansible versions interaction
19:48:45 <clarkb> wow ansible 5 will still talk to python2.6 on the target nodes
19:50:18 <clarkb> Sounds like that is it. Thank you everyone!
19:50:25 <clarkb> We'll be back here next week same time and location
19:50:26 <fungi> thanks clarkb!
19:50:26 <corvus> i don't think we ever had py26 support in zuul-jobs?
19:50:37 <clarkb> corvus: we didn't. Mostly just surprised that upstream does it
19:50:46 <clarkb> #endmeeting