15:04:48 <noonedeadpunk> #startmeeting openstack_ansible_meeting
15:04:48 <opendevmeet> Meeting started Tue Nov 23 15:04:48 2021 UTC and is due to finish in 60 minutes.  The chair is noonedeadpunk. Information about MeetBot at http://wiki.debian.org/MeetBot.
15:04:48 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
15:04:48 <opendevmeet> The meeting name has been set to 'openstack_ansible_meeting'
15:04:54 <noonedeadpunk> #topic rollcall
15:05:05 <damiandabrowski[m]> hey!
15:06:13 <mgariepy> hey half here. \o/
15:08:20 <noonedeadpunk> \o/
15:09:48 <opendevreview> Dmitriy Rabotyagov proposed openstack/openstack-ansible master: Do not upgrade packages without upgrades  https://review.opendev.org/c/openstack/openstack-ansible/+/818939
15:09:51 <noonedeadpunk> #topic office hours
15:10:22 <noonedeadpunk> I've started writing proxysql role and today it become obvious for me that we won't be able to land it in time for release
15:10:46 <noonedeadpunk> So I'd say that we do it without proxysql this time unfortunatelly (
15:11:04 <noonedeadpunk> As it would involve a lot of changes including db_setup across all roles
15:11:19 <noonedeadpunk> which we won't be able to test properly during left 2 weeks (or so)
15:11:42 <noonedeadpunk> and eventually I think we're staying on galera 10.5 as well
15:12:28 <noonedeadpunk> I don't see how we can easily workaround mysql_upgrade
15:12:51 <noonedeadpunk> the only thing I guess we can merge would be https://review.opendev.org/c/openstack/openstack-ansible-os_keystone/+/817390
15:13:04 <mgariepy> the issue is creating a small wave tho.
15:13:29 <mgariepy> for proxysql maybe we cloud backport it in beta when it's done ?
15:15:27 <opendevreview> Dmitriy Rabotyagov proposed openstack/openstack-ansible-galera_server master: Update mariadb to 10.6.5  https://review.opendev.org/c/openstack/openstack-ansible-galera_server/+/817384
15:15:49 <noonedeadpunk> well, I will continue working on it and let's see where we will end up
15:16:01 <noonedeadpunk> as I expect pretty big hardly backportable changes...
15:16:28 <noonedeadpunk> as eventually we would need to manage mysql users differently
15:16:54 <noonedeadpunk> but let's see
15:17:24 <noonedeadpunk> maybe we will be able to backport it in beta before 24.1.0
15:17:26 <mgariepy> ping me when you have some patches up for review
15:17:31 <noonedeadpunk> sure
15:18:03 <noonedeadpunk> regarding keystone patch - for upgrade to pass we need to land https://review.opendev.org/c/openstack/openstack-ansible/+/818733
15:18:25 <noonedeadpunk> and I guess after that I will suggest roles freeze patch for milestone release
15:20:22 <noonedeadpunk> also we have these 2 wrongly backported patches https://review.opendev.org/c/openstack/openstack-ansible-lxc_hosts/+/818485 and https://review.opendev.org/c/openstack/openstack-ansible-lxc_hosts/+/818486
15:20:36 <noonedeadpunk> which make symlink from non-existing file ....
15:21:03 <admin1> bootstrap, hosts, infra all went ok .. but then on setup openstack, i get python_venv_build : Slurp up the constraints file for later re-deployment  .. file not found: /var/www/repo/os-releases/21.2.9/keystone-21.2.9-constraints.txt
15:21:08 <noonedeadpunk> oh, also Ususri goes to EM jsut in case
15:21:16 <noonedeadpunk> so we will have last release soon
15:21:20 <noonedeadpunk> and that would be it
15:23:25 <noonedeadpunk> admin1: this file should have been created with previous task named `Build constraints file for installation purposes`
15:23:36 <noonedeadpunk> https://opendev.org/openstack/ansible-role-python_venv_build/src/branch/master/tasks/python_venv_wheel_build.yml#L180
15:24:58 <opendevreview> Dmitriy Rabotyagov proposed openstack/openstack-ansible stable/victoria: Set default for octavia_barbican_enabled  https://review.opendev.org/c/openstack/openstack-ansible/+/769179
15:30:59 <jrosser> apologies i'm in another meeting
15:33:23 <opendevreview> Dmitriy Rabotyagov proposed openstack/openstack-ansible-galera_server master: Update nariadb to 10.5.13  https://review.opendev.org/c/openstack/openstack-ansible-galera_server/+/818946
15:35:26 <mgariepy> have you seen any other timeout on galera 10.6.5 ?
15:35:43 <noonedeadpunk> nope, I guess I haven't
15:36:17 <opendevreview> Dmitriy Rabotyagov proposed openstack/openstack-ansible-galera_server master: Update mariadb to 10.5.13  https://review.opendev.org/c/openstack/openstack-ansible-galera_server/+/818946
15:36:54 <noonedeadpunk> I've jsut placed processlist to see if we really hit mariadb-upgrade
15:37:24 <noonedeadpunk> as I have some feeling, that it might be unrelated to us running upgrade....
15:37:40 <mgariepy> the logs didn't show up the upgrade was over..
15:37:49 <mgariepy> yeah hard to tell.
15:38:04 <noonedeadpunk> as eventually, now we don't run upgrade but process still stuck
15:38:21 <noonedeadpunk> huh, but maybe we perform restart or smth like that...
15:38:31 <noonedeadpunk> before upgrade is finished
15:38:31 <mgariepy> the upgrade was launche by the debian-start no ?
15:38:38 <noonedeadpunk> yes
15:39:04 <noonedeadpunk> but I mean that in https://review.opendev.org/c/openstack/openstack-ansible-galera_server/+/817384/6/tasks/galera_server_setup.yml we already have `when: ansible_facts['os_family'] | lower == 'redhat'`
15:39:13 <mgariepy> well it's sad that the file created on install if no db is prensent need upgrades..
15:40:37 <noonedeadpunk> maybe indeed place our version of debian-start?
15:40:58 <noonedeadpunk> or just ensure with lineinfile that affecting us line is not present...
15:41:49 <mgariepy> well is the order fixed ?
15:42:20 <mgariepy> as i recall it the variable were not in the correct order.
15:42:29 <noonedeadpunk> yeah, it was
15:42:46 <noonedeadpunk> which still prevents us from easily overriding command in defaults
15:43:05 <noonedeadpunk> but returning back to your suggestion - we can mess up with debian-start directly
15:43:08 <mgariepy> maybe we wait a bit to see how maria will fix it ?
15:43:34 <noonedeadpunk> yeah, that's probably the wisest choice given
15:44:43 <mgariepy> the issue is still generating traffic and more ppl are adding to it.
15:48:52 <noonedeadpunk> oh, there're updates I haven't read actually
15:49:05 <mgariepy> i do read all of them :)
15:49:13 <mgariepy> at least i try.
15:54:24 <noonedeadpunk> `It appears regularly in new or modified tests which are initially written optimistically, assuming it would work, but then we add all sorts of "wait till there is no mysql_upgrade process running [anymore]", and that does the trick.` \
15:54:26 <noonedeadpunk> lol
15:57:00 <noonedeadpunk> mgariepy: have `SELECT GET_LOCK("mariadb-upgrade");` ever worked for you?
15:57:19 <mgariepy> i havent checked that
15:57:32 <noonedeadpunk> I just have `ERROR 1582 (42000): Incorrect parameter count in the call to native function 'GET_LOCK'`
16:00:10 <noonedeadpunk> well, now I was able to get lock while mariadb-upgrade was running
16:00:36 <mgariepy> get_lock needs anther param .. but well.
16:02:33 <noonedeadpunk> ah, yes...
16:02:39 <noonedeadpunk> I misread that I guess
16:03:00 <noonedeadpunk> #endmeeting