*** kiennt2609 is now known as kiennt | 03:45 | |
*** elvira2 is now known as elvira | 12:28 | |
tkajinam | Hi. it seems release job of puppet-nova failed likely because of a temporal issue with puppet forge. Can we attempt to rerun the job ? https://zuul.opendev.org/t/openstack/build/6057ca178a8e46079589ac4a46aa0b3c | 13:02 |
---|---|---|
noonedeadpunk | Hey folks! It looks to me, like some mirrors for jammy are desynced. Maybe, even this one - `deb http://mirror.iad3.inmotion.opendev.org:8080/MariaDB/mariadb-10.11.2/repo/ubuntu jammy main` | 13:07 |
fungi | tkajinam: sorry, i was responding to you in the #openstack-puppet channel. we're now talking about this problem in four channels (also #rdo and #openstack-release) | 13:07 |
noonedeadpunk | as jobs fail with ` E: Version '12.3.0-1ubuntu1~22.04' for 'libgcc1' was not found` https://zuul.opendev.org/t/openstack/build/a39cba90d870457ea01975131f576821 | 13:07 |
tkajinam | fungi, oh, sorry. I didn't notice it. let me move to #puppet-openstack | 13:08 |
fungi | noonedeadpunk: all our "mirrors" should have identical content, since they're just web front-ends to a distributed network filesystem | 13:08 |
noonedeadpunk | while ones, that use `deb http://mirror-int.ord.rax.opendev.org:8080/MariaDB/mariadb-10.11.2/repo/ubuntu jammy main` do pass https://zuul.opendev.org/t/openstack/build/90a3284262e54665bad9b7a339931fec | 13:08 |
noonedeadpunk | fungi: some caching on front-end? | 13:10 |
fungi | noonedeadpunk: only at the filesystem driver layer | 13:11 |
fungi | noonedeadpunk: oh, though this isn't our normal ubuntu mirrors? | 13:11 |
fungi | or maybe the :8080 is throwing me off | 13:11 |
fungi | i'll need to look into what vhost is served from that port, maybe it's not a normal reprepro mirror in our afs | 13:12 |
noonedeadpunk | I've jsut checked 3 failed jobs fro today - all of them were from inmotion. Also checked 3-4 that passed today - not a single one was inmotion, but rax or ovh | 13:14 |
noonedeadpunk | It could jsut co-incidence, but not sure how I can debug that thingy. | 13:14 |
fungi | noonedeadpunk: okay, i found it. 8080 is a caching reverse web proxy to https://downloads.mariadb.com/MariaDB/ according to https://downloads.mariadb.com/MariaDB/ | 13:17 |
fungi | er, according to https://opendev.org/opendev/system-config/src/branch/master/playbooks/roles/mirror/templates/mirror.vhost.j2#L304-L307 | 13:17 |
fungi | so i suppose it's possible the mirror in inmotion cached a bad response from downloads.mariadb.com with a long ttl | 13:18 |
fungi | or there could be something else going on with the server's disk. i'll see if anything's out of sorts there | 13:18 |
noonedeadpunk | well, ttl doesn't look that long according to config. And I think we've started seeing issues previous week. At least I see first failure dated 31st of July | 13:20 |
fungi | server seems happy enough | 13:20 |
noonedeadpunk | (that could be "proper" failure due to original mirror misbehaving) | 13:21 |
fungi | noonedeadpunk: looks like downloads.mariadb.com is a round-robin to four cloudflare cdn endpoints, so it's possible one of those is actually serving stale content to requests for its region (which might then only be impacting our inmotion environment) | 13:22 |
noonedeadpunk | yeah, that's also a possibility ofc | 13:23 |
fungi | i'm seeing if i can find the same missing package version | 13:23 |
noonedeadpunk | fungi: well, actually, I think this package comes not from mariadb mirror | 13:27 |
noonedeadpunk | https://paste.openstack.org/show/buBAKn2nSXBk1a40Ms5U/ | 13:27 |
noonedeadpunk | but in output it's missing exactly same package I have installed in my sandbox | 13:28 |
fungi | https://packages.ubuntu.com/libgcc1 | 13:31 |
noonedeadpunk | https://packages.ubuntu.com/jammy-updates/libgcc-s1 | 13:33 |
fungi | oh, virtual package for libgcc-s1 on jammy | 13:33 |
fungi | yep | 13:33 |
noonedeadpunk | yeah, I guess it's time to update this in the roles, but it should work.... | 13:34 |
fungi | and changelog says that backport was from may | 13:34 |
fungi | doesn't explain why it's breaking in only one provider though | 13:35 |
fungi | out of curiosity, why does that role pin exact package versions? ubuntu doesn't keep them in their respective suites indefinitely, so those versions are going to get rotated out for newer updates over time | 13:36 |
fungi | seems like you'll wind up with constant churn on those version pins | 13:37 |
noonedeadpunk | I think we're pinning only packages that come from third-partie repos, like mariadb or rabbitmq, that are there for quite long time | 13:39 |
noonedeadpunk | Or well, we never had issues with mariadb being pinned, we had with rabbit, but it's worth pinning them, as you can get incompatible erlang/rabbit quite easily | 13:40 |
fungi | "apt-get ... install install debconf-utils=1.5.79ubuntu1 libgcc1=12.3.0-1ubuntu1~22.04 libstdc++6=12.3.0-1ubuntu1~22.04 python3-pymysql=1.0.2-1ubuntu1 libmariadb-dev=1:10.11.2+maria~ubu2204 mariadb-client=1:10.11.2+maria~ubu2204 mariadb-backup=1:10.11.2+maria~ubu2204 mariadb-server=1:10.11.2+maria~ubu2204 socat=1.7.4.1-3ubuntu4" | 13:41 |
noonedeadpunk | or anytime you decide to extend a cluster - it can jsut fall apart | 13:41 |
fungi | like half of those are straight from the ubuntu main repository | 13:41 |
noonedeadpunk | fungi: ah. you're about that. It's Ansible. Hvae no idea why, but they've re-invented system resolver for package versions | 13:41 |
fungi | oh, yeesh | 13:42 |
noonedeadpunk | And they jsut handle dependencies and provide exact versions to apt | 13:42 |
fungi | if their dep solver has subtle differences with apt's, i can imagine that would lead to some crazy failure conditions | 13:42 |
noonedeadpunk | I think they did that to support things like version comparison and ability to supply `>=` and stuff to apt module | 13:43 |
fungi | maybe they just use libapt to do it | 13:43 |
noonedeadpunk | oh, they had quite wild failures when you was crazy enough to use apt prefernces.d isntead if their resolver... | 13:43 |
fungi | but yeah, i would have expected they'd only pass through version pins specified in the task data | 13:43 |
fungi | anyway, still doesn't explain why it would be different in the inmotion environment | 13:44 |
noonedeadpunk | yeah, they actually do https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/apt.py#L493-L514 | 13:47 |
fungi | so circling back around to the beginning, libgcc1 (or libgcc-s1) 12.3.0-1ubuntu1~22.04 is being served from our global afs file tree so all the mirrors should have it, or none should, since they're all just serving content from the same read-only replica from the afs fileservers | 13:47 |
fungi | but i'll double-check that | 13:47 |
fungi | firstly, /afs/openstack.org/mirror/ubuntu/pool/main/g/gcc-12/libgcc-s1_12.3.0-1ubuntu1~22.04_amd64.deb exists | 13:50 |
fungi | https://mirror.iad3.inmotion.opendev.org/ubuntu/pool/main/g/gcc-12/libgcc-s1_12.3.0-1ubuntu1~22.04_amd64.deb is present | 13:52 |
fungi | also the package index for jammy-updates at https://mirror.iad3.inmotion.opendev.org/ubuntu/dists/jammy-updates/main/binary-amd64/Packages.gz has a libgcc-s1 12.3.0-1ubuntu1~22.04 entry with the desired "Provides: libgcc1 (= 1:12.3.0-1ubuntu1~22.04)" | 13:55 |
noonedeadpunk | huh.... | 13:59 |
* noonedeadpunk confused now | 13:59 | |
noonedeadpunk | let me re-check again then... | 14:00 |
noonedeadpunk | thanks for checking on this | 14:03 |
fungi | noonedeadpunk: it may be something odd with the images for the nodes themselves. i see that the last time we successfully uploaded any ubuntu-jammy images to our providers was wednesday, and the last successful ubuntu-jammy upload to inmotion-iad3 was a week ago | 14:06 |
fungi | i'm going to switch to #opendev and start looking into why we're having problems uploading (or building) images | 14:06 |
opendevreview | Michael Johnson proposed openstack/project-config master: Allow designate-core as osc/sdk service-core https://review.opendev.org/c/openstack/project-config/+/890365 | 22:42 |
opendevreview | Michael Johnson proposed openstack/project-config master: Allow designate-core as osc/sdk service-core https://review.opendev.org/c/openstack/project-config/+/890365 | 22:48 |
opendevreview | Merged openstack/project-config master: Fix app-intel-ethernet-operator reviewers group https://review.opendev.org/c/openstack/project-config/+/890569 | 23:46 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!