fungi | hopefully the docs volume will be done replicating in the next few minutes | 02:06 |
---|---|---|
fungi | it just finished. running again to make sure it's a ~noop | 02:30 |
fungi | it's taking longer on the rerun than i would have expected | 02:32 |
fungi | okay, finished much faster that time | 02:55 |
fungi | the flock is now cancelled | 02:57 |
fungi | and i'm watching logs for the next cron pass | 02:58 |
fungi | the tarballs volume is taking a while to catch up | 03:22 |
opendevreview | Ian Wienand proposed openstack/diskimage-builder master: [wip] regenerate initramfs with FEDORA-2021-e4843341ca https://review.opendev.org/c/openstack/diskimage-builder/+/815385 | 03:32 |
*** ykarel|away is now known as ykarel | 04:44 | |
opendevreview | Ian Wienand proposed openstack/diskimage-builder master: fedora-container: regenerate initramfs for F34 https://review.opendev.org/c/openstack/diskimage-builder/+/815385 | 05:09 |
opendevreview | Ian Wienand proposed openstack/diskimage-builder master: dracut-regenerate: drop Python 2 packages https://review.opendev.org/c/openstack/diskimage-builder/+/815409 | 05:09 |
ianw | clarkb/fungi: ^ that's about my best guess for how to make a bootable f34 image | 05:09 |
*** pojadhav is now known as pojadhav|ruck | 05:34 | |
*** ysandeep|out is now known as ysandeep | 06:28 | |
opendevreview | Ananya proposed opendev/elastic-recheck rdo: DNM: WIP: ER bot with opensearch for upstream https://review.opendev.org/c/opendev/elastic-recheck/+/813250 | 06:54 |
frickler | ianw: this seems to be hard failing the arm64-bionic jobs. maybe add a "mkdir -p"? https://review.opendev.org/c/openstack/diskimage-builder/+/814081/9..13/roles/dib-functests/tasks/main.yaml | 06:54 |
opendevreview | daniel.pawlik proposed openstack/project-config master: Add project openstack/ci-log-processing https://review.opendev.org/c/openstack/project-config/+/815260 | 07:01 |
opendevreview | Ananya proposed opendev/elastic-recheck rdo: Fix ER bot to report back to gerrit with bug/error report https://review.opendev.org/c/opendev/elastic-recheck/+/805638 | 07:08 |
opendevreview | Ananya proposed opendev/elastic-recheck rdo: Fix ER bot to report back to gerrit with bug/error report https://review.opendev.org/c/opendev/elastic-recheck/+/805638 | 07:24 |
*** jpena|off is now known as jpena | 07:31 | |
ianw | frickler: hrm, https://review.opendev.org/c/openstack/diskimage-builder/+/814846 switches things to bullseye | 07:37 |
ianw | i'm just going to approve that through, as the other func tests and the nodepool image are all now bullseye | 07:39 |
opendevreview | Merged openstack/diskimage-builder master: Switch ARM64 testing to bullseye https://review.opendev.org/c/openstack/diskimage-builder/+/814846 | 08:54 |
*** ykarel is now known as ykarel|lunch | 08:55 | |
*** ysandeep is now known as ysandeep|lunch | 09:44 | |
*** ysandeep|lunch is now known as ysandeep | 10:33 | |
*** ykarel|lunch is now known as ykarel | 10:56 | |
*** dviroel|rover|out is now known as dviroel|rover | 11:11 | |
*** jpena is now known as jpena|lunch | 11:24 | |
*** tosky_ is now known as tosky | 11:25 | |
damiandabrowski[m] | hello, | 11:28 |
damiandabrowski[m] | 199.204.45.33 (review02.opendev.org) is listed b.barracudacentral.org DNSBL, so some of the mail servers using this blacklist(like mine) may not be able to receive emails from review.opendev.org. | 11:28 |
damiandabrowski[m] | Can I ask someone authorized to look at this and send removal request? https://barracudacentral.org/rbl/removal-request | 11:28 |
fungi | #status log Requested removal of review.opendev.org's IPv4 address from the barracudacentral.org RBL | 11:43 |
opendevstatus | fungi: finished logging | 11:43 |
fungi | damiandabrowski[m]: thanks for the heads up, hopefully it's just a one-time thing and the same user doesn't re-report gerrit notifications as spam | 11:44 |
*** pojadhav|ruck is now known as pojadhav|brb | 11:45 | |
fungi | BBR21635248847-93103-2709 is our confirmation number for that removal request | 11:45 |
damiandabrowski[m] | thank You! | 11:46 |
fungi | a couple of amusing things i noticed... their rbl search doesn't support ipv6 addresses, and also after submitting the removal request they recommend listing e-mail emitting domains with https://www.emailreg.org which is entirely unresponsive. makes me suspect the barracuda spam filtering "service" may be in a semi-abandoned state, or at least derelict | 11:48 |
*** ysandeep is now known as ysandeep|brb | 12:03 | |
*** pojadhav|brb is now known as pojadhav|ruck | 12:04 | |
*** jpena|lunch is now known as jpena | 12:25 | |
*** ysandeep|brb is now known as ysandeep | 12:25 | |
frickler | fungi: related question: do we check for bounces from review.o.o? I guess mail-admins might consider continually sending mails to adresses that no longer exist also to be spam | 12:41 |
fungi | frickler: i don't think gerrit has a feature for that, but we could probably set up something once we have the account collisions fixed | 12:49 |
frickler | fungi: grep 550 /var/log/exim4/mainlog|cut -f5 -d\ |sort|uniq -c|wc => 130, with 1400 msgs in total. the question would be what action to take, disable the affected accounts completely might be a bit harsh? | 12:59 |
fungi | yes, we could probably batch something to remove the e-mail addresses, from their accounts, but i don't know what gerrit will do in such cases | 13:01 |
fungi | also we wouldn't want to take action on a single bounce, we'd need some sort of tracking to look for multiple bounces and no successful deliveries in recent history | 13:02 |
*** Guest3656 is now known as redrobot | 13:03 | |
opendevreview | Hervé Beraud proposed openstack/project-config master: Allow to check release approval for DPL teams https://review.opendev.org/c/openstack/project-config/+/815497 | 13:08 |
opendevreview | Hervé Beraud proposed openstack/project-config master: Allow to check release approval for DPL teams https://review.opendev.org/c/openstack/project-config/+/815497 | 13:09 |
opendevreview | Michal Nasiadka proposed openstack/diskimage-builder master: Add dnf versionlock support https://review.opendev.org/c/openstack/diskimage-builder/+/811945 | 13:21 |
*** lbragstad6 is now known as lbragstad | 13:28 | |
opendevreview | Merged opendev/elastic-recheck rdo: Changing no of days for query from 14 to 7 https://review.opendev.org/c/opendev/elastic-recheck/+/813795 | 13:53 |
*** sshnaidm_ is now known as sshnaidm | 14:03 | |
clarkb | fungi: frickler: removing emails from accounts is dangerous due to gerrit's consistency checking. That could cause us to create new accounts for people when they next log in. Instead I suspect the two options we can look at are disabling accounts (and then work with people who discover their accounts are disabled to fix them) or removing the preferred email address from the account | 14:35 |
clarkb | settings (this would need testing but I think gerrit may not send email if you don't have a preferred email set and the preferred email setting isn't as highly scrutinized by gerrit for account creation on login) | 14:35 |
clarkb | related the user that has a problem with an email conflict did get back to me and confirmed they want us to clean up the new email from the old account allowing gerrit to make a new account for them. I'll look into that after some morning meetings. If someone else wants to give that a go I'm willing to help too | 14:36 |
fungi | yeah, whatever avenue we take will need testing | 14:36 |
clarkb | I've approved the ci-log-processing repo creation change | 14:42 |
fungi | thanks! | 14:45 |
opendevreview | Merged openstack/project-config master: Add project openstack/ci-log-processing https://review.opendev.org/c/openstack/project-config/+/815260 | 14:55 |
venkatakrishnat | Hi, can someone please suggest me the correct channel to discuss about Devstack stack related issues | 15:03 |
fungi | venkatakrishnat: #openstack-qa is where the devstack maintainers hang out | 15:03 |
venkatakrishnat | @fungi, thank you | 15:04 |
*** ysandeep is now known as ysandeep|out | 15:08 | |
*** pojadhav|ruck is now known as pojadhav|out | 15:15 | |
*** ykarel is now known as ykarel|away | 15:16 | |
*** marios is now known as marios|out | 15:54 | |
*** jpena is now known as jpena|off | 16:33 | |
opendevreview | Yuriy Shyyan proposed openstack/project-config master: Reinstating InMotion cloud https://review.opendev.org/c/openstack/project-config/+/815532 | 17:56 |
opendevreview | Martin Kopec proposed opendev/system-config master: Adjust RefStack build for osf->openinfra rename https://review.opendev.org/c/opendev/system-config/+/808480 | 18:00 |
opendevreview | Merged openstack/project-config master: Reinstating InMotion cloud https://review.opendev.org/c/openstack/project-config/+/815532 | 18:18 |
clarkb | fungi: https://keys.openpgp.org/about also seems to be a thing | 19:13 |
clarkb | oh wait that is what emailed us so I think you just typo'd in the other channel | 19:14 |
fungi | er, yep i mistyped in the meeting | 19:15 |
opendevreview | Jeremy Stanley proposed opendev/system-config master: Update artifact signing key management process https://review.opendev.org/c/opendev/system-config/+/815547 | 19:42 |
opendevreview | Jeremy Stanley proposed openstack/project-config master: Replace old Xena cycle signing key with Yoga https://review.opendev.org/c/openstack/project-config/+/815548 | 19:53 |
ianw | fungi: "something like a caff-style signature approval mechanism" .. oh wow, nostalgia ... it's probably been 15 years since I last attended a mass key-signing and apt-get installed signing-party | 20:00 |
ianw | it was so long ago that we all put up our license/passports on a big projector in the room and nobody even thought about "someone could take a high quality photo of this" | 20:01 |
fungi | indeed! | 20:02 |
fungi | but the point is that caff basically gives the keyholder control over choosing which third-party signatures get published | 20:03 |
fungi | which could alleviate the previous pollution issues with unsolicited keysig pollution | 20:04 |
fungi | er, previous security issues | 20:04 |
fungi | the keys.openpgp.org maintainers have indicated they'd be willing to support something like that, but somebody needs to do the work to implement it | 20:05 |
fungi | so it's not as if publication/distribution of third-party keysigs will never be a thing again, there's just no good option available for it right now | 20:06 |
opendevreview | Yuriy Shyyan proposed openstack/project-config master: Increasing max servers https://review.opendev.org/c/openstack/project-config/+/815552 | 20:18 |
yuriys | looking good today | 20:18 |
frickler | fungi: well we could publish a txt export of our collective signatures somewhere locally. while not quite as convenient for users, it may be comforting for some, like distro managers maybe? | 20:20 |
fungi | frickler: yeah, i think we can iterate on it, i just wanted to get the process to a usable state again so we could rotate keys as they're overdue | 20:22 |
fungi | we'd definitely need a bit more orchestration around it if we want to continue publishing our individual keysigs somewhere | 20:22 |
*** dviroel|rover is now known as dviroel|rover|afk | 20:42 | |
fungi | looks like 815552 would be good to put some more stress on the inmotion-iad3 environment | 21:35 |
clarkb | fungi: yuriys done | 21:36 |
fungi | https://grafana.opendev.org/d/4sdNjeXGk/nodepool-inmotion is showing error-clean so far | 21:37 |
yuriys | yep, and ive confirmed the nova-scheduler/placement rando properties are helping quite a bit | 21:37 |
yuriys | big thanks to fungi and melwitt for the research material on optimizing that | 21:38 |
corvus | clarkb: i'm seeing some gerrit http timeouts in the zuul scheduler log | 21:40 |
corvus | it just got a lot better | 21:40 |
corvus | zuul may have been exhibiting sytmptoms of either a slow gerrit or bad zuul<->gerrit network | 21:41 |
fungi | i see periodic gerrit http(s) timeouts from gertty too | 21:41 |
corvus | then probably the former | 21:41 |
clarkb | corvus: ok, so hopefully a transient issue and it will catch backup on the backlog of events now | 21:41 |
corvus | agree | 21:41 |
fungi | i've been suspecting issues reaching it, possibly limited to ipv6 | 21:41 |
clarkb | load average isn't bad on review02 right now | 21:41 |
clarkb | which is the symptom we've seen in the past for slow gerrit (huge load average spikes) | 21:42 |
clarkb | I guess we continue to monitor then but nothing worth panicing over yet :) | 21:42 |
fungi | i also get semi-frequent errors running git-review, where it acts like it times out trying to establish an ssh connection (again over ipv6) | 21:42 |
fungi | but yeah, i assumed it was a network issue in my house | 21:42 |
fungi | or with my isp's ipv6 routes | 21:43 |
johnsom | I am also noticing some slowness today. Also on IPv6. No errors, just slow to load. I am doing reviews, I will leave the dev console open and see if I get any hints. | 21:47 |
corvus | fungi: plot twist: gerrit was in the house the whole time! | 21:48 |
fungi | d'oh! | 21:54 |
opendevreview | Merged openstack/project-config master: Increasing max servers https://review.opendev.org/c/openstack/project-config/+/815552 | 21:55 |
fungi | yuriys: ^ it has deployed in the last few minutes | 22:39 |
yuriys | Yep! watching all the things :) | 22:43 |
Generated by irclog2html.py 2.17.2 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!