| opendevreview | OpenStack Proposal Bot proposed openstack/project-config master: Normalize projects.yaml https://review.opendev.org/c/openstack/project-config/+/962557 | 02:25 |
|---|---|---|
| *** mrunge_ is now known as mrunge | 05:31 | |
| fungi | infra-root: i can's remember if we're still publishing anything to npm, but they're apparently overhauling upload authentication | 05:42 |
| fungi | https://github.com/orgs/community/discussions/174507 | 05:43 |
| fungi | supposedly in a few weeks, long-lived tokens are going away | 05:43 |
| *** ralonsoh_ is now known as ralonsoh | 07:46 | |
| Clark[m] | I can't think of anything we're currently publishing to npm. Even Openstack did the weird python package route so isn't using it directly | 09:30 |
| Clark[m] | I think gitea 12 may be slowish now (not as bad as 13 was before). 13 looks better too. I wonder if they are crawling all backends directly and we just notice some more depending on who gets balanced there. | 09:35 |
| Clark[m] | Probably more of a data point that we should consider blocking all direct access and force things to load balance | 09:35 |
| tonyb | I'm pretty sure it's Facebook going through the LB. last time I looked they had 100+ connections | 09:36 |
| tonyb | yeah I think after the summit we should look at disabling direct access | 09:37 |
| tonyb | Facebook don't claim to respect crawl-delay. I was thinking we could add `disallow: *commit*`, if it isn't already there to maybe reduce the load a little | 09:39 |
| tonyb | but that of course assumes well behaved crawlers | 09:39 |
| fungi | i'm still in favor of completely blocking crawlers that don't respect our robots.txt | 09:43 |
| Clark[m] | Ya I think Facebook was the one that may have ignored crawl delay but a mix of direct and load balancer access may explain that | 09:47 |
| fungi | i wonder if we could serve a different robots.txt from nackend names (deny *) than from the lb name | 09:49 |
| fungi | s/nackend/backend/ | 09:50 |
| Clark[m] | Ooh that is an idea. I think we can with different vhosts with different server names? | 09:52 |
| fungi | yeah, that's what i'm thinking | 09:53 |
| Clark[m] | Or maybe a mod rewrite rule that serves one vs another based on the url | 09:54 |
| fungi | that may also work, and then we don't need multiple vhosts | 09:54 |
| opendevreview | James E. Blair proposed opendev/system-config master: static: redirect zuulci.org to zuul-ci.org https://review.opendev.org/c/opendev/system-config/+/964296 | 12:01 |
| opendevreview | Szymon Datko proposed zuul/zuul-jobs master: [bindep] Allow options for packages installation https://review.opendev.org/c/zuul/zuul-jobs/+/964297 | 13:09 |
| opendevreview | Szymon Datko proposed zuul/zuul-jobs master: [bindep] Allow options for packages installation https://review.opendev.org/c/zuul/zuul-jobs/+/964297 | 13:41 |
| opendevreview | Szymon Datko proposed zuul/zuul-jobs master: [bindep] Allow options for packages installation https://review.opendev.org/c/zuul/zuul-jobs/+/964297 | 13:48 |
| opendevreview | Szymon Datko proposed zuul/zuul-jobs master: [bindep] Allow options for packages installation https://review.opendev.org/c/zuul/zuul-jobs/+/964297 | 14:05 |
| opendevreview | James E. Blair proposed opendev/system-config master: static: redirect zuulci.org to zuul-ci.org https://review.opendev.org/c/opendev/system-config/+/964296 | 14:12 |
| tonyb | We could for sure do different vhosts but I think gitea itself can serve a robots.txt. which could just be "go away" | 21:08 |
Generated by irclog2html.py 4.0.0 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!