opendevreview | Ghanshyam proposed openstack/nova master: Remove the Hyper-V driver https://review.opendev.org/c/openstack/nova/+/894466 | 02:53 |
---|---|---|
frickler | the new oslo.log release seems to need changes in nova unit tests, please check https://review.opendev.org/c/openstack/requirements/+/892928 | 07:00 |
bauzas | frickler: looking | 08:22 |
sean-k-mooney | frickler: we are passed the non client lib freeze | 09:12 |
sean-k-mooney | why is there a 5.3.0 happening now | 09:12 |
bauzas | my bad, I thought it was related to https://bugs.launchpad.net/ironic/+bug/2030976 but actually no, so indeed this isn't an urgency, right? | 09:24 |
bauzas | anyway, this is u-c, not a nova change | 09:24 |
sean-k-mooney | it would break master when it becomes caracl | 09:24 |
sean-k-mooney | the reason im askin gis do then intend 5.3.0 to be part of bobcat or not | 09:24 |
sean-k-mooney | we have not cut RC1 yet | 09:25 |
sean-k-mooney | so we shoudl not really be merging any changes for this until after that is done | 09:26 |
bauzas | https://docs.openstack.org/releasenotes/oslo.log/unreleased.html | 09:26 |
bauzas | if this is just for deprecating os-win, meh | 09:26 |
sean-k-mooney | deprecations need to go out in a slurp anyway so i think this can wait | 09:27 |
bauzas | sean-k-mooney: but maybe frickler wants to merge this u-c change after RC1 ? | 09:27 |
sean-k-mooney | it must contain something else too | 09:27 |
sean-k-mooney | ya perhaps and that is fine | 09:27 |
sean-k-mooney | i woudl just prefer to hold this until RC1 is cut but once it is we can correect whatever the unit test fallout is | 09:28 |
sean-k-mooney | https://github.com/openstack/oslo.log/compare/5.2.0...5.3.0 | 09:29 |
bauzas | https://github.com/openstack/oslo.log/compare/5.3.0...5.2.0 | 09:29 |
bauzas | heh | 09:29 |
sean-k-mooney | https://github.com/openstack/oslo.log/commit/6abf69e194c9dac13d26bca3e7ac1f710f9e26a0 | 09:29 |
sean-k-mooney | so they are droping 3.8 support | 09:29 |
sean-k-mooney | that is why its breaking things | 09:30 |
sean-k-mooney | so this cant merge in bobcat | 09:30 |
bauzas | right | 09:30 |
bauzas | and the relnotes are actually silent, which is sad | 09:31 |
sean-k-mooney | i -1'd the uc patch | 09:32 |
sean-k-mooney | and linked to the relevent commits and upstream runtimes | 09:32 |
sean-k-mooney | funillay enough this failind in the py310 job | 09:34 |
sean-k-mooney | https://6bd5f66fe830a1e3af93-5011cbf696878119f18f8b3f0098ae5d.ssl.cf1.rackcdn.com/892928/1/check/cross-nova-py310/9f15891/testr_results.html | 09:34 |
bauzas | sorry for lagging but this morning, I'm switching from 4G to fiber (which is back, woohooo) and testing a few things | 09:36 |
bauzas | I have an unifi USG as my home gateway and I'm tempted to stick with a passive HA mode with 4G :) | 09:36 |
elodilles | bauzas sean-k-mooney : the release is part of bobcat and it was released in time. when the release is out, then a requirements patch (upper-constraint bumping patch) is generated, that tests compatibility. this seems to be failing now | 09:37 |
bauzas | elodilles: I don't disagree | 09:37 |
sean-k-mooney | elodilles: well as i noted that oslo tag increases the min python to 3.9 | 09:37 |
bauzas | elodilles: but we're just saying that we can't import it in our bobcat release | 09:37 |
sean-k-mooney | elodilles: so it cannot be inclucded in bobcat without reverting that | 09:37 |
bauzas | anyway, kids taxi for a couple of mins | 09:37 |
sean-k-mooney | on wait... | 09:38 |
sean-k-mooney | 2023.2 i bobcat | 09:39 |
sean-k-mooney | ok actully it can be incled but i would still consider this late | 09:39 |
elodilles | sean-k-mooney: it didn't drop py38 support. py38 is still the min python | 09:39 |
sean-k-mooney | no it did https://github.com/openstack/oslo.log/commit/6abf69e194c9dac13d26bca3e7ac1f710f9e26a0 | 09:40 |
sean-k-mooney | python_requires = >=3.9 | 09:40 |
sean-k-mooney | but 2023.2 is bobcat and bocat technially did drop 3.8 support | 09:40 |
sean-k-mooney | however we didnt put in a min python_requires in nova | 09:40 |
sean-k-mooney | we did attempt too but there was objections to that as it would break other porjects that did not have that change | 09:41 |
elodilles | sean-k-mooney: all the py38 patches were reverted. here i see: https://opendev.org/openstack/oslo.log/src/commit/b5b8c30b0d925aa3d31b58932c94586631827b62/setup.cfg#L9 | 09:41 |
elodilles | * py38 dropping patches | 09:42 |
sean-k-mooney | oh just reread the runtimes patch too """All Python-based projects must additionally target and test against following Python versions: | 09:42 |
sean-k-mooney | Python 3.8 (available as default in Ubuntu 20.04) | 09:42 |
sean-k-mooney | " | 09:42 |
sean-k-mooney | "" | 09:42 |
sean-k-mooney | ok so they reverted that | 09:42 |
elodilles | yepp | 09:42 |
sean-k-mooney | so im still trying to figure out why its failing as none of the failing test cases test/use oslo log fucntionality | 09:43 |
sean-k-mooney | i think this is actully breakign due to privsep somehosw | 09:43 |
elodilles | anyway, this needs to be sorted out together with oslo team, as Bobcat final release is in 3 weeks | 09:44 |
elodilles | I'd say if there is an easy fix in nova, then we should add that fix in nova ASAP, because it's easier then to invalidate a release | 09:44 |
elodilles | sean-k-mooney: thanks for looking into the issue | 09:45 |
sean-k-mooney | oh https://github.com/openstack/oslo.log/compare/5.2.0...5.3.0 is not in revers chronalogical order | 09:46 |
sean-k-mooney | so ya i see teh revert i was confued by that | 09:46 |
elodilles | ++ | 09:47 |
sean-k-mooney | im not seeing any change there that could cause this so i think it must be form a change in privsep | 09:47 |
sean-k-mooney | well | 09:47 |
sean-k-mooney | we do actully log her | 09:47 |
sean-k-mooney | https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L43-L90 | 09:47 |
sean-k-mooney | in the failing test we are makign this raise an error https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L57 | 09:48 |
sean-k-mooney | specificaly value error | 09:49 |
sean-k-mooney | and we are expecting to take this path https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L76-L78 | 09:49 |
sean-k-mooney | we mock os.write and assert its not called here https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L61 | 09:50 |
sean-k-mooney | sorry off by one | 09:50 |
sean-k-mooney | https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L62 | 09:50 |
sean-k-mooney | im wondering if the LOG.error in the excption block is somehow counting as a call to os.write | 09:51 |
sean-k-mooney | in the unit test we shoudl be using a stringIO buffer for logging in the test fixture | 09:51 |
sean-k-mooney | so there shoudl not be aby write calls to stdout or a file | 09:52 |
sean-k-mooney | ill see if i can repoduce it locally but it does not really make sesne why this would pass on 5.2.0 and not 5.3.0 | 09:52 |
frickler | sorry, was away for a bit. in general, new openstack library releases are not affected by requirements freeze I think. in particular if merging them is delayed by other projects | 09:53 |
frickler | so oslo.log 5.3.0 is part of the bobcat release and thus should make it into u-c as well if possible | 09:54 |
frickler | ah, elodilles said that in between already | 09:55 |
sean-k-mooney | frickler: the final release is ment to happen before FF | 10:04 |
sean-k-mooney | that why we have the non client freeze before it | 10:04 |
sean-k-mooney | so we do not expect to have any new non-client release in uc between FF and RC-1 unless its an RC-1 fix | 10:04 |
sean-k-mooney | it looks like the release was actully done 13 days ago | 10:05 |
sean-k-mooney | so that would have been before FF | 10:05 |
frickler | yes, it is just the u-c bump that is lagging | 10:05 |
sean-k-mooney | right which is not good because that kidn of defeats the reason for the eailer releas | 10:06 |
sean-k-mooney | but i guess in this case it just would have been better to highlihgt this earlier | 10:06 |
sean-k-mooney | i.e. that it was not passing | 10:07 |
sean-k-mooney | im running the tests with the older release not and then ill manually unpin and see if this repoduces for me locally | 10:08 |
frickler | yes, handling by the requirements team could be improved if they had more participants I guess. it also isn't helped by the CI being unstable in general. I'm just trying to fill in some of the gaps where I can | 10:08 |
sean-k-mooney | so locally passes with 5.2.0 and fails with 5.3.0 | 10:10 |
sean-k-mooney | so thats a good start | 10:10 |
sean-k-mooney | looks like the same error too | 10:11 |
sean-k-mooney | although for some reason my tox execution seams to be hanging... | 10:11 |
sean-k-mooney | oh right the other test is a timeout exception | 10:16 |
sean-k-mooney | so it was just waitign on that | 10:16 |
opendevreview | Merged openstack/python-novaclient stable/2023.2: Update .gitreview for stable/2023.2 https://review.opendev.org/c/openstack/python-novaclient/+/894074 | 10:45 |
sean-k-mooney | this is kind of insane | 11:07 |
sean-k-mooney | apprently this LOG.ERROR is causign the test to fail | 11:07 |
sean-k-mooney | https://github.com/openstack/nova/blob/master/nova/privsep/utils.py#L77C13-L78C62 | 11:07 |
sean-k-mooney | this passes https://paste.opendev.org/show/bUyh4M9k01rrYaP58223/ | 11:08 |
sean-k-mooney | but https://paste.opendev.org/show/b4wfwcpSB6XdjPZzC5UN/ fails | 11:08 |
sean-k-mooney | but the error is | 11:09 |
sean-k-mooney | AssertionError: Expected 'write' to not have been called. Called 1 times. | 11:09 |
sean-k-mooney | Calls: [call(7, b'X')]. | 11:09 |
sean-k-mooney | the commented out code | 11:10 |
sean-k-mooney | # m.write(b"x" * align_size) | 11:10 |
sean-k-mooney | # os.write(fd, m) | 11:10 |
sean-k-mooney | woudl write X to fd 7 | 11:10 |
sean-k-mooney | but its commeted out... | 11:10 |
sean-k-mooney | and the log message is not writng that | 11:10 |
sean-k-mooney | sothere is somehting funky happening https://github.com/openstack/oslo.log/compare/5.3.0...5.2.0 and https://github.com/openstack/oslo.log/compare/5.2.0...5.3.0 | 11:30 |
sean-k-mooney | shoudl not both show patches but they do | 11:30 |
elodilles | sean-k-mooney: now that i'm looking at git tree it shows that we had some issue with oslo.log at antelope release, thus reverted the two patches, but we did it on stable/2023.1 branch and not on master | 11:37 |
sean-k-mooney | ack | 11:39 |
sean-k-mooney | perhaps i shoudl be diffeint agaisnt a diffent base | 11:39 |
sean-k-mooney | the failure is definatly comign form the LOG.error call | 11:40 |
sean-k-mooney | but i cant see any patch that would casue this in those diffes | 11:40 |
elodilles | yes, 5.1.0..5.3.0, but it still not shows the reverts that we only merged on stable/2023.1 :/ | 11:40 |
elodilles | there are 3 patches reverted in 5.2.0 | 11:41 |
sean-k-mooney | just so we are on the same page https://paste.opendev.org/show/bHorJ0sp6UTAKfu2EbJ2/ passes | 11:41 |
sean-k-mooney | its the LOG.error on line 34 | 11:42 |
sean-k-mooney | that is causing the issue but the message it is loging and the assert that is fialing do not make sense | 11:42 |
sean-k-mooney | no matter what i log it causes Calls: [call(7, b'X')] on os.write | 11:43 |
elodilles | sean-k-mooney: these were reverted on 5.2.0, but not in master (5.3.0): https://paste.opendev.org/show/bRU8ommJhreoVpdiWMRV/ | 11:45 |
elodilles | one of them has to be related to the issue | 11:45 |
sean-k-mooney | so the eventlet fix was added and removed fith | 11:46 |
sean-k-mooney | *right | 11:46 |
sean-k-mooney | im going to cloen oslo.log and pip install -e it into the env | 11:47 |
sean-k-mooney | and i guess do a biset effectivly | 11:47 |
frickler | so this whole issue is 6 months old and got successfully ignored? cool | 11:48 |
sean-k-mooney | no idea really. its only causing two test to fail | 11:49 |
sean-k-mooney | so i dont think its actully breaking real code | 11:49 |
sean-k-mooney | however im not parcarlly happy with just modifying the test without understadnign why this is breakign | 11:50 |
sean-k-mooney | as it really not obvious why this is happeining | 11:50 |
sean-k-mooney | and looking at the patch delta i woudl not expect this type of change between 5.2.0 and 5.3.0 | 11:50 |
elodilles | stephenfin: do you have any idea what could be a solution for this oslo.log issue? ^^^ | 11:53 |
sean-k-mooney | so its broken by 94b9dc3 | 11:56 |
frickler | sean-k-mooney: 5.2.0 is a bugfix release only done on stable/2023.1, 5.1.0 is the initial release for 2023.1 | 11:56 |
sean-k-mooney | https://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17 | 11:56 |
elodilles | yepp | 11:56 |
frickler | so 5.1.0..5.3.0 gives a better idea of what happened in master. though iiuc the issue is in 5.0.0..5.1.0 | 11:57 |
frickler | 94b9dc3 is 5.0.1, ack | 11:58 |
sean-k-mooney | so with that calls to LOG.error are internally using os.write | 11:59 |
sean-k-mooney | presumably its realted to the lock its taking | 11:59 |
sean-k-mooney | here https://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17#diff-2c76d41c287653560e6e84f39ce877b4f9ba33c7b36db17194e7435cd54adfb0R41 | 11:59 |
sean-k-mooney | its this call on release | 12:00 |
sean-k-mooney | https://github.com/openstack/oslo.log/commit/94b9dc32ec1f52a582adbd97fe2847f7c87d6c17#diff-2c76d41c287653560e6e84f39ce877b4f9ba33c7b36db17194e7435cd54adfb0R108 | 12:01 |
sean-k-mooney | line 108 | 12:01 |
sean-k-mooney | that is causing teh test failure sepcifically | 12:01 |
sean-k-mooney | the next commit maks this conditionaly | 12:01 |
sean-k-mooney | https://github.com/openstack/oslo.log/commit/de615d9370681a2834cebe88acfa81b919da340c | 12:01 |
sean-k-mooney | so i guess there are two ways to fix this | 12:02 |
sean-k-mooney | i could proably disable this fix in this test | 12:02 |
sean-k-mooney | or i might be able to mock the lock | 12:02 |
sean-k-mooney | the issue is we have a StandardLogging fixture in our base test clase which handles all or most of the loging config | 12:03 |
sean-k-mooney | i can also mock the LOG var on the moduel i guess | 12:03 |
sean-k-mooney | we cocationally do that so i would just replace LOG with a mock explictly | 12:04 |
elodilles | so this issue only comes up in nova's tests, but could not cause any problem outside the test code, you mean? | 12:05 |
sean-k-mooney | its not breaking any real code | 12:07 |
sean-k-mooney | its just breaking the test case as we are mocking os.write and testign a function that uses it directly | 12:07 |
sean-k-mooney | and oslo.log is not ment ot driectly call os.write | 12:07 |
sean-k-mooney | or at least id did not previously | 12:07 |
sean-k-mooney | the "eventlet logging fix" patch add a loc that internally calls os.write | 12:08 |
elodilles | hmmmm, i see | 12:10 |
sean-k-mooney | self.useFixture(fixtures.MonkeyPatch("nova.privsep.utils.LOG", mock.Mock())) | 12:11 |
sean-k-mooney | that shoudl be the fix | 12:11 |
sean-k-mooney | ya that works | 12:11 |
sean-k-mooney | ill push that up as a patch | 12:12 |
elodilles | sean-k-mooney: thanks! \o/ | 12:27 |
opendevreview | sean mooney proposed openstack/nova master: adapt to oslo.log changes https://review.opendev.org/c/openstack/nova/+/894538 | 13:24 |
sean-k-mooney | frickler: elodilles: bauzas: ^ that should fix that specific issue | 13:24 |
bauzas | just saw the patch | 13:24 |
sean-k-mooney | there may be more generic ways to adress this but i dont really want do do anything more invaisve or global right now | 13:25 |
bauzas | sean-k-mooney: I wish oslo.log would have had an upgrades relnote | 13:26 |
sean-k-mooney | well they were not awre i guss that we were sentiive to things like calls to os.write | 13:26 |
sean-k-mooney | it does not rbeak runtime code | 13:26 |
sean-k-mooney | with that said i need to test this with our functional tests as well | 13:27 |
sean-k-mooney | so im going to do that now | 13:27 |
bauzas | cool | 13:28 |
opendevreview | Merged openstack/python-novaclient stable/2023.2: Update TOX_CONSTRAINTS_FILE for stable/2023.2 https://review.opendev.org/c/openstack/python-novaclient/+/894076 | 13:53 |
sean-k-mooney | bauzas: so the functional test also work find with 5.3.0 | 14:22 |
sean-k-mooney | so it really was just thowse two speicifc unit tests that were impacted | 14:22 |
bauzas | I voted +2 | 14:50 |
bauzas | folks (esp. cores), I have a lot of changes that need to be merged before RC1 : https://etherpad.opendev.org/p/nova-bobcat-rc-potential | 14:51 |
bauzas | at least : https://review.opendev.org/c/openstack/nova/+/893742 https://review.opendev.org/c/openstack/nova/+/893744 https://review.opendev.org/c/openstack/nova/+/893749 | 14:52 |
bauzas | dansmith: sean-k-mooney: gmann: ^ | 14:53 |
dansmith | bauzas: yep I saw, I'll work through the etherpad when I'm done with my current thing | 14:56 |
bauzas | ++ | 14:56 |
sean-k-mooney | stephenfin: can you review https://review.opendev.org/c/openstack/nova/+/894538 | 15:10 |
stephenfin | sure | 15:10 |
sean-k-mooney | bauzas: ill start look at those now to | 15:11 |
stephenfin | sean-k-mooney: is the print intentional? Do we need it? | 15:11 |
sean-k-mooney | did we really have no api microversion this cycle | 15:11 |
sean-k-mooney | ok i tought we did | 15:11 |
bauzas | sean-k-mooney: no, we haven't, neither any RPC change | 15:12 |
bauzas | it was a very smooth release in terms of upgrade | 15:12 |
sean-k-mooney | this is both good and bad for the first slurp release | 15:12 |
sean-k-mooney | stephenfin: no it was not good catch | 15:13 |
stephenfin | cool, can +2/+W once it's respin | 15:14 |
stephenfin | *respun | 15:14 |
bauzas | sean-k-mooney: I thought your print was needed | 15:14 |
sean-k-mooney | i can add a log for it | 15:14 |
sean-k-mooney | that was me just debuging things | 15:14 |
sean-k-mooney | the way we use pbr does not work with the debugger in vscode | 15:15 |
bauzas | cool, just respin then | 15:15 |
bauzas | hah | 15:15 |
bauzas | I personnally rather pdb with an entrypoint I add | 15:15 |
bauzas | but I understand your ask for walking thru | 15:15 |
opendevreview | sean mooney proposed openstack/nova master: adapt to oslo.log changes https://review.opendev.org/c/openstack/nova/+/894538 | 15:16 |
stephenfin | thanks | 15:17 |
sean-k-mooney | bauzas: i mentioned this to you before but can we also try to land https://review.opendev.org/c/openstack/nova/+/860829 today | 15:26 |
bauzas | sean-k-mooney: this is just changing the way we query, right? | 15:33 |
bauzas | my only concern is how much it would impact the SQL query plan | 15:33 |
sean-k-mooney | we need to do it anyway | 15:33 |
bauzas | and what would be the impact in terms of performance | 15:33 |
bauzas | sean-k-mooney: because of SQLA 3, right? | 15:34 |
bauzas | S/3/2 | 15:34 |
sean-k-mooney | for 2 yes | 15:34 |
sean-k-mooney | we are changing the way we describe the join | 15:34 |
sean-k-mooney | but it should not matiraly chagne the queary | 15:35 |
bauzas | that, I understood | 15:35 |
bauzas | ideally, if the execution plan isn't changing, that's a no-brainer | 15:35 |
bauzas | but I don't know this | 15:35 |
sean-k-mooney | its been a while since i read the docs but im not expecting it to mofy the executed SQL at all | 15:35 |
sean-k-mooney | i read https://docs.sqlalchemy.org/en/14/orm/backref.html and stephens blog before reviewing https://that.guru/blog/sqlalchemy-relationships-without-foreign-keys/ | 15:37 |
bauzas | I hope you understand my concern : if we merge anything so far in the cycle that would impact the query time on some large and frequently called records like instance or instance info cache, operators would jump on our throats | 15:37 |
bauzas | stephenfin: around ? | 15:38 |
sean-k-mooney | bauzas: i do but i dont think that is a reason not to merge this | 15:38 |
sean-k-mooney | no ne just left to head into the city | 15:38 |
sean-k-mooney | he will be around tomorrow | 15:38 |
bauzas | kk | 15:38 |
sean-k-mooney | we can chat about it in the team meeting tomrowow if you like | 15:38 |
bauzas | sean-k-mooney: and I assume you want to address this in this cycle so that the SQLA 2 bump is considered 100% complete for Bobcat ? | 15:39 |
sean-k-mooney | yes | 15:39 |
sean-k-mooney | i want that to be one of our cycle highlights for what its worth | 15:39 |
bauzas | in the prelude if you want | 15:39 |
bauzas | cycle highlights are tend to be marketing-readable | 15:40 |
sean-k-mooney | well i would be happy even if its in the normaly release notes since all of the above means we finsihed it :) | 15:40 |
bauzas | I mean, don't get me wrong | 15:40 |
bauzas | this looks a quickwin | 15:40 |
sean-k-mooney | well support for sqlachemy 2.0 is good but we need all the other proejct to also support it before we can fully claim victory | 15:41 |
bauzas | and I wouldn't disagree merging such things | 15:41 |
sean-k-mooney | for what its worth this was first propsed 11 months ago and we punted it once already | 15:42 |
bauzas | a long time before in a very far galaxy, we had a DB job | 15:42 |
bauzas | that was ensuring our performance was stable | 15:42 |
bauzas | sean-k-mooney: I get your frustration and again, I'm not against merging it | 15:42 |
bauzas | I'm just asking for clarification | 15:42 |
bauzas | and visibility | 15:43 |
sean-k-mooney | yep i understand | 15:43 |
opendevreview | Merged openstack/nova master: doc: mark the maximum microversion for 2023.2 Bobcat https://review.opendev.org/c/openstack/nova/+/893742 | 15:48 |
opendevreview | sean mooney proposed openstack/nova-specs master: replace() argument 1 must be str, not _StrPath https://review.opendev.org/c/openstack/nova-specs/+/894553 | 16:11 |
sean-k-mooney | bauzas: ^ that will fix the nova-specs docs job | 16:12 |
opendevreview | Merged openstack/nova master: adapt to oslo.log changes https://review.opendev.org/c/openstack/nova/+/894538 | 16:42 |
opendevreview | Merged openstack/nova-specs master: replace() argument 1 must be str, not _StrPath https://review.opendev.org/c/openstack/nova-specs/+/894553 | 16:59 |
atmark | hello, is possible to migrate cell back to non cell on existing deployment? | 17:04 |
dansmith | atmark: there's no "non-cell" mode in nova | 17:07 |
dansmith | maybe explain more about what you're trying to do | 17:08 |
greatgatsby_ | Hello. Our host aggregates seem to get out of sync with our provider aggregates. There seems to be a `nova-manage placement sync_aggregates` command, but we're confused how they're getting out of sync in the first place. Any suggestions of what could be the cause? This is deployed via kolla-ansible yoga | 18:41 |
*** bauzas_ is now known as bauzas | 19:13 | |
*** bauzas_ is now known as bauzas | 19:27 | |
*** bauzas_ is now known as bauzas | 21:47 | |
*** bauzas_ is now known as bauzas | 22:04 | |
*** bauzas_ is now known as bauzas | 22:28 | |
*** bauzas_ is now known as bauzas | 23:05 | |
*** bauzas_ is now known as bauzas | 23:21 | |
*** bauzas_ is now known as bauzas | 23:35 |
Generated by irclog2html.py 2.17.3 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!