Thursday, 2025-11-06

sean-k-mooneyfor what its worht belive most llm tool authors and user consier the output to be the same as the compiler case.11:49
sean-k-mooneythe compiler case however has many decades of precident11:50
sean-k-mooneywhich the llm equivlent does not11:50
sean-k-mooneywhen openstack is leagally old enough to drink in most juristiction it is deployed in we may have some light on that topic11:51
fungiwell, it's been plenty old enough to drink at home, it just needs a fake id to get into clubs14:42
clarkbsean-k-mooney: but not claude importantly which is the tool I see most people using when pushing code16:23
clarkbI think that is the hang up. I have no issue with people using tools that make it clear it is ok to use them this way. claude seems to assert ownership over its output that they convey to you only if you follow their rules16:24
sean-k-mooneyi don tread claude code as haveing the implciation you do16:24
sean-k-mooneyalso that agreement applie to the antropic api serivce16:25
sean-k-mooneynot the claude cli16:25
sean-k-mooneythe two are sepereate thing you can use the claude cli without any antropic service agrement 16:25
clarkbclaude cli isn't making api requests to the same apis with the same user account(s)?16:25
sean-k-mooneyit can but does not have to16:26
sean-k-mooneybut the service agreement is form my antorpic account16:26
sean-k-mooneynot the tool16:26
clarkbbut your anthropic account is what ultimately generates the code you push16:26
sean-k-mooneywhen im not using other models with it yes16:27
clarkbthe frontend doesn't matter so much here as the license terms for the thing creating the output16:27
sean-k-mooneyim using claude code cli with glm-4.6 as well, when if i used the redhat provide claude access that woule be usign sonnet form google vertex16:27
sean-k-mooneywith no agreement with antrpic for that16:27
clarkband you'd annotate your commits saying glm-4.6 or sonnet instead of claude in those casese16:28
sean-k-mooneywell i would and do put claude glm-4.6 yues16:28
sean-k-mooneyor claude sonnet sometimes16:28
clarkband then I'd have to go look up their license terms and determine if as the code reviewer I felt there were similar issues16:28
clarkbdoing a quick search in gerrit claude and sonnet both show up but glm-4.6 does not (I could be doing somethign wrong with the sarch terms too fwiw)16:29
sean-k-mooneyya but even when using antropics service i think people will diffeer on if there is an issue16:29
sean-k-mooneyclarkb: i dotn know if i have pushed anytign with glm to gerrit16:29
clarkbyup understood. I've made my position clear I think with the comparison to being presented with code from a human under the same terms. I would not accept them in that case16:30
sean-k-mooneyi only got access since the summit16:30
clarkband I've yet to hear a compelling argument for why an llm producing code under the same terms can be ignored/accepted16:30
clarkbbut I can see that those arguments exist and aren't baseless16:30
clarkbthey just haven't convinced me yet16:30
sean-k-mooney:) and ya in this specific case im not trying to convice as much as just particapate in this conversation16:31
sean-k-mooneyclarkb: ah it was my github repo https://github.com/SeanMooney/openstack-ai-style-guide/commit/5ae1e48f50ac008d8bdaeeaede76a0edc097bc4b16:34
sean-k-mooneyand using opencdoe in that case but i have way to many tools installed currently :)16:35
sean-k-mooneyz.ai term you both clearer and less clear language16:41
sean-k-mooney"""Ownership of content. As between you and us, and to the fullest extent permitted by applicable law, you retain all rights, title, and interest in the Prompts you submit and the Outputs generated specifically at your request and provided to you as a response to your submitted Prompts. You acknowledge and agree that the Outputs generated may lack uniqueness and could be similar16:41
sean-k-mooneyor identical to Outputs generated for other users or any Third Party. Consequently, your rights in specific Outputs, if any, may not extend to Outputs generated at the request of other users or any Third Party."""16:41
clarkbfor open source that seems a lot less problematic since open source doesn't mind if other people also use the content16:42
clarkb(again I'm not a lawyer but my initial impression of that is something I would be far more comfortable with)16:43
sean-k-mooneyya the also have a clause where you explicty grant them a non exlcisve lisnce ot use the generate content16:43
sean-k-mooneyseciton 3 of  https://docs.z.ai/legal-agreement/terms-of-use#iv-content16:44
sean-k-mooneythey do not even traisnitivly imply tht the had a rights over the input or output16:44
sean-k-mooneybut by using there subscript plan you are also grantign them rigths via section 3 "Our Use of User Content"16:45
sean-k-mooney"""For enterprises and developers using API Services, we will not use your User Content for developing or improving Services unless you explicitly agree to such use.""16:46
sean-k-mooneythat is nice so if you are not using there hosted chat interface there is an out if your just using the api16:46
sean-k-mooneyhttps://docs.z.ai/legal-agreement/terms-of-use#4-ip-rights16:47
sean-k-mooneyit woudl be nice if antropic provided such clear terms16:50
sean-k-mooneyhttps://github.com/zai-org/GLM-4 is an open weight apache2 licensed opensouce model with similar performance ot antropics sonnet 4.516:51
sean-k-mooneyactully that the 4.0 link but https://huggingface.co/zai-org/GLM-4.6 is 4.6 4.0 is much less powerful but 4.6 is still open weight and opensouce under the apache2 https://github.com/zai-org/GLM-4.516:54
sean-k-mooneyi have not had time to use it too much yet but im hoping this will be one of the first viable open model to use instead of sonnet16:55
sean-k-mooneybut even with the unsolth quantizations its 714GB full size or 96.8 with the 1 bit quant https://huggingface.co/unsloth/GLM-4.6-GGUF so its still too big for most peopel to run locally16:57
sean-k-mooneyglm 4.6 was only release on 2025-09-30 so i dont think many will have had time to try it yet16:59
sean-k-mooneyclarkb: i dont know if the emergence of open weight models that are comeptivie helps your concerns or not bu ti know im personally planning to see if ti can replace sonnet in my own workflow over the next few months17:05
clarkbsean-k-mooney: I think when the policy was developed I at least expected that the "make sure the model is compatible" (paraphrasing) would lead to more use of open weight models with fewer restrictions17:06
fungiyeah, as i mentioned yesterday, it was mostly written while osi was still working on solidifying the osaid, with a goal of encouraging use of open source ai (as deemed by osi)17:34
sean-k-mooneyfungi: fyi i suggested that joan send the notic to say the branch woudl be eol for wathcer20:14
sean-k-mooneyi wanted ot make sure it got cleaned up proactivly because there was no one doing maintaince on it20:14
fungisean-k-mooney: yeah it was a good suggestion. my point was not that the watcher team is doing anything wrong there, more that if the process were working as intended they should never have needed to at all20:15
sean-k-mooneyya there was a littel confution when the new reliase leiason reached out to the release team20:16
sean-k-mooneyorginally they tought we shoudl be keepign the branch but i linked and explain how it works20:16
fungiat the moment we've got a ton of unmaintained branches with nobody explicitly agreeing to become caretakers, so those branches should be getting deleted anyway but that's clearly not been happening20:16
sean-k-mooneyyep when we revied watcher i wanted to be proactive to indicate that you shoudl not look at the older ones20:17
sean-k-mooneys/look at/run in production/20:17
fungiwell, hopefully calling the branches "unmaintained" at least dissuades anyone from thinking they're suitable for future production use20:19

Generated by irclog2html.py 4.0.0 by Marius Gedminas - find it at https://mg.pov.lt/irclog2html/!