

the bigger issue is that it’s being used in a GPL3 project which kind of isn’t allowed
I followed the links and I think the original argument being referenced has been twisted around a bit game-of-telephone style, GPL prohibiting inclusion of LLM generated code isn’t what it’s claiming, it’s more that they think AI trained on GPL code violates it when it happens to reproduce it exactly:
it is readily apparent that GitHub Copilot is capable of returning, verbatim, already extant code (although it does attempt to synthesise novel code based on its training data). This immediately raises the issue, what happens when that code (such as the previous example) is licensed under a copyleft license such as the GPL or AGPL? How is the matter of copyright in this instance resolved?
https://github.com/ZDoom/gzdoom/issues/3395 https://www.fsf.org/licensing/copilot/on-the-nature-of-ai-code-copilots#5. What About Copyright?
It might also be the case that the GPL prohibits LLM generated code somehow, I don’t actually know, just want to point out that no one has made an argument for that.


I use this and contributed a small bug fix to it the other year. It’s pretty functional, although one thing I wish it did is have a way to see quarantined/spam labeled messages instead of the filter just blackholing them.