![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/a64z2tlDDD.png)
Is there any interest in getting local models to run using this? I’d rather not use Gemini, and then all the data can reside locally (and not require a login).
I’d be happy to work on this, though I’m a python developer not a typescript one.
Is there any interest in getting local models to run using this? I’d rather not use Gemini, and then all the data can reside locally (and not require a login).
I’d be happy to work on this, though I’m a python developer not a typescript one.
I personally love PWAs — why the hate for them? Personally I think more apps should be PWAs instead.
Bookmarked and will come back to this. One thing that may be if interest to add is for AMD cards with 20gb of ram. I’d suppose that it would be Qwen 2.5 34B with maybe less strict quant or something.
Also, it may be interesting to look at the AllenAI molmo related models. I’m kind of planning to do this myself but haven’t had time as yet.
I think the Creative Commons would fail to apply to the source material, but using the source material should be fair use in almost every context on Lemmy.
Ah interesting — again happy to help out if there’s anything I can contribute to. I can make a feature request on github if there’s interest.