I’ve been building MinimalChat for a while now, and based on the feedback I’ve received, it’s in a pretty decent place for general use. I figured I’d share it here for anyone who might be interested!

Quick Features Overview:

  • Mobile PWA Support: Install the site like a normal app on any device.
  • Any OpenAI formatted API support: Works with LM Studio, OpenRouter, etc.
  • Local Storage: All data is stored locally in the browser with minimal setup. Just enter a port and go in Docker.
  • Experimental Conversational Mode (GPT Models for now)
  • Basic File Upload and Storage Support: Files are stored locally in the browser.
  • Vision Support with Maintained Context
  • Regen/Edit Previous User Messages
  • Swap Models Anytime: Maintain conversational context while switching models.
  • Set/Save System Prompts: Set the system prompt. Prompts will also be saved to a list so they can be switched between easily.

The idea is to make it essentially foolproof to deploy or set up while being generally full-featured and aesthetically pleasing. No additional databases or servers are needed, everything is contained and managed inside the web app itself locally.

It’s another chat client in a sea of clients but it is unique in its own ways in my opinion. Enjoy! Feedback is always appreciated!

Self Hosting Wiki Section https://github.com/fingerthief/minimal-chat/wiki/Self-Hosting-With-Docker

  • SimpleDev@infosec.pubOP
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    This project is entirely web based using Vue 3, it doesn’t use langchain and I haven’t looked into it before honestly but I do see they offer a JS library I could utilize. I’ll definitely be looking into that!

    As a result there is no LLM function calling currently and apps like LM Studio don’t support function calling when hosting models locally from what I remember. It’s definitely on my list to add the ability to retrieve outside data like searching the web and generating a response with the results etc…