Simple little web interface for creating characters and chatting with them. It's basically a single HTML file - no server. Share characters using a link (character data is stored within the URL itself). All chat data is stored in your browser using IndexedDB. Currently supports OpenAI APIs and ~any Hugging Face model.
Attempting to link this to a LLM. It connects, send the prompt and the local LLM properly parses and responds however the character output remains blank. I've tried using: kobold.cpp and oobabooga text-generation Both with the custom model config and the optional proxy script Regardless I get the same issue no matter the combination. Any ideas?
This issue appears to be discussing a feature request or bug report related to the repository. Based on the content, it seems to be still under discussion. The issue was opened by Seantourage and has received 0 comments.