Heavy wip feature.
Quickstart:
-Download koboldcpp: https://github.com/LostRuins/koboldcpp/releases
-Download any llm model you want to run:
https://huggingface.co/BeaverAI/Rocinante-12B-v2k-GGUF/blob/main/Rocinante-12B-v2k-Q6_K.gguf
https://huggingface.co/TheBloke/Silicon-Maid-7B-GGUF/blob/main/silicon-maid-7b.Q5_K_S.gguf
-Start koboldcpp, (default settings), browse, select model, launch.
-Do not touch LLM configs in aiplayerbot.conf, should all be # unless you know what you are doing.
-Start server.
-Give bot 'nc +ai chat'
-Talk to bot in say, yell, party, guild or whisper.
-Watch koboldcpp window react and generate response.
-Chat away.
(Small fyi, any LLM endpoint will do. Will need proper config changes. If you get any working let us know how so we can add it to the guide)