Think-Act-Say Llama.cp Bash Script
Instructions
- Create a file Example.sh and paste the code in it. Place the file in examples folder of llama.cpp. If you have permissions issues try copying the permissions of chat.sh for instance (reference: https://www.tecmint.com/copy-file-permissions-and-ownership-to-another-file-in-linux/)
- Run from llama.cpp root, like always (i.e. "./examples/Example.sh")
- Make sure to actually look at the parameters and tweak them so that they match the best model you can use and that they're optimized for your hardware. Remove "--n_parts 1" if you are not using Alpaca/GPT4All/Another Lora-integrated model.
- Tweak the name variables and the text regarding the world so it matches your own fantasy. If not enjoy this anal / redhead fetish dream.
Code:
Updates
- 2023-03-29 Added keep -9999 to gen options on advice of >>92404922
- 2023-04-02 (1) Removed unnecessary directory change on advice of >>92513194. (2) added n_parts note because of >>92513645. (3) Modified keep to use a value of -1 so required length of context is calculated on load.