Ollama released a version that works on Windows and so I had given it a shot and it worked quite well.
This gave me the push to install Ollama and use Mistral in Arch and so far it is working pretty well.
Installation:
curl -fsSL https://ollama.com/install.sh | sh
A very painless installation on Linux.
ollama pull mistral
I also use open-webui which is the frontend for Ollama. It works through docker and worked out of the box.
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
So far Mistral is worse than ChatGPT which is using ChatGPT 3.5 but for random things Mistral is good enough. I also tried Codellama but that seemed slightly worse and Mistral is more general purpose so I'm sticking with it for now.