First install Ollama by following this link: https://ollama.com/download
Download model by runing the command ollama pull [model name]. You'll find all available models here: https://ollama.com/library.
Run the Ollama locally hosted server by running ollama serve
Now create a python virtual environemnt and install all the required packages.
python mistralchat.py -h will show you all the running options.