multi-model AI infrastructure for prompting and comparing queries to multiple models.
2025-08-30.23-35-46.mov
Built with Go backend, vanilla JavaScript frontend, and AI models across isolated VMs. Features side-by-side model comparison, real-time token streaming, and modern space-themed UI.
You will first have to setup your local running models, I used llama.cpp for this, you can find a .service file here for an example of the setup I use.
- Clone
git clone https://github.com/mengdotzip/ai.meng.zip
cd ai.meng.zip- Start backend
cd backend && go run main.go- Serve frontend
cd frontendThen serve the files with something like nginx (or use https://github.com/mengdotzip/Mazarin If you are sigma)
Don't forget to setup the backend to point to your own models, then after that everything should work.