An open source wireframe to app generator. Powered by Llama 3.2 Vision & groq.
- Llama 3.2 vision 90B from Meta for the LLM
- Llama 3.2 text 90B from Meta for the Vision model
- GROQ for LLM inference
- Uploadthing for image storage
- Next.js app router with Tailwind
- Expo snack SDK
- Clone the repo:
git clone https://github.com/mundume/quickchat-ai - Create a
.env.localfile and add your GROQ:GROQ_API_KEY= - Create an uploadthing accout and add the credentials to your
.env.localfile. All required values are in the.env.examplefile. - Run
pnpm iandpnpm devto install dependencies and run locally
This project was inspired by the Amazing Nutlope's Napkins.dev