|
βοΈ Rukia Code Freezer Debugger Agent
|
π¬ Nemu Lab Architect Research Agent
|
β¨ Orihime Bug Healer Refactor Agent
|
π― Makima Flow Controller Orchestrator
|
"The fastest AI-powered code generation platform, now with peer-to-peer collaboration"
Groq Coder is a production-grade acceleration engine that leverages Groq LPUβ’ (Language Processing Unit) for blazing-fast inference. Unlike GPU-based solutions that batch requests, we stream logic in real-timeβenabling developers to iterate at the speed of thought.
+ β‘ Sub-50ms inference latency via Groq LPU
+ π Real-time P2P collaboration without central servers
+ π― 800+ tokens/second streaming
+ π End-to-end encrypted peer connections
+ π Decentralized code sharing networkflowchart TB
subgraph "π Public Network"
direction TB
STUN[("π STUN Server<br/>NAT Traversal")]
TURN[("π TURN Relay<br/>Fallback Route")]
end
subgraph "π€ Peer A - Creator"
direction TB
UA[("π₯οΈ Browser Client")]
WA["π‘ WebRTC Agent"]
LA["πΎ Local State<br/>IndexedDB"]
end
subgraph "π₯ Peer B - Collaborator"
direction TB
UB[("π₯οΈ Browser Client")]
WB["π‘ WebRTC Agent"]
LB["πΎ Local State<br/>IndexedDB"]
end
subgraph "β‘ Signaling Layer"
direction LR
SIG["π WebSocket<br/>Signaling Server"]
end
UA --> WA
UB --> WB
WA <-->|"π SDP Offer/Answer"| SIG
WB <-->|"π SDP Offer/Answer"| SIG
WA <-->|"π§ ICE Candidates"| STUN
WB <-->|"π§ ICE Candidates"| STUN
WA <-.->|"π Relay (if needed)"| TURN
WB <-.->|"π Relay (if needed)"| TURN
WA <===>|"π Encrypted DataChannel<br/>Code + Cursor + State"| WB
WA --> LA
WB --> LB
style UA fill:#ff6b00,stroke:#fff,color:#fff
style UB fill:#00d4aa,stroke:#fff,color:#fff
style SIG fill:#8b5cf6,stroke:#fff,color:#fff
style STUN fill:#3b82f6,stroke:#fff,color:#fff
style TURN fill:#ec4899,stroke:#fff,color:#fff
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β PEER-TO-PEER HANDSHAKE SEQUENCE β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β PEER A SIGNALING PEER B β
β β β β β
β βββββ Create Offer βββββββ>β β β
β β βββββ Forward Offer βββββββ>β β
β β β β β
β β β<βββββ Create Answer βββββββ β
β β<ββββ Forward Answer ββββββ β β
β β β β β
β β<βββββββββ ICE Candidates Exchange βββββββββββββββββ>β β
β β β β β
β ββββββββββββββββββ DTLS Handshake ββββββββββββββββββΆ β β
β β β β β
β ββββββββββββ ENCRYPTED DATA CHANNEL ββββββββββββββββΆ β β
β β β β
β βββ΄ββ βββ΄ββ β
β β A βββββββββββββ LIVE COLLABORATION ββββββββββββββββββΆβ B β β
β βββββ βββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Channel | Purpose | Priority |
|---|---|---|
code-sync |
Real-time code delta sync | π΄ Critical |
cursor-pos |
Cursor position broadcast | π‘ High |
ai-stream |
AI response streaming | π΄ Critical |
presence |
User presence/status | π’ Normal |
files |
Large file transfer | π΅ Low |
graph TB
subgraph "π¨ Client Layer"
direction TB
UI["πΌοΈ React UI<br/>Next.js 15"]
Editor["π Monaco Editor<br/>Code Workspace"]
Preview["ποΈ Live Preview<br/>Sandboxed iFrame"]
P2P["π P2P Module<br/>WebRTC"]
end
subgraph "π Edge Layer"
direction TB
CDN["βοΈ Vercel Edge<br/>Global CDN"]
MW["π‘οΈ Middleware<br/>Rate Limiting"]
Auth["π NextAuth.js<br/>OAuth/JWT"]
end
subgraph "β‘ Inference Engine"
direction TB
Groq["π§ Groq LPU<br/>Primary Engine"]
Cerebras["π Cerebras<br/>Fallback"]
DeepSeek["π DeepSeek R1<br/>Reasoning Model"]
end
subgraph "πΎ Data Layer"
direction TB
Mongo["ποΈ MongoDB Atlas<br/>Project Storage"]
Redis["β‘ Upstash Redis<br/>Session Cache"]
Vector["π Vector DB<br/>Embeddings"]
end
subgraph "π‘ Real-time Layer"
direction TB
WS["π WebSocket<br/>Live Updates"]
SSE["π SSE<br/>AI Streaming"]
Signal["π Signaling<br/>P2P Coordination"]
end
UI --> Editor
UI --> Preview
UI --> P2P
Editor -->|"HTTP/2"| CDN
CDN --> MW
MW --> Auth
Auth --> Groq
Auth --> Cerebras
Groq --> DeepSeek
Auth --> Mongo
Auth --> Redis
Mongo --> Vector
Editor --> WS
Groq --> SSE
P2P --> Signal
style Groq fill:#ff6b00,stroke:#fff,color:#fff
style P2P fill:#00d4aa,stroke:#fff,color:#fff
style UI fill:#8b5cf6,stroke:#fff,color:#fff
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β REQUEST PROCESSING PIPELINE β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β USER INPUT ββββββΆβ EDGE RUNTIME ββββββΆβ AUTH GUARD β
β "Build form" β β validate req β β check session β
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β
βΌ
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β RATE LIMITER βββββββ CONTEXT BUILD βββββββ PROMPT ENGINE β
β Upstash Redisβ β inject meta β β system prompt β
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β
βΌ
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β GROQ LPU ββββββΆβ STREAM ββββββΆβ SSE PUSH β
β 800 tok/sec β β transformer β β to client β
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β
βΌ
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
β ASYNC WRITE βββββββ P2P FANOUT βββββββ RENDER β
β MongoDB β β sync to peers β β live preview β
βββββββββββββββββ βββββββββββββββββ βββββββββββββββββ
| Layer | Technology | Purpose | Latency Target |
|---|---|---|---|
| Client | Next.js 15 + React 19 | Server Components, Streaming | < 100ms FCP |
| Auth | NextAuth.js + JWT | OAuth, Session Management | < 50ms |
| Inference | Groq LPU | Primary AI Engine | < 50ms TTFB |
| Fallback | Cerebras WSE | Secondary Inference | < 150ms TTFB |
| Cache | Upstash Redis | Rate Limiting, Sessions | < 10ms |
| Database | MongoDB Atlas | Project Persistence | < 100ms |
| P2P | WebRTC + DataChannels | Real-time Collaboration | < 30ms RTT |
"Intelligence shouldn't be gated behind paywalls"
In an era of $20/month AI subscriptions, Groq Coder stands for accessibility:
| Belief | Our Commitment |
|---|---|
| π Access is a Right | Every developer deserves SOTA tooling |
| π₯ Community > Corporation | Features come from users, not roadmaps |
| π Transparency is Trust | See every prompt, every decision |
This project demonstrates:
- ποΈ Full-Stack Mastery: MongoDB β GraphQL β React β Edge Runtime
- β‘ Performance Obsession: Sub-50ms latency is a feature, not a goal
- π€ AI Integration: LLM context windows, streaming, graceful degradation
- π P2P Expertise: WebRTC, STUN/TURN, encrypted DataChannels
- π¨ Product Sense: Onboarding, galleries, social features
Built by a single determined engineer to prove that high-performance AI apps are achievable.
Node.js >= 18.0.0
MongoDB Atlas Account
Groq API Keygit clone https://github.com/ixchio/GroqCoder.git
cd GroqCoder
npm installcp .env.example .env.local# Required
MONGODB_URI=mongodb+srv://...
GROQ_API_KEY=gsk_...
# OAuth (optional)
GITHUB_CLIENT_ID=...
GITHUB_CLIENT_SECRET=...
# P2P Signaling
NEXT_PUBLIC_SIGNALING_URL=wss://...npm run devVisit http://localhost:3000 and start building!
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β BENCHMARK RESULTS β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β Inference Latency (TTFB) ββββββββββββββββββββ 42ms β
β Token Generation Rate ββββββββββββββββββββ 823/s β
β P2P Connection Setup ββββββββββββββββββββ 285ms β
β DataChannel RTT ββββββββββββββββββββ 24ms β
β State Sync Latency ββββββββββββββββββββ 12ms β
β Cold Start (Vercel Edge) ββββββββββββββββββββ 180ms β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
We welcome contributions! Check out our Contributing Guide for details.
git checkout -b feature/amazing-feature
git commit -m 'Add amazing feature'
git push origin feature/amazing-feature