Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

Groq Coder is a decentralized AI coding platform built with Next.js and Groq LPU, delivering ultra-low-latency code generation, live streaming responses, and secure peer-to-peer collaboration.

Notifications You must be signed in to change notification settings

ixchio/GroqCoder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

44 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Groq Coder Llama

⚑ GROQ CODER ⚑

πŸ”₯ Your Imagination. Compiled. Instantly. πŸ”₯

Powered By Groq Next.js 15 TypeScript P2P Enabled MIT License

🌐 Live Demo β€’ πŸš€ Quick Start β€’ πŸ”— P2P Flow β€’ πŸ—οΈ System Design


🎭 Meet The Squad


⚑ Yoruichi
Speed Daemon
P2P Coordinator

πŸ—‘οΈ Retsu
Code Samurai
Inference Engine

🌊 Nelliel
Data Guardian
State Manager

🐝 Soifon
Security Ninja
Auth Handler

🌸 Yachiru
UX Specialist
Stream Router

βš”οΈ AI Agents Squad


❄️ Rukia
Code Freezer
Debugger Agent

πŸ”¬ Nemu
Lab Architect
Research Agent

✨ Orihime
Bug Healer
Refactor Agent

🎯 Makima
Flow Controller
Orchestrator

πŸš€ What is Groq Coder?

"The fastest AI-powered code generation platform, now with peer-to-peer collaboration"

Groq Coder is a production-grade acceleration engine that leverages Groq LPUβ„’ (Language Processing Unit) for blazing-fast inference. Unlike GPU-based solutions that batch requests, we stream logic in real-timeβ€”enabling developers to iterate at the speed of thought.

✨ Key Highlights

+ ⚑ Sub-50ms inference latency via Groq LPU
+ πŸ”— Real-time P2P collaboration without central servers
+ 🎯 800+ tokens/second streaming
+ πŸ” End-to-end encrypted peer connections
+ 🌐 Decentralized code sharing network

πŸ”— Peer-to-Peer Architecture

The Decentralized Code Network

flowchart TB
    subgraph "🌐 Public Network"
        direction TB
        STUN[("πŸ”„ STUN Server<br/>NAT Traversal")]
        TURN[("πŸ”€ TURN Relay<br/>Fallback Route")]
    end

    subgraph "πŸ‘€ Peer A - Creator"
        direction TB
        UA[("πŸ–₯️ Browser Client")]
        WA["πŸ“‘ WebRTC Agent"]
        LA["πŸ’Ύ Local State<br/>IndexedDB"]
    end

    subgraph "πŸ‘₯ Peer B - Collaborator"
        direction TB
        UB[("πŸ–₯️ Browser Client")]
        WB["πŸ“‘ WebRTC Agent"]
        LB["πŸ’Ύ Local State<br/>IndexedDB"]
    end

    subgraph "⚑ Signaling Layer"
        direction LR
        SIG["πŸ”” WebSocket<br/>Signaling Server"]
    end

    UA --> WA
    UB --> WB
    
    WA <-->|"πŸ“‹ SDP Offer/Answer"| SIG
    WB <-->|"πŸ“‹ SDP Offer/Answer"| SIG
    
    WA <-->|"🧊 ICE Candidates"| STUN
    WB <-->|"🧊 ICE Candidates"| STUN
    
    WA <-.->|"πŸ”€ Relay (if needed)"| TURN
    WB <-.->|"πŸ”€ Relay (if needed)"| TURN
    
    WA <===>|"πŸ” Encrypted DataChannel<br/>Code + Cursor + State"| WB
    
    WA --> LA
    WB --> LB

    style UA fill:#ff6b00,stroke:#fff,color:#fff
    style UB fill:#00d4aa,stroke:#fff,color:#fff
    style SIG fill:#8b5cf6,stroke:#fff,color:#fff
    style STUN fill:#3b82f6,stroke:#fff,color:#fff
    style TURN fill:#ec4899,stroke:#fff,color:#fff
Loading

πŸ”„ P2P Connection Flow

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    PEER-TO-PEER HANDSHAKE SEQUENCE                  β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                                                                     β”‚
β”‚   PEER A                    SIGNALING                    PEER B     β”‚
β”‚     β”‚                          β”‚                           β”‚        β”‚
β”‚     │──── Create Offer ───────>β”‚                           β”‚        β”‚
β”‚     β”‚                          │──── Forward Offer ───────>β”‚        β”‚
β”‚     β”‚                          β”‚                           β”‚        β”‚
β”‚     β”‚                          β”‚<───── Create Answer ──────│        β”‚
β”‚     β”‚<──── Forward Answer ─────│                           β”‚        β”‚
β”‚     β”‚                          β”‚                           β”‚        β”‚
β”‚     β”‚<═════════ ICE Candidates Exchange ═════════════════>β”‚        β”‚
β”‚     β”‚                          β”‚                           β”‚        β”‚
β”‚     │◀════════════════ DTLS Handshake ═════════════════▢ β”‚        β”‚
β”‚     β”‚                          β”‚                           β”‚        β”‚
β”‚     │◀══════════ ENCRYPTED DATA CHANNEL ═══════════════▢ β”‚        β”‚
β”‚     β”‚                                                      β”‚        β”‚
β”‚   β”Œβ”€β”΄β”€β”                                                  β”Œβ”€β”΄β”€β”      β”‚
β”‚   β”‚ A │◀═══════════ LIVE COLLABORATION ═════════════════▢│ B β”‚      β”‚
β”‚   β””β”€β”€β”€β”˜                                                  β””β”€β”€β”€β”˜      β”‚
β”‚                                                                     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“‘ Data Channel Protocol

Channel Purpose Priority
code-sync Real-time code delta sync πŸ”΄ Critical
cursor-pos Cursor position broadcast 🟑 High
ai-stream AI response streaming πŸ”΄ Critical
presence User presence/status 🟒 Normal
files Large file transfer πŸ”΅ Low

πŸ—οΈ System Design

Complete System Architecture

graph TB
    subgraph "🎨 Client Layer"
        direction TB
        UI["πŸ–ΌοΈ React UI<br/>Next.js 15"]
        Editor["πŸ“ Monaco Editor<br/>Code Workspace"]
        Preview["πŸ‘οΈ Live Preview<br/>Sandboxed iFrame"]
        P2P["πŸ”— P2P Module<br/>WebRTC"]
    end

    subgraph "🌐 Edge Layer"
        direction TB
        CDN["☁️ Vercel Edge<br/>Global CDN"]
        MW["πŸ›‘οΈ Middleware<br/>Rate Limiting"]
        Auth["πŸ” NextAuth.js<br/>OAuth/JWT"]
    end

    subgraph "⚑ Inference Engine"
        direction TB
        Groq["🧠 Groq LPU<br/>Primary Engine"]
        Cerebras["πŸ”„ Cerebras<br/>Fallback"]
        DeepSeek["πŸ’­ DeepSeek R1<br/>Reasoning Model"]
    end

    subgraph "πŸ’Ύ Data Layer"
        direction TB
        Mongo["πŸ—„οΈ MongoDB Atlas<br/>Project Storage"]
        Redis["⚑ Upstash Redis<br/>Session Cache"]
        Vector["πŸ” Vector DB<br/>Embeddings"]
    end

    subgraph "πŸ“‘ Real-time Layer"
        direction TB
        WS["πŸ”Œ WebSocket<br/>Live Updates"]
        SSE["πŸ“Š SSE<br/>AI Streaming"]
        Signal["πŸ”” Signaling<br/>P2P Coordination"]
    end

    UI --> Editor
    UI --> Preview
    UI --> P2P
    
    Editor -->|"HTTP/2"| CDN
    CDN --> MW
    MW --> Auth
    
    Auth --> Groq
    Auth --> Cerebras
    Groq --> DeepSeek
    
    Auth --> Mongo
    Auth --> Redis
    Mongo --> Vector
    
    Editor --> WS
    Groq --> SSE
    P2P --> Signal

    style Groq fill:#ff6b00,stroke:#fff,color:#fff
    style P2P fill:#00d4aa,stroke:#fff,color:#fff
    style UI fill:#8b5cf6,stroke:#fff,color:#fff
Loading

🧬 Request Lifecycle

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                         REQUEST PROCESSING PIPELINE                        β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  β•‘   USER INPUT  ║────▢║  EDGE RUNTIME ║────▢║  AUTH GUARD   β•‘
  β•‘  "Build form" β•‘     β•‘  validate req β•‘     β•‘ check session β•‘
  β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
                                                      β”‚
                                                      β–Ό
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  β•‘ RATE LIMITER  ║◀────║ CONTEXT BUILD ║◀────║ PROMPT ENGINE β•‘
  β•‘  Upstash Redisβ•‘     β•‘ inject meta   β•‘     β•‘ system prompt β•‘
  β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
          β”‚
          β–Ό
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  β•‘   GROQ LPU    ║────▢║    STREAM     ║────▢║   SSE PUSH    β•‘
  β•‘  800 tok/sec  β•‘     β•‘  transformer  β•‘     β•‘  to client    β•‘
  β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•
                                                      β”‚
                                                      β–Ό
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╝
  β•‘  ASYNC WRITE  ║◀────║  P2P FANOUT   ║◀────║    RENDER     β•‘
  β•‘   MongoDB     β•‘     β•‘ sync to peers β•‘     β•‘  live preview β•‘
  β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•     β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•

🎯 Component Matrix

Layer Technology Purpose Latency Target
Client Next.js 15 + React 19 Server Components, Streaming < 100ms FCP
Auth NextAuth.js + JWT OAuth, Session Management < 50ms
Inference Groq LPU Primary AI Engine < 50ms TTFB
Fallback Cerebras WSE Secondary Inference < 150ms TTFB
Cache Upstash Redis Rate Limiting, Sessions < 10ms
Database MongoDB Atlas Project Persistence < 100ms
P2P WebRTC + DataChannels Real-time Collaboration < 30ms RTT

πŸ”“ Why Open Source?

"Intelligence shouldn't be gated behind paywalls"

In an era of $20/month AI subscriptions, Groq Coder stands for accessibility:

Belief Our Commitment
🌍 Access is a Right Every developer deserves SOTA tooling
πŸ‘₯ Community > Corporation Features come from users, not roadmaps
πŸ” Transparency is Trust See every prompt, every decision

πŸ’Ό For Recruiters

This project demonstrates:

  • πŸ—οΈ Full-Stack Mastery: MongoDB β†’ GraphQL β†’ React β†’ Edge Runtime
  • ⚑ Performance Obsession: Sub-50ms latency is a feature, not a goal
  • πŸ€– AI Integration: LLM context windows, streaming, graceful degradation
  • πŸ”— P2P Expertise: WebRTC, STUN/TURN, encrypted DataChannels
  • 🎨 Product Sense: Onboarding, galleries, social features

Built by a single determined engineer to prove that high-performance AI apps are achievable.


πŸ› οΈ Quick Start

Prerequisites

Node.js >= 18.0.0
MongoDB Atlas Account
Groq API Key

1. Clone & Install

git clone https://github.com/ixchio/GroqCoder.git
cd GroqCoder
npm install

2. Configure Environment

cp .env.example .env.local
# Required
MONGODB_URI=mongodb+srv://...
GROQ_API_KEY=gsk_...

# OAuth (optional)
GITHUB_CLIENT_ID=...
GITHUB_CLIENT_SECRET=...

# P2P Signaling
NEXT_PUBLIC_SIGNALING_URL=wss://...

3. Launch πŸš€

npm run dev

Visit http://localhost:3000 and start building!


πŸ“Š Performance Metrics

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                     BENCHMARK RESULTS                           β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                                                                 β”‚
β”‚  Inference Latency (TTFB)         β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  42ms   β”‚
β”‚  Token Generation Rate            β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ  823/s   β”‚
β”‚  P2P Connection Setup             β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  285ms  β”‚
β”‚  DataChannel RTT                  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  24ms   β”‚
β”‚  State Sync Latency               β–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  12ms   β”‚
β”‚  Cold Start (Vercel Edge)         β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  180ms  β”‚
β”‚                                                                 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

🀝 Contributing

We welcome contributions! Check out our Contributing Guide for details.

git checkout -b feature/amazing-feature
git commit -m 'Add amazing feature'
git push origin feature/amazing-feature

🌟 Star this repo if you found it useful!


Built with ❀️ and ⚑ by a 10x Engineer

"The future of coding is decentralized, fast, and free."

About

Groq Coder is a decentralized AI coding platform built with Next.js and Groq LPU, delivering ultra-low-latency code generation, live streaming responses, and secure peer-to-peer collaboration.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages