A web-based fork of the original loom - a tree-based writing interface powered by AI.
- Tree-Based Writing: Explore multiple narrative branches visually
- AI-Powered Generation: Generate continuations using multiple AI providers
- Multi-Provider Support: OpenAI, Anthropic, Ollama, and custom endpoints
- Real-Time Streaming: See AI responses as they're generated
- Interactive Tree Visualization: Powered by React Flow with horizontal layout
- Flexible Configuration:
- Chat API (default) for modern models
- Completions API for Ollama and legacy models
- Configurable system prompts per model
- Temperature-only generation for maximum compatibility
- Node Management:
- Manual node creation
- Reconnect nodes to change narrative flow
- Bookmark important nodes
- Editable node titles
- Persistence: Auto-save with local storage
- Modern UI: Skeumorphic design with pastel yellow theme
- Node.js 18+ and npm
- API keys for your chosen provider (OpenAI, Anthropic, or Ollama)
# Clone the repository
git clone https://github.com/amphetamarina/loom.git
cd loom
# Install dependencies
npm install
# Set up environment variables (optional)
cp .env.example .env
# Edit .env with your API keys
# Start development server
npm run devVisit http://localhost:5173 to start using Loom.
npm run build
npm run preview- Open Settings (gear icon)
- Click "Adicionar Novo Modelo"
- Configure:
- Model Name: e.g.,
gpt-4o,claude-3-5-sonnet,llama3 - Provider: OpenAI, Anthropic, Ollama, or Custom
- API Type:
Chat / Messages APIfor modern chat models (default)Completions APIfor Ollama and legacy models
- Base URL: (optional) Custom endpoint
- API Key: (optional) Uses environment variable if not set
- System Prompt: (optional) Custom system instructions
- Model Name: e.g.,
For local inference with Ollama:
- Install and start Ollama
- Pull a model:
ollama pull llama3 - In Loom Settings:
- Provider:
Ollama - API Type:
Completions API - Model Name:
llama3 - Base URL:
http://localhost:11434/v1(default)
- Provider:
- Start Writing: Type your initial prompt in the root node
- Generate: Click the lightning bolt icon to generate AI continuations
- Navigate: Click on nodes to explore different branches
- Edit: Double-click nodes to edit text
- Add Nodes: Use the Plus icon to manually add child nodes
- Reconnect: Use the Link icon to change a node's parent
- Save: Trees are auto-saved to browser storage
- Double-click tab title: Rename tree
- Double-click node: Edit text
- Enter: Save edits
- Escape: Cancel edits
- Framework: React 18 + TypeScript
- Build Tool: Vite 5
- State Management: Zustand with persistence
- AI SDK: Vercel AI SDK
- Visualization: React Flow 11
- Styling: Tailwind CSS 3
- Icons: Lucide React
src/
├── components/ # React components
│ ├── TreeView.tsx # Main tree visualization
│ ├── EditableNode.tsx # Interactive node component
│ ├── SettingsDialog.tsx
│ └── ...
├── services/ # Business logic
│ └── aiService.ts # Multi-provider AI integration
├── stores/ # Zustand state management
│ ├── treeStore.ts # Tree data and operations
│ └── settingsStore.ts # App settings and models
├── types/ # TypeScript definitions
└── hooks/ # Custom React hooks
Uses the Vercel AI SDK's unified streamText() interface:
- OpenAI:
gpt-4o,gpt-4o-mini, etc. - Anthropic:
claude-3-5-sonnet,claude-3-opus, etc. - Custom: Any OpenAI-compatible endpoint
Direct HTTP streaming for legacy compatibility:
- Ollama: All models
- Custom: Any
/v1/completionscompatible endpoint
npm run dev # Start development server
npm run build # Build for production
npm run preview # Preview production build
npm run lint # Lint codeCreate a .env file:
VITE_OPENAI_API_KEY=sk-...
VITE_ANTHROPIC_API_KEY=sk-ant-...This is a personal fork focused on web-based accessibility. For the original Python/Tkinter version, see socketteer/loom.
This project inherits the license from the original Loom project.
- Original Loom: socketteer/loom
- Web fork: @amphetamarina
- Ensure Ollama is running:
ollama serve - Check base URL matches Ollama endpoint
- Select Completions API type (not Chat API)
- Use Chat API for: OpenAI, Anthropic, most cloud providers
- Use Completions API for: Ollama, legacy models, custom endpoints
- Trees are saved to localStorage
- Export important work as JSON (coming soon)
Happy writing! 🌳✨