seerai is an intelligent research assistant plugin for Zotero 7 that integrates AI-powered chat, advanced search, and data extraction capabilities directly into your research workflow. Chat with your papers, extract structured data, and accelerate your literature review with a local-first, privacy-focused design.
- Contextual Conversations: Chat with AI about your selected papers with full context awareness.
- Smart Context Priority: Automatically prioritizes content sources:
- Zotero Notes (OCR note, and other notes for highest priority)
- Indexed PDF Text (Fast, efficient, howver consumes alot of tokens, and would cause limit issues)
- OCR (Fallback for scanned documents with no indexed text)
- Multi-paper Support: Add multiple papers to a single conversation for comparative analysis.
- Streaming Responses: Real-time, token-by-token response rendering.
- Markdown & Math: Responses are formatted with syntax highlighting and LaTeX math support.
- Vision Support: Paste images directly into chat for multimodal analysis.
- Enhanced Keybindings:
Enter: Insert new lineShift+Enter: Send message
- Web Search: Integrated Firecrawl support for finding full-text content.
- Semantic Scholar Agent: Advanced paper search with filtering (Year, Venue, Citation Count).
- Smart Import:
- PDF Discovery: Automatically finds and attaches PDFs during import.
- Source Link: Fallback to source links if PDFs are unavailable.
- Status Indicators: Clear feedback on import status (β¬οΈ Importing, β
Imported,
β οΈ Failed).
- Structured Extraction: Extract specific data points from multiple papers into a comparative table.
- AI-Powered Columns: Define custom columns with AI prompts (e.g., "Methodology", "Sample Size").
- Inline Editing: innovative inline editor for column titles and prompts.
- One-Click Generation: Generate data for individual cells or entire columns instantly.
- Side Strip Actions: Unified controls for adding, removing columns, generating triggers, and settings.
- Flexible OCR Options:
- Mistral OCR: High-quality cloud OCR (Recommended).
- DataLab.to: Reliable cloud-based extraction.
- Local Marker: Run your own local OCR server for free, private processing.
- Auto-Processing: Automatically processes unindexed PDFs when needed.
- Model Presets: Pre-configured settings for popular providers:
- OpenAI (GPT-5, o3)
- Anthropic (Claude Sonnet 4.5)
- Google (Gemini 3 Pro)
- DeepSeek, Mistral, Groq, OpenRouter
- Local Models (Openai compatible endpoint, Ollama, LM Studio)
- 12-16g Vram - Qwen3-4B-Thinking-2507
- 24-32g Vram - gpt-oss-20b
- 48-64g Vram - QwQ-32B
- 96-128g Vram - Qwen3-Next-80B-A3B-Instruct
- Per-Conversation Models: Switch models dynamically based on the task complexity.
- Download the latest release (
.xpifile) from Releases. - In Zotero, go to Tools β Add-ons.
- Click the gear icon βοΈ and select Install Add-on From File....
- Select the downloaded
.xpifile. - Restart Zotero.
# Clone the repository
git clone https://github.com/dralkh/seerai.git
cd seerai
# Install dependencies
npm install
# Build the plugin
npm run build
# The .xpi file will be generated in the root directoryGo to Zotero β Settings β seerai to configure your AI providers and services.
Use the Add Configuration button to set up your AI models.
- Presets: Select from built-in presets (OpenAI, Anthropic, Ollama, etc.) for quick setup.
- Custom: Manually configure API URL, Key, and Model ID for any OpenAI-compatible provider.
- Default: Set a preferred model as your default for new conversations.
Choose your preferred text extraction engine:
- Mistral OCR: Requires Mistral API Key. Best for accuracy.
- Cloud (DataLab.to): Requires DataLab API Key.
- Local Marker Server: Requires running a local Python server.
- URL:
http://localhost:8001(Default) - See Marker Project for setup.
- URL:
- Semantic Scholar: Add your API Key for higher rate limits and faster searches.
- Firecrawl: Add API Key to enable deep web search capabilities - local instance with (GitHub).
- Select a paper (or multiple) in your library.
- Open the SeerAI sidebar tab.
- (Optional) Customize context inclusions via the settings icon (Abstracts, Notes).
- Type your question or use templates from the Prompt Library (Book icon).
- Open the Tables tab in the main view.
- Click
+on the side strip to add a new column. - Define the column header and AI prompt (e.g., "What is the sample size?").
- Drag and drop papers into the table.
- Click Generate on cells to extract data.
- Access via the Book Icon π in chat.
- Use built-in templates (Summarize, Critique, Compare).
- Create custom templates with placeholders:
!: Saved Prompts/: Papers^: Folders~: Tags@: Authors#: Topics
Propose several advanced features to enhance seerai's capabilities. These are currently in the just idea board.
Enhanced search functionality to help users find relevant literature more effectively.
- Autocomplete: Intelligent suggestions for tags, creators, and collections as you type.
- Complex Queries: Support for boolean logic (AND/OR) and nested search conditions (e.g., "Title contains X AND Year > 2020").
- Field-Specific Search: Dedicated filters for titles, authors, years, and tags.
- Voice, Transcription, Embedding Integration: Support for OpenAI-compatible embedding, voice, transcription models (e.g.,
text-embedding-3-small, local Ollama embeddings). - Contextual Retrieval: Find papers based on conceptual similarity rather than just exact text matches.
- In-Memory Vector Store β Fast, local indexing of session-relevant papers for semantic analysis.
- RAG - used when 80% limit reached to context size
- Verifier Button β One-click verification to check all extracted data against source text.
- Confidence Scores β AI-generated confidence ratings for each extracted data point.
- Source Highlighting β Click a cell to see the exact passage in the paper where the data came from.
- URL Discovery β Usage Firecrawl API for pdf discovery
Citations referencing within tables and chat on generation - MCP Connectors UI revamp
- Node.js 18+
- Zotero 7
The codebase follows a modular architecture:
seerai/
βββ addon/ # Zotero integration files (XUL/XHTML)
βββ src/
β βββ modules/ # Core feature modules
β β βββ chat/ # Chat engine & state
β β βββ assistant.ts # Main assistant logic
β β βββ firecrawl.ts # Firecrawl integration
β β βββ ocr.ts # OCR implementation
β β βββ openai.ts # LLM client implementation
β β βββ semanticScholar.ts # Semantic Scholar integration
β β βββ preferenceScript.ts # Preferences logic
β βββ utils/ # Utility functions
β βββ hooks.ts # Zotero event listeners
βββ package.json
npm start # Start dev server with hot reload
npm run build # Build for production
npm run lint:fix # Fix code style issuesContributions are welcome!
- Fork the repo.
- Create a feature branch (
git checkout -b feature/MyFeature). - Commit changes (
git commit -m 'Add MyFeature'). - Push to branch (
git push origin feature/MyFeature). - Open a Pull Request.
MIT License - see LICENSE for details.