Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Modular AI-assisted developer toolkit that combines deterministic rule-based analysis with local LLM reasoning to interpret stack traces, classify failures, and generate structured debugging workflows.

Notifications You must be signed in to change notification settings

aphator-tech/DevCompanion

Repository files navigation

DevCompanion

DevCompanion is a modular, explainable AI-assisted developer toolkit designed to help engineers diagnose and resolve complex technical issues. Unlike generic chatbots, DevCompanion uses a structured routing engine and specialized analysis modules to provide deterministic, high-confidence insights.

Project Overview

DevCompanion is built on a "Deterministic First, AI Second" philosophy. It prioritizes rule-based analysis for reliability and uses local Large Language Models (LLMs) via Ollama for optional enhancement.

Key Features

  • Modular Architecture: Easily extendable with new analysis modules.
  • Smart Problem Router: Automatically detects input types (stack traces, application logs, or configuration files) and routes them to the correct module.
  • Deterministic Analysis: Rule-based scoring for identifying failure layers and detecting anomalies or security risks.
  • Buddy Mode (Interactive Chat): A specialized AI partner that understands the deterministic analysis and helps you resolve the issue step-by-step.
  • Explainable Execution Trace: Full transparency into how the tool reached its conclusions.
  • Local AI Support: Optional integration with Ollama for root cause hypothesis generation.

Architecture

devcompanion/
│
├── main.ts         # CLI Entry Point
├── router.ts       # Input Detection & Routing
├── state.ts        # Central Execution State
├── modules/        # Specialized Analysis Modules
│   ├── stacktrace.ts
│   ├── log.ts      # NEW: Log Analysis
│   ├── config.ts   # NEW: Configuration Analysis
│   └── base_module.ts
├── ai/             # AI Enhancement Layer
│   └── ai_engine.ts
└── utils/          # Supporting Utilities
    ├── parser.ts
    ├── scoring.ts
    └── formatter.ts

CLI Usage

Analyze a Stack Trace

npx tsx src/devcompanion/main.ts analyze path/to/error.txt

Options

  • --ai: Enable local LLM reasoning via Ollama (requires Ollama running locally).
  • --verbose: Show the full execution trace, including routing and scoring details.
  • --json: Output the entire state in JSON format for integration with other tools.

Chat with DevCompanion (Buddy Mode)

npx tsx src/devcompanion/main.ts chat path/to/error.txt "How do I fix this database connection issue?"

Sample Output

DevCompanion Interface

=== DevCompanion Analysis ===

Predicted Layer: Backend
Confidence: 62.0%

Layer Breakdown:
  Frontend: 20%
  Backend: 60%
  Database: 5%
  Network: 10%
  Auth: 5%

Reproduction Plan:
1. Check server-side logs for detailed error messages.
2. Verify API endpoint availability using cURL or Postman.
3. Restart the backend service to clear transient states.

Debug Plan:
1. Trace the request through middleware and controllers.
2. Check for resource leaks (memory, file handles, connections).
3. Verify environment variables and configuration files.

=== Execution Trace ===
[Main] Loaded input from path/to/error.txt
[Router] Detected stacktrace (confidence 0.82)
[StackTraceModule] Starting deterministic analysis...
[StackTraceModule] Matched signals: HTTP 5xx, controller.js
[Scoring] Backend score increased to 60%

Roadmap

  • Prompt Linter Module: Help engineers write better prompts for LLMs.
  • Repo Analyzer Module: Scan repositories for common configuration issues.
  • Network Traffic Analyzer: Analyze PCAP or HAR files for network issues.

Local Setup

Prerequisites

Installation

  1. Clone the repository.
  2. Install dependencies:
    npm install

Ollama Setup (Optional)

  1. Install Ollama from ollama.com.
  2. Pull the Llama3 model:
    ollama pull llama3
  3. Ensure Ollama is running at http://localhost:11434.

Developed by Aphator

Tone: Engineering-focused. Not startup-marketing. Not exaggerated.

About

Modular AI-assisted developer toolkit that combines deterministic rule-based analysis with local LLM reasoning to interpret stack traces, classify failures, and generate structured debugging workflows.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages