eco-solver is a TypeScript / NestJS service that listens for, validates, and fulfils on-chain intents across multiple EVM chains. It orchestrates BullMQ workers, Redis, MongoDB, AWS KMS signers and more – all wrapped in modular NestJS components.
- Intent life-cycle automation – create ⇒ validate ⇒ solve ⇒ fulfil.
- Multi-chain support through
viem+ Alchemy RPC / WS endpoints. - Secure signing via AWS KMS (EOA & ERC-4337 smart-wallet accounts).
- Liquidity aggregation / rebalancing powered by LiFi & custom providers.
- Distributed queues with BullMQ + Redis; workers scale independently.
- Observability – Pino JSON logging, Nest Terminus health-checks, Feature-flags (LaunchDarkly).
- Prerequisites
- MCP (Model Context Protocol) Integrations
- Setup
- Typical pnpm Scripts
- API Quick-Start
- Contributing
- Node.js v20.19.0 (use nvm)
- pnpm v9.x (managed by Corepack)
- Docker & Docker Compose v3.8+
- AWS credentials (SSO or IAM) for Secrets Manager & KMS access
Redis & MongoDB can run locally or via Docker (see below).
This project includes MCP server integrations for enhanced development tooling:
- Development: Connects to local MongoDB instance
- Production: Connects to production MongoDB (requires AWS VPN connection)
- Configuration: Uses
.mcp.jsonwith connection strings from environment variables
- Purpose: GitHub repository management and operations
- Requirements: GitHub Personal Access Token must be set as
GITHUB_PERSONAL_ACCESS_TOKENenvironment variable - Usage: Enables GitHub operations through MCP tools
When setting up this repository for the first time, follow these steps to configure MCP services:
- AWS VPN Connection: Must be active and connected for production/preprod database access
- Docker: Must be running locally for GitHub MCP service
- Environment Variables: Required for service authentication
- env-cmd: Tool for loading environment variables from config files (install globally or use via npx)
-
Verify Prerequisites:
# Check AWS VPN is connected (verify internal network access) # Check Docker is running docker --version # Install env-cmd globally (optional - can also use npx) npm install -g env-cmd # OR verify npx can access it npx env-cmd --version
-
Configure Environment Variables:
# Set GitHub Personal Access Token for GitHub MCP export GITHUB_PERSONAL_ACCESS_TOKEN=your_github_token_here # Verify environment files exist (these should be in the repo) ls -la .env* # Should show: .env-cmdrc, .env.preprod, .env.prod
-
Test MCP Services: The following services are configured in
.mcp.json:- mongodb-preprod: Uses
env-cmd -e preprodto load preprod MongoDB connection from.env-cmdrc - mongodb-prod: Uses
env-cmd -e prodto load production MongoDB connection from.env-cmdrc - github: Uses Docker to run GitHub MCP server with your personal access token
The
env-cmdtool reads environment-specific configurations from.env-cmdrcand applies them before running themongodb-mcp-server. - mongodb-preprod: Uses
-
Verify Connection Strings: Connection strings are automatically loaded from:
.env-cmdrc(JSON format with preprod/prod environments).env.preprodand.env.prod(traditional .env format)
These files contain the MongoDB Atlas connection strings for the MCP user.
- MongoDB Connection Issues: Ensure AWS VPN is active and you can reach internal MongoDB clusters
- GitHub MCP Issues: Verify
GITHUB_PERSONAL_ACCESS_TOKENis set and Docker is running - Service Not Found: Check that
mongodb-mcp-serveris available vianpx -y mongodb-mcp-server
- MongoDB credentials are read-only MCP user credentials, not production write access
- GitHub token should have minimal required permissions for repository operations
- All MCP services run locally and do not expose external endpoints
The MCP configuration is defined in .mcp.json at the project root.
# Clone repo and enter directory
git clone <repository-url>
cd eco-solver
# Node & pnpm
nvm install 20.19.0
nvm use 20.19.0
corepack enable
# Install packages
pnpm installeco-solver pulls sensitive config from Secrets Manager. The easiest way locally is AWS SSO:
aws configure sso # choose your SSO profile, e.g. eco-dev
aws sso login --profile eco-dev
export AWS_PROFILE=eco-devConfiguration values are merged from multiple sources:
config/default.json– sane defaultsconfig/<NODE_ENV>.json– per-environment overrides- Env vars (highest priority)
- AWS Secrets Manager JSON – fetched on boot via
EcoConfigService
See config/ folder for full list & structure.
Compose profiles are required because every service in docker-compose.yml is associated with one:
| Profile | Starts | When to use |
|---|---|---|
db |
mongodb, redis |
Use if you want to run the Node server on your host machine but still rely on local Docker databases. |
app |
app |
Useful when you already have Mongo/Redis running elsewhere. |
all |
app, mongodb, redis |
Complete stack for one-shot testing. |
If you omit
--profile …no service will be started - this is the behaviour when every service is gated by a profile.
Example flows:
# ▸ Full stack (API + DBs) in the background
docker compose --profile all up -d
# ▸ Only Mongo & Redis
docker compose --profile db up -d
# ▸ Only the NestJS API (assumes DBs available)
docker compose --profile app up --build # --build to pick up local code changes
# Stop everything
docker compose --profile all downInternally the app service mounts ./src and ./config as bind-volumes and runs pnpm start:dev, so code changes are live-reloaded.
A few environment variables are forwarded into the container via docker-compose.yml:
AWS_PROFILE: ${AWS_PROFILE} # whichever profile you logged in with (see AWS section)
NODE_ENV: ${NODE_ENV:-development} # environment used to run
NODE_CONFIG: | { ... } # in-container overrides for DB / RedisMake sure you export AWS_PROFILE in your shell before launching compose so the container can load credentials from the mounted ~/.aws folder.
pnpm start:dev # Hot-reload dev mode
pnpm start # Compile & run oncepnpm test # all unit tests
pnpm test --watch| Script | Purpose |
|---|---|
pnpm build |
Compile TypeScript into dist/ |
pnpm cli |
Invoke commander CLI utilities (balance check, transfer, etc.) |
pnpm lint / lint:fix |
ESLint code quality |
pnpm prettier / prettier:fix |
Code formatting |
pnpm test:cov |
Unit-test coverage report |
Once the server is up (localhost:3000 by default):
# Get balances for all solver wallets (flattened JSON)
curl "http://localhost:3000/api/v1/balance?flat=true"
# Request a quote (payload abbreviated)
curl -X POST http://localhost:3000/api/v1/quote \
-H 'Content-Type: application/json' \
-d '{
"sourceChain": 1,
"destChain": 137,
"tokenIn": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606e48",
"amount": "1000000"
}'Swagger UI is auto-generated at http://localhost:3000/api