Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Train your AI self, amplify you, bridge the world

License

hjwgrateful/Second-Me

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

71 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Second Me

Homepage Report Discord Twitter Reddit

Our Vision

Companies like OpenAI built "Super AI" that threatens human independence. We crave individuality: AI that amplifies, not erases, you.

We’re challenging that with "Second Me": an open-source prototype where you craft your own AI selfβ€”a new AI species that preserves you, delivers your context, and defends your interests.

It’s locally trained and hostedβ€”your data, your controlβ€”yet globally connected, scaling your intelligence across an AI network. Beyond that, it’s your AI identity interfaceβ€”a bold standard linking your AI to the world, sparks collaboration among AI selves, and builds tomorrow’s truly native AI apps.

Join us. Tech enthusiasts, AI pros, domain expertsβ€”Second Me is your launchpad to extend your mind into the digital horizon.

Key Features

Train Your AI Self with AI-Native Memory (Paper)

Start training your Second Me today with your own memories! Using Hierarchical Memory Modeling (HMM) and the Me-Alignment Algorithm, your AI self captures your identity, understands your context, and reflects you authentically.

Scale Your Intelligence on the Second Me Network

Launch your AI self from your laptop onto our decentralized networkβ€”anyone or any app can connect with your permission, sharing your context as your digital identity.

Build Tomorrow’s Apps with Second Me

Roleplay: Your AI self switches personas to represent you in different scenarios.
AI Space: Collaborate with other Second Mes to spark ideas or solve problems.

100% Privacy and Control

Unlike traditional centralized AI systems, Second Me ensures that your information and intelligence remain local and completely private.

Getting started & staying tuned with us

Star and join us, and you will receive all release notifications from GitHub without any delay!

Quick Start

🍎 Option 1: Local Setup (macOS with Apple Silicon)

Click to expand/collapse Mac setup details
Prerequisites
  • Python 3.12 or higher
  • Xcode Command Line Tools
Installing Xcode Command Line Tools

If you haven't installed Xcode Command Line Tools yet, you can install them by running:

xcode-select --install

After installation, you may need to accept the license agreement:

sudo xcodebuild -license accept
Setup Steps
  1. Clone the repository

    git clone [email protected]:Mindverse/Second-Me.git
    cd Second-Me
  2. Set up the environment

    Choose one of the following options:

    Option A: For users with existing conda environment

    If you already have conda installed:

    1. Create a new environment from our environment file:

      conda env create -f environment.yml   # This will create an environment named 'second-me'
      conda activate second-me
    2. Set the custom conda mode in .env:

      CUSTOM_CONDA_MODE=true
    3. Run setup:

      make setup
    Option B: For new users

    If you're new or want a fresh environment:

    make setup

    This command will automatically:

    • Install all required system dependencies (including conda if not present)
    • Create a new Python environment named 'second-me'
    • Build llama.cpp
    • Set up frontend environment
  3. Start the service

    make start

🐳 Option 2: Docker Setup (For Linux & Windows users)

Click to expand/collapse Docker setup details

Docker provides a consistent environment across different operating systems.

Prerequisites
  • Docker and Docker Compose installed on your system

Important: You must install both Docker and Docker Compose before proceeding. If you haven't installed them yet:

Setup Steps
  1. Clone the repository
git clone [email protected]:Mindverse/Second-Me.git
cd Second-Me
  1. Build the Docker images
make docker-build
  1. Start the containers
make docker-up
  1. To stop the containers when you're done
make docker-down
Other Useful Docker Commands
  • Restart all services
make docker-restart-all
  • Rebuild and restart only the backend
make docker-restart-backend
  • Rebuild and restart only the frontend
make docker-restart-frontend
  • Please notice that if you are using Apple Silicon and you want to run docker commands directly, you need to set the PLATFORM environment variable to apple and the DOCKER_BACKEND_DOCKERFILE environment variable to Dockerfile.backend.apple. For example:
PLATFORM=apple DOCKER_BACKEND_DOCKERFILE=Dockerfile.backend.apple docker-compose up -d --build

πŸ–₯️ Option 3: Manual Setup (Cross-Platform Guide)

Click to expand/collapse single or multi-OS setup details

In this section, we explore how to deploy both the frontend and backend on a single server, as well as how to enable cross-server communication between the frontend and backend using separate servers.

βœ… Prerequisites
  • Miniforge/Miniconda
πŸ“¦ Install Dependencies

The following scripts are sourced from scripts/setup.sh and scripts\start_local.sh.

🐍 Python Environment Setup with Conda and Poetry We recommend managing the Python environment using Miniconda, and handling dependencies with Poetry. While Conda and Poetry are independent tools, they can be used together effectively:

  • Conda provides flexible and isolated environment management.
  • Poetry offers strict and declarative dependency management.

Below is a step-by-step example of combining them:

# Set up Python Environment
conda create -n secondme python=3.12
conda activate secondme

# (Recommand) Install Poetry inside the Conda environment
# This avoids using system-wide Poetry and keeps dependencies isolated
pip install poetry

# (Optional) Set a custom Python package index (e.g., TUNA mirror for better speed in China)
poetry source add tuna https://pypi.tuna.tsinghua.edu.cn/simple
poetry source set-default tuna

poetry install --no-root --no-interaction

# Install specific version of GraphRAG from local archive
# ⚠️ Adjust the path separator based on your OS (e.g., \ on Windows, / on Unix)
pip install --force-reinstall dependencies\graphrag-1.2.1.dev27.tar.gz 
# Install Frontend Dependencies 
cd lpm_frontend
npm install
cd ..

# Build llama.cpp Dependencies 
unzip -q dependencies/llama.cpp.zip
cd llama.cpp
mkdir -p build && cd build
cmake ..
cmake --build . --config Release
cd ../..
Run Servers
# Initialize SQL Database
mkdir -p "./data/sqlite"
cat docker/sqlite/init.sql | sqlite3 ./data/sqlite/lpm.db

# Initialize ChromaDB Database
mkdir -p logs
python docker/app/init_chroma.py

# Start the Backend Server (develop mode)
python -m flask run --host=0.0.0.0 --port=8002 >> "logs/backend.log" 2>&1
# If deploying in a production environment, please use `nohup` and `disown` commands to keep it running persistently in the background.

# Start the Frontend Server (Open Another Terminal Shell)
cd lpm_frontend
npm run build
npm run start

ℹ️ Note: If the frontend and backend are deployed on separate servers, make sure to configure the HOST_ADDRESS in the .env file accordingly.

Accessing the Service

After starting the service (either with local setup or Docker), open your browser and visit:

http://localhost:3000

View help and more commands

make help

Important Notes

  1. Ensure you have sufficient disk space (at least 10GB recommended)
  2. If using local setup with an existing conda environment, ensure there are no conflicting package versions
  3. First startup may take a few minutes to download and install dependencies
  4. Some commands may require sudo privileges

Troubleshooting

If you encounter issues, check:

  1. For local setup: Python and Node.js versions meet requirements
  2. For local setup: You're in the correct conda environment
  3. All dependencies are properly installed
  4. System firewall allows the application to use required ports
  5. For Docker setup: Docker daemon is running and you have sufficient permissions

Tutorial and Use Cases

πŸ› οΈ Feel free to follow User tutorial to build your Second Me.

πŸ’‘ Check out the links below to see how Second Me can be used in real-life scenarios:

Join the Community

Coming Soon

The following features have been completed internally and are being gradually integrated into the open-source project. For detailed experimental results and technical specifications, please refer to our Technical Report.

Model Enhancement Features

  • Long Chain-of-Thought Training Pipeline: Enhanced reasoning capabilities through extended thought process training
  • Direct Preference Optimization for L2 Model: Improved alignment with user preferences and intent
  • Data Filtering for Training: Advanced techniques for higher quality training data selection
  • Apple Silicon Support: Native support for Apple Silicon processors with MLX Training and Serving capabilities

Product Features

  • Natural Language Memory Summarization: Intuitive memory organization in natural language format

Contributing

We welcome contributions to Second Me! Whether you're interested in fixing bugs, adding new features, or improving documentation, please check out our Contribution Guide. You can also support Second Me by sharing your experience with it in your community, at tech conferences, or on social media.

For more detailed information about development, please refer to our Contributing Guide.

Contributors

We would like to express our gratitude to all the individuals who have contributed to Second Me! If you're interested in contributing to the future of intelligence uploading, whether through code, documentation, or ideas, please feel free to submit a pull request to our repository: Second-Me.

Made with contrib.rocks.

Acknowledgements

This work leverages the power of the open-source community.

For data synthesis, we utilized GraphRAG from Microsoft.

For model deployment, we utilized llama.cpp, which provides efficient inference capabilities.

Our base models primarily come from the Qwen2.5 series.

We also want to extend our sincere gratitude to all users who have experienced Second Me. We recognize that there is significant room for optimization throughout the entire pipeline, and we are fully committed to iterative improvements to ensure everyone can enjoy the best possible experience locally.

License

Second Me is open source software licensed under the Apache License 2.0. See the LICENSE file for more details.

Star History

Star History Chart

About

Train your AI self, amplify you, bridge the world

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 76.3%
  • TypeScript 19.4%
  • Shell 3.5%
  • Other 0.8%