Thanks to visit codestin.com
Credit goes to github.com

Skip to content

DurhamARC/Peca

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Peca

This repository contains a simple Django project.

Setup

  1. Create and activate a Python virtual environment:
python3 -m venv venv
source venv/bin/activate
  1. Install required dependencies:
pip install --upgrade pip
pip install -r requirements.txt

Once the dependencies are installed, you can run the project's tests to ensure everything is set up correctly:

cd peca_chatbot
python manage.py test

If you plan to use the built-in OpenAI client, set the OPENAI_API_KEY environment variable before running the server so API requests can be authenticated. You can place this variable (and any other configuration values) in a .env file located either at the repository root or inside peca_chatbot/. For example:

OPENAI_API_KEY=sk-your-key

An example.env file is provided for reference. Copy it to .env and replace the placeholder values with your own keys before running the server. The included startup scripts and Django entry points automatically load this file if it exists in either location.

  1. Apply migrations and start the development server from the peca_chatbot directory:
cd peca_chatbot
python manage.py migrate
python manage.py runserver
  1. (Optional) Create an administrative user with the is_admin flag already set:
python manage.py createadmin

Optional setup script

The peca_chatbot/peca_chatbot/setup.sh script performs all of the above steps automatically and creates a superuser. It also sets the DJANGO_SETTINGS_MODULE=peca_chatbot.settings environment variable. Run it from the repository root if you prefer:

bash peca_chatbot/peca_chatbot/setup.sh

Codebase overview

This repository follows a typical Django layout. The peca_chatbot directory contains the main project while the core app houses the bulk of the code:

  • core/models.py – defines users, personalities, agents and chat records.
  • core/views.py – user and admin views for chatting and managing data.
  • core/utils.py – the agent pipeline and helper classes for LLM clients.
  • core/templates/ – HTML templates used by the views.
  • core/management/commands/ – custom management command createadmin.

Admin usage

After creating an admin account you can log in at /login/ and access /admin-panel/. The admin interface allows you to:

The root URL / shows a simple landing page with a button that links to /start/. Upon successful login you will be redirected to /start/ unless a specific next parameter is supplied.

  1. Create and reorder agents which process each chat message.
  2. Define personalities that provide an initial prompt for new chats.
  3. Browse past chats in the database and filter them by personality and date.
  4. Use the debug chat page to inspect each agent step in detail (admin-only).
  5. Export a CSV file containing each chat message for external analysis.

Flow of operations

  1. A user initiates a chat at /start/; a Chat instance is created with a randomly selected active personality.
  2. Messages posted to /chat/<chat_id>/ are stored and the run_pipeline function runs all active agents sequentially.
  3. Each agent receives the previous output as input (the first agent also gets the personality prompt). The output of every agent is recorded as an AgentStep.
  4. The final output is returned to the user and saved as the chat summary.

For detailed developer notes see docs/DEVELOPER_GUIDE.md. For a walkthrough of the web interface see docs/USER_MANUAL.md.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published