Thanks to visit codestin.com
Credit goes to github.com

Skip to content

ollama chat application to use llm from browser

Notifications You must be signed in to change notification settings

Prikalel/OLLAMACHAT

Repository files navigation

OLLAMACHAT

Project Overview

OLLAMACHAT is a web application that allows users to chat with various Large Language Models (LLMs). It's built with ASP.NET Core and features a clean, modern architecture. The application supports multiple LLMs and uses a local SQLite database to store chat history.

Features

  • Chat Interface: A simple and intuitive web interface for chatting with LLMs.
  • Multi-Model Support: Supports multiple LLMs, including models any OpenAI capable provider.
  • Chat History: Stores chat history in a local SQLite database.
  • Background Job Processing: Uses SignalR to communicate between frontend and backend.
  • API Documentation: Includes Swagger for API documentation.
  • SSE MCP support: mcp config can be defined in appsettings including passing custom Authorization header.

Technologies

  • .NET 7
  • ASP.NET Core
  • Entity Framework Core
  • SQLite
  • SignalR
  • Swagger
  • Razor Pages
  • Minimal APIs (gonna be replaced)
  • Mediator
  • ModelContextProtocol.Core

Configuration

The application's configuration is stored in appsettings.json. The following settings can be configured:

  • Urls: The URL the application will run on.
  • ConnectionStrings:OLLAMACHAT: The connection string for the SQLite database.
  • OpenAISettings:ApiKey: Your OpenAI API key.
  • OpenAISettings:Models: A list of LLMs to make available in the application.
  • McpServers: A list of sse mcp servers to connect to, chat will use them if necessary.

How to Run

  1. Clone the repository:
    git clone https://github.com/Prikalel/OLLAMACHAT.git
  2. Navigate to the web project directory:
    cd OLLAMACHAT/Source/OLLAMACHAT.Web
  3. Restore dependencies:
    dotnet restore
  4. Update the appsettings.json file:
    • Set your OpenAI API key in OpenAISettings:ApiKey.
  5. Run the application:
    dotnet run
  6. Open your browser and navigate to the URL specified in appsettings.json (default is http://localhost:5001).

SAST Tools

PVS-Studio - static analyzer for C, C++, C#, and Java code.

About

ollama chat application to use llm from browser

Topics

Resources

Stars

Watchers

Forks