Thanks to visit codestin.com
Credit goes to github.com

Skip to content

ariya/tinker-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tinker-chat

Screenshot

This chat app supports GPT from OpenAI or your own local LLM.

GPT from OpenAI

To use GPT from OpenAI, set the environment variable OPENAI_API_KEY to your API key.

Local LLM

To utilize llama.cpp locally with its inference engine, first load a quantized model such as Phi-3 Mini, e.g.:

/path/to/llama.cpp/server -m Phi-3-mini-4k-instruct-q4.gguf

Before launching the demo, set the environment variable OPENAI_API_BASE:

export OPENAI_API_BASE=http://127.0.0.1:8080

Demo

With Node.js >= v18:

npm install
npm start

and open localhost:5000 with a web browser.

About

No description or website provided.

Topics

Resources

License

Stars

Watchers

Forks

Contributors 2

  •  
  •