Thanks to visit codestin.com
Credit goes to github.com

Skip to content
#

tinyllama

Here are 110 public repositories matching this topic...

onenm_local_llm is a Flutter plugin that simplifies on-device language model inference on Android using llama.cpp. It removes the complexity of setting up native runtimes, model loading, and inference pipelines, so developers can integrate local AI into their apps through a simple API.

  • Updated Mar 19, 2026
  • C++

Terminal Commander AI is a smart, natural language terminal assistant that converts English instructions into safe, executable shell commands. It supports ROS operations, multi-terminal launching, command explanations, and history — powered by a local TinyLlama LLM.

  • Updated Jun 14, 2025
  • Python

Improve this page

Add a description, image, and links to the tinyllama topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the tinyllama topic, visit your repo's landing page and select "manage topics."

Learn more