Highlights
Starred repositories
AirPods liberated from Apple's ecosystem.
String decrypt plugin powered by unicorn engine
Source Code Obfuscation And Binary Obfuscation, Multiple Languages And Multiple Platforms. Including 250+ Tools and 600+ Posts
An implementation of Shazam's song recognition algorithm.
A tool that can share DeviceOwner permissions to other application.
Collection of leaked system prompts
FULL Augment Code, Claude Code, Cluely, CodeBuddy, Comet, Cursor, Devin AI, Junie, Kiro, Leap.new, Lovable, Manus, NotionAI, Orchids.app, Perplexity, Poke, Qoder, Replit, Same.dev, Trae, Traycer AI…
prime is a framework for efficient, globally distributed training of AI models over the internet.
Unofficial implementation of Titans, SOTA memory for transformers, in Pytorch
Unofficial Scalable-Softmax Is Superior for Attention
BlinkDL / minGPT-tuned
Forked from karpathy/minGPTA *tuned* minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
🚀 Efficiently (pre)training foundation models with native PyTorch features, including FSDP for training and SDPA implementation of Flash attention v2.
Research projects built on top of Transformers
verl: Volcano Engine Reinforcement Learning for LLMs
Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax
🚀 Efficient implementations of state-of-the-art linear attention models
A hooking framework, symbol rebinder and memory-manager for jailed IOS, All-In-One.
The calflops is designed to calculate FLOPs、MACs and Parameters in all various neural networks, such as Linear、 CNN、 RNN、 GCN、Transformer(Bert、LlaMA etc Large Language Model)
Minimalistic large language model 3D-parallelism training
The simplest, fastest repository for training/finetuning medium-sized GPTs.