Thanks to visit codestin.com
Credit goes to github.com

Skip to content

kaedeek/vix

Β 
Β 

Vix.cpp

Vix.cpp Banner

Vix.cpp


🌍 What is Vix?

Vix is a next-generation offline-first, peer-to-peer, ultra-fast runtime for modern C++.

Its goal is clear:

A runtime capable of running apps like Node / Deno / Bun β€”
but engineered for unstable, low-quality, real-world networks.

Vix is more than a backend framework:
it is a modular runtime, designed for distributed applications, edge systems, offline devices, and environments where traditional cloud frameworks fail.

Inspired by FastAPI, Vue.js, React, and modern runtimes β€” but rebuilt from scratch in C++20 for raw speed and full control.


⚑ Benchmarks (Updated β€” Dec 2025)

All benchmarks were executed using wrk
8 threads, 800 connections, for 30 seconds, on the same machine:
Ubuntu 24.04 β€” Intel Xeon β€” C++20 optimized build β€” Logging disabled

Results represent steady-state throughput on a simple "OK" endpoint.


πŸš€ Requests per second

Framework Requests/sec Avg Latency Transfer/sec
⭐ Vix.cpp (v1.12.3) ~98,942 (pinned CPU) 7.3–10.8 ms ~13.8 MB/s
Vix.cpp (default run) 81,300 – 81,400 9.7–10.8 ms β‰ˆ 11.3 MB/s
Go (Fiber) 81,336 0.67 ms 10.16 MB/s
Deno ~48,868 16.34 ms ~6.99 MB/s
Node.js (Fastify) 4,220 16.00 ms 0.97 MB/s
PHP (Slim) 2,804 16.87 ms 0.49 MB/s
Crow (C++) 1,149 41.60 ms 0.35 MB/s
FastAPI (Python) 752 63.71 ms 0.11 MB/s

πŸ”₯ New record: When pinned to a single core (taskset -c 2)
Vix.cpp reaches ~99k req/s, surpassing Go and matching the fastest C++ microframeworks.


πŸ“ Notes

βœ” Why Vix.cpp reaches Go-level performance

  • zero-cost abstractions
  • custom ThreadPool tuned for HTTP workloads
  • optimized HTTP pipeline
  • fast-path routing
  • Beast-based IO
  • minimal memory allocations
  • predictable threading model

πŸ¦• Deno benchmark (reference)

$ wrk -t8 -c800 -d30s --latency http://127.0.0.1:8000
Requests/sec: 48,868.73

βœ” Vix.cpp recommended benchmark mode

When benchmarking from inside the Vix.cpp repository (using the built-in example):

cd ~/vixcpp/vix
export VIX_LOG_LEVEL=critical
export VIX_LOG_ASYNC=false

# Run the optimized example server
vix run example main

Then, in another terminal:

wrk -t8 -c800 -d30s --latency http://127.0.0.1:8080/bench

If you want CPU pinning for more stable results:

taskset -c 2 ./build/main
wrk -t8 -c800 -d30s --latency http://127.0.0.1:8080/bench

🏁 Result: ~98,942 req/s

βœ” Fast-path routing gives +1–3%

Use /fastbench to bypass RequestHandler overhead.


🧭 Quick Example

#include <vix.hpp>
using namespace Vix;

int main() {
    App app;

    app.get("/", [](auto&, auto& res) {
        res.json({ "message", "Hello world" });
    });

    app.run(8080);
}

QueryBuilder ORM

QueryBuilder qb;
qb.raw("UPDATE users SET age=? WHERE email=?")
  .param(29)
  .param("[email protected]");

Minimal HTTP + WebSocket Server

This example shows the smallest fully working HTTP + WS hybrid server.

Features

  • Basic GET route
  • Simple WS connection handling
  • Auto-start server

Example (summary)

#include <vix.hpp>
#include <vix/websocket/AttachedRuntime.hpp>

using namespace vix;

int main()
{
    auto bundle = vix::make_http_and_ws("config/config.json");
    auto &[app, ws] = bundle;

    app.get("/", [](const Request &, Response &res)
            { res.json({"framework", "Vix.cpp",
                        "message", "HTTP + WebSocket example (basic) πŸš€"}); });

    ws.on_open([&ws](auto &session)
               {
        (void)session;

        ws.broadcast_json("chat.system", {
            "user", "server",
            "text", "Welcome to Vix WebSocket! πŸ‘‹"
        }); });

    vix::run_http_and_ws(app, ws, 8080);

    return 0;
}

Minimal WebSocket Client

auto client = Client::create("localhost", "9090", "/");

client->on_open([] {
    std::cout << "Connected!" << std::endl;
});

client->send("chat.message", {"text", "Hello world!"});

1. Hello World (JSON)

app.get("/", [](Request req, Response res) {
    return json::o("message", "Hello from Vix");
});

2. Route Parameters

app.get("/users/{id}", [](Request req, Response res) {
    auto id = req.param("id");
    return json::o("user_id", id);
});

3. Query Parameters

app.get("/search", [](Request req, Response res) {
    auto q = req.query_value("q", "none");
    auto page = req.query_value("page", "1");

    return json::o(
        "query", q,
        "page", page
    );
});

4. Automatic Status + Payload (FastAPI style)

app.get("/missing", [](Request req, Response res) {
    return std::pair{
        404,
        json::o("error", "Not found")
    };
});

5. Redirect

app.get("/go", [](Request req, Response res) {
    res.redirect("https://vixcpp.com");
});

6. Automatic Status Message

app.get("/forbidden", [](Request req, Response res) {
    res.status(403).send();
});

7. POST JSON Body

app.post("/echo", [](Request req, Response res) {
    return json::o(
        "received", req.json()
    );
});

8. Typed JSON Parsing

struct UserInput {
    std::string name;
    int age;
};

app.post("/users", [](Request req, Response res) {
    UserInput input = req.json_as<UserInput>();

    return std::pair{
        201,
        json::o(
            "name", input.name,
            "age", input.age
        )
    };
});

9. Headers

app.get("/headers", [](Request req, Response res) {
    res.header("X-App", "Vix")
       .type("text/plain")
       .send("Hello headers");
});

10. Request-Scoped State

app.get("/state", [](Request req, Response res) {
    req.set_state<int>(42);

    return json::o(
        "value", req.state<int>()
    );
});

11. Void Handler

app.get("/manual", [](Request req, Response res) {
    res.status(200)
       .json(json::o("ok", true));
});

12. Params Map Access

app.get("/items/{id}", [](Request req, Response res) {
    const auto& params = req.params();
    return json::o("id", params.at("id"));
});

13. 204 No Content

app.delete("/items/{id}", [](Request req, Response res) {
    res.status(204).send();
});

🧱 Why Vix Exists

Cloud-first frameworks assume:

  • stable networks
  • predictable latency
  • always-online connectivity

But in most of the world, this is not reality.

Vix is built for:

βœ” Offline-first

Applications continue functioning even without internet.

βœ” Peer-to-Peer

Nodes sync and communicate locally without a central server.

βœ” Ultra-Fast Native Execution

C++20 + Asio + zero-overhead abstractions.


🧩 Key Features

  • 🌍 Offline-first runtime
  • πŸ”— P2P-ready communication model
  • βš™οΈ Async HTTP server
  • 🧭 Expressive routing
  • πŸ’Ύ ORM for MySQL/SQLite
  • 🧠 Middleware system
  • πŸ“‘ WebSocket engine
  • 🧰 Modular design
  • πŸš€ Developer experience similar to Node/Deno/Bun
  • ⚑ 80k+ requests/sec performance

πŸš€ Getting Started

To set up Vix.cpp on your system:

git clone https://github.com/vixcpp/vix.git
cd vix
cmake -S . -B build -DCMAKE_BUILD_TYPE=Release
cmake --build build -j
./build/hello_routes

🧰 Example (CLI Project)

Once installed, you can generate a new project using the CLI:

vix new myapp
cd myapp
vix build
vix run

vix dev file.cpp
vix run file.cpp
vix orm migrate

🎯 Script Mode β€” Run .cpp Files Directly

Vix can execute a single .cpp file like a script, without creating a full project.

vix run file.cpp
vix dev file.cpp

βœ” How it works

  • Generates a temporary CMake project under:
    ./.vix-scripts/<filename>/
  • Compiles the .cpp file as a standalone executable
  • Runs it immediately
  • Stops cleanly on Ctrl+C (no gmake noise)

Example:

~/myapp/test$ vix run server.cpp
Script mode: compiling server.cpp
Using script build directory:
  β€’ .vix-scripts/server

βœ” Build succeeded
[I] Server running on port 8080
^C
β„Ή Server interrupted by user (SIGINT)

πŸ“š Documentation

πŸ“¦ Module Documentation Index

πŸ“Š Summary

Vix.cpp sits at the top of modern backend runtimes, matching or exceeding high-performance frameworks like Go Fiber, and outperforming Deno, Node, PHP, Python, and even several C++ frameworks like Crow.

Vix.cpp = the C++ runtime pushing boundaries.


🀝 Contributing

Contributions are welcome!
Please read the contributing guidelines.


πŸͺͺ License

Licensed under the MIT License.

About

pull request

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • CMake 72.6%
  • Makefile 14.1%
  • C++ 6.9%
  • Shell 6.3%
  • C 0.1%