8.0 KiB
kbot (C++)
CMake-based C++ toolchain for kbot: HTML/HTTP/JSON utilities, length-prefixed JSON IPC, optional UDS/TCP worker for Node orchestrators, and LLM chat via liboai (OpenRouter, OpenAI, Ollama-compatible servers, etc.). The main binary is kbot (kbot.exe on Windows).
Prerequisites
| Requirement | Notes |
|---|---|
| CMake | ≥ 3.20 |
| C++ compiler | C++17 (MSVC, GCC, Clang) |
| Git | For FetchContent dependencies |
| Node.js | Optional; for orchestrator/ IPC integration tests (npm run test:ipc) |
On Windows, use a Developer Command Prompt or PowerShell with MSVC in PATH. Git Bash helps if you use shell scripts under scripts/.
Quick start (build)
From this directory (packages/kbot/cpp):
npm install # optional; only needed if you use npm scripts
npm run build
Artifacts go to dist/ (e.g. dist/kbot.exe, test tools).
Equivalent CMake:
cmake --preset dev
cmake --build --preset dev
Presets
| Preset | Role |
|---|---|
dev |
Debug, static ipc + kbot libraries (default) |
release |
Release build |
dev-dll |
Debug with ipc.dll and kbot.dll (IPC_BUILD_SHARED=ON, POLYMECH_KBOT_SHARED=ON) |
cmake --preset dev-dll
cmake --build --preset dev-dll --config Debug
Place ipc.dll and kbot.dll next to kbot.exe (or on PATH) when using the DLL configuration.
npm scripts (reference)
| Script | Purpose |
|---|---|
npm run build |
Configure dev + build |
npm run build:release |
Release preset |
npm run test |
ctest in build/dev |
npm run clean |
Remove build/ and dist/ |
npm run test:ipc |
Node UDS IPC integration test |
npm run worker |
Run worker (stdio IPC) |
Installation
Install the CLI and headers into a prefix (e.g. local tree or system root):
cmake --install build/dev --prefix "C:/path/to/install"
This installs:
bin/kbot(runtime)include/polymech/—kbot.h,llm_client.h,polymech_export.h,cmd_kbot.hinclude/ipc/—ipc.h,ipc_export.hlib/— import libraries / archives (depending on static vs shared)
Library layout is defined in packages/kbot/CMakeLists.txt and packages/ipc/CMakeLists.txt.
CMake options (libraries)
| Cache variable | Effect |
|---|---|
IPC_BUILD_SHARED |
Build ipc as a shared library (OFF default) |
POLYMECH_KBOT_SHARED |
Build kbot as a shared library (OFF default) |
Static builds define IPC_STATIC_BUILD / POLYMECH_STATIC_BUILD for consumers via INTERFACE compile definitions. Shared builds export IPC_API / POLYMECH_API (see ipc_export.h, polymech_export.h).
CLI overview
Top-level:
kbot --help
kbot -v,--version
kbot --log-level debug|info|warn|error
Subcommands
| Command | Description |
|---|---|
parse <html> |
Parse HTML and list elements |
select <html> <selector> |
CSS-select elements |
config <file> |
Load and print a TOML file |
fetch <url> |
HTTP GET |
json <input> |
Prettify JSON |
db [-c config] [table] [-l limit] |
Supabase / DB helper (uses config/postgres.toml by default) |
worker [--uds <arg>] |
IPC worker (see below) |
kbot ai ... / kbot run ... |
AI and run pipelines (setup_cmd_kbot — use kbot kbot ai --help) |
Worker mode (kbot worker)
Used by orchestrators and tests.
-
Stdio IPC (length-prefixed JSON frames on stdin/stdout):
kbot worker -
UDS / TCP (Windows: TCP port string, e.g.
4001; Unix: socket path):kbot worker --uds 4001
Framing: [uint32 LE length][UTF-8 JSON object with id, type, payload]. Message types include ping, job, kbot-ai, kbot-run, shutdown, etc. See src/main.cpp and orchestrator/test-ipc.mjs.
kbot kbot (nested)
CLI for AI tasks and run configurations:
kbot kbot ai --help
kbot kbot run --help
Example:
kbot kbot ai --prompt "Hello" --config config/postgres.toml
API keys are typically resolved from config/postgres.toml ([services]).
Using in other CMake projects
There is no single find_package(kbot) config yet. Practical options:
1. Same repository / superbuild (recommended)
Add this repo’s cpp tree as a subdirectory from a parent CMakeLists.txt so FetchContent and internal targets (logger, json, ipc, oai, kbot, …) resolve once. Then:
target_link_libraries(your_app PRIVATE ipc kbot)
kbot pulls in logger, json, liboai (oai) per packages/kbot/CMakeLists.txt.
2. Install prefix + explicit IMPORTED libraries
After cmake --install, link import libraries under lib/ and add include/ for ipc and polymech. You must still satisfy transitive dependencies (oai, logger, json, …) from the same build/install of this project, or duplicate their build—usually easier to use option 1.
3. Minimal example: IPC framing only
If you only need ipc::encode / ipc::decode (and can build logger + json the same way this project does), mirror packages/ipc/CMakeLists.txt:
cmake_minimum_required(VERSION 3.20)
project(myapp CXX)
set(CMAKE_CXX_STANDARD 17)
add_subdirectory(path/to/polymech-mono/packages/kbot/cpp/packages/logger)
add_subdirectory(path/to/polymech-mono/packages/kbot/cpp/packages/json)
add_subdirectory(path/to/polymech-mono/packages/kbot/cpp/packages/ipc)
add_executable(myapp main.cpp)
target_link_libraries(myapp PRIVATE ipc)
main.cpp (stdio-style framing helpers):
#include <iostream>
#include <ipc/ipc.h>
int main() {
ipc::Message msg{"1", "ping", "{}"};
auto frame = ipc::encode(msg);
// frame: 4-byte LE length + JSON object bytes
ipc::Message roundtrip;
if (frame.size() > 4 &&
ipc::decode(frame.data() + 4, frame.size() - 4, roundtrip)) {
std::cout << roundtrip.type << "\n"; // ping
}
return 0;
}
4. Example: LLM pipeline API (kbot library)
Headers: kbot.h, llm_client.h, polymech_export.h. You need a valid API key and options (see KBotOptions in kbot.h).
#include <iostream>
#include "kbot.h"
#include "llm_client.h"
int main() {
polymech::kbot::KBotOptions opts;
opts.prompt = "Say hello in one sentence.";
opts.api_key = "YOUR_KEY";
opts.router = "openrouter";
opts.model = "openai/gpt-4o-mini";
polymech::kbot::LLMClient client(opts);
polymech::kbot::LLMResponse r = client.execute_chat(opts.prompt);
if (r.success) {
std::cout << r.text << "\n";
} else {
std::cerr << r.error << "\n";
return 1;
}
return 0;
}
Or use the callback-based pipeline:
polymech::kbot::KBotCallbacks cb;
cb.onEvent = [](const std::string& type, const std::string& json) {
std::cout << type << ": " << json << "\n";
};
return polymech::kbot::run_kbot_ai_pipeline(opts, cb);
Link kbot (and its public dependencies). cmd_kbot.h entry points (run_kbot_ai_ipc, run_cmd_kbot_uds, …) are implemented in src/cmd_kbot*.cpp in this project; to reuse them, compile those sources into your binary or vendor the logic.
Node / IPC tests
Integration tests live under orchestrator/ (see comments in orchestrator/test-ipc.mjs). Typical run from cpp/:
npm run test:ipc
Classifier batch (semantic distances vs JobViewer labels):
npm run test:ipc:classifier
npm run test:ipc:classifier:openrouter
Stress: repeat the same batched kbot-ai call N times on one worker; prints per-run wall time, token usage (when present), then min / max / avg / p50 / p95 and Σ tokens. Default N = 5 for the OpenRouter stress script:
npm run test:ipc:classifier:openrouter:stress
npm run test:ipc:classifier -- -r openrouter -m openai/gpt-4o-mini --backend remote -n 3
KBOT_CLASSIFIER_STRESS_RUNS=10 npm run test:ipc:classifier:openrouter:stress
Requires a built dist/kbot.exe (or kbot on Unix). Set API keys via config/postgres.toml for OpenRouter.
License
See LICENSE in this directory.