maintainence love:)

This commit is contained in:
lovebird 2025-04-17 16:42:03 +02:00
parent 0ec7f89cbc
commit 237e045262
15 changed files with 24661 additions and 5888 deletions

View File

@ -109,141 +109,156 @@ When creating content
- always add links
- when sending emails, always add 'Best regards, [Your Name]'
```
## Commands
### Prompt
```kbot "create Astro minimal boilerplate, use starlight theme. Install dependencies via NPM tool"```
### Fetch latest models
```kbot fetch```
### Print examples
```kbot examples```
### Print extended help
```kbot help-md```
### Initialize folder
```kbot init```
### Internal : Build
```kbot build```
# Main Commands
The primary way to interact with `kbot` for processing tasks is by invoking it with a prompt and various options. While often used implicitly, this typically corresponds to the `run` command.
## Running Tasks
```bash
kbot run [options...] "Your prompt here..."
# or simply (if 'run' is the default):
kbot [options...] "Your prompt here..."
```
This command executes the main AI processing pipeline based on the provided prompt and options. Key aspects controlled by options include:
* **Input:** Specified via `--include` (files, directories, web URLs), `--path`.
* **Task:** Defined by the `--prompt`.
* **Behavior:** Controlled by `--mode` (e.g., `tools`, `completion`).
* **Output:** Directed using `--dst` or `--output`.
* **Model & API:** Configured with `--model`, `--router`, `--api_key`, etc.
Refer to [Parameters](./parameters.md) and [Modes](./modes.md) for detailed options.
## Utility Commands
Other potential utility commands might include:
* `kbot fetch`: Fetch updated information, such as the latest available models.
* `kbot init`: Initialize a directory or project for use with `kbot` (e.g., create default config files).
* `kbot help-md`: Generate extended help documentation in Markdown format.
* `kbot examples`: Show example usage patterns.
*(Note: Availability and exact behavior of utility commands may vary.)*
# Command Line Parameters
This document describes all available command line parameters.
This document describes the command line parameters available for `kbot`.
**Note:** Many parameters support environment variable substitution (e.g., `${VAR_NAME}`).
## Core Parameters
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `path` | Target directory | `.` | No |
| `prompt` | The prompt. Supports file paths and environment variables | `./prompt.md` | No |
| `output` | Optional output path for modified files (Tool mode only) | - | No |
| `dst` | Optional destination path for the result, will substitute ${MODEL} and ${ROUTER} in the path. | - | No |
| `model` | AI model to use for processing | `anthropic/claude-3.5-sonnet` | No |
| `router` | Router to use: openai or openrouter | `openrouter` | No |
| `mode` | Chat completion mode: "completion" (without tools) or "tools" | `tools` | No |
| `prompt` | The main instruction or question for the AI. Can be a string, a file path (e.g., `file:./my_prompt.md`), or an environment variable. | - | Yes (or implied by context) |
| `model` | AI model ID to use for processing (e.g., `openai/gpt-4o`). See available models via helper functions or router documentation. | Depends on router/config | No |
| `router` | The API provider to use. | `openrouter` | No |
| `mode` | The operational mode. See [Modes](./modes.md) for details. | `tools` | No |
## Advanced Parameters
## Input & File Selection
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `each` | Target directory | `.` | No |
| `dry` | Dry run - only write out parameters without making API calls | `false` | No |
| `path` | Target directory for local file operations or context. | `.` | No |
| `include` | Specify input files or content. Accepts comma-separated glob patterns (e.g., `src/**/*.ts`), file paths, directory paths, or **web URLs** (e.g., `https://example.com/page`). | `[]` | No |
| `query` | JSONPath query to extract specific data from input objects (often used with structured input files). | `null` | No |
## File Selection & Tools
## Output & Formatting
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `include` | Glob patterns to match files for processing. Supports multiple patterns, e.g. `--include=src/*.tsx,src/*.ts --include=package.json` | - | No |
| `disable` | Disable tools categories | `[]` | No |
| `disableTools` | List of specific tools to disable | `[]` | No |
| `output` | Output path for modified files (primarily for `tools` mode operations like refactoring). | - | No |
| `dst` | Destination path/filename for the main result (primarily for `completion` or `assistant` mode). Supports `${MODEL_NAME}` and `${ROUTER}` substitutions. | - | No |
| `format` | Defines the desired structure for the AI's output. Can be a Zod schema object, a Zod schema string, a JSON schema string, or a path to a JSON schema file (e.g., `file:./schema.json`). Ensures the output conforms to the specified structure. | - | No |
| `filters` | Post-processing filters applied to the output (primarily `completion` mode with `--dst`). Can be a comma-separated string of filter names (e.g., `unwrapMarkdown,trim`). | `''` | No |
## Configuration & Profiles
## Tool Usage
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `profile` | Path to profile for variables. Supports environment variables | `${POLYMECH-ROOT}/profile.json` | No |
| `env` | Environment (in profile) | `default` | No |
| `config` | Path to JSON configuration file (API keys). Supports environment variables | - | No |
| `preferences` | Path to preferences file (location, email, gender, etc). Supports environment variables | `./.kbot/preferences.md` | No |
| `tools` | Comma-separated list of tool names or paths to custom tool files to enable. | (List of default tools) | No |
| `disable` | Comma-separated list of tool *categories* to disable (e.g., `filesystem,git`). | `[]` | No |
| `disableTools` | Comma-separated list of specific tool *names* to disable. | `[]` | No |
## Iteration & Advanced Control
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `each` | Iterate the task over multiple items. Accepts a GLOB pattern, path to a JSON file (array), or comma-separated strings. The current item is available as the `${ITEM}` variable in other parameters (e.g., `--dst="${ITEM}-output.md"`). Can be used to test different models (e.g., `--each="openai/gpt-3.5-turbo,openai/gpt-4o"`). | - | No |
| `variables` | Define custom key-value variables for use in prompts or other parameters (e.g., `--variables.PROJECT_NAME=MyProject`). Access via `${variableName}`. | `{}` | No |
## Configuration & Authentication
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `api_key` | Explicit API key for the selected router. Overrides keys from config files. | - | No |
| `baseURL` | Custom base URL for the API endpoint (e.g., for local LLMs via Ollama). Set automatically for known routers or can be specified directly. | - | No |
| `config` | Path to a JSON configuration file containing API keys and potentially other settings. | - | No |
| `profile` | Path to a profile file (JSON or .env format) for loading environment-specific variables. | - | No |
| `env` | Specifies the environment section to use within the profile file. | `default` | No |
| `preferences` | Path to a preferences file (e.g., containing user details like location, email). Used to provide context to the AI. | (System-specific default, often `~/.kbot/Preferences`) | No |
## Debugging & Logging
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `logLevel` | Logging level for the application (0-4) | `2` | No |
| `logs` | Logging directory | `./.kbot` | No |
| `dump` | Create a script | - | No |
| `logLevel` | Logging verbosity level (e.g., 0=error, 4=debug). | `4` | No |
| `logs` | Directory to store log files and temporary outputs (like `params.json`). | `./logs` | No |
| `dry` | Perform a dry run: log parameters and configurations without executing the AI request. | `false` | No |
| `dump` | Path to generate a script file representing the current command invocation. | - | No |
# Advanced Topics
# Working on Larger Directories
This section covers more advanced usage patterns and concepts.
Since LLMs (Large Language Models) and providers are limited to very small 'context windows', it's necessary to feed them with smaller chunks instead. This document explains how to process larger directories efficiently.
## Processing Multiple Items (`--each`)
## Directory Processing Example
Instead of relying on external scripting for batch processing, `kbot` provides the built-in `--each` parameter. This allows you to iterate a task over multiple inputs efficiently.
Here's an example of how to walk through files and process them:
**How it Works:**
The `--each` parameter accepts:
* A comma-separated list of strings (e.g., `--each="file1.txt,file2.txt"`).
* A file path to a JSON file containing an array of strings.
* A GLOB pattern matching multiple files (e.g., `--each="./src/**/*.ts"`).
* A list of model IDs to test a prompt against different models (e.g., `--each="openai/gpt-4o,anthropic/claude-3.5-sonnet"`).
**Using the `${ITEM}` Variable:**
Within the loop initiated by `--each`, the current item being processed is available as the `${ITEM}` variable. You can use this variable in other parameters, such as `--dst`, `--include`, or within the `--prompt` itself.
**Example: Generating Documentation for Multiple Files**
```bash
osr-cli each --main='kbot \"read ${KEY} and translate to german, save in docs/language code/filename.md\" --include=\"${REL}\" --include=\".kbot/preferences.md\"' --list="./docs/*.md" --cwd=.
kbot --each "./src/modules/*.ts" \
--dst "./docs/api/${ITEM}.md" \
--prompt "Generate API documentation in Markdown format for the module defined in ${ITEM}"
```
### Parameter Explanation
This command will:
- `each`: Command to process multiple files iteratively
- `--main`: The main command (`kbot`) to execute for each file
- `--include=\"${REL}\"` instructs kbot to include the current selected path
- `--include=\".kbot/preferences.md\"` instructs kbot to include additional preferences about the task (eg: translation specifics)
- `--list`: Specifies the file pattern to match
- Supports include patterns (e.g., `"./docs/*.md"`)
- `--cwd`: Sets the current working directory for the command execution. Default is the current directory (`.`)
1. Find all `.ts` files in `./src/modules/`.
2. For each file (e.g., `moduleA.ts`):
* Set `${ITEM}` to the file path (`./src/modules/moduleA.ts`).
* Execute `kbot` with the prompt, including the specific file via `${ITEM}`.
* Save the output to `./docs/api/./src/modules/moduleA.ts.md` (Note: path handling might vary).
**Note** requires `@plastichub/osr-cli-commons` to be installed globally:
Refer to the [Examples](./examples.md#iterating-with---each) for more use cases.
```bash
npm i -g @plastichub/osr-cli-commons
```
## Choosing a Transformation Method: `transform` vs. `createIterator`
## Async Iterator Examples
When transforming data structures (often JSON) using LLMs, you have two primary approaches:
This package provides examples of how to use the async-iterator module to transform objects with LLM integration:
1. **`transform` Helper Function:**
* **Pros:** Simple, minimal setup, good for basic field transformations.
* **Cons:** Less control over network, caching, logging details.
* **Use Case:** Quickly applying straightforward transformations to data fields without needing deep customization.
### Running the Examples
2. **`createIterator` Factory:**
* **Pros:** Full control over network options (retries, concurrency), caching (namespace, expiration), logging, custom transformer logic, and callbacks (`onTransform`, `onTransformed`).
* **Cons:** More verbose setup required.
* **Use Case:** Complex transformations requiring fine-tuned control over the entire process, advanced caching strategies, or integration with custom logging/transformation logic.
```bash
# Run the standard example (uses mock transformers if API is unavailable)
npm run examples:async-iterator
# Run with verbose logging
npm run examples:async-iterator:verbose
```
### Implementation Details
The async-iterator module provides a powerful way to transform specific fields in complex nested objects using JSONPath selectors. Key features include:
1. **Field Selection**: Use JSONPath patterns like `$..description` to target specific fields across the entire object
2. **Transformation Options**: Configure how fields are transformed with options like model, prompt, etc.
3. **Target Mapping**: Transform fields in-place or create new fields to store the transformed values
4. **Error Handling**: Built-in error callbacks to handle transformation failures gracefully
5. **Throttling & Concurrency**: Control API rate limits with throttle delay and concurrent task settings
### Example Source Code
You can find the example in:
- `src/examples/core/async-iterator-example.ts` - Source implementation using the actual run command with fallback to mock transformers
The example demonstrates:
1. Using the `transformObject` function from the async-iterator module
2. Setting up proper JSONPath selectors for targeting specific fields
3. Creating field mappings with transformation options
4. Implementing in-place transformations and new field creation
5. Graceful fallback handling when API calls fail
Consult the [Iterator Documentation](./iterator.md) for detailed explanations and code examples of both methods.

View File

@ -1,12 +1,6 @@
{
"timestamp": 1743931351155,
"timestamp": 1744900912154,
"models": [
{
"id": "gpt-4o-realtime-preview-2024-12-17",
"object": "model",
"created": 1733945430,
"owned_by": "system"
},
{
"id": "gpt-4o-audio-preview-2024-12-17",
"object": "model",
@ -19,36 +13,54 @@
"created": 1698785189,
"owned_by": "system"
},
{
"id": "text-embedding-3-large",
"object": "model",
"created": 1705953180,
"owned_by": "system"
},
{
"id": "dall-e-2",
"object": "model",
"created": 1698798177,
"owned_by": "system"
},
{
"id": "o4-mini-2025-04-16",
"object": "model",
"created": 1744133506,
"owned_by": "system"
},
{
"id": "gpt-4o-audio-preview-2024-10-01",
"object": "model",
"created": 1727389042,
"owned_by": "system"
},
{
"id": "o4-mini",
"object": "model",
"created": 1744225351,
"owned_by": "system"
},
{
"id": "gpt-4.1-nano",
"object": "model",
"created": 1744321707,
"owned_by": "system"
},
{
"id": "gpt-4.1-nano-2025-04-14",
"object": "model",
"created": 1744321025,
"owned_by": "system"
},
{
"id": "gpt-4o-realtime-preview-2024-10-01",
"object": "model",
"created": 1727131766,
"owned_by": "system"
},
{
"id": "gpt-4o-transcribe",
"object": "model",
"created": 1742068463,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-transcribe",
"object": "model",
"created": 1742068596,
"owned_by": "system"
},
{
"id": "gpt-4o-realtime-preview",
"object": "model",
@ -62,9 +74,9 @@
"owned_by": "system"
},
{
"id": "gpt-4o-mini-tts",
"id": "gpt-4-turbo-preview",
"object": "model",
"created": 1742403959,
"created": 1706037777,
"owned_by": "system"
},
{
@ -74,9 +86,9 @@
"owned_by": "system"
},
{
"id": "text-embedding-3-large",
"id": "gpt-4-0125-preview",
"object": "model",
"created": 1705953180,
"created": 1706037612,
"owned_by": "system"
},
{
@ -91,12 +103,6 @@
"created": 1671217299,
"owned_by": "openai-internal"
},
{
"id": "omni-moderation-latest",
"object": "model",
"created": 1731689265,
"owned_by": "system"
},
{
"id": "tts-1-hd",
"object": "model",
@ -127,6 +133,12 @@
"created": 1734387380,
"owned_by": "system"
},
{
"id": "gpt-4.1-mini",
"object": "model",
"created": 1744318173,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-realtime-preview-2024-12-17",
"object": "model",
@ -146,15 +158,9 @@
"owned_by": "system"
},
{
"id": "gpt-4-turbo-preview",
"id": "gpt-4.1-mini-2025-04-14",
"object": "model",
"created": 1706037777,
"owned_by": "system"
},
{
"id": "gpt-4-0125-preview",
"object": "model",
"created": 1706037612,
"created": 1744317547,
"owned_by": "system"
},
{
@ -163,6 +169,12 @@
"created": 1699053241,
"owned_by": "system"
},
{
"id": "chatgpt-4o-latest",
"object": "model",
"created": 1723515131,
"owned_by": "system"
},
{
"id": "davinci-002",
"object": "model",
@ -175,12 +187,24 @@
"created": 1698959748,
"owned_by": "system"
},
{
"id": "gpt-4o-search-preview",
"object": "model",
"created": 1741388720,
"owned_by": "system"
},
{
"id": "gpt-4-turbo",
"object": "model",
"created": 1712361441,
"owned_by": "system"
},
{
"id": "gpt-4o-realtime-preview-2024-12-17",
"object": "model",
"created": 1733945430,
"owned_by": "system"
},
{
"id": "gpt-3.5-turbo-instruct",
"object": "model",
@ -194,9 +218,9 @@
"owned_by": "openai"
},
{
"id": "chatgpt-4o-latest",
"id": "gpt-4-1106-preview",
"object": "model",
"created": 1723515131,
"created": 1698957206,
"owned_by": "system"
},
{
@ -217,24 +241,12 @@
"created": 1677532384,
"owned_by": "openai-internal"
},
{
"id": "gpt-3.5-turbo-0125",
"object": "model",
"created": 1706048358,
"owned_by": "system"
},
{
"id": "gpt-4o-2024-05-13",
"object": "model",
"created": 1715368132,
"owned_by": "system"
},
{
"id": "gpt-3.5-turbo-16k",
"object": "model",
"created": 1683758102,
"owned_by": "openai-internal"
},
{
"id": "gpt-4-turbo-2024-04-09",
"object": "model",
@ -242,10 +254,10 @@
"owned_by": "system"
},
{
"id": "gpt-4-1106-preview",
"id": "gpt-3.5-turbo-16k",
"object": "model",
"created": 1698957206,
"owned_by": "system"
"created": 1683758102,
"owned_by": "openai-internal"
},
{
"id": "o1-preview",
@ -259,30 +271,24 @@
"created": 1686588896,
"owned_by": "openai"
},
{
"id": "gpt-4o-search-preview",
"object": "model",
"created": 1741388720,
"owned_by": "system"
},
{
"id": "o1-2024-12-17",
"object": "model",
"created": 1734326976,
"owned_by": "system"
},
{
"id": "o1-pro",
"object": "model",
"created": 1742251791,
"owned_by": "system"
},
{
"id": "o1",
"object": "model",
"created": 1734375816,
"owned_by": "system"
},
{
"id": "o1-pro",
"object": "model",
"created": 1742251791,
"owned_by": "system"
},
{
"id": "o1-pro-2025-03-19",
"object": "model",
@ -307,18 +313,6 @@
"created": 1741388170,
"owned_by": "system"
},
{
"id": "o3-mini",
"object": "model",
"created": 1737146383,
"owned_by": "system"
},
{
"id": "o3-mini-2025-01-31",
"object": "model",
"created": 1738010200,
"owned_by": "system"
},
{
"id": "tts-1",
"object": "model",
@ -337,12 +331,30 @@
"created": 1705948997,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-tts",
"object": "model",
"created": 1742403959,
"owned_by": "system"
},
{
"id": "gpt-4o",
"object": "model",
"created": 1715367049,
"owned_by": "system"
},
{
"id": "o3-mini",
"object": "model",
"created": 1737146383,
"owned_by": "system"
},
{
"id": "o3-mini-2025-01-31",
"object": "model",
"created": 1738010200,
"owned_by": "system"
},
{
"id": "gpt-4o-mini",
"object": "model",
@ -355,12 +367,36 @@
"created": 1722814719,
"owned_by": "system"
},
{
"id": "gpt-4.1",
"object": "model",
"created": 1744316542,
"owned_by": "system"
},
{
"id": "gpt-4o-transcribe",
"object": "model",
"created": 1742068463,
"owned_by": "system"
},
{
"id": "gpt-4.1-2025-04-14",
"object": "model",
"created": 1744315746,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-2024-07-18",
"object": "model",
"created": 1721172717,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-transcribe",
"object": "model",
"created": 1742068596,
"owned_by": "system"
},
{
"id": "o1-mini",
"object": "model",
@ -373,11 +409,23 @@
"created": 1734115920,
"owned_by": "system"
},
{
"id": "gpt-3.5-turbo-0125",
"object": "model",
"created": 1706048358,
"owned_by": "system"
},
{
"id": "o1-mini-2024-09-12",
"object": "model",
"created": 1725648979,
"owned_by": "system"
},
{
"id": "omni-moderation-latest",
"object": "model",
"created": 1731689265,
"owned_by": "system"
}
]
}

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -109,105 +109,156 @@ When creating content
- always add links
- when sending emails, always add 'Best regards, [Your Name]'
```
## Commands
# Main Commands
### Prompt
The primary way to interact with `kbot` for processing tasks is by invoking it with a prompt and various options. While often used implicitly, this typically corresponds to the `run` command.
```kbot "create Astro minimal boilerplate, use starlight theme. Install dependencies via NPM tool"```
## Running Tasks
### Fetch latest models
```bash
kbot run [options...] "Your prompt here..."
# or simply (if 'run' is the default):
kbot [options...] "Your prompt here..."
```
```kbot fetch```
This command executes the main AI processing pipeline based on the provided prompt and options. Key aspects controlled by options include:
### Print examples
* **Input:** Specified via `--include` (files, directories, web URLs), `--path`.
* **Task:** Defined by the `--prompt`.
* **Behavior:** Controlled by `--mode` (e.g., `tools`, `completion`).
* **Output:** Directed using `--dst` or `--output`.
* **Model & API:** Configured with `--model`, `--router`, `--api_key`, etc.
```kbot examples```
Refer to [Parameters](./parameters.md) and [Modes](./modes.md) for detailed options.
### Print extended help
## Utility Commands
```kbot help-md```
Other potential utility commands might include:
### Initialize folder
* `kbot fetch`: Fetch updated information, such as the latest available models.
* `kbot init`: Initialize a directory or project for use with `kbot` (e.g., create default config files).
* `kbot help-md`: Generate extended help documentation in Markdown format.
* `kbot examples`: Show example usage patterns.
```kbot init```
### Internal : Build
```kbot build```
*(Note: Availability and exact behavior of utility commands may vary.)*
# Command Line Parameters
This document describes all available command line parameters.
This document describes the command line parameters available for `kbot`.
**Note:** Many parameters support environment variable substitution (e.g., `${VAR_NAME}`).
## Core Parameters
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `path` | Target directory | `.` | No |
| `prompt` | The prompt. Supports file paths and environment variables | `./prompt.md` | No |
| `output` | Optional output path for modified files (Tool mode only) | - | No |
| `dst` | Optional destination path for the result, will substitute ${MODEL} and ${ROUTER} in the path. | - | No |
| `model` | AI model to use for processing | `anthropic/claude-3.5-sonnet` | No |
| `router` | Router to use: openai or openrouter | `openrouter` | No |
| `mode` | Chat completion mode: "completion" (without tools) or "tools" | `tools` | No |
| `prompt` | The main instruction or question for the AI. Can be a string, a file path (e.g., `file:./my_prompt.md`), or an environment variable. | - | Yes (or implied by context) |
| `model` | AI model ID to use for processing (e.g., `openai/gpt-4o`). See available models via helper functions or router documentation. | Depends on router/config | No |
| `router` | The API provider to use. | `openrouter` | No |
| `mode` | The operational mode. See [Modes](./modes.md) for details. | `tools` | No |
## Advanced Parameters
## Input & File Selection
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `each` | Target directory | `.` | No |
| `dry` | Dry run - only write out parameters without making API calls | `false` | No |
| `path` | Target directory for local file operations or context. | `.` | No |
| `include` | Specify input files or content. Accepts comma-separated glob patterns (e.g., `src/**/*.ts`), file paths, directory paths, or **web URLs** (e.g., `https://example.com/page`). | `[]` | No |
| `query` | JSONPath query to extract specific data from input objects (often used with structured input files). | `null` | No |
## File Selection & Tools
## Output & Formatting
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `include` | Glob patterns to match files for processing. Supports multiple patterns, e.g. `--include=src/*.tsx,src/*.ts --include=package.json` | - | No |
| `disable` | Disable tools categories | `[]` | No |
| `disableTools` | List of specific tools to disable | `[]` | No |
| `output` | Output path for modified files (primarily for `tools` mode operations like refactoring). | - | No |
| `dst` | Destination path/filename for the main result (primarily for `completion` or `assistant` mode). Supports `${MODEL_NAME}` and `${ROUTER}` substitutions. | - | No |
| `format` | Defines the desired structure for the AI's output. Can be a Zod schema object, a Zod schema string, a JSON schema string, or a path to a JSON schema file (e.g., `file:./schema.json`). Ensures the output conforms to the specified structure. | - | No |
| `filters` | Post-processing filters applied to the output (primarily `completion` mode with `--dst`). Can be a comma-separated string of filter names (e.g., `unwrapMarkdown,trim`). | `''` | No |
## Configuration & Profiles
## Tool Usage
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `profile` | Path to profile for variables. Supports environment variables | `${POLYMECH-ROOT}/profile.json` | No |
| `env` | Environment (in profile) | `default` | No |
| `config` | Path to JSON configuration file (API keys). Supports environment variables | - | No |
| `preferences` | Path to preferences file (location, email, gender, etc). Supports environment variables | `./.kbot/preferences.md` | No |
| `tools` | Comma-separated list of tool names or paths to custom tool files to enable. | (List of default tools) | No |
| `disable` | Comma-separated list of tool *categories* to disable (e.g., `filesystem,git`). | `[]` | No |
| `disableTools` | Comma-separated list of specific tool *names* to disable. | `[]` | No |
## Iteration & Advanced Control
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `each` | Iterate the task over multiple items. Accepts a GLOB pattern, path to a JSON file (array), or comma-separated strings. The current item is available as the `${ITEM}` variable in other parameters (e.g., `--dst="${ITEM}-output.md"`). Can be used to test different models (e.g., `--each="openai/gpt-3.5-turbo,openai/gpt-4o"`). | - | No |
| `variables` | Define custom key-value variables for use in prompts or other parameters (e.g., `--variables.PROJECT_NAME=MyProject`). Access via `${variableName}`. | `{}` | No |
## Configuration & Authentication
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `api_key` | Explicit API key for the selected router. Overrides keys from config files. | - | No |
| `baseURL` | Custom base URL for the API endpoint (e.g., for local LLMs via Ollama). Set automatically for known routers or can be specified directly. | - | No |
| `config` | Path to a JSON configuration file containing API keys and potentially other settings. | - | No |
| `profile` | Path to a profile file (JSON or .env format) for loading environment-specific variables. | - | No |
| `env` | Specifies the environment section to use within the profile file. | `default` | No |
| `preferences` | Path to a preferences file (e.g., containing user details like location, email). Used to provide context to the AI. | (System-specific default, often `~/.kbot/Preferences`) | No |
## Debugging & Logging
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `logLevel` | Logging level for the application (0-4) | `2` | No |
| `logs` | Logging directory | `./.kbot` | No |
| `dump` | Create a script | - | No |
| `logLevel` | Logging verbosity level (e.g., 0=error, 4=debug). | `4` | No |
| `logs` | Directory to store log files and temporary outputs (like `params.json`). | `./logs` | No |
| `dry` | Perform a dry run: log parameters and configurations without executing the AI request. | `false` | No |
| `dump` | Path to generate a script file representing the current command invocation. | - | No |
# Advanced Topics
# Working on Larger Directories
This section covers more advanced usage patterns and concepts.
Since LLMs (Large Language Models) and providers are limited to very small 'context windows', it's necessary to feed them with smaller chunks instead. This document explains how to process larger directories efficiently.
## Processing Multiple Items (`--each`)
## Directory Processing Example
Instead of relying on external scripting for batch processing, `kbot` provides the built-in `--each` parameter. This allows you to iterate a task over multiple inputs efficiently.
Here's an example of how to walk through files and process them:
**How it Works:**
The `--each` parameter accepts:
* A comma-separated list of strings (e.g., `--each="file1.txt,file2.txt"`).
* A file path to a JSON file containing an array of strings.
* A GLOB pattern matching multiple files (e.g., `--each="./src/**/*.ts"`).
* A list of model IDs to test a prompt against different models (e.g., `--each="openai/gpt-4o,anthropic/claude-3.5-sonnet"`).
**Using the `${ITEM}` Variable:**
Within the loop initiated by `--each`, the current item being processed is available as the `${ITEM}` variable. You can use this variable in other parameters, such as `--dst`, `--include`, or within the `--prompt` itself.
**Example: Generating Documentation for Multiple Files**
```bash
osr-cli each --main='kbot \"read ${KEY} and translate to german, save in docs/language code/filename.md\" --include=\"${REL}\" --include=\".kbot/preferences.md\"' --list="./docs/*.md" --cwd=.
kbot --each "./src/modules/*.ts" \
--dst "./docs/api/${ITEM}.md" \
--prompt "Generate API documentation in Markdown format for the module defined in ${ITEM}"
```
### Parameter Explanation
This command will:
- `each`: Command to process multiple files iteratively
- `--main`: The main command (`kbot`) to execute for each file
- `--include=\"${REL}\"` instructs kbot to include the current selected path
- `--include=\".kbot/preferences.md\"` instructs kbot to include additional preferences about the task (eg: translation specifics)
- `--list`: Specifies the file pattern to match
- Supports include patterns (e.g., `"./docs/*.md"`)
- `--cwd`: Sets the current working directory for the command execution. Default is the current directory (`.`)
1. Find all `.ts` files in `./src/modules/`.
2. For each file (e.g., `moduleA.ts`):
* Set `${ITEM}` to the file path (`./src/modules/moduleA.ts`).
* Execute `kbot` with the prompt, including the specific file via `${ITEM}`.
* Save the output to `./docs/api/./src/modules/moduleA.ts.md` (Note: path handling might vary).
**Note** requires `@plastichub/osr-cli-commons` to be installed globally:
Refer to the [Examples](./examples.md#iterating-with---each) for more use cases.
```bash
npm i -g @plastichub/osr-cli-commons
```
## Choosing a Transformation Method: `transform` vs. `createIterator`
When transforming data structures (often JSON) using LLMs, you have two primary approaches:
1. **`transform` Helper Function:**
* **Pros:** Simple, minimal setup, good for basic field transformations.
* **Cons:** Less control over network, caching, logging details.
* **Use Case:** Quickly applying straightforward transformations to data fields without needing deep customization.
2. **`createIterator` Factory:**
* **Pros:** Full control over network options (retries, concurrency), caching (namespace, expiration), logging, custom transformer logic, and callbacks (`onTransform`, `onTransformed`).
* **Cons:** More verbose setup required.
* **Use Case:** Complex transformations requiring fine-tuned control over the entire process, advanced caching strategies, or integration with custom logging/transformation logic.
Consult the [Iterator Documentation](./iterator.md) for detailed explanations and code examples of both methods.

File diff suppressed because one or more lines are too long

View File

@ -1,12 +1,12 @@
{
"name": "@plastichub/kbot",
"version": "1.1.25",
"version": "1.1.26",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@plastichub/kbot",
"version": "1.1.25",
"version": "1.1.26",
"license": "ISC",
"dependencies": {
"node-emoji": "^2.2.0"

View File

@ -1,6 +1,6 @@
{
"name": "@plastichub/kbot",
"version": "1.1.25",
"version": "1.1.26",
"main": "main_node.js",
"author": "",
"license": "ISC",

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,65 +1,73 @@
export enum E_OPENAI_MODEL {
MODEL_GPT_4O_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-realtime-preview-2024-12-17",
MODEL_GPT_4O_AUDIO_PREVIEW_2024_12_17 = "gpt-4o-audio-preview-2024-12-17",
MODEL_DALL_E_3 = "dall-e-3",
MODEL_TEXT_EMBEDDING_3_LARGE = "text-embedding-3-large",
MODEL_DALL_E_2 = "dall-e-2",
MODEL_O4_MINI_2025_04_16 = "o4-mini-2025-04-16",
MODEL_GPT_4O_AUDIO_PREVIEW_2024_10_01 = "gpt-4o-audio-preview-2024-10-01",
MODEL_O4_MINI = "o4-mini",
MODEL_GPT_4_1_NANO = "gpt-4.1-nano",
MODEL_GPT_4_1_NANO_2025_04_14 = "gpt-4.1-nano-2025-04-14",
MODEL_GPT_4O_REALTIME_PREVIEW_2024_10_01 = "gpt-4o-realtime-preview-2024-10-01",
MODEL_GPT_4O_TRANSCRIBE = "gpt-4o-transcribe",
MODEL_GPT_4O_MINI_TRANSCRIBE = "gpt-4o-mini-transcribe",
MODEL_GPT_4O_REALTIME_PREVIEW = "gpt-4o-realtime-preview",
MODEL_BABBAGE_002 = "babbage-002",
MODEL_GPT_4O_MINI_TTS = "gpt-4o-mini-tts",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_TTS_1_HD_1106 = "tts-1-hd-1106",
MODEL_TEXT_EMBEDDING_3_LARGE = "text-embedding-3-large",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4 = "gpt-4",
MODEL_TEXT_EMBEDDING_ADA_002 = "text-embedding-ada-002",
MODEL_OMNI_MODERATION_LATEST = "omni-moderation-latest",
MODEL_TTS_1_HD = "tts-1-hd",
MODEL_GPT_4O_MINI_AUDIO_PREVIEW = "gpt-4o-mini-audio-preview",
MODEL_GPT_4O_AUDIO_PREVIEW = "gpt-4o-audio-preview",
MODEL_O1_PREVIEW_2024_09_12 = "o1-preview-2024-09-12",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW = "gpt-4o-mini-realtime-preview",
MODEL_GPT_4_1_MINI = "gpt-4.1-mini",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-mini-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT_0914 = "gpt-3.5-turbo-instruct-0914",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW = "gpt-4o-mini-search-preview",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4_1_MINI_2025_04_14 = "gpt-4.1-mini-2025-04-14",
MODEL_TTS_1_1106 = "tts-1-1106",
MODEL_CHATGPT_4O_LATEST = "chatgpt-4o-latest",
MODEL_DAVINCI_002 = "davinci-002",
MODEL_GPT_3_5_TURBO_1106 = "gpt-3.5-turbo-1106",
MODEL_GPT_4O_SEARCH_PREVIEW = "gpt-4o-search-preview",
MODEL_GPT_4_TURBO = "gpt-4-turbo",
MODEL_GPT_4O_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT = "gpt-3.5-turbo-instruct",
MODEL_GPT_3_5_TURBO = "gpt-3.5-turbo",
MODEL_CHATGPT_4O_LATEST = "chatgpt-4o-latest",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-mini-search-preview-2025-03-11",
MODEL_GPT_4O_2024_11_20 = "gpt-4o-2024-11-20",
MODEL_WHISPER_1 = "whisper-1",
MODEL_GPT_3_5_TURBO_0125 = "gpt-3.5-turbo-0125",
MODEL_GPT_4O_2024_05_13 = "gpt-4o-2024-05-13",
MODEL_GPT_3_5_TURBO_16K = "gpt-3.5-turbo-16k",
MODEL_GPT_4_TURBO_2024_04_09 = "gpt-4-turbo-2024-04-09",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_GPT_3_5_TURBO_16K = "gpt-3.5-turbo-16k",
MODEL_O1_PREVIEW = "o1-preview",
MODEL_GPT_4_0613 = "gpt-4-0613",
MODEL_GPT_4O_SEARCH_PREVIEW = "gpt-4o-search-preview",
MODEL_O1_2024_12_17 = "o1-2024-12-17",
MODEL_O1_PRO = "o1-pro",
MODEL_O1 = "o1",
MODEL_O1_PRO = "o1-pro",
MODEL_O1_PRO_2025_03_19 = "o1-pro-2025-03-19",
MODEL_GPT_4_5_PREVIEW = "gpt-4.5-preview",
MODEL_GPT_4_5_PREVIEW_2025_02_27 = "gpt-4.5-preview-2025-02-27",
MODEL_GPT_4O_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-search-preview-2025-03-11",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_TTS_1 = "tts-1",
MODEL_OMNI_MODERATION_2024_09_26 = "omni-moderation-2024-09-26",
MODEL_TEXT_EMBEDDING_3_SMALL = "text-embedding-3-small",
MODEL_GPT_4O_MINI_TTS = "gpt-4o-mini-tts",
MODEL_GPT_4O = "gpt-4o",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_GPT_4O_MINI = "gpt-4o-mini",
MODEL_GPT_4O_2024_08_06 = "gpt-4o-2024-08-06",
MODEL_GPT_4_1 = "gpt-4.1",
MODEL_GPT_4O_TRANSCRIBE = "gpt-4o-transcribe",
MODEL_GPT_4_1_2025_04_14 = "gpt-4.1-2025-04-14",
MODEL_GPT_4O_MINI_2024_07_18 = "gpt-4o-mini-2024-07-18",
MODEL_GPT_4O_MINI_TRANSCRIBE = "gpt-4o-mini-transcribe",
MODEL_O1_MINI = "o1-mini",
MODEL_GPT_4O_MINI_AUDIO_PREVIEW_2024_12_17 = "gpt-4o-mini-audio-preview-2024-12-17",
MODEL_O1_MINI_2024_09_12 = "o1-mini-2024-09-12"
MODEL_GPT_3_5_TURBO_0125 = "gpt-3.5-turbo-0125",
MODEL_O1_MINI_2024_09_12 = "o1-mini-2024-09-12",
MODEL_OMNI_MODERATION_LATEST = "omni-moderation-latest"
}

View File

@ -1,7 +1,13 @@
export enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_SHISA_AI_SHISA_V2_LLAMA3_3_70B_FREE = "shisa-ai/shisa-v2-llama3.3-70b:free",
MODEL_FREE_ARLIAI_QWQ_32B_ARLIAI_RPR_V1_FREE = "arliai/qwq-32b-arliai-rpr-v1:free",
MODEL_FREE_AGENTICA_ORG_DEEPCODER_14B_PREVIEW_FREE = "agentica-org/deepcoder-14b-preview:free",
MODEL_FREE_MOONSHOTAI_KIMI_VL_A3B_THINKING_FREE = "moonshotai/kimi-vl-a3b-thinking:free",
MODEL_FREE_NVIDIA_LLAMA_3_1_NEMOTRON_NANO_8B_V1_FREE = "nvidia/llama-3.1-nemotron-nano-8b-v1:free",
MODEL_FREE_NVIDIA_LLAMA_3_3_NEMOTRON_SUPER_49B_V1_FREE = "nvidia/llama-3.3-nemotron-super-49b-v1:free",
MODEL_FREE_NVIDIA_LLAMA_3_1_NEMOTRON_ULTRA_253B_V1_FREE = "nvidia/llama-3.1-nemotron-ultra-253b-v1:free",
MODEL_FREE_META_LLAMA_LLAMA_4_MAVERICK_FREE = "meta-llama/llama-4-maverick:free",
MODEL_FREE_META_LLAMA_LLAMA_4_SCOUT_FREE = "meta-llama/llama-4-scout:free",
MODEL_FREE_OPENROUTER_QUASAR_ALPHA = "openrouter/quasar-alpha",
MODEL_FREE_DEEPSEEK_DEEPSEEK_V3_BASE_FREE = "deepseek/deepseek-v3-base:free",
MODEL_FREE_ALLENAI_MOLMO_7B_D_FREE = "allenai/molmo-7b-d:free",
MODEL_FREE_BYTEDANCE_RESEARCH_UI_TARS_72B_FREE = "bytedance-research/ui-tars-72b:free",
@ -24,7 +30,6 @@ export enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_NOUSRESEARCH_DEEPHERMES_3_LLAMA_3_8B_PREVIEW_FREE = "nousresearch/deephermes-3-llama-3-8b-preview:free",
MODEL_FREE_COGNITIVECOMPUTATIONS_DOLPHIN3_0_R1_MISTRAL_24B_FREE = "cognitivecomputations/dolphin3.0-r1-mistral-24b:free",
MODEL_FREE_COGNITIVECOMPUTATIONS_DOLPHIN3_0_MISTRAL_24B_FREE = "cognitivecomputations/dolphin3.0-mistral-24b:free",
MODEL_FREE_GOOGLE_GEMINI_2_0_PRO_EXP_02_05_FREE = "google/gemini-2.0-pro-exp-02-05:free",
MODEL_FREE_QWEN_QWEN2_5_VL_72B_INSTRUCT_FREE = "qwen/qwen2.5-vl-72b-instruct:free",
MODEL_FREE_MISTRALAI_MISTRAL_SMALL_24B_INSTRUCT_2501_FREE = "mistralai/mistral-small-24b-instruct-2501:free",
MODEL_FREE_DEEPSEEK_DEEPSEEK_R1_DISTILL_QWEN_32B_FREE = "deepseek/deepseek-r1-distill-qwen-32b:free",
@ -52,9 +57,5 @@ export enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_MISTRALAI_MISTRAL_NEMO_FREE = "mistralai/mistral-nemo:free",
MODEL_FREE_GOOGLE_GEMMA_2_9B_IT_FREE = "google/gemma-2-9b-it:free",
MODEL_FREE_MISTRALAI_MISTRAL_7B_INSTRUCT_FREE = "mistralai/mistral-7b-instruct:free",
MODEL_FREE_MICROSOFT_PHI_3_MINI_128K_INSTRUCT_FREE = "microsoft/phi-3-mini-128k-instruct:free",
MODEL_FREE_MICROSOFT_PHI_3_MEDIUM_128K_INSTRUCT_FREE = "microsoft/phi-3-medium-128k-instruct:free",
MODEL_FREE_OPENCHAT_OPENCHAT_7B_FREE = "openchat/openchat-7b:free",
MODEL_FREE_UNDI95_TOPPY_M_7B_FREE = "undi95/toppy-m-7b:free",
MODEL_FREE_HUGGINGFACEH4_ZEPHYR_7B_BETA_FREE = "huggingfaceh4/zephyr-7b-beta:free"
}

View File

@ -1,10 +1,27 @@
export enum E_OPENROUTER_MODEL {
MODEL_OPENAI_O4_MINI_HIGH = "openai/o4-mini-high",
MODEL_OPENAI_O3 = "openai/o3",
MODEL_OPENAI_O4_MINI = "openai/o4-mini",
MODEL_SHISA_AI_SHISA_V2_LLAMA3_3_70B_FREE = "shisa-ai/shisa-v2-llama3.3-70b:free",
MODEL_QWEN_QWEN2_5_CODER_7B_INSTRUCT = "qwen/qwen2.5-coder-7b-instruct",
MODEL_OPENAI_GPT_4_1 = "openai/gpt-4.1",
MODEL_OPENAI_GPT_4_1_MINI = "openai/gpt-4.1-mini",
MODEL_OPENAI_GPT_4_1_NANO = "openai/gpt-4.1-nano",
MODEL_ELEUTHERAI_LLEMMA_7B = "eleutherai/llemma_7b",
MODEL_ALFREDPROS_CODELLAMA_7B_INSTRUCT_SOLIDITY = "alfredpros/codellama-7b-instruct-solidity",
MODEL_ARLIAI_QWQ_32B_ARLIAI_RPR_V1_FREE = "arliai/qwq-32b-arliai-rpr-v1:free",
MODEL_AGENTICA_ORG_DEEPCODER_14B_PREVIEW_FREE = "agentica-org/deepcoder-14b-preview:free",
MODEL_MOONSHOTAI_KIMI_VL_A3B_THINKING_FREE = "moonshotai/kimi-vl-a3b-thinking:free",
MODEL_X_AI_GROK_3_MINI_BETA = "x-ai/grok-3-mini-beta",
MODEL_X_AI_GROK_3_BETA = "x-ai/grok-3-beta",
MODEL_NVIDIA_LLAMA_3_1_NEMOTRON_NANO_8B_V1_FREE = "nvidia/llama-3.1-nemotron-nano-8b-v1:free",
MODEL_NVIDIA_LLAMA_3_3_NEMOTRON_SUPER_49B_V1_FREE = "nvidia/llama-3.3-nemotron-super-49b-v1:free",
MODEL_NVIDIA_LLAMA_3_1_NEMOTRON_ULTRA_253B_V1_FREE = "nvidia/llama-3.1-nemotron-ultra-253b-v1:free",
MODEL_META_LLAMA_LLAMA_4_MAVERICK_FREE = "meta-llama/llama-4-maverick:free",
MODEL_META_LLAMA_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick",
MODEL_META_LLAMA_LLAMA_4_SCOUT_FREE = "meta-llama/llama-4-scout:free",
MODEL_META_LLAMA_LLAMA_4_SCOUT = "meta-llama/llama-4-scout",
MODEL_GOOGLE_GEMINI_2_5_PRO_PREVIEW_03_25 = "google/gemini-2.5-pro-preview-03-25",
MODEL_OPENROUTER_QUASAR_ALPHA = "openrouter/quasar-alpha",
MODEL_ALL_HANDS_OPENHANDS_LM_32B_V0_1 = "all-hands/openhands-lm-32b-v0.1",
MODEL_MISTRAL_MINISTRAL_8B = "mistral/ministral-8b",
MODEL_DEEPSEEK_DEEPSEEK_V3_BASE_FREE = "deepseek/deepseek-v3-base:free",
@ -25,7 +42,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_OPEN_R1_OLYMPICCODER_7B_FREE = "open-r1/olympiccoder-7b:free",
MODEL_OPEN_R1_OLYMPICCODER_32B_FREE = "open-r1/olympiccoder-32b:free",
MODEL_STEELSKULL_L3_3_ELECTRA_R1_70B = "steelskull/l3.3-electra-r1-70b",
MODEL_ALLENAI_OLMO_2_0325_32B_INSTRUCT = "allenai/olmo-2-0325-32b-instruct",
MODEL_GOOGLE_GEMMA_3_1B_IT_FREE = "google/gemma-3-1b-it:free",
MODEL_GOOGLE_GEMMA_3_4B_IT_FREE = "google/gemma-3-4b-it:free",
MODEL_GOOGLE_GEMMA_3_4B_IT = "google/gemma-3-4b-it",
@ -36,7 +52,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_COHERE_COMMAND_A = "cohere/command-a",
MODEL_OPENAI_GPT_4O_MINI_SEARCH_PREVIEW = "openai/gpt-4o-mini-search-preview",
MODEL_OPENAI_GPT_4O_SEARCH_PREVIEW = "openai/gpt-4o-search-preview",
MODEL_TOKYOTECH_LLM_LLAMA_3_1_SWALLOW_70B_INSTRUCT_V0_3 = "tokyotech-llm/llama-3.1-swallow-70b-instruct-v0.3",
MODEL_REKAAI_REKA_FLASH_3_FREE = "rekaai/reka-flash-3:free",
MODEL_GOOGLE_GEMMA_3_27B_IT_FREE = "google/gemma-3-27b-it:free",
MODEL_GOOGLE_GEMMA_3_27B_IT = "google/gemma-3-27b-it",
@ -50,7 +65,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_DEEPSEEK_DEEPSEEK_R1_ZERO_FREE = "deepseek/deepseek-r1-zero:free",
MODEL_QWEN_QWQ_32B_FREE = "qwen/qwq-32b:free",
MODEL_QWEN_QWQ_32B = "qwen/qwq-32b",
MODEL_QWEN_QWEN2_5_32B_INSTRUCT = "qwen/qwen2.5-32b-instruct",
MODEL_MOONSHOTAI_MOONLIGHT_16B_A3B_INSTRUCT_FREE = "moonshotai/moonlight-16b-a3b-instruct:free",
MODEL_NOUSRESEARCH_DEEPHERMES_3_LLAMA_3_8B_PREVIEW_FREE = "nousresearch/deephermes-3-llama-3-8b-preview:free",
MODEL_OPENAI_GPT_4_5_PREVIEW = "openai/gpt-4.5-preview",
@ -66,7 +80,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_OPENAI_O3_MINI_HIGH = "openai/o3-mini-high",
MODEL_DEEPSEEK_DEEPSEEK_R1_DISTILL_LLAMA_8B = "deepseek/deepseek-r1-distill-llama-8b",
MODEL_GOOGLE_GEMINI_2_0_FLASH_001 = "google/gemini-2.0-flash-001",
MODEL_GOOGLE_GEMINI_2_0_PRO_EXP_02_05_FREE = "google/gemini-2.0-pro-exp-02-05:free",
MODEL_QWEN_QWEN_VL_PLUS = "qwen/qwen-vl-plus",
MODEL_AION_LABS_AION_1_0 = "aion-labs/aion-1.0",
MODEL_AION_LABS_AION_1_0_MINI = "aion-labs/aion-1.0-mini",
@ -209,9 +222,7 @@ export enum E_OPENROUTER_MODEL {
MODEL_MISTRALAI_MISTRAL_7B_INSTRUCT = "mistralai/mistral-7b-instruct",
MODEL_MISTRALAI_MISTRAL_7B_INSTRUCT_V0_3 = "mistralai/mistral-7b-instruct-v0.3",
MODEL_NOUSRESEARCH_HERMES_2_PRO_LLAMA_3_8B = "nousresearch/hermes-2-pro-llama-3-8b",
MODEL_MICROSOFT_PHI_3_MINI_128K_INSTRUCT_FREE = "microsoft/phi-3-mini-128k-instruct:free",
MODEL_MICROSOFT_PHI_3_MINI_128K_INSTRUCT = "microsoft/phi-3-mini-128k-instruct",
MODEL_MICROSOFT_PHI_3_MEDIUM_128K_INSTRUCT_FREE = "microsoft/phi-3-medium-128k-instruct:free",
MODEL_MICROSOFT_PHI_3_MEDIUM_128K_INSTRUCT = "microsoft/phi-3-medium-128k-instruct",
MODEL_NEVERSLEEP_LLAMA_3_LUMIMAID_70B = "neversleep/llama-3-lumimaid-70b",
MODEL_GOOGLE_GEMINI_FLASH_1_5 = "google/gemini-flash-1.5",
@ -252,16 +263,13 @@ export enum E_OPENROUTER_MODEL {
MODEL_COGNITIVECOMPUTATIONS_DOLPHIN_MIXTRAL_8X7B = "cognitivecomputations/dolphin-mixtral-8x7b",
MODEL_GOOGLE_GEMINI_PRO_VISION = "google/gemini-pro-vision",
MODEL_GOOGLE_GEMINI_PRO = "google/gemini-pro",
MODEL_MISTRALAI_MIXTRAL_8X7B = "mistralai/mixtral-8x7b",
MODEL_MISTRALAI_MIXTRAL_8X7B_INSTRUCT = "mistralai/mixtral-8x7b-instruct",
MODEL_OPENCHAT_OPENCHAT_7B_FREE = "openchat/openchat-7b:free",
MODEL_OPENCHAT_OPENCHAT_7B = "openchat/openchat-7b",
MODEL_NEVERSLEEP_NOROMAID_20B = "neversleep/noromaid-20b",
MODEL_ANTHROPIC_CLAUDE_2_1_BETA = "anthropic/claude-2.1:beta",
MODEL_ANTHROPIC_CLAUDE_2_1 = "anthropic/claude-2.1",
MODEL_ANTHROPIC_CLAUDE_2_BETA = "anthropic/claude-2:beta",
MODEL_ANTHROPIC_CLAUDE_2 = "anthropic/claude-2",
MODEL_UNDI95_TOPPY_M_7B_FREE = "undi95/toppy-m-7b:free",
MODEL_UNDI95_TOPPY_M_7B = "undi95/toppy-m-7b",
MODEL_ALPINDALE_GOLIATH_120B = "alpindale/goliath-120b",
MODEL_OPENROUTER_AUTO = "openrouter/auto",

View File

@ -25,6 +25,7 @@ export interface IKBotOptions {

01-ai/yi-large | paid
aetherwiing/mn-starcannon-12b | paid
agentica-org/deepcoder-14b-preview:free | free
ai21/jamba-1-5-large | paid
ai21/jamba-1-5-mini | paid
ai21/jamba-1.6-large | paid
@ -34,8 +35,8 @@ export interface IKBotOptions {
aion-labs/aion-1.0-mini | paid
aion-labs/aion-rp-llama-3.1-8b | paid
jondurbin/airoboros-l2-70b | paid
alfredpros/codellama-7b-instruct-solidity | paid
allenai/molmo-7b-d:free | free
allenai/olmo-2-0325-32b-instruct | paid
amazon/nova-lite-v1 | paid
amazon/nova-micro-v1 | paid
amazon/nova-pro-v1 | paid
@ -62,6 +63,7 @@ export interface IKBotOptions {
anthropic/claude-2.0:beta | paid
anthropic/claude-2.1 | paid
anthropic/claude-2.1:beta | paid
arliai/qwq-32b-arliai-rpr-v1:free | free
openrouter/auto | paid
bytedance-research/ui-tars-72b:free | free
cohere/command | paid
@ -93,6 +95,7 @@ export interface IKBotOptions {
cognitivecomputations/dolphin-mixtral-8x22b | paid
cognitivecomputations/dolphin3.0-mistral-24b:free | free
cognitivecomputations/dolphin3.0-r1-mistral-24b:free | free
eleutherai/llemma_7b | paid
eva-unit-01/eva-llama-3.33-70b | paid
eva-unit-01/eva-qwen-2.5-32b | paid
eva-unit-01/eva-qwen-2.5-72b | paid
@ -107,7 +110,6 @@ export interface IKBotOptions {
google/gemini-2.0-flash-lite-001 | paid
google/gemini-2.0-flash-thinking-exp-1219:free | free
google/gemini-2.0-flash-thinking-exp:free | free
google/gemini-2.0-pro-exp-02-05:free | free
google/gemini-2.5-pro-exp-03-25:free | free
google/gemini-2.5-pro-preview-03-25 | paid
google/gemini-pro | paid
@ -166,9 +168,7 @@ export interface IKBotOptions {
microsoft/phi-4 | paid
microsoft/phi-4-multimodal-instruct | paid
microsoft/phi-3-medium-128k-instruct | paid
microsoft/phi-3-medium-128k-instruct:free | free
microsoft/phi-3-mini-128k-instruct | paid
microsoft/phi-3-mini-128k-instruct:free | free
microsoft/phi-3.5-mini-128k-instruct | paid
sophosympatheia/midnight-rose-70b | paid
minimax/minimax-01 | paid
@ -196,11 +196,11 @@ export interface IKBotOptions {
mistralai/mistral-small-3.1-24b-instruct | paid
mistralai/mistral-small-3.1-24b-instruct:free | free
mistralai/mixtral-8x22b-instruct | paid
mistralai/mixtral-8x7b | paid
mistralai/mixtral-8x7b-instruct | paid
mistralai/pixtral-12b | paid
mistralai/pixtral-large-2411 | paid
mistralai/mistral-saba | paid
moonshotai/kimi-vl-a3b-thinking:free | free
moonshotai/moonlight-16b-a3b-instruct:free | free
gryphe/mythomax-l2-13b | paid
neversleep/llama-3-lumimaid-70b | paid
@ -217,6 +217,9 @@ export interface IKBotOptions {
nousresearch/hermes-2-pro-llama-3-8b | paid
nvidia/llama-3.1-nemotron-70b-instruct | paid
nvidia/llama-3.1-nemotron-70b-instruct:free | free
nvidia/llama-3.1-nemotron-nano-8b-v1:free | free
nvidia/llama-3.1-nemotron-ultra-253b-v1:free | free
nvidia/llama-3.3-nemotron-super-49b-v1:free | free
open-r1/olympiccoder-32b:free | free
open-r1/olympiccoder-7b:free | free
openai/chatgpt-4o-latest | paid
@ -233,6 +236,9 @@ export interface IKBotOptions {
openai/gpt-4-turbo | paid
openai/gpt-4-1106-preview | paid
openai/gpt-4-turbo-preview | paid
openai/gpt-4.1 | paid
openai/gpt-4.1-mini | paid
openai/gpt-4.1-nano | paid
openai/gpt-4.5-preview | paid
openai/gpt-4o | paid
openai/gpt-4o-2024-05-13 | paid
@ -249,10 +255,12 @@ export interface IKBotOptions {
openai/o1-preview | paid
openai/o1-preview-2024-09-12 | paid
openai/o1-pro | paid
openai/o3 | paid
openai/o3-mini | paid
openai/o3-mini-high | paid
openai/o4-mini | paid
openai/o4-mini-high | paid
openchat/openchat-7b | paid
openchat/openchat-7b:free | free
all-hands/openhands-lm-32b-v0.1 | paid
perplexity/llama-3.1-sonar-large-128k-online | paid
perplexity/llama-3.1-sonar-small-128k-online | paid
@ -263,14 +271,13 @@ export interface IKBotOptions {
perplexity/sonar-reasoning | paid
perplexity/sonar-reasoning-pro | paid
pygmalionai/mythalion-13b | paid
openrouter/quasar-alpha | paid
qwen/qwen-2-72b-instruct | paid
qwen/qwen-vl-max | paid
qwen/qwen-vl-plus | paid
qwen/qwen-max | paid
qwen/qwen-plus | paid
qwen/qwen-turbo | paid
qwen/qwen2.5-32b-instruct | paid
qwen/qwen2.5-coder-7b-instruct | paid
qwen/qwen2.5-vl-32b-instruct | paid
qwen/qwen2.5-vl-32b-instruct:free | free
qwen/qwen2.5-vl-3b-instruct:free | free
@ -299,13 +306,12 @@ export interface IKBotOptions {
sao10k/l3.1-70b-hanami-x1 | paid
sao10k/l3.1-euryale-70b | paid
sao10k/l3.3-euryale-70b | paid
shisa-ai/shisa-v2-llama3.3-70b:free | free
raifle/sorcererlm-8x22b | paid
steelskull/l3.3-electra-r1-70b | paid
tokyotech-llm/llama-3.1-swallow-70b-instruct-v0.3 | paid
thedrummer/anubis-pro-105b-v1 | paid
thedrummer/skyfall-36b-v2 | paid
undi95/toppy-m-7b | paid
undi95/toppy-m-7b:free | free
scb10x/llama3.1-typhoon2-70b-instruct | paid
scb10x/llama3.1-typhoon2-8b-instruct | paid
thedrummer/unslopnemo-12b | paid
@ -313,6 +319,8 @@ export interface IKBotOptions {
microsoft/wizardlm-2-8x22b | paid
x-ai/grok-2-1212 | paid
x-ai/grok-2-vision-1212 | paid
x-ai/grok-3-beta | paid
x-ai/grok-3-mini-beta | paid
x-ai/grok-beta | paid
x-ai/grok-vision-beta | paid
xwin-lm/xwin-lm-70b | paid
@ -337,6 +345,12 @@ export interface IKBotOptions {
gpt-4-turbo
gpt-4-turbo-2024-04-09
gpt-4-turbo-preview
gpt-4.1
gpt-4.1-2025-04-14
gpt-4.1-mini
gpt-4.1-mini-2025-04-14
gpt-4.1-nano
gpt-4.1-nano-2025-04-14
gpt-4.5-preview
gpt-4.5-preview-2025-02-27
gpt-4o
@ -372,6 +386,8 @@ export interface IKBotOptions {
o1-pro-2025-03-19
o3-mini
o3-mini-2025-01-31
o4-mini
o4-mini-2025-04-16
omni-moderation-2024-09-26
omni-moderation-latest
text-embedding-3-large