maintainence love:)

This commit is contained in:
lovebird 2025-04-21 14:48:15 +02:00
parent 20693330dd
commit 59673bb1a2
19 changed files with 1021 additions and 315 deletions

View File

@ -1,34 +1,264 @@
IyBLYk90IENMSSBUb29s
# @plastichub/kbot
##IEF2ZXJ3aWV3
AI-powered command-line tool for code modifications and project management that supports multiple AI models and routers.
S0JvdCBpcyBhIENMSSB0b29sIHRoYXQgaW50ZWdyYXRlcyB3aXRoIHZhcmlvdXMgQUkgbW9kZWxzIGFuZCB0b29scyB0byBwcm92aWRlIGR5bmFtaWMgaW50ZXJhY3Rpb25zIGJhc2VkIG9uIHVzZXIgcHJlZmVyZW5jZXMgYW5kIHByb2plY3QgY29udGV4dC4KCkNPbnRhaW5lZCBBIEFUIFZJRVcK
## Overview
```mermaid
graph TD
Start[Start Process] --> Setup[Setup Options]
Setup --> Messages[Complete Messages]
Messages --> Params[Construct Parameters]
Params --> Request[Execute Request]
Request --> Result[Process Result]
Code-bot is a powerful CLI tool that helps developers automate code modifications, handle project management tasks, and integrate with various AI models for intelligent code and content assistance.
## Quick Start
### Installation Steps
KBot requires Node.js to run. It's recommended to use Node.js version 18 or higher.
1. Visit the official [Node.js website](https://nodejs.org/)
2. Download the LTS (Long Term Support) version for your operating system
3. Follow the installation wizard
4. Verify installation by opening a terminal and running:
```bash
node --version
npm --version
```
### API Keys
KBot supports both OpenRouter and OpenAI APIs. You'll need at least one of these set up.
#### OpenRouter API (Recommended)
1. Visit [OpenRouter](https://openrouter.ai/)
2. Sign up for an account
3. Navigate to the API Keys section
4. Create a new API key
#### OpenAI API (Optional)
1. Go to [OpenAI's platform](https://platform.openai.com/)
2. Create an account or sign in
3. Navigate to API keys section
4. Create a new secret key
### Installation using Node NPM package manager
```bash
npm install -g @plastichub/kbot
```
##S2V5IE1vZHVsZXMK
## Configuration
LSBjb21wbGV0ZV9vcHRpb25zOiBDb25maWd1cmVzIGFuZCB2YWxpZGF0ZXMgb3B0aW9ucy4KLSBjb21wbGV0ZV9tZXNzYWdlczogR2F0aGVycyBtZXNzYWdlcyBmcm9tIGZpbGVzIGFuZCBlbnN1cmVzIGludGVyYWN0aW9uLg0tIGV4ZWN1dGVfcmVxdWVzdDogRGlzcGF0Y2hlcyByZXF1ZXN0cyBiYXNlZCBvbiBtb2RlIChjb21wbGV0aW9uLCB0b29scywgYXNzaXN0YW50KS4K
### API Keys Setup
##IEFyY2hpdGVjdHVyZSBEaWFncmFt
Create configuration at `$HOME/.osr/.config.json` (or export OSR_CONFIG with path to config.json):
```mermaid
graph TD
CLI[CLI Interface] -->|Uses| Core[Core Engine]
Core -->|Manages| Options[Options Processor]
Core -->|Handles| Messages[Message Aggregator]
Core -->|Dispatches| Request[Request Executor]
```json
{
"openrouter": {
"key": "your-openrouter-key"
},
"openai": {
"key": "your-openai-key"
},
"email": {
"newsletter": {
"host": "host.org",
"port": 465,
"debug": true,
"transactionLog": true,
"auth": {
"user": "foo@bar.com",
"pass": "pass"
}
}
},
"google": {
"cse": "custom search engine id",
"api_key": "google custom search api key"
},
"serpapi": {
"key": "your SerpAPI key (optional, used for web searches(places, google maps))"
},
"deepseek": {
"key": "your SerpAPI key (optional, used for web searches(places, google maps))"
},
}
```
##IFVzYWdl
### Preferences Setup
UnJ1biB0aGUgdG9vbCB1c2luZyB0aGUgZm9sbG93aW5nIG1haW4gY29tbWFuZDoK
Optionally, create `.kbot/preferences.md` in your project directory to customize AI interactions:
XG5gYmFzaCBrcGx1cyBtb2RpZnkgW3Byb21wdF0ge1Byb21wdCB9YFxuXG5Gb3IgbW9yZSBkZXRhaWxzLCBjaGVjayB0aGUgZG9jdW1lbnRhdGlvbiBvbiB0aGUgcHJvamVjdCB3ZWJzaXRlLgoKXG5raW5kIHJlZ2FyZHMsIGd1ZW50ZXIK
```markdown
## My Preferences
Gender : male
Location : New York, USA (eg: `send me all saunas next to me`)
Language : English
Occupation : software developer, Typescript
Age : 30+
## Contacts
My email address : example@email.com (eg: `send me latest hacker news`)
My wife's email address ("Anne") : example@email.com (eg: `send email to my wife, with latest local news')
## Content
When creating content
- always Markdown
- always add links
- when sending emails, always add 'Best regards, [Your Name]'
```
# Main Commands
The primary way to interact with `kbot` for processing tasks is by invoking it with a prompt and various options. While often used implicitly, this typically corresponds to the `run` command.
## Running Tasks
```bash
kbot run [options...] "Your prompt here..."
# or simply (if 'run' is the default):
kbot [options...] "Your prompt here..."
```
This command executes the main AI processing pipeline based on the provided prompt and options. Key aspects controlled by options include:
* **Input:** Specified via `--include` (files, directories, web URLs), `--path`.
* **Task:** Defined by the `--prompt`.
* **Behavior:** Controlled by `--mode` (e.g., `tools`, `completion`).
* **Output:** Directed using `--dst` or `--output`.
* **Model & API:** Configured with `--model`, `--router`, `--api_key`, etc.
Refer to [Parameters](./parameters.md) and [Modes](./modes.md) for detailed options.
## Utility Commands
Other potential utility commands might include:
* `kbot fetch`: Fetch updated information, such as the latest available models.
* `kbot init`: Initialize a directory or project for use with `kbot` (e.g., create default config files).
* `kbot help-md`: Generate extended help documentation in Markdown format.
* `kbot examples`: Show example usage patterns.
*(Note: Availability and exact behavior of utility commands may vary.)*
# Command Line Parameters
This document describes the command line parameters available for `kbot`.
**Note:** Many parameters support environment variable substitution (e.g., `${VAR_NAME}`).
## Core Parameters
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `prompt` | The main instruction or question for the AI. Can be a string, a file path (e.g., `file:./my_prompt.md`), or an environment variable. | - | Yes (or implied by context) |
| `model` | AI model ID to use for processing (e.g., `openai/gpt-4o`). See available models via helper functions or router documentation. | Depends on router/config | No |
| `router` | The API provider to use. | `openrouter` | No |
| `mode` | The operational mode. See [Modes](./modes.md) for details. | `tools` | No |
## Input & File Selection
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `path` | Target directory for local file operations or context. | `.` | No |
| `include` | Specify input files or content. Accepts comma-separated glob patterns (e.g., `src/**/*.ts`), file paths, directory paths, or **web URLs** (e.g., `https://example.com/page`). | `[]` | No |
| `query` | JSONPath query to extract specific data from input objects (often used with structured input files). | `null` | No |
## Output & Formatting
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `output` | Output path for modified files (primarily for `tools` mode operations like refactoring). | - | No |
| `dst` | Destination path/filename for the main result (primarily for `completion` or `assistant` mode). Supports `${MODEL_NAME}` and `${ROUTER}` substitutions. | - | No |
| `format` | Defines the desired structure for the AI's output. Can be a Zod schema object, a Zod schema string, a JSON schema string, or a path to a JSON schema file (e.g., `file:./schema.json`). Ensures the output conforms to the specified structure. | - | No |
| `filters` | Post-processing filters applied to the output (primarily `completion` mode with `--dst`). Can be a comma-separated string of filter names (e.g., `unwrapMarkdown,trim`). | `''` | No |
## Tool Usage
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `tools` | Comma-separated list of tool names or paths to custom tool files to enable. | (List of default tools) | No |
| `disable` | Comma-separated list of tool *categories* to disable (e.g., `filesystem,git`). | `[]` | No |
| `disableTools` | Comma-separated list of specific tool *names* to disable. | `[]` | No |
## Iteration & Advanced Control
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `each` | Iterate the task over multiple items. Accepts a GLOB pattern, path to a JSON file (array), or comma-separated strings. The current item is available as the `${ITEM}` variable in other parameters (e.g., `--dst="${ITEM}-output.md"`). Can be used to test different models (e.g., `--each="openai/gpt-3.5-turbo,openai/gpt-4o"`). | - | No |
| `variables` | Define custom key-value variables for use in prompts or other parameters (e.g., `--variables.PROJECT_NAME=MyProject`). Access via `${variableName}`. | `{}` | No |
## Configuration & Authentication
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `api_key` | Explicit API key for the selected router. Overrides keys from config files. | - | No |
| `baseURL` | Custom base URL for the API endpoint (e.g., for local LLMs via Ollama). Set automatically for known routers or can be specified directly. | - | No |
| `config` | Path to a JSON configuration file containing API keys and potentially other settings. | - | No |
| `profile` | Path to a profile file (JSON or .env format) for loading environment-specific variables. | - | No |
| `env` | Specifies the environment section to use within the profile file. | `default` | No |
| `preferences` | Path to a preferences file (e.g., containing user details like location, email). Used to provide context to the AI. | (System-specific default, often `~/.kbot/Preferences`) | No |
## Debugging & Logging
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| `logLevel` | Logging verbosity level (e.g., 0=error, 4=debug). | `4` | No |
| `logs` | Directory to store log files and temporary outputs (like `params.json`). | `./logs` | No |
| `dry` | Perform a dry run: log parameters and configurations without executing the AI request. | `false` | No |
| `dump` | Path to generate a script file representing the current command invocation. | - | No |
# Advanced Topics
This section covers more advanced usage patterns and concepts.
## Processing Multiple Items (`--each`)
Instead of relying on external scripting for batch processing, `kbot` provides the built-in `--each` parameter. This allows you to iterate a task over multiple inputs efficiently.
**How it Works:**
The `--each` parameter accepts:
* A comma-separated list of strings (e.g., `--each="file1.txt,file2.txt"`).
* A file path to a JSON file containing an array of strings.
* A GLOB pattern matching multiple files (e.g., `--each="./src/**/*.ts"`).
* A list of model IDs to test a prompt against different models (e.g., `--each="openai/gpt-4o,anthropic/claude-3.5-sonnet"`).
**Using the `${ITEM}` Variable:**
Within the loop initiated by `--each`, the current item being processed is available as the `${ITEM}` variable. You can use this variable in other parameters, such as `--dst`, `--include`, or within the `--prompt` itself.
**Example: Generating Documentation for Multiple Files**
```bash
kbot --each "./src/modules/*.ts" \
--dst "./docs/api/${ITEM}.md" \
--prompt "Generate API documentation in Markdown format for the module defined in ${ITEM}"
```
This command will:
1. Find all `.ts` files in `./src/modules/`.
2. For each file (e.g., `moduleA.ts`):
* Set `${ITEM}` to the file path (`./src/modules/moduleA.ts`).
* Execute `kbot` with the prompt, including the specific file via `${ITEM}`.
* Save the output to `./docs/api/./src/modules/moduleA.ts.md` (Note: path handling might vary).
Refer to the [Examples](./examples.md#iterating-with---each) for more use cases.
## Choosing a Transformation Method: `transform` vs. `createIterator`
When transforming data structures (often JSON) using LLMs, you have two primary approaches:
1. **`transform` Helper Function:**
* **Pros:** Simple, minimal setup, good for basic field transformations.
* **Cons:** Less control over network, caching, logging details.
* **Use Case:** Quickly applying straightforward transformations to data fields without needing deep customization.
2. **`createIterator` Factory:**
* **Pros:** Full control over network options (retries, concurrency), caching (namespace, expiration), logging, custom transformer logic, and callbacks (`onTransform`, `onTransformed`).
* **Cons:** More verbose setup required.
* **Use Case:** Complex transformations requiring fine-tuned control over the entire process, advanced caching strategies, or integration with custom logging/transformation logic.
Consult the [Iterator Documentation](./iterator.md) for detailed explanations and code examples of both methods.

View File

@ -1,5 +1,5 @@
{
"timestamp": 1744959640848,
"timestamp": 1745239682440,
"models": [
{
"id": "gpt-4o-audio-preview-2024-12-17",
@ -73,12 +73,6 @@
"created": 1692634615,
"owned_by": "system"
},
{
"id": "tts-1-hd-1106",
"object": "model",
"created": 1699053533,
"owned_by": "system"
},
{
"id": "gpt-4",
"object": "model",
@ -91,42 +85,12 @@
"created": 1671217299,
"owned_by": "openai-internal"
},
{
"id": "o1-2024-12-17",
"object": "model",
"created": 1734326976,
"owned_by": "system"
},
{
"id": "o1-pro-2025-03-19",
"object": "model",
"created": 1742251504,
"owned_by": "system"
},
{
"id": "o1",
"object": "model",
"created": 1734375816,
"owned_by": "system"
},
{
"id": "tts-1-hd",
"object": "model",
"created": 1699046015,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-audio-preview",
"object": "model",
"created": 1734387424,
"owned_by": "system"
},
{
"id": "o1-pro",
"object": "model",
"created": 1742251791,
"owned_by": "system"
},
{
"id": "gpt-4o-audio-preview",
"object": "model",
@ -139,6 +103,18 @@
"created": 1725648865,
"owned_by": "system"
},
{
"id": "o1-pro",
"object": "model",
"created": 1742251791,
"owned_by": "system"
},
{
"id": "o1-2024-12-17",
"object": "model",
"created": 1734326976,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-realtime-preview",
"object": "model",
@ -169,6 +145,12 @@
"created": 1741391161,
"owned_by": "system"
},
{
"id": "o1",
"object": "model",
"created": 1734375816,
"owned_by": "system"
},
{
"id": "gpt-4.1-mini-2025-04-14",
"object": "model",
@ -176,9 +158,9 @@
"owned_by": "system"
},
{
"id": "tts-1-1106",
"id": "o1-pro-2025-03-19",
"object": "model",
"created": 1699053241,
"created": 1742251504,
"owned_by": "system"
},
{
@ -229,12 +211,24 @@
"created": 1677610602,
"owned_by": "openai"
},
{
"id": "gpt-4-turbo-preview",
"object": "model",
"created": 1706037777,
"owned_by": "system"
},
{
"id": "gpt-4o-mini-search-preview-2025-03-11",
"object": "model",
"created": 1741390858,
"owned_by": "system"
},
{
"id": "gpt-4-0125-preview",
"object": "model",
"created": 1706037612,
"owned_by": "system"
},
{
"id": "gpt-4o-2024-11-20",
"object": "model",
@ -284,15 +278,15 @@
"owned_by": "system"
},
{
"id": "gpt-4-turbo-preview",
"id": "o3-mini",
"object": "model",
"created": 1706037777,
"created": 1737146383,
"owned_by": "system"
},
{
"id": "gpt-4-0125-preview",
"id": "o3-mini-2025-01-31",
"object": "model",
"created": 1706037612,
"created": 1738010200,
"owned_by": "system"
},
{
@ -307,12 +301,6 @@
"created": 1741388170,
"owned_by": "system"
},
{
"id": "tts-1",
"object": "model",
"created": 1681940951,
"owned_by": "openai-internal"
},
{
"id": "omni-moderation-2024-09-26",
"object": "model",
@ -331,6 +319,12 @@
"created": 1742403959,
"owned_by": "system"
},
{
"id": "tts-1-hd",
"object": "model",
"created": 1699046015,
"owned_by": "system"
},
{
"id": "gpt-4o",
"object": "model",
@ -338,15 +332,9 @@
"owned_by": "system"
},
{
"id": "o3-mini",
"id": "tts-1-hd-1106",
"object": "model",
"created": 1737146383,
"owned_by": "system"
},
{
"id": "o3-mini-2025-01-31",
"object": "model",
"created": 1738010200,
"created": 1699053533,
"owned_by": "system"
},
{
@ -415,12 +403,24 @@
"created": 1725648979,
"owned_by": "system"
},
{
"id": "tts-1",
"object": "model",
"created": 1681940951,
"owned_by": "openai-internal"
},
{
"id": "gpt-4-1106-preview",
"object": "model",
"created": 1698957206,
"owned_by": "system"
},
{
"id": "tts-1-1106",
"object": "model",
"created": 1699053241,
"owned_by": "system"
},
{
"id": "omni-moderation-latest",
"object": "model",

View File

@ -1,5 +1,5 @@
{
"timestamp": 1744959641066,
"timestamp": 1745239682608,
"models": [
{
"id": "google/gemini-2.5-pro-preview-03-25",
@ -2652,7 +2652,7 @@
"name": "Llama Guard 3 8B",
"created": 1739401318,
"description": "Llama Guard 3 is a Llama-3.1-8B pretrained model, fine-tuned for content safety classification. Similar to previous versions, it can be used to classify content in both LLM inputs (prompt classification) and in LLM responses (response classification). It acts as an LLM it generates text in its output that indicates whether a given prompt or response is safe or unsafe, and if unsafe, it also lists the content categories violated.\n\nLlama Guard 3 was aligned to safeguard against the MLCommons standardized hazards taxonomy and designed to support Llama 3.1 capabilities. Specifically, it provides content moderation in 8 languages, and was optimized to support safety and security for search and code interpreter tool calls.\n",
"context_length": 8192,
"context_length": 131072,
"architecture": {
"modality": "text->text",
"input_modalities": [
@ -2665,15 +2665,15 @@
"instruct_type": "none"
},
"pricing": {
"prompt": "0.0000002",
"completion": "0.0000002",
"prompt": "0.0000001",
"completion": "0.0000001",
"request": "0",
"image": "0",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 8192,
"context_length": 131072,
"max_completion_tokens": null,
"is_moderated": false
},
@ -3776,38 +3776,6 @@
},
"per_request_limits": null
},
{
"id": "sao10k/l3.1-70b-hanami-x1",
"name": "Sao10K: Llama 3.1 70B Hanami x1",
"created": 1736302854,
"description": "This is [Sao10K](/sao10k)'s experiment over [Euryale v2.2](/sao10k/l3.1-euryale-70b).",
"context_length": 16000,
"architecture": {
"modality": "text->text",
"input_modalities": [
"text"
],
"output_modalities": [
"text"
],
"tokenizer": "Llama3",
"instruct_type": null
},
"pricing": {
"prompt": "0.000003",
"completion": "0.000003",
"request": "0",
"image": "0",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 16000,
"max_completion_tokens": null,
"is_moderated": false
},
"per_request_limits": null
},
{
"id": "deepseek/deepseek-chat:free",
"name": "DeepSeek: DeepSeek V3 (free)",
@ -5606,7 +5574,7 @@
"name": "Meta: Llama 3.2 90B Vision Instruct",
"created": 1727222400,
"description": "The Llama 90B Vision model is a top-tier, 90-billion-parameter multimodal model designed for the most challenging visual reasoning and language tasks. It offers unparalleled accuracy in image captioning, visual question answering, and advanced image-text comprehension. Pre-trained on vast multimodal datasets and fine-tuned with human feedback, the Llama 90B Vision is engineered to handle the most demanding image-based AI tasks.\n\nThis model is perfect for industries requiring cutting-edge multimodal AI capabilities, particularly those dealing with complex, real-time visual and textual analysis.\n\nClick here for the [original model card](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/MODEL_CARD_VISION.md).\n\nUsage of this model is subject to [Meta's Acceptable Use Policy](https://www.llama.com/llama3/use-policy/).",
"context_length": 4096,
"context_length": 131072,
"architecture": {
"modality": "text+image->text",
"input_modalities": [
@ -5620,16 +5588,16 @@
"instruct_type": "llama3"
},
"pricing": {
"prompt": "0.0000008",
"completion": "0.0000016",
"prompt": "0.0000009",
"completion": "0.0000009",
"request": "0",
"image": "0.0051456",
"image": "0.001301",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 4096,
"max_completion_tokens": 4096,
"context_length": 131072,
"max_completion_tokens": null,
"is_moderated": false
},
"per_request_limits": null
@ -6480,6 +6448,38 @@
},
"per_request_limits": null
},
{
"id": "meta-llama/llama-3.1-405b:free",
"name": "Meta: Llama 3.1 405B (base) (free)",
"created": 1722556800,
"description": "Meta's latest class of model (Llama 3.1) launched with a variety of sizes & flavors. This is the base 405B pre-trained version.\n\nIt has demonstrated strong performance compared to leading closed-source models in human evaluations.\n\nTo read more about the model release, [click here](https://ai.meta.com/blog/meta-llama-3/). Usage of this model is subject to [Meta's Acceptable Use Policy](https://llama.meta.com/llama3/use-policy/).",
"context_length": 64000,
"architecture": {
"modality": "text->text",
"input_modalities": [
"text"
],
"output_modalities": [
"text"
],
"tokenizer": "Llama3",
"instruct_type": "none"
},
"pricing": {
"prompt": "0",
"completion": "0",
"request": "0",
"image": "0",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 64000,
"max_completion_tokens": null,
"is_moderated": false
},
"per_request_limits": null
},
{
"id": "meta-llama/llama-3.1-405b",
"name": "Meta: Llama 3.1 405B (base)",
@ -6645,7 +6645,7 @@
"name": "Meta: Llama 3.1 8B Instruct",
"created": 1721692800,
"description": "Meta's latest class of model (Llama 3.1) launched with a variety of sizes & flavors. This 8B instruct-tuned version is fast and efficient.\n\nIt has demonstrated strong performance compared to leading closed-source models in human evaluations.\n\nTo read more about the model release, [click here](https://ai.meta.com/blog/meta-llama-3-1/). Usage of this model is subject to [Meta's Acceptable Use Policy](https://llama.meta.com/llama3/use-policy/).",
"context_length": 131072,
"context_length": 16384,
"architecture": {
"modality": "text->text",
"input_modalities": [
@ -6659,15 +6659,15 @@
},
"pricing": {
"prompt": "0.00000002",
"completion": "0.000000045",
"completion": "0.00000003",
"request": "0",
"image": "0",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 131072,
"max_completion_tokens": 8192,
"context_length": 16384,
"max_completion_tokens": 16384,
"is_moderated": false
},
"per_request_limits": null
@ -6950,8 +6950,8 @@
"instruct_type": "chatml"
},
"pricing": {
"prompt": "0.0000015",
"completion": "0.00000225",
"prompt": "0.000004",
"completion": "0.000006",
"request": "0",
"image": "0",
"web_search": "0",
@ -6959,7 +6959,7 @@
},
"top_provider": {
"context_length": 16384,
"max_completion_tokens": 1024,
"max_completion_tokens": 4096,
"is_moderated": false
},
"per_request_limits": null
@ -7468,8 +7468,8 @@
"instruct_type": "llama3"
},
"pricing": {
"prompt": "0.000003375",
"completion": "0.0000045",
"prompt": "0.000004",
"completion": "0.000006",
"request": "0",
"image": "0",
"web_search": "0",
@ -7477,7 +7477,7 @@
},
"top_provider": {
"context_length": 8192,
"max_completion_tokens": 2048,
"max_completion_tokens": 4096,
"is_moderated": false
},
"per_request_limits": null
@ -9202,38 +9202,6 @@
},
"per_request_limits": null
},
{
"id": "xwin-lm/xwin-lm-70b",
"name": "Xwin 70B",
"created": 1697328000,
"description": "Xwin-LM aims to develop and open-source alignment tech for LLMs. Our first release, built-upon on the [Llama2](/models/${Model.Llama_2_13B_Chat}) base models, ranked TOP-1 on AlpacaEval. Notably, it's the first to surpass [GPT-4](/models/${Model.GPT_4}) on this benchmark. The project will be continuously updated.",
"context_length": 8192,
"architecture": {
"modality": "text->text",
"input_modalities": [
"text"
],
"output_modalities": [
"text"
],
"tokenizer": "Llama2",
"instruct_type": "airoboros"
},
"pricing": {
"prompt": "0.00000375",
"completion": "0.00000375",
"request": "0",
"image": "0",
"web_search": "0",
"internal_reasoning": "0"
},
"top_provider": {
"context_length": 8192,
"max_completion_tokens": 512,
"is_moderated": false
},
"per_request_limits": null
},
{
"id": "openai/gpt-3.5-turbo-instruct",
"name": "OpenAI: GPT-3.5 Turbo Instruct",

View File

@ -11,22 +11,21 @@ export declare enum E_OPENAI_MODEL {
MODEL_GPT_4O_REALTIME_PREVIEW_2024_10_01 = "gpt-4o-realtime-preview-2024-10-01",
MODEL_GPT_4O_REALTIME_PREVIEW = "gpt-4o-realtime-preview",
MODEL_BABBAGE_002 = "babbage-002",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_TTS_1_HD_1106 = "tts-1-hd-1106",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4 = "gpt-4",
MODEL_TEXT_EMBEDDING_ADA_002 = "text-embedding-ada-002",
MODEL_TTS_1_HD = "tts-1-hd",
MODEL_GPT_4O_MINI_AUDIO_PREVIEW = "gpt-4o-mini-audio-preview",
MODEL_GPT_4O_AUDIO_PREVIEW = "gpt-4o-audio-preview",
MODEL_O1_PREVIEW_2024_09_12 = "o1-preview-2024-09-12",
MODEL_O1_PRO = "o1-pro",
MODEL_O1_2024_12_17 = "o1-2024-12-17",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW = "gpt-4o-mini-realtime-preview",
MODEL_GPT_4_1_MINI = "gpt-4.1-mini",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-mini-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT_0914 = "gpt-3.5-turbo-instruct-0914",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW = "gpt-4o-mini-search-preview",
MODEL_O1 = "o1",
MODEL_GPT_4_1_MINI_2025_04_14 = "gpt-4.1-mini-2025-04-14",
MODEL_TTS_1_1106 = "tts-1-1106",
MODEL_O1_PRO_2025_03_19 = "o1-pro-2025-03-19",
MODEL_CHATGPT_4O_LATEST = "chatgpt-4o-latest",
MODEL_DAVINCI_002 = "davinci-002",
MODEL_GPT_3_5_TURBO_1106 = "gpt-3.5-turbo-1106",
@ -35,8 +34,9 @@ export declare enum E_OPENAI_MODEL {
MODEL_GPT_4O_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT = "gpt-3.5-turbo-instruct",
MODEL_GPT_3_5_TURBO = "gpt-3.5-turbo",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-mini-search-preview-2025-03-11",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4O_2024_11_20 = "gpt-4o-2024-11-20",
MODEL_WHISPER_1 = "whisper-1",
MODEL_GPT_4O_2024_05_13 = "gpt-4o-2024-05-13",
@ -44,20 +44,17 @@ export declare enum E_OPENAI_MODEL {
MODEL_GPT_3_5_TURBO_16K = "gpt-3.5-turbo-16k",
MODEL_O1_PREVIEW = "o1-preview",
MODEL_GPT_4_0613 = "gpt-4-0613",
MODEL_O1_2024_12_17 = "o1-2024-12-17",
MODEL_O1 = "o1",
MODEL_O1_PRO = "o1-pro",
MODEL_O1_PRO_2025_03_19 = "o1-pro-2025-03-19",
MODEL_GPT_4_5_PREVIEW = "gpt-4.5-preview",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_GPT_4_5_PREVIEW_2025_02_27 = "gpt-4.5-preview-2025-02-27",
MODEL_GPT_4O_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-search-preview-2025-03-11",
MODEL_TTS_1 = "tts-1",
MODEL_OMNI_MODERATION_2024_09_26 = "omni-moderation-2024-09-26",
MODEL_TEXT_EMBEDDING_3_SMALL = "text-embedding-3-small",
MODEL_GPT_4O_MINI_TTS = "gpt-4o-mini-tts",
MODEL_TTS_1_HD = "tts-1-hd",
MODEL_GPT_4O = "gpt-4o",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_TTS_1_HD_1106 = "tts-1-hd-1106",
MODEL_GPT_4O_MINI = "gpt-4o-mini",
MODEL_GPT_4O_2024_08_06 = "gpt-4o-2024-08-06",
MODEL_GPT_4_1 = "gpt-4.1",
@ -69,5 +66,8 @@ export declare enum E_OPENAI_MODEL {
MODEL_GPT_4O_MINI_AUDIO_PREVIEW_2024_12_17 = "gpt-4o-mini-audio-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_0125 = "gpt-3.5-turbo-0125",
MODEL_O1_MINI_2024_09_12 = "o1-mini-2024-09-12",
MODEL_TTS_1 = "tts-1",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_TTS_1_1106 = "tts-1-1106",
MODEL_OMNI_MODERATION_LATEST = "omni-moderation-latest"
}

View File

@ -12,22 +12,21 @@ export var E_OPENAI_MODEL;
E_OPENAI_MODEL["MODEL_GPT_4O_REALTIME_PREVIEW_2024_10_01"] = "gpt-4o-realtime-preview-2024-10-01";
E_OPENAI_MODEL["MODEL_GPT_4O_REALTIME_PREVIEW"] = "gpt-4o-realtime-preview";
E_OPENAI_MODEL["MODEL_BABBAGE_002"] = "babbage-002";
E_OPENAI_MODEL["MODEL_GPT_4_TURBO_PREVIEW"] = "gpt-4-turbo-preview";
E_OPENAI_MODEL["MODEL_TTS_1_HD_1106"] = "tts-1-hd-1106";
E_OPENAI_MODEL["MODEL_GPT_4_0125_PREVIEW"] = "gpt-4-0125-preview";
E_OPENAI_MODEL["MODEL_GPT_4"] = "gpt-4";
E_OPENAI_MODEL["MODEL_TEXT_EMBEDDING_ADA_002"] = "text-embedding-ada-002";
E_OPENAI_MODEL["MODEL_TTS_1_HD"] = "tts-1-hd";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_AUDIO_PREVIEW"] = "gpt-4o-mini-audio-preview";
E_OPENAI_MODEL["MODEL_GPT_4O_AUDIO_PREVIEW"] = "gpt-4o-audio-preview";
E_OPENAI_MODEL["MODEL_O1_PREVIEW_2024_09_12"] = "o1-preview-2024-09-12";
E_OPENAI_MODEL["MODEL_O1_PRO"] = "o1-pro";
E_OPENAI_MODEL["MODEL_O1_2024_12_17"] = "o1-2024-12-17";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_REALTIME_PREVIEW"] = "gpt-4o-mini-realtime-preview";
E_OPENAI_MODEL["MODEL_GPT_4_1_MINI"] = "gpt-4.1-mini";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_REALTIME_PREVIEW_2024_12_17"] = "gpt-4o-mini-realtime-preview-2024-12-17";
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO_INSTRUCT_0914"] = "gpt-3.5-turbo-instruct-0914";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_SEARCH_PREVIEW"] = "gpt-4o-mini-search-preview";
E_OPENAI_MODEL["MODEL_O1"] = "o1";
E_OPENAI_MODEL["MODEL_GPT_4_1_MINI_2025_04_14"] = "gpt-4.1-mini-2025-04-14";
E_OPENAI_MODEL["MODEL_TTS_1_1106"] = "tts-1-1106";
E_OPENAI_MODEL["MODEL_O1_PRO_2025_03_19"] = "o1-pro-2025-03-19";
E_OPENAI_MODEL["MODEL_CHATGPT_4O_LATEST"] = "chatgpt-4o-latest";
E_OPENAI_MODEL["MODEL_DAVINCI_002"] = "davinci-002";
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO_1106"] = "gpt-3.5-turbo-1106";
@ -36,8 +35,9 @@ export var E_OPENAI_MODEL;
E_OPENAI_MODEL["MODEL_GPT_4O_REALTIME_PREVIEW_2024_12_17"] = "gpt-4o-realtime-preview-2024-12-17";
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO_INSTRUCT"] = "gpt-3.5-turbo-instruct";
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO"] = "gpt-3.5-turbo";
E_OPENAI_MODEL["MODEL_GPT_4_1106_PREVIEW"] = "gpt-4-1106-preview";
E_OPENAI_MODEL["MODEL_GPT_4_TURBO_PREVIEW"] = "gpt-4-turbo-preview";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_SEARCH_PREVIEW_2025_03_11"] = "gpt-4o-mini-search-preview-2025-03-11";
E_OPENAI_MODEL["MODEL_GPT_4_0125_PREVIEW"] = "gpt-4-0125-preview";
E_OPENAI_MODEL["MODEL_GPT_4O_2024_11_20"] = "gpt-4o-2024-11-20";
E_OPENAI_MODEL["MODEL_WHISPER_1"] = "whisper-1";
E_OPENAI_MODEL["MODEL_GPT_4O_2024_05_13"] = "gpt-4o-2024-05-13";
@ -45,20 +45,17 @@ export var E_OPENAI_MODEL;
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO_16K"] = "gpt-3.5-turbo-16k";
E_OPENAI_MODEL["MODEL_O1_PREVIEW"] = "o1-preview";
E_OPENAI_MODEL["MODEL_GPT_4_0613"] = "gpt-4-0613";
E_OPENAI_MODEL["MODEL_O1_2024_12_17"] = "o1-2024-12-17";
E_OPENAI_MODEL["MODEL_O1"] = "o1";
E_OPENAI_MODEL["MODEL_O1_PRO"] = "o1-pro";
E_OPENAI_MODEL["MODEL_O1_PRO_2025_03_19"] = "o1-pro-2025-03-19";
E_OPENAI_MODEL["MODEL_GPT_4_5_PREVIEW"] = "gpt-4.5-preview";
E_OPENAI_MODEL["MODEL_O3_MINI"] = "o3-mini";
E_OPENAI_MODEL["MODEL_O3_MINI_2025_01_31"] = "o3-mini-2025-01-31";
E_OPENAI_MODEL["MODEL_GPT_4_5_PREVIEW_2025_02_27"] = "gpt-4.5-preview-2025-02-27";
E_OPENAI_MODEL["MODEL_GPT_4O_SEARCH_PREVIEW_2025_03_11"] = "gpt-4o-search-preview-2025-03-11";
E_OPENAI_MODEL["MODEL_TTS_1"] = "tts-1";
E_OPENAI_MODEL["MODEL_OMNI_MODERATION_2024_09_26"] = "omni-moderation-2024-09-26";
E_OPENAI_MODEL["MODEL_TEXT_EMBEDDING_3_SMALL"] = "text-embedding-3-small";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_TTS"] = "gpt-4o-mini-tts";
E_OPENAI_MODEL["MODEL_TTS_1_HD"] = "tts-1-hd";
E_OPENAI_MODEL["MODEL_GPT_4O"] = "gpt-4o";
E_OPENAI_MODEL["MODEL_O3_MINI"] = "o3-mini";
E_OPENAI_MODEL["MODEL_O3_MINI_2025_01_31"] = "o3-mini-2025-01-31";
E_OPENAI_MODEL["MODEL_TTS_1_HD_1106"] = "tts-1-hd-1106";
E_OPENAI_MODEL["MODEL_GPT_4O_MINI"] = "gpt-4o-mini";
E_OPENAI_MODEL["MODEL_GPT_4O_2024_08_06"] = "gpt-4o-2024-08-06";
E_OPENAI_MODEL["MODEL_GPT_4_1"] = "gpt-4.1";
@ -70,6 +67,9 @@ export var E_OPENAI_MODEL;
E_OPENAI_MODEL["MODEL_GPT_4O_MINI_AUDIO_PREVIEW_2024_12_17"] = "gpt-4o-mini-audio-preview-2024-12-17";
E_OPENAI_MODEL["MODEL_GPT_3_5_TURBO_0125"] = "gpt-3.5-turbo-0125";
E_OPENAI_MODEL["MODEL_O1_MINI_2024_09_12"] = "o1-mini-2024-09-12";
E_OPENAI_MODEL["MODEL_TTS_1"] = "tts-1";
E_OPENAI_MODEL["MODEL_GPT_4_1106_PREVIEW"] = "gpt-4-1106-preview";
E_OPENAI_MODEL["MODEL_TTS_1_1106"] = "tts-1-1106";
E_OPENAI_MODEL["MODEL_OMNI_MODERATION_LATEST"] = "omni-moderation-latest";
})(E_OPENAI_MODEL || (E_OPENAI_MODEL = {}));
//# sourceMappingURL=data:application/json;base64,eyJ2ZXJzaW9uIjozLCJmaWxlIjoib3BlbmFpLW1vZGVscy5qcyIsInNvdXJjZVJvb3QiOiIiLCJzb3VyY2VzIjpbIi4uLy4uLy4uL3NyYy9tb2RlbHMvY2FjaGUvb3BlbmFpLW1vZGVscy50cyJdLCJuYW1lcyI6W10sIm1hcHBpbmdzIjoiQUFBQSxNQUFNLENBQU4sSUFBWSxjQXdFWDtBQXhFRCxXQUFZLGNBQWM7SUFDeEIsMkZBQXlFLENBQUE7SUFDekUsNkNBQTJCLENBQUE7SUFDM0IseUVBQXVELENBQUE7SUFDdkQsNkNBQTJCLENBQUE7SUFDM0IsaUVBQStDLENBQUE7SUFDL0MsMkZBQXlFLENBQUE7SUFDekUsMkNBQXlCLENBQUE7SUFDekIscURBQW1DLENBQUE7SUFDbkMsMkVBQXlELENBQUE7SUFDekQsaUdBQStFLENBQUE7SUFDL0UsMkVBQXlELENBQUE7SUFDekQsbURBQWlDLENBQUE7SUFDakMsbUVBQWlELENBQUE7SUFDakQsdURBQXFDLENBQUE7SUFDckMsaUVBQStDLENBQUE7SUFDL0MsdUNBQXFCLENBQUE7SUFDckIseUVBQXVELENBQUE7SUFDdkQsNkNBQTJCLENBQUE7SUFDM0IsK0VBQTZELENBQUE7SUFDN0QscUVBQW1ELENBQUE7SUFDbkQsdUVBQXFELENBQUE7SUFDckQscUZBQW1FLENBQUE7SUFDbkUscURBQW1DLENBQUE7SUFDbkMsMkdBQXlGLENBQUE7SUFDekYsbUZBQWlFLENBQUE7SUFDakUsaUZBQStELENBQUE7SUFDL0QsMkVBQXlELENBQUE7SUFDekQsaURBQStCLENBQUE7SUFDL0IsK0RBQTZDLENBQUE7SUFDN0MsbURBQWlDLENBQUE7SUFDakMsaUVBQStDLENBQUE7SUFDL0MsdUVBQXFELENBQUE7SUFDckQsbURBQWlDLENBQUE7SUFDakMsaUdBQStFLENBQUE7SUFDL0UseUVBQXVELENBQUE7SUFDdkQsdURBQXFDLENBQUE7SUFDckMsaUVBQStDLENBQUE7SUFDL0MsdUdBQXFGLENBQUE7SUFDckYsK0RBQTZDLENBQUE7SUFDN0MsK0NBQTZCLENBQUE7SUFDN0IsK0RBQTZDLENBQUE7SUFDN0MseUVBQXVELENBQUE7SUFDdkQsK0RBQTZDLENBQUE7SUFDN0MsaURBQStCLENBQUE7SUFDL0IsaURBQStCLENBQUE7SUFDL0IsdURBQXFDLENBQUE7SUFDckMsaUNBQWUsQ0FBQTtJQUNmLHlDQUF1QixDQUFBO0lBQ3ZCLCtEQUE2QyxDQUFBO0lBQzdDLDJEQUF5QyxDQUFBO0lBQ3pDLGlGQUErRCxDQUFBO0lBQy9ELDZGQUEyRSxDQUFBO0lBQzNFLHVDQUFxQixDQUFBO0lBQ3JCLGlGQUErRCxDQUFBO0lBQy9ELHlFQUF1RCxDQUFBO0lBQ3ZELDJEQUF5QyxDQUFBO0lBQ3pDLHlDQUF1QixDQUFBO0lBQ3ZCLDJDQUF5QixDQUFBO0lBQ3pCLGlFQUErQyxDQUFBO0lBQy9DLG1EQUFpQyxDQUFBO0lBQ2pDLCtEQUE2QyxDQUFBO0lBQzdDLDJDQUF5QixDQUFBO0lBQ3pCLCtEQUE2QyxDQUFBO0lBQzdDLGlFQUErQyxDQUFBO0lBQy9DLHlFQUF1RCxDQUFBO0lBQ3ZELHlFQUF1RCxDQUFBO0lBQ3ZELDJDQUF5QixDQUFBO0lBQ3pCLHFHQUFtRixDQUFBO0lBQ25GLGlFQUErQyxDQUFBO0lBQy9DLGlFQUErQyxDQUFBO0lBQy9DLHlFQUF1RCxDQUFBO0FBQ3pELENBQUMsRUF4RVcsY0FBYyxLQUFkLGNBQWMsUUF3RXpCIn0=
//# sourceMappingURL=data:application/json;base64,eyJ2ZXJzaW9uIjozLCJmaWxlIjoib3BlbmFpLW1vZGVscy5qcyIsInNvdXJjZVJvb3QiOiIiLCJzb3VyY2VzIjpbIi4uLy4uLy4uL3NyYy9tb2RlbHMvY2FjaGUvb3BlbmFpLW1vZGVscy50cyJdLCJuYW1lcyI6W10sIm1hcHBpbmdzIjoiQUFBQSxNQUFNLENBQU4sSUFBWSxjQXdFWDtBQXhFRCxXQUFZLGNBQWM7SUFDeEIsMkZBQXlFLENBQUE7SUFDekUsNkNBQTJCLENBQUE7SUFDM0IseUVBQXVELENBQUE7SUFDdkQsNkNBQTJCLENBQUE7SUFDM0IsaUVBQStDLENBQUE7SUFDL0MsMkZBQXlFLENBQUE7SUFDekUsMkNBQXlCLENBQUE7SUFDekIscURBQW1DLENBQUE7SUFDbkMsMkVBQXlELENBQUE7SUFDekQsaUdBQStFLENBQUE7SUFDL0UsMkVBQXlELENBQUE7SUFDekQsbURBQWlDLENBQUE7SUFDakMsdUNBQXFCLENBQUE7SUFDckIseUVBQXVELENBQUE7SUFDdkQsK0VBQTZELENBQUE7SUFDN0QscUVBQW1ELENBQUE7SUFDbkQsdUVBQXFELENBQUE7SUFDckQseUNBQXVCLENBQUE7SUFDdkIsdURBQXFDLENBQUE7SUFDckMscUZBQW1FLENBQUE7SUFDbkUscURBQW1DLENBQUE7SUFDbkMsMkdBQXlGLENBQUE7SUFDekYsbUZBQWlFLENBQUE7SUFDakUsaUZBQStELENBQUE7SUFDL0QsaUNBQWUsQ0FBQTtJQUNmLDJFQUF5RCxDQUFBO0lBQ3pELCtEQUE2QyxDQUFBO0lBQzdDLCtEQUE2QyxDQUFBO0lBQzdDLG1EQUFpQyxDQUFBO0lBQ2pDLGlFQUErQyxDQUFBO0lBQy9DLHVFQUFxRCxDQUFBO0lBQ3JELG1EQUFpQyxDQUFBO0lBQ2pDLGlHQUErRSxDQUFBO0lBQy9FLHlFQUF1RCxDQUFBO0lBQ3ZELHVEQUFxQyxDQUFBO0lBQ3JDLG1FQUFpRCxDQUFBO0lBQ2pELHVHQUFxRixDQUFBO0lBQ3JGLGlFQUErQyxDQUFBO0lBQy9DLCtEQUE2QyxDQUFBO0lBQzdDLCtDQUE2QixDQUFBO0lBQzdCLCtEQUE2QyxDQUFBO0lBQzdDLHlFQUF1RCxDQUFBO0lBQ3ZELCtEQUE2QyxDQUFBO0lBQzdDLGlEQUErQixDQUFBO0lBQy9CLGlEQUErQixDQUFBO0lBQy9CLDJEQUF5QyxDQUFBO0lBQ3pDLDJDQUF5QixDQUFBO0lBQ3pCLGlFQUErQyxDQUFBO0lBQy9DLGlGQUErRCxDQUFBO0lBQy9ELDZGQUEyRSxDQUFBO0lBQzNFLGlGQUErRCxDQUFBO0lBQy9ELHlFQUF1RCxDQUFBO0lBQ3ZELDJEQUF5QyxDQUFBO0lBQ3pDLDZDQUEyQixDQUFBO0lBQzNCLHlDQUF1QixDQUFBO0lBQ3ZCLHVEQUFxQyxDQUFBO0lBQ3JDLG1EQUFpQyxDQUFBO0lBQ2pDLCtEQUE2QyxDQUFBO0lBQzdDLDJDQUF5QixDQUFBO0lBQ3pCLCtEQUE2QyxDQUFBO0lBQzdDLGlFQUErQyxDQUFBO0lBQy9DLHlFQUF1RCxDQUFBO0lBQ3ZELHlFQUF1RCxDQUFBO0lBQ3ZELDJDQUF5QixDQUFBO0lBQ3pCLHFHQUFtRixDQUFBO0lBQ25GLGlFQUErQyxDQUFBO0lBQy9DLGlFQUErQyxDQUFBO0lBQy9DLHVDQUFxQixDQUFBO0lBQ3JCLGlFQUErQyxDQUFBO0lBQy9DLGlEQUErQixDQUFBO0lBQy9CLHlFQUF1RCxDQUFBO0FBQ3pELENBQUMsRUF4RVcsY0FBYyxLQUFkLGNBQWMsUUF3RXpCIn0=

View File

@ -1,4 +1,6 @@
export declare enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_THUDM_GLM_Z1_32B_FREE = "thudm/glm-z1-32b:free",
MODEL_FREE_THUDM_GLM_4_32B_FREE = "thudm/glm-4-32b:free",
MODEL_FREE_SHISA_AI_SHISA_V2_LLAMA3_3_70B_FREE = "shisa-ai/shisa-v2-llama3.3-70b:free",
MODEL_FREE_ARLIAI_QWQ_32B_ARLIAI_RPR_V1_FREE = "arliai/qwq-32b-arliai-rpr-v1:free",
MODEL_FREE_AGENTICA_ORG_DEEPCODER_14B_PREVIEW_FREE = "agentica-org/deepcoder-14b-preview:free",
@ -53,6 +55,7 @@ export declare enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_QWEN_QWEN_2_5_72B_INSTRUCT_FREE = "qwen/qwen-2.5-72b-instruct:free",
MODEL_FREE_QWEN_QWEN_2_5_VL_7B_INSTRUCT_FREE = "qwen/qwen-2.5-vl-7b-instruct:free",
MODEL_FREE_GOOGLE_GEMINI_FLASH_1_5_8B_EXP = "google/gemini-flash-1.5-8b-exp",
MODEL_FREE_META_LLAMA_LLAMA_3_1_405B_FREE = "meta-llama/llama-3.1-405b:free",
MODEL_FREE_META_LLAMA_LLAMA_3_1_8B_INSTRUCT_FREE = "meta-llama/llama-3.1-8b-instruct:free",
MODEL_FREE_MISTRALAI_MISTRAL_NEMO_FREE = "mistralai/mistral-nemo:free",
MODEL_FREE_GOOGLE_GEMMA_2_9B_IT_FREE = "google/gemma-2-9b-it:free",

View File

@ -1,5 +1,7 @@
export var E_OPENROUTER_MODEL_FREE;
(function (E_OPENROUTER_MODEL_FREE) {
E_OPENROUTER_MODEL_FREE["MODEL_FREE_THUDM_GLM_Z1_32B_FREE"] = "thudm/glm-z1-32b:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_THUDM_GLM_4_32B_FREE"] = "thudm/glm-4-32b:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_SHISA_AI_SHISA_V2_LLAMA3_3_70B_FREE"] = "shisa-ai/shisa-v2-llama3.3-70b:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_ARLIAI_QWQ_32B_ARLIAI_RPR_V1_FREE"] = "arliai/qwq-32b-arliai-rpr-v1:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_AGENTICA_ORG_DEEPCODER_14B_PREVIEW_FREE"] = "agentica-org/deepcoder-14b-preview:free";
@ -54,10 +56,11 @@ export var E_OPENROUTER_MODEL_FREE;
E_OPENROUTER_MODEL_FREE["MODEL_FREE_QWEN_QWEN_2_5_72B_INSTRUCT_FREE"] = "qwen/qwen-2.5-72b-instruct:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_QWEN_QWEN_2_5_VL_7B_INSTRUCT_FREE"] = "qwen/qwen-2.5-vl-7b-instruct:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_GOOGLE_GEMINI_FLASH_1_5_8B_EXP"] = "google/gemini-flash-1.5-8b-exp";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_META_LLAMA_LLAMA_3_1_405B_FREE"] = "meta-llama/llama-3.1-405b:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_META_LLAMA_LLAMA_3_1_8B_INSTRUCT_FREE"] = "meta-llama/llama-3.1-8b-instruct:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_MISTRALAI_MISTRAL_NEMO_FREE"] = "mistralai/mistral-nemo:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_GOOGLE_GEMMA_2_9B_IT_FREE"] = "google/gemma-2-9b-it:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_MISTRALAI_MISTRAL_7B_INSTRUCT_FREE"] = "mistralai/mistral-7b-instruct:free";
E_OPENROUTER_MODEL_FREE["MODEL_FREE_HUGGINGFACEH4_ZEPHYR_7B_BETA_FREE"] = "huggingfaceh4/zephyr-7b-beta:free";
})(E_OPENROUTER_MODEL_FREE || (E_OPENROUTER_MODEL_FREE = {}));
//# sourceMappingURL=data:application/json;base64,eyJ2ZXJzaW9uIjozLCJmaWxlIjoib3BlbnJvdXRlci1tb2RlbHMtZnJlZS5qcyIsInNvdXJjZVJvb3QiOiIiLCJzb3VyY2VzIjpbIi4uLy4uLy4uL3NyYy9tb2RlbHMvY2FjaGUvb3BlbnJvdXRlci1tb2RlbHMtZnJlZS50cyJdLCJuYW1lcyI6W10sIm1hcHBpbmdzIjoiQUFBQSxNQUFNLENBQU4sSUFBWSx1QkE0RFg7QUE1REQsV0FBWSx1QkFBdUI7SUFDakMsaUhBQXNGLENBQUE7SUFDdEYsNkdBQWtGLENBQUE7SUFDbEYseUhBQThGLENBQUE7SUFDOUYsbUhBQXdGLENBQUE7SUFDeEYsNkhBQWtHLENBQUE7SUFDbEcsaUlBQXNHLENBQUE7SUFDdEcsbUlBQXdHLENBQUE7SUFDeEcsMkdBQWdGLENBQUE7SUFDaEYscUdBQTBFLENBQUE7SUFDMUUsdUdBQTRFLENBQUE7SUFDNUUseUZBQThELENBQUE7SUFDOUQsaUhBQXNGLENBQUE7SUFDdEYsMkdBQWdGLENBQUE7SUFDaEYsbUhBQXdGLENBQUE7SUFDeEYsNkdBQWtGLENBQUE7SUFDbEYsaUhBQXNGLENBQUE7SUFDdEYsaUdBQXNFLENBQUE7SUFDdEUscUlBQTBHLENBQUE7SUFDMUcsbUdBQXdFLENBQUE7SUFDeEUscUdBQTBFLENBQUE7SUFDMUUsNkZBQWtFLENBQUE7SUFDbEUsNkZBQWtFLENBQUE7SUFDbEUsK0ZBQW9FLENBQUE7SUFDcEUsMkZBQWdFLENBQUE7SUFDaEUsK0ZBQW9FLENBQUE7SUFDcEUsdUdBQTRFLENBQUE7SUFDNUUsNkVBQWtELENBQUE7SUFDbEQsK0hBQW9HLENBQUE7SUFDcEcsNklBQWtILENBQUE7SUFDbEgsbUpBQXdILENBQUE7SUFDeEgsNklBQWtILENBQUE7SUFDbEgsNkdBQWtGLENBQUE7SUFDbEYsdUlBQTRHLENBQUE7SUFDNUcsK0hBQW9HLENBQUE7SUFDcEcsK0hBQW9HLENBQUE7SUFDcEcsaUlBQXNHLENBQUE7SUFDdEcsNkhBQWtHLENBQUE7SUFDbEcsNkZBQWtFLENBQUE7SUFDbEUsNkhBQWtHLENBQUE7SUFDbEcsaUdBQXNFLENBQUE7SUFDdEUsdUlBQTRHLENBQUE7SUFDNUcsMkdBQWdGLENBQUE7SUFDaEYsdUhBQTRGLENBQUE7SUFDNUYsNkZBQWtFLENBQUE7SUFDbEUsMkhBQWdHLENBQUE7SUFDaEcscUhBQTBGLENBQUE7SUFDMUYsdUdBQTRFLENBQUE7SUFDNUUsaUlBQXNHLENBQUE7SUFDdEcscUhBQTBGLENBQUE7SUFDMUYscUhBQTBGLENBQUE7SUFDMUYscUlBQTBHLENBQUE7SUFDMUcseUdBQThFLENBQUE7SUFDOUUsNkdBQWtGLENBQUE7SUFDbEYsdUdBQTRFLENBQUE7SUFDNUUscUhBQTBGLENBQUE7SUFDMUYsaUdBQXNFLENBQUE7SUFDdEUsNkZBQWtFLENBQUE7SUFDbEUsK0dBQW9GLENBQUE7SUFDcEYsNkdBQWtGLENBQUE7QUFDcEYsQ0FBQyxFQTVEVyx1QkFBdUIsS0FBdkIsdUJBQXVCLFFBNERsQyJ9
//# sourceMappingURL=data:application/json;base64,eyJ2ZXJzaW9uIjozLCJmaWxlIjoib3BlbnJvdXRlci1tb2RlbHMtZnJlZS5qcyIsInNvdXJjZVJvb3QiOiIiLCJzb3VyY2VzIjpbIi4uLy4uLy4uL3NyYy9tb2RlbHMvY2FjaGUvb3BlbnJvdXRlci1tb2RlbHMtZnJlZS50cyJdLCJuYW1lcyI6W10sIm1hcHBpbmdzIjoiQUFBQSxNQUFNLENBQU4sSUFBWSx1QkErRFg7QUEvREQsV0FBWSx1QkFBdUI7SUFDakMscUZBQTBELENBQUE7SUFDMUQsbUZBQXdELENBQUE7SUFDeEQsaUhBQXNGLENBQUE7SUFDdEYsNkdBQWtGLENBQUE7SUFDbEYseUhBQThGLENBQUE7SUFDOUYsbUhBQXdGLENBQUE7SUFDeEYsNkhBQWtHLENBQUE7SUFDbEcsaUlBQXNHLENBQUE7SUFDdEcsbUlBQXdHLENBQUE7SUFDeEcsMkdBQWdGLENBQUE7SUFDaEYscUdBQTBFLENBQUE7SUFDMUUsdUdBQTRFLENBQUE7SUFDNUUseUZBQThELENBQUE7SUFDOUQsaUhBQXNGLENBQUE7SUFDdEYsMkdBQWdGLENBQUE7SUFDaEYsbUhBQXdGLENBQUE7SUFDeEYsNkdBQWtGLENBQUE7SUFDbEYsaUhBQXNGLENBQUE7SUFDdEYsaUdBQXNFLENBQUE7SUFDdEUscUlBQTBHLENBQUE7SUFDMUcsbUdBQXdFLENBQUE7SUFDeEUscUdBQTBFLENBQUE7SUFDMUUsNkZBQWtFLENBQUE7SUFDbEUsNkZBQWtFLENBQUE7SUFDbEUsK0ZBQW9FLENBQUE7SUFDcEUsMkZBQWdFLENBQUE7SUFDaEUsK0ZBQW9FLENBQUE7SUFDcEUsdUdBQTRFLENBQUE7SUFDNUUsNkVBQWtELENBQUE7SUFDbEQsK0hBQW9HLENBQUE7SUFDcEcsNklBQWtILENBQUE7SUFDbEgsbUpBQXdILENBQUE7SUFDeEgsNklBQWtILENBQUE7SUFDbEgsNkdBQWtGLENBQUE7SUFDbEYsdUlBQTRHLENBQUE7SUFDNUcsK0hBQW9HLENBQUE7SUFDcEcsK0hBQW9HLENBQUE7SUFDcEcsaUlBQXNHLENBQUE7SUFDdEcsNkhBQWtHLENBQUE7SUFDbEcsNkZBQWtFLENBQUE7SUFDbEUsNkhBQWtHLENBQUE7SUFDbEcsaUdBQXNFLENBQUE7SUFDdEUsdUlBQTRHLENBQUE7SUFDNUcsMkdBQWdGLENBQUE7SUFDaEYsdUhBQTRGLENBQUE7SUFDNUYsNkZBQWtFLENBQUE7SUFDbEUsMkhBQWdHLENBQUE7SUFDaEcscUhBQTBGLENBQUE7SUFDMUYsdUdBQTRFLENBQUE7SUFDNUUsaUlBQXNHLENBQUE7SUFDdEcscUhBQTBGLENBQUE7SUFDMUYscUhBQTBGLENBQUE7SUFDMUYscUlBQTBHLENBQUE7SUFDMUcseUdBQThFLENBQUE7SUFDOUUsNkdBQWtGLENBQUE7SUFDbEYsdUdBQTRFLENBQUE7SUFDNUUsdUdBQTRFLENBQUE7SUFDNUUscUhBQTBGLENBQUE7SUFDMUYsaUdBQXNFLENBQUE7SUFDdEUsNkZBQWtFLENBQUE7SUFDbEUsK0dBQW9GLENBQUE7SUFDcEYsNkdBQWtGLENBQUE7QUFDcEYsQ0FBQyxFQS9EVyx1QkFBdUIsS0FBdkIsdUJBQXVCLFFBK0RsQyJ9

View File

@ -1,4 +1,9 @@
export declare enum E_OPENROUTER_MODEL {
MODEL_GOOGLE_GEMINI_2_5_PRO_PREVIEW_03_25 = "google/gemini-2.5-pro-preview-03-25",
MODEL_THUDM_GLM_Z1_32B_FREE = "thudm/glm-z1-32b:free",
MODEL_THUDM_GLM_4_32B_FREE = "thudm/glm-4-32b:free",
MODEL_GOOGLE_GEMINI_2_5_FLASH_PREVIEW = "google/gemini-2.5-flash-preview",
MODEL_GOOGLE_GEMINI_2_5_FLASH_PREVIEW_THINKING = "google/gemini-2.5-flash-preview:thinking",
MODEL_OPENAI_O4_MINI_HIGH = "openai/o4-mini-high",
MODEL_OPENAI_O3 = "openai/o3",
MODEL_OPENAI_O4_MINI = "openai/o4-mini",
@ -21,7 +26,6 @@ export declare enum E_OPENROUTER_MODEL {
MODEL_META_LLAMA_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick",
MODEL_META_LLAMA_LLAMA_4_SCOUT_FREE = "meta-llama/llama-4-scout:free",
MODEL_META_LLAMA_LLAMA_4_SCOUT = "meta-llama/llama-4-scout",
MODEL_GOOGLE_GEMINI_2_5_PRO_PREVIEW_03_25 = "google/gemini-2.5-pro-preview-03-25",
MODEL_ALL_HANDS_OPENHANDS_LM_32B_V0_1 = "all-hands/openhands-lm-32b-v0.1",
MODEL_MISTRAL_MINISTRAL_8B = "mistral/ministral-8b",
MODEL_DEEPSEEK_DEEPSEEK_V3_BASE_FREE = "deepseek/deepseek-v3-base:free",
@ -111,7 +115,6 @@ export declare enum E_OPENROUTER_MODEL {
MODEL_MINIMAX_MINIMAX_01 = "minimax/minimax-01",
MODEL_MISTRALAI_CODESTRAL_2501 = "mistralai/codestral-2501",
MODEL_MICROSOFT_PHI_4 = "microsoft/phi-4",
MODEL_SAO10K_L3_1_70B_HANAMI_X1 = "sao10k/l3.1-70b-hanami-x1",
MODEL_DEEPSEEK_DEEPSEEK_CHAT_FREE = "deepseek/deepseek-chat:free",
MODEL_DEEPSEEK_DEEPSEEK_CHAT = "deepseek/deepseek-chat",
MODEL_GOOGLE_GEMINI_2_0_FLASH_THINKING_EXP_1219_FREE = "google/gemini-2.0-flash-thinking-exp-1219:free",
@ -194,6 +197,7 @@ export declare enum E_OPENROUTER_MODEL {
MODEL_SAO10K_L3_LUNARIS_8B = "sao10k/l3-lunaris-8b",
MODEL_AETHERWIING_MN_STARCANNON_12B = "aetherwiing/mn-starcannon-12b",
MODEL_OPENAI_GPT_4O_2024_08_06 = "openai/gpt-4o-2024-08-06",
MODEL_META_LLAMA_LLAMA_3_1_405B_FREE = "meta-llama/llama-3.1-405b:free",
MODEL_META_LLAMA_LLAMA_3_1_405B = "meta-llama/llama-3.1-405b",
MODEL_NOTHINGIISREAL_MN_CELESTE_12B = "nothingiisreal/mn-celeste-12b",
MODEL_PERPLEXITY_LLAMA_3_1_SONAR_SMALL_128K_ONLINE = "perplexity/llama-3.1-sonar-small-128k-online",
@ -278,7 +282,6 @@ export declare enum E_OPENROUTER_MODEL {
MODEL_GOOGLE_PALM_2_CHAT_BISON_32K = "google/palm-2-chat-bison-32k",
MODEL_GOOGLE_PALM_2_CODECHAT_BISON_32K = "google/palm-2-codechat-bison-32k",
MODEL_JONDURBIN_AIROBOROS_L2_70B = "jondurbin/airoboros-l2-70b",
MODEL_XWIN_LM_XWIN_LM_70B = "xwin-lm/xwin-lm-70b",
MODEL_OPENAI_GPT_3_5_TURBO_INSTRUCT = "openai/gpt-3.5-turbo-instruct",
MODEL_MISTRALAI_MISTRAL_7B_INSTRUCT_V0_1 = "mistralai/mistral-7b-instruct-v0.1",
MODEL_PYGMALIONAI_MYTHALION_13B = "pygmalionai/mythalion-13b",

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,12 +1,12 @@
{
"name": "@plastichub/kbot",
"version": "1.1.26",
"version": "1.1.27",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@plastichub/kbot",
"version": "1.1.26",
"version": "1.1.27",
"license": "ISC",
"dependencies": {
"node-emoji": "^2.2.0"

View File

@ -1,6 +1,6 @@
{
"name": "@plastichub/kbot",
"version": "1.1.26",
"version": "1.1.27",
"main": "main_node.js",
"author": "",
"license": "ISC",

View File

@ -11,22 +11,21 @@ export enum E_OPENAI_MODEL {
MODEL_GPT_4O_REALTIME_PREVIEW_2024_10_01 = "gpt-4o-realtime-preview-2024-10-01",
MODEL_GPT_4O_REALTIME_PREVIEW = "gpt-4o-realtime-preview",
MODEL_BABBAGE_002 = "babbage-002",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_TTS_1_HD_1106 = "tts-1-hd-1106",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4 = "gpt-4",
MODEL_TEXT_EMBEDDING_ADA_002 = "text-embedding-ada-002",
MODEL_TTS_1_HD = "tts-1-hd",
MODEL_GPT_4O_MINI_AUDIO_PREVIEW = "gpt-4o-mini-audio-preview",
MODEL_GPT_4O_AUDIO_PREVIEW = "gpt-4o-audio-preview",
MODEL_O1_PREVIEW_2024_09_12 = "o1-preview-2024-09-12",
MODEL_O1_PRO = "o1-pro",
MODEL_O1_2024_12_17 = "o1-2024-12-17",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW = "gpt-4o-mini-realtime-preview",
MODEL_GPT_4_1_MINI = "gpt-4.1-mini",
MODEL_GPT_4O_MINI_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-mini-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT_0914 = "gpt-3.5-turbo-instruct-0914",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW = "gpt-4o-mini-search-preview",
MODEL_O1 = "o1",
MODEL_GPT_4_1_MINI_2025_04_14 = "gpt-4.1-mini-2025-04-14",
MODEL_TTS_1_1106 = "tts-1-1106",
MODEL_O1_PRO_2025_03_19 = "o1-pro-2025-03-19",
MODEL_CHATGPT_4O_LATEST = "chatgpt-4o-latest",
MODEL_DAVINCI_002 = "davinci-002",
MODEL_GPT_3_5_TURBO_1106 = "gpt-3.5-turbo-1106",
@ -35,8 +34,9 @@ export enum E_OPENAI_MODEL {
MODEL_GPT_4O_REALTIME_PREVIEW_2024_12_17 = "gpt-4o-realtime-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_INSTRUCT = "gpt-3.5-turbo-instruct",
MODEL_GPT_3_5_TURBO = "gpt-3.5-turbo",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_GPT_4_TURBO_PREVIEW = "gpt-4-turbo-preview",
MODEL_GPT_4O_MINI_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-mini-search-preview-2025-03-11",
MODEL_GPT_4_0125_PREVIEW = "gpt-4-0125-preview",
MODEL_GPT_4O_2024_11_20 = "gpt-4o-2024-11-20",
MODEL_WHISPER_1 = "whisper-1",
MODEL_GPT_4O_2024_05_13 = "gpt-4o-2024-05-13",
@ -44,20 +44,17 @@ export enum E_OPENAI_MODEL {
MODEL_GPT_3_5_TURBO_16K = "gpt-3.5-turbo-16k",
MODEL_O1_PREVIEW = "o1-preview",
MODEL_GPT_4_0613 = "gpt-4-0613",
MODEL_O1_2024_12_17 = "o1-2024-12-17",
MODEL_O1 = "o1",
MODEL_O1_PRO = "o1-pro",
MODEL_O1_PRO_2025_03_19 = "o1-pro-2025-03-19",
MODEL_GPT_4_5_PREVIEW = "gpt-4.5-preview",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_GPT_4_5_PREVIEW_2025_02_27 = "gpt-4.5-preview-2025-02-27",
MODEL_GPT_4O_SEARCH_PREVIEW_2025_03_11 = "gpt-4o-search-preview-2025-03-11",
MODEL_TTS_1 = "tts-1",
MODEL_OMNI_MODERATION_2024_09_26 = "omni-moderation-2024-09-26",
MODEL_TEXT_EMBEDDING_3_SMALL = "text-embedding-3-small",
MODEL_GPT_4O_MINI_TTS = "gpt-4o-mini-tts",
MODEL_TTS_1_HD = "tts-1-hd",
MODEL_GPT_4O = "gpt-4o",
MODEL_O3_MINI = "o3-mini",
MODEL_O3_MINI_2025_01_31 = "o3-mini-2025-01-31",
MODEL_TTS_1_HD_1106 = "tts-1-hd-1106",
MODEL_GPT_4O_MINI = "gpt-4o-mini",
MODEL_GPT_4O_2024_08_06 = "gpt-4o-2024-08-06",
MODEL_GPT_4_1 = "gpt-4.1",
@ -69,5 +66,8 @@ export enum E_OPENAI_MODEL {
MODEL_GPT_4O_MINI_AUDIO_PREVIEW_2024_12_17 = "gpt-4o-mini-audio-preview-2024-12-17",
MODEL_GPT_3_5_TURBO_0125 = "gpt-3.5-turbo-0125",
MODEL_O1_MINI_2024_09_12 = "o1-mini-2024-09-12",
MODEL_TTS_1 = "tts-1",
MODEL_GPT_4_1106_PREVIEW = "gpt-4-1106-preview",
MODEL_TTS_1_1106 = "tts-1-1106",
MODEL_OMNI_MODERATION_LATEST = "omni-moderation-latest"
}

View File

@ -1,4 +1,6 @@
export enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_THUDM_GLM_Z1_32B_FREE = "thudm/glm-z1-32b:free",
MODEL_FREE_THUDM_GLM_4_32B_FREE = "thudm/glm-4-32b:free",
MODEL_FREE_SHISA_AI_SHISA_V2_LLAMA3_3_70B_FREE = "shisa-ai/shisa-v2-llama3.3-70b:free",
MODEL_FREE_ARLIAI_QWQ_32B_ARLIAI_RPR_V1_FREE = "arliai/qwq-32b-arliai-rpr-v1:free",
MODEL_FREE_AGENTICA_ORG_DEEPCODER_14B_PREVIEW_FREE = "agentica-org/deepcoder-14b-preview:free",
@ -53,6 +55,7 @@ export enum E_OPENROUTER_MODEL_FREE {
MODEL_FREE_QWEN_QWEN_2_5_72B_INSTRUCT_FREE = "qwen/qwen-2.5-72b-instruct:free",
MODEL_FREE_QWEN_QWEN_2_5_VL_7B_INSTRUCT_FREE = "qwen/qwen-2.5-vl-7b-instruct:free",
MODEL_FREE_GOOGLE_GEMINI_FLASH_1_5_8B_EXP = "google/gemini-flash-1.5-8b-exp",
MODEL_FREE_META_LLAMA_LLAMA_3_1_405B_FREE = "meta-llama/llama-3.1-405b:free",
MODEL_FREE_META_LLAMA_LLAMA_3_1_8B_INSTRUCT_FREE = "meta-llama/llama-3.1-8b-instruct:free",
MODEL_FREE_MISTRALAI_MISTRAL_NEMO_FREE = "mistralai/mistral-nemo:free",
MODEL_FREE_GOOGLE_GEMMA_2_9B_IT_FREE = "google/gemma-2-9b-it:free",

View File

@ -1,4 +1,9 @@
export enum E_OPENROUTER_MODEL {
MODEL_GOOGLE_GEMINI_2_5_PRO_PREVIEW_03_25 = "google/gemini-2.5-pro-preview-03-25",
MODEL_THUDM_GLM_Z1_32B_FREE = "thudm/glm-z1-32b:free",
MODEL_THUDM_GLM_4_32B_FREE = "thudm/glm-4-32b:free",
MODEL_GOOGLE_GEMINI_2_5_FLASH_PREVIEW = "google/gemini-2.5-flash-preview",
MODEL_GOOGLE_GEMINI_2_5_FLASH_PREVIEW_THINKING = "google/gemini-2.5-flash-preview:thinking",
MODEL_OPENAI_O4_MINI_HIGH = "openai/o4-mini-high",
MODEL_OPENAI_O3 = "openai/o3",
MODEL_OPENAI_O4_MINI = "openai/o4-mini",
@ -21,7 +26,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_META_LLAMA_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick",
MODEL_META_LLAMA_LLAMA_4_SCOUT_FREE = "meta-llama/llama-4-scout:free",
MODEL_META_LLAMA_LLAMA_4_SCOUT = "meta-llama/llama-4-scout",
MODEL_GOOGLE_GEMINI_2_5_PRO_PREVIEW_03_25 = "google/gemini-2.5-pro-preview-03-25",
MODEL_ALL_HANDS_OPENHANDS_LM_32B_V0_1 = "all-hands/openhands-lm-32b-v0.1",
MODEL_MISTRAL_MINISTRAL_8B = "mistral/ministral-8b",
MODEL_DEEPSEEK_DEEPSEEK_V3_BASE_FREE = "deepseek/deepseek-v3-base:free",
@ -111,7 +115,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_MINIMAX_MINIMAX_01 = "minimax/minimax-01",
MODEL_MISTRALAI_CODESTRAL_2501 = "mistralai/codestral-2501",
MODEL_MICROSOFT_PHI_4 = "microsoft/phi-4",
MODEL_SAO10K_L3_1_70B_HANAMI_X1 = "sao10k/l3.1-70b-hanami-x1",
MODEL_DEEPSEEK_DEEPSEEK_CHAT_FREE = "deepseek/deepseek-chat:free",
MODEL_DEEPSEEK_DEEPSEEK_CHAT = "deepseek/deepseek-chat",
MODEL_GOOGLE_GEMINI_2_0_FLASH_THINKING_EXP_1219_FREE = "google/gemini-2.0-flash-thinking-exp-1219:free",
@ -194,6 +197,7 @@ export enum E_OPENROUTER_MODEL {
MODEL_SAO10K_L3_LUNARIS_8B = "sao10k/l3-lunaris-8b",
MODEL_AETHERWIING_MN_STARCANNON_12B = "aetherwiing/mn-starcannon-12b",
MODEL_OPENAI_GPT_4O_2024_08_06 = "openai/gpt-4o-2024-08-06",
MODEL_META_LLAMA_LLAMA_3_1_405B_FREE = "meta-llama/llama-3.1-405b:free",
MODEL_META_LLAMA_LLAMA_3_1_405B = "meta-llama/llama-3.1-405b",
MODEL_NOTHINGIISREAL_MN_CELESTE_12B = "nothingiisreal/mn-celeste-12b",
MODEL_PERPLEXITY_LLAMA_3_1_SONAR_SMALL_128K_ONLINE = "perplexity/llama-3.1-sonar-small-128k-online",
@ -278,7 +282,6 @@ export enum E_OPENROUTER_MODEL {
MODEL_GOOGLE_PALM_2_CHAT_BISON_32K = "google/palm-2-chat-bison-32k",
MODEL_GOOGLE_PALM_2_CODECHAT_BISON_32K = "google/palm-2-codechat-bison-32k",
MODEL_JONDURBIN_AIROBOROS_L2_70B = "jondurbin/airoboros-l2-70b",
MODEL_XWIN_LM_XWIN_LM_70B = "xwin-lm/xwin-lm-70b",
MODEL_OPENAI_GPT_3_5_TURBO_INSTRUCT = "openai/gpt-3.5-turbo-instruct",
MODEL_MISTRALAI_MISTRAL_7B_INSTRUCT_V0_1 = "mistralai/mistral-7b-instruct-v0.1",
MODEL_PYGMALIONAI_MYTHALION_13B = "pygmalionai/mythalion-13b",

View File

@ -38,6 +38,16 @@ interface BaseCrawlOptions {
includeExternal?: boolean; // Corresponds to --include-external
maxPages?: number; // Corresponds to --max-pages
stream?: boolean; // Corresponds to --stream
// --- New Config Options ---
headless?: boolean; // Defaults to true (Python default), set false for --no-headless
userAgent?: string; // Corresponds to --user-agent
textMode?: boolean; // Corresponds to --text-mode
lightMode?: boolean; // Corresponds to --light-mode
waitFor?: string; // Corresponds to --wait-for
screenshot?: boolean; // Corresponds to --screenshot
pdf?: boolean; // Corresponds to --pdf
jsCode?: string; // Corresponds to --js-code
wordCountThreshold?: number; // Corresponds to --word-count-threshold
}
// Options specific to JSON output mode (schema is now path)
@ -80,7 +90,17 @@ export async function crawlAndExtract(url: string, options: CrawlOptions): Promi
maxDepth,
includeExternal,
maxPages,
stream
stream,
// --- Destructure New Config Options ---
headless,
userAgent,
textMode,
lightMode,
waitFor,
screenshot,
pdf,
jsCode,
wordCountThreshold
} = options;
if (!url) {
@ -148,6 +168,37 @@ export async function crawlAndExtract(url: string, options: CrawlOptions): Promi
}
}
// --- Add New Config Arguments ---
// Note: headless defaults to true in Python script via argparse.
// Only add --no-headless if the user explicitly sets headless: false.
if (headless === false) {
args.push('--no-headless');
}
if (userAgent) {
args.push('--user-agent', userAgent);
}
if (textMode) {
args.push('--text-mode');
}
if (lightMode) {
args.push('--light-mode');
}
if (waitFor) {
args.push('--wait-for', waitFor);
}
if (screenshot) {
args.push('--screenshot');
}
if (pdf) {
args.push('--pdf');
}
if (jsCode) {
args.push('--js-code', jsCode);
}
if (wordCountThreshold !== undefined && wordCountThreshold !== null) {
args.push('--word-count-threshold', String(wordCountThreshold));
}
return new Promise(async (resolve, reject) => {
console.log(`Spawning: ${pythonExecutable} ${args.join(' ')} (output to: ${tempFilePath})`);
const env = { ...process.env, PYTHONIOENCODING: 'UTF-8' };

View File

@ -30,7 +30,10 @@ def load_json_config(config_path):
async def main(url, schema_path, strategy_type, output_mode, output_file,
browser_config_path, crawler_config_path, bypass_cache, verbose,
# --- Add Deep Crawl Arguments ---
deep_crawl_strategy_name, max_depth, include_external, max_pages, stream_results):
deep_crawl_strategy_name, max_depth, include_external, max_pages, stream_results,
# --- Add New Config Arguments ---
headless, user_agent, text_mode, light_mode,
wait_for, screenshot, pdf, js_code, word_count_threshold):
output_dir = os.path.dirname(output_file)
if output_dir and not os.path.exists(output_dir):
@ -45,6 +48,22 @@ async def main(url, schema_path, strategy_type, output_mode, output_file,
# --- Prepare CrawlerRunConfig arguments ---
run_config_kwargs = crawler_config.copy() # Start with crawler config file content
# --- Apply relevant CLI args to CrawlerRunConfig ---
if wait_for:
run_config_kwargs['wait_for'] = wait_for
if screenshot:
run_config_kwargs['screenshot'] = True
if pdf:
run_config_kwargs['pdf'] = True
if js_code:
run_config_kwargs['js_code'] = js_code
if word_count_threshold is not None:
run_config_kwargs['word_count_threshold'] = word_count_threshold
# --- Set default wait_until if not provided ---
if 'wait_until' not in run_config_kwargs:
run_config_kwargs['wait_until'] = 'networkidle'
# Set cache mode ONLY if bypassCache is true
if bypass_cache:
run_config_kwargs['cache_mode'] = CacheMode.BYPASS
@ -120,6 +139,17 @@ async def main(url, schema_path, strategy_type, output_mode, output_file,
crawler_kwargs = browser_config.copy() # Start with browser config file content
crawler_kwargs['verbose'] = verbose
# --- Apply relevant CLI args to BrowserConfig/AsyncWebCrawler ---
# Note: headless default is True in argparse, so we only override if False
if headless is False:
crawler_kwargs['headless'] = False
if user_agent:
crawler_kwargs['user_agent'] = user_agent
if text_mode:
crawler_kwargs['text_mode'] = True
if light_mode:
crawler_kwargs['light_mode'] = True
try:
# Create crawler instance
# Only pass kwargs if the loaded config was not empty
@ -301,6 +331,59 @@ if __name__ == "__main__":
help='Process and potentially output results as they become available (streaming).'
)
# --- Add New Config Arguments to Parser ---
parser.add_argument(
'--no-headless',
action='store_false',
dest='headless', # Sets args.headless to False if flag exists
help='Run browser in visible mode (default is headless).'
# Default is implicitly True because action is store_false
)
parser.add_argument(
'--user-agent',
type=str,
default=None,
help='Specify a custom User-Agent string.'
)
parser.add_argument(
'--text-mode',
action='store_true',
help='Enable text mode (disables images, etc.).'
)
parser.add_argument(
'--light-mode',
action='store_true',
help='Enable light mode (performance optimization).'
)
parser.add_argument(
'--wait-for',
type=str,
default=None,
help='CSS selector or JS expression to wait for before extraction.'
)
parser.add_argument(
'--screenshot',
action='store_true',
help='Capture a screenshot of the page.'
)
parser.add_argument(
'--pdf',
action='store_true',
help='Generate a PDF of the page.'
)
parser.add_argument(
'--js-code',
type=str,
default=None,
help='JavaScript code to execute on the page before extraction.'
)
parser.add_argument(
'--word-count-threshold',
type=int,
default=None,
help='Minimum word count for markdown content blocks.'
)
args = parser.parse_args()
if args.output_mode == 'json' and not args.schema_path:
@ -324,5 +407,15 @@ if __name__ == "__main__":
max_depth=args.max_depth,
include_external=args.include_external,
max_pages=args.max_pages,
stream_results=args.stream_results
stream_results=args.stream_results,
# Pass new config args
headless=args.headless,
user_agent=args.user_agent,
text_mode=args.text_mode,
light_mode=args.light_mode,
wait_for=args.wait_for,
screenshot=args.screenshot,
pdf=args.pdf,
js_code=args.js_code,
word_count_threshold=args.word_count_threshold
))