1. Persistence Setup: Optimized the dockerfile to manually create the /root/.config/conn/.folder file
pointing to /config. This avoids running the conn command during the build process and ensures a
cleaner setup.
2. Copilot UI Fix: Resolved a double-escaping bug in the terminal bottom bar. Device prompts (like
6WIND-PE1>) will now render correctly instead of showing HTML entities like >.
3. AI Model Update: Updated the default engineer model in connpy/ai.py to
gemini/gemini-3.1-flash-lite, removing the deprecated -preview suffix.
4. Standardized Timeouts: Unified all default timeouts to 20 seconds across the board. This includes
direct execution (run/test), modern playbooks (v2), and classic task-based playbooks (v1).
5. Documentation Update: Regenerated the full documentation site in the docs/ directory using pdoc to
reflect the latest changes.
6. Cleanup: Removed all debug prints from connpy/core.py and restored the docker/logs/.gitignore
file.
This commit is contained in:
@@ -0,0 +1,22 @@
|
||||
.git
|
||||
__pycache__
|
||||
*.pyc
|
||||
*.pyo
|
||||
*.pyd
|
||||
.pytest_cache
|
||||
.venv
|
||||
venv
|
||||
env
|
||||
node_modules
|
||||
dist
|
||||
build
|
||||
*.egg-info
|
||||
docker
|
||||
docker-compose.yml
|
||||
.gemini
|
||||
.github
|
||||
docs
|
||||
scratch
|
||||
testall
|
||||
testremote
|
||||
automation-template.yaml
|
||||
@@ -164,3 +164,7 @@ connpy_roadmap.md
|
||||
MULTI_USER_PLAN.md
|
||||
COPILOT_PLAN.md
|
||||
ARCHITECTURAL_DEBT_REFACTOR.md
|
||||
|
||||
#themes
|
||||
nord.yml
|
||||
theme.py
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
include LICENSE
|
||||
include README.md
|
||||
include requirements.txt
|
||||
recursive-include connpy/core_plugins *
|
||||
recursive-include connpy/proto *
|
||||
recursive-include connpy/grpc *.proto
|
||||
recursive-exclude * __pycache__
|
||||
recursive-exclude * *.py[co]
|
||||
@@ -9,519 +9,182 @@
|
||||
[](https://github.com/fluzzi/connpy/blob/main/LICENSE)
|
||||
[](https://pypi.org/pypi/connpy/)
|
||||
|
||||
Connpy is a SSH, SFTP, Telnet, kubectl, Docker pod, and AWS SSM connection manager and automation module for Linux, Mac, and Docker.
|
||||
**Connpy** is a powerful Connection Manager and Network Automation Platform for Linux, Mac, and Docker. It provides a unified interface for **SSH, SFTP, Telnet, kubectl, Docker pods, and AWS SSM**.
|
||||
|
||||
The v6 release introduces the **AI Copilot**, an interactive terminal assistant that understands your network context and helps you manage your infrastructure more intelligently.
|
||||
|
||||
|
||||
## 🤖 AI Copilot (New in v6)
|
||||
The AI Copilot is deeply integrated into your terminal workflow:
|
||||
- **Terminal Context Awareness**: The Copilot can "see" your screen output, helping you diagnose errors or analyze command results in real-time.
|
||||
- **Hybrid Multi-Agent System**: Automatically escalates complex tasks between the **Network Engineer** (execution) and the **Network Architect** (strategy).
|
||||
- **MCP Integration**: Dynamically load tools from external providers (6WIND, AWS, etc.) via the Model Context Protocol.
|
||||
- **Interactive Chat**: Launch with `conn ai` for a collaborative troubleshooting session.
|
||||
|
||||
|
||||
## Core Features
|
||||
- **Multi-Protocol**: Native support for SSH, SFTP, Telnet, kubectl, Docker exec, and AWS SSM.
|
||||
- **Context Management**: Set regex-based contexts to manage specific nodes across different environments (work, home, clients).
|
||||
- **Advanced Inventory**:
|
||||
- Organize nodes in folders (`@folder`) and subfolders (`@subfolder@folder`).
|
||||
- Use Global Profiles (`@profilename`) to manage shared credentials easily.
|
||||
- Bulk creation, copying, moving, and export/import of nodes.
|
||||
- **Modern UI**: High-performance terminal experience with `prompt-toolkit`, including:
|
||||
- Fuzzy search integration with `fzf`.
|
||||
- Advanced tab completion.
|
||||
- Syntax highlighting and customizable themes.
|
||||
- **Automation Engine**: Run parallel tasks and playbooks on multiple devices with variable support.
|
||||
- **Plugin System**: Build and execute custom Python scripts locally or on a remote gRPC server.
|
||||
- **gRPC Architecture**: Fully decoupled Client/Server model for distributed management.
|
||||
- **Privacy & Sync**: Local-first encrypted storage (RSA/OAEP) with optional Google Drive backup.
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install connpy
|
||||
|
||||
### Run it in Windows using docker
|
||||
```
|
||||
|
||||
### Run it in Windows/Linux using Docker
|
||||
```bash
|
||||
git clone https://github.com/fluzzi/connpy
|
||||
docker compose -f path/to/folder/docker-compose.yml build
|
||||
docker compose -f path/to/folder/docker-compose.yml run -it connpy-app
|
||||
cd connpy
|
||||
docker compose build
|
||||
|
||||
# Run it like a native app (completely silent)
|
||||
docker compose --log-level ERROR run --rm --remove-orphans connpy-app [command]
|
||||
|
||||
# Pro Tip: Add this alias for a 100% native experience from any folder
|
||||
alias conn='docker compose -f /path/to/connpy/docker-compose.yml --log-level ERROR run --rm --remove-orphans connpy-app'
|
||||
```
|
||||
|
||||
## Connection manager
|
||||
---
|
||||
|
||||
## 🔒 Privacy & Integration
|
||||
|
||||
### Privacy Policy
|
||||
|
||||
Connpy is committed to protecting your privacy. Our privacy policy explains how we handle user data:
|
||||
|
||||
- **Data Access**: Connpy accesses data necessary for managing remote host connections, including server addresses, usernames, and passwords. This data is stored locally on your machine and is not transmitted or shared with any third parties.
|
||||
- **Data Usage**: User data is used solely for the purpose of managing and automating SSH, Telnet, and SSM connections.
|
||||
- **Data Storage**: All connection details are stored locally and securely on your device. We do not store or process this data on our servers.
|
||||
- **Data Sharing**: We do not share any user data with third parties.
|
||||
Connpy is committed to protecting your privacy:
|
||||
- **Local Storage**: All server addresses, usernames, and passwords are encrypted and stored **only** on your machine. No data is transmitted to our servers.
|
||||
- **Data Access**: Data is used solely for managing and automating your connections.
|
||||
|
||||
### Google Integration
|
||||
Used strictly for backup:
|
||||
- **Backup**: Sync your encrypted configuration with your Google Drive account.
|
||||
- **Scoped Access**: Connpy only accesses its own backup files.
|
||||
|
||||
Connpy integrates with Google services for backup purposes:
|
||||
---
|
||||
|
||||
- **Configuration Backup**: The app allows users to store their device information in the app configuration. This configuration can be synced with Google services to create backups.
|
||||
- **Data Access**: Connpy only accesses its own files and does not access any other files on your Google account.
|
||||
- **Data Usage**: The data is used solely for backup and restore purposes, ensuring that your device information and configurations are safe and recoverable.
|
||||
- **Data Sharing**: Connpy does not share any user data with third parties, including Google. The backup data is only accessible by the user.
|
||||
## Usage
|
||||
|
||||
For more detailed information, please read our [Privacy Policy](https://connpy.gederico.dynu.net/fluzzi32/connpy/src/branch/main/PRIVATE_POLICY.md).
|
||||
|
||||
|
||||
### Features
|
||||
- Manage connections using SSH, SFTP, Telnet, kubectl, Docker exec, and AWS SSM.
|
||||
- Set contexts to manage specific nodes from specific contexts (work/home/clients/etc).
|
||||
- You can generate profiles and reference them from nodes using @profilename so you don't
|
||||
need to edit multiple nodes when changing passwords or other information.
|
||||
- Nodes can be stored on @folder or @subfolder@folder to organize your devices. They can
|
||||
be referenced using node@subfolder@folder or node@folder.
|
||||
- If you have too many nodes, get a completion script using: conn config --completion.
|
||||
Or use fzf by installing pyfzf and running conn config --fzf true.
|
||||
- Create in bulk, copy, move, export, and import nodes for easy management.
|
||||
- Run automation scripts on network devices.
|
||||
- Use AI with a multi-agent system (Engineer/Architect) to manage devices.
|
||||
Supports any LLM provider via litellm (OpenAI, Anthropic, Google, etc.).
|
||||
Features streaming responses, interactive chat, and extensible plugin tools.
|
||||
- Add plugins with your own scripts, and execute them remotely.
|
||||
- Fully decoupled gRPC Client/Server architecture.
|
||||
- Unified UI with syntax highlighting and theming.
|
||||
- Much more!
|
||||
|
||||
### Usage:
|
||||
```
|
||||
```bash
|
||||
usage: conn [-h] [--add | --del | --mod | --show | --debug] [node|folder] [--sftp]
|
||||
conn {profile,move,mv,copy,cp,list,ls,bulk,export,import,ai,run,api,plugin,config,sync,context} ...
|
||||
|
||||
positional arguments:
|
||||
node|folder node[@subfolder][@folder]
|
||||
Connect to specific node or show all matching nodes
|
||||
[@subfolder][@folder]
|
||||
Show all available connections globally or in specified path
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
-v, --version Show version
|
||||
-a, --add Add new node[@subfolder][@folder] or [@subfolder]@folder
|
||||
-r, --del, --rm Delete node[@subfolder][@folder] or [@subfolder]@folder
|
||||
-e, --mod, --edit Modify node[@subfolder][@folder]
|
||||
-s, --show Show node[@subfolder][@folder]
|
||||
-d, --debug Display all conections steps
|
||||
-t, --sftp Connects using sftp instead of ssh
|
||||
--service-mode Set the backend service mode (local or remote)
|
||||
--remote Connect to a remote connpy service via gRPC
|
||||
--theme UI Output theme (dark, light, or path)
|
||||
|
||||
Commands:
|
||||
profile Manage profiles
|
||||
move(mv) Move node
|
||||
copy(cp) Copy node
|
||||
list(ls) List profiles, nodes or folders
|
||||
bulk Add nodes in bulk
|
||||
export Export connection folder to Yaml file
|
||||
import Import connection folder to config from Yaml file
|
||||
ai Make request to an AI
|
||||
run Run scripts or commands on nodes
|
||||
api Start and stop connpy api
|
||||
plugin Manage plugins
|
||||
config Manage app config
|
||||
sync Sync config with Google
|
||||
context Manage contexts with regex matching
|
||||
conn {profile,move,copy,list,bulk,export,import,ai,run,api,plugin,config,sync,context} ...
|
||||
```
|
||||
|
||||
### Manage profiles:
|
||||
```
|
||||
usage: conn profile [-h] (--add | --del | --mod | --show) profile
|
||||
### Basic Examples:
|
||||
```bash
|
||||
# Add a folder and subfolder
|
||||
conn --add @office
|
||||
conn --add @datacenter@office
|
||||
|
||||
positional arguments:
|
||||
profile Name of profile to manage
|
||||
# Add a node with a profile
|
||||
conn --add server1@datacenter@office --profile @myuser
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
-a, --add Add new profile
|
||||
-r, --del, --rm Delete profile
|
||||
-e, --mod, --edit Modify profile
|
||||
-s, --show Show profile
|
||||
# Connect to a node (fuzzy match)
|
||||
conn server1
|
||||
|
||||
# Start the AI Copilot
|
||||
conn ai
|
||||
|
||||
# Run a command on all nodes in a folder
|
||||
conn run @office "uptime"
|
||||
```
|
||||
|
||||
### Examples:
|
||||
```
|
||||
#Add new profile
|
||||
conn profile --add office-user
|
||||
#Add new folder
|
||||
conn --add @office
|
||||
#Add new subfolder
|
||||
conn --add @datacenter@office
|
||||
#Add node to subfolder
|
||||
conn --add server@datacenter@office
|
||||
#Add node to folder
|
||||
conn --add pc@office
|
||||
#Show node information
|
||||
conn --show server@datacenter@office
|
||||
#Connect to nodes
|
||||
conn pc@office
|
||||
conn server
|
||||
#Create and set new context
|
||||
conn context -a office .*@office
|
||||
conn context --set office
|
||||
#Run a command in a node
|
||||
conn run server ls -la
|
||||
```
|
||||
---
|
||||
|
||||
## 🔌 Plugin System
|
||||
Connpy supports a robust plugin architecture where scripts can run transparently on a remote gRPC server.
|
||||
|
||||
### Structure
|
||||
Plugins must be Python files containing:
|
||||
- **Class `Parser`**: Defines `argparse` arguments.
|
||||
- **Class `Entrypoint`**: Execution logic.
|
||||
- **Class `Preload`**: (Optional) Hooks and modifications to the core app.
|
||||
|
||||
See the [Plugin Requirements section](#plugin-requirements-for-connpy) for full technical details.
|
||||
|
||||
---
|
||||
|
||||
## Plugin Requirements for Connpy
|
||||
|
||||
### Remote Plugin Execution
|
||||
When Connpy operates in remote mode, plugins are executed **transparently on the server**:
|
||||
- The client automatically downloads the plugin source code (`Parser` class context) to generate the local `argparse` structure and provide autocompletion.
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory, ensuring the plugin runs securely against the server's inventory without passing sensitive data to the client.
|
||||
- You can manage remote plugins using the `--remote` flag (e.g. `connpy plugin --add myplugin script.py --remote`).
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory.
|
||||
- You can manage remote plugins using the `--remote` flag.
|
||||
|
||||
### General Structure
|
||||
- The plugin script must be a Python file.
|
||||
- Only the following top-level elements are allowed in the plugin script:
|
||||
- Class definitions
|
||||
- Function definitions
|
||||
- Import statements
|
||||
- The `if __name__ == "__main__":` block for standalone execution
|
||||
- Pass statements
|
||||
|
||||
### Specific Class Requirements
|
||||
- The plugin script must define specific classes with particular attributes and methods. Each class serves a distinct role within the plugin's architecture:
|
||||
1. **Class `Parser`**:
|
||||
- **Purpose**: Handles parsing of command-line arguments.
|
||||
- **Requirements**:
|
||||
- Must contain only one method: `__init__`.
|
||||
- The `__init__` method must initialize at least one attribute:
|
||||
- `self.parser`: An instance of `argparse.ArgumentParser`.
|
||||
2. **Class `Entrypoint`**:
|
||||
- **Purpose**: Acts as the entry point for plugin execution, utilizing parsed arguments and integrating with the main application.
|
||||
- **Requirements**:
|
||||
- Must have an `__init__` method that accepts exactly three parameters besides `self`:
|
||||
- `args`: Arguments passed to the plugin.
|
||||
- The parser instance (typically `self.parser` from the `Parser` class).
|
||||
- The Connapp instance to interact with the Connpy app.
|
||||
3. **Class `Preload`**:
|
||||
- **Purpose**: Performs any necessary preliminary setup or configuration independent of the main parsing and entry logic.
|
||||
- **Requirements**:
|
||||
- Contains at least an `__init__` method that accepts parameter connapp besides `self`.
|
||||
|
||||
### Class Dependencies and Combinations
|
||||
- **Dependencies**:
|
||||
- `Parser` and `Entrypoint` are interdependent and must both be present if one is included.
|
||||
- `Preload` is independent and may exist alone or alongside the other classes.
|
||||
- **Valid Combinations**:
|
||||
- `Parser` and `Entrypoint` together.
|
||||
- `Preload` alone.
|
||||
- All three classes (`Parser`, `Entrypoint`, `Preload`).
|
||||
- The plugin script must define specific classes:
|
||||
1. **Class `Parser`**: Handles `argparse.ArgumentParser` initialization.
|
||||
2. **Class `Entrypoint`**: Main execution logic (receives `args`, `parser`, and `connapp`).
|
||||
3. **Class `Preload`**: (Optional) For modifying core app behavior or registering hooks.
|
||||
|
||||
### Preload Modifications and Hooks
|
||||
|
||||
In the `Preload` class of the plugin system, you have the ability to customize the behavior of existing classes and methods within the application through a robust hooking system. This documentation explains how to use the `modify`, `register_pre_hook`, and `register_post_hook` methods to tailor plugin functionality to your needs.
|
||||
|
||||
#### Modifying Classes with `modify`
|
||||
The `modify` method allows you to alter instances of a class at the time they are created or after their creation. This is particularly useful for setting or modifying configuration settings, altering default behaviors, or adding new functionalities to existing classes without changing the original class definitions.
|
||||
|
||||
- **Usage**: Modify a class to include additional configurations or changes
|
||||
- **Modify Method Signature**:
|
||||
- `modify(modification_method)`: A function that is invoked with an instance of the class as its argument. This function should perform any modifications directly on this instance.
|
||||
- **Modification Method Signature**:
|
||||
- **Arguments**:
|
||||
- `cls`: This function accepts a single argument, the class instance, which it then modifies.
|
||||
- **Modifiable Classes**:
|
||||
- `connapp.config`
|
||||
- `connapp.node`
|
||||
- `connapp.nodes`
|
||||
- `connapp.ai`
|
||||
- ```python
|
||||
def modify_config(cls):
|
||||
# Example modification: adding a new attribute or modifying an existing one
|
||||
cls.new_attribute = 'New Value'
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
# Applying modification to the config class instance
|
||||
connapp.config.modify(modify_config)
|
||||
```
|
||||
|
||||
#### Implementing Method Hooks
|
||||
There are 2 methods that allows you to define custom logic to be executed before (`register_pre_hook`) or after (`register_post_hook`) the main logic of a method. This is particularly useful for logging, auditing, preprocessing inputs, postprocessing outputs or adding functionalities.
|
||||
|
||||
- **Usage**: Register hooks to methods to execute additional logic before or after the main method execution.
|
||||
- **Registration Methods Signature**:
|
||||
- `register_pre_hook(pre_hook_method)`: A function that is invoked before the main method is executed. This function should do preprocessing of the arguments.
|
||||
- `register_post_hook(post_hook_method)`: A function that is invoked after the main method is executed. This function should do postprocessing of the outputs.
|
||||
- **Method Signatures for Pre-Hooks**
|
||||
- `pre_hook_method(*args, **kwargs)`
|
||||
- **Arguments**:
|
||||
- `*args`, `**kwargs`: The arguments and keyword arguments that will be passed to the method being hooked. The pre-hook function has the opportunity to inspect and modify these arguments before they are passed to the main method.
|
||||
- **Return**:
|
||||
- Must return a tuple `(args, kwargs)`, which will be used as the new arguments for the main method. If the original arguments are not modified, the function should return them as received.
|
||||
- **Method Signatures for Post-Hooks**:
|
||||
- `post_hook_method(*args, **kwargs)`
|
||||
- **Arguments**:
|
||||
- `*args`, `**kwargs`: The arguments and keyword arguments that were passed to the main method.
|
||||
- `kwargs["result"]`: The value returned by the main method. This allows the post-hook to inspect and even alter the result before it is returned to the original caller.
|
||||
- **Return**:
|
||||
- Can return a modified result, which will replace the original result of the main method, or simply return `kwargs["result"]` to return the original method result.
|
||||
- ```python
|
||||
def pre_processing_hook(*args, **kwargs):
|
||||
print("Pre-processing logic here")
|
||||
# Modify arguments or perform any checks
|
||||
return args, kwargs # Return modified or unmodified args and kwargs
|
||||
|
||||
def post_processing_hook(*args, **kwargs):
|
||||
print("Post-processing logic here")
|
||||
# Modify the result or perform any final logging or cleanup
|
||||
return kwargs["result"] # Return the modified or unmodified result
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
# Registering a pre-hook
|
||||
connapp.ai.some_method.register_pre_hook(pre_processing_hook)
|
||||
|
||||
# Registering a post-hook
|
||||
connapp.node.another_method.register_post_hook(post_processing_hook)
|
||||
```
|
||||
|
||||
|
||||
### Executable Block
|
||||
- The plugin script can include an executable block:
|
||||
- `if __name__ == "__main__":`
|
||||
- This block allows the plugin to be run as a standalone script for testing or independent use.
|
||||
You can customize the behavior of core classes using hooks:
|
||||
- **`modify(method)`**: Alter class instances (e.g., `connapp.config`, `connapp.ai`).
|
||||
- **`register_pre_hook(method)`**: Logic to run before a method execution.
|
||||
- **`register_post_hook(method)`**: Logic to run after a method execution.
|
||||
|
||||
### Command Completion Support
|
||||
Plugins can provide intelligent tab completion:
|
||||
1. **Tree-based Completion (Recommended)**: Define `_connpy_tree(info)` returning a navigation dictionary.
|
||||
2. **Legacy Completion**: Define `_connpy_completion(wordsnumber, words, info)`.
|
||||
|
||||
Plugins can provide intelligent **tab completion** by defining autocompletion logic. There are two supported methods, with the tree-based approach being the most modern and recommended.
|
||||
---
|
||||
|
||||
#### 1. Tree-based Completion (Recommended)
|
||||
## ⚙️ gRPC Service Architecture
|
||||
Connpy can operate in a decoupled mode:
|
||||
1. **Start the API (Server)**: `conn api -s 50051`
|
||||
2. **Configure the Client**:
|
||||
```bash
|
||||
conn config --service-mode remote
|
||||
conn config --remote-host localhost:50051
|
||||
```
|
||||
All inventory management and execution will now happen on the server.
|
||||
|
||||
Define a function called `_connpy_tree` that returns a declarative navigation tree. This method is highly efficient, supports complex state loops, and is very simple to implement for most use cases.
|
||||
---
|
||||
|
||||
## 🐍 Automation Module (API)
|
||||
You can use `connpy` as a Python library for your own scripts.
|
||||
|
||||
### Basic Execution
|
||||
```python
|
||||
def _connpy_tree(info=None):
|
||||
nodes = info.get("nodes", [])
|
||||
return {
|
||||
"__exclude_used__": True, # Filter out words already typed
|
||||
"__extra__": nodes, # Suggest nodes at this level
|
||||
"--format": ["json", "yaml", "table"], # Fixed suggestions
|
||||
"*": { # Wildcard matches any positional word
|
||||
"interface1": None,
|
||||
"interface2": None,
|
||||
"--verbose": None
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- **Keys**: Literal completions (exact matches).
|
||||
- **`*` Key**: A wildcard that matches any positional word typed by the user.
|
||||
- **`__extra__`**: A list or a callable `(words) -> list` that adds dynamic suggestions.
|
||||
- **`__exclude_used__`**: (Boolean) If True, automatically filters out words already present in the command line.
|
||||
|
||||
#### 2. Legacy Function-based Completion
|
||||
|
||||
For backward compatibility or highly custom logic, you can define `_connpy_completion`.
|
||||
|
||||
```python
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
if wordsnumber == 3:
|
||||
return ["--help", "--verbose", "start", "stop"]
|
||||
|
||||
elif wordsnumber == 4 and words[2] == "start":
|
||||
return info["nodes"] # Suggest node names
|
||||
|
||||
return []
|
||||
```
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the total number of words on the command line. For plugins, this typically starts at 3. |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin. |
|
||||
| `info` | A dictionary of structured context data (`nodes`, `folders`, `profiles`, `config`). |
|
||||
|
||||
> In this example, if the user types `connpy myplugin start ` and presses Tab, it will suggest node names.
|
||||
|
||||
### Handling Unknown Arguments
|
||||
|
||||
Plugins can choose to accept and process unknown arguments that are **not explicitly defined** in the parser. To enable this behavior, the plugin must define the following hidden argument in its `Parser` class:
|
||||
|
||||
```
|
||||
self.parser.add_argument(
|
||||
"--unknown-args",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help=argparse.SUPPRESS
|
||||
)
|
||||
```
|
||||
|
||||
#### Behavior:
|
||||
|
||||
- When this argument is present, Connpy will parse the known arguments and capture any extra (unknown) ones.
|
||||
- These unknown arguments will be passed to the plugin as `args.unknown_args` inside the `Entrypoint`.
|
||||
- If the user does not pass any unknown arguments, `args.unknown_args` will contain the default value (`True`, unless overridden).
|
||||
|
||||
#### Example:
|
||||
|
||||
If a plugin accepts unknown tcpdump flags like this:
|
||||
|
||||
```
|
||||
connpy myplugin -nn -s0
|
||||
```
|
||||
|
||||
And defines the hidden `--unknown-args` flag as shown above, then:
|
||||
|
||||
- `args.unknown_args` inside `Entrypoint.__init__()` will be: `['-nn', '-s0']`
|
||||
|
||||
> This allows the plugin to receive and process arguments intended for external tools (e.g., `tcpdump`) without argparse raising an error.
|
||||
|
||||
#### Note:
|
||||
|
||||
If a plugin does **not** define `--unknown-args`, any extra arguments passed will cause argparse to fail with an unrecognized arguments error.
|
||||
|
||||
### Script Verification
|
||||
- The `verify_script` method in `plugins.py` is used to check the plugin script's compliance with these standards.
|
||||
- Non-compliant scripts will be rejected to ensure consistency and proper functionality within the plugin system.
|
||||
|
||||
### Example Script
|
||||
|
||||
For a practical example of how to write a compatible plugin script, please refer to the following example:
|
||||
|
||||
[Example Plugin Script](https://github.com/fluzzi/awspy)
|
||||
|
||||
This script demonstrates the required structure and implementation details according to the plugin system's standards.
|
||||
|
||||
## Automation module usage
|
||||
### Standalone module
|
||||
```
|
||||
import connpy
|
||||
router = connpy.node("uniqueName","ip/host", user="username", password="password")
|
||||
router.run(["term len 0","show run"])
|
||||
router = connpy.node("uniqueName", "1.1.1.1", user="admin")
|
||||
router.run(["show ip int brief"])
|
||||
print(router.output)
|
||||
hasip = router.test("show ip int brief","1.1.1.1")
|
||||
if hasip:
|
||||
print("Router has ip 1.1.1.1")
|
||||
else:
|
||||
print("router does not have ip 1.1.1.1")
|
||||
```
|
||||
|
||||
### Using manager configuration
|
||||
```
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
device = conf.getitem("router@office")
|
||||
router = connpy.node("unique name", **device, config=conf)
|
||||
result = router.run("show ip int brief")
|
||||
print(result)
|
||||
```
|
||||
### Running parallel tasks on multiple devices
|
||||
```
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
#You can get the nodes from the config from a folder and fitlering in it
|
||||
nodes = conf.getitem("@office", ["router1", "router2", "router3"])
|
||||
#You can also get each node individually:
|
||||
nodes = {}
|
||||
nodes["router1"] = conf.getitem("router1@office")
|
||||
nodes["router2"] = conf.getitem("router2@office")
|
||||
nodes["router10"] = conf.getitem("router10@datacenter")
|
||||
#Also, you can create the nodes manually:
|
||||
nodes = {}
|
||||
nodes["router1"] = {"host": "1.1.1.1", "user": "user", "password": "password1"}
|
||||
nodes["router2"] = {"host": "1.1.1.2", "user": "user", "password": "password2"}
|
||||
nodes["router3"] = {"host": "1.1.1.2", "user": "user", "password": "password3"}
|
||||
#Finally you run some tasks on the nodes
|
||||
mynodes = connpy.nodes(nodes, config = conf)
|
||||
result = mynodes.test(["show ip int br"], "1.1.1.2")
|
||||
for i in result:
|
||||
print("---" + i + "---")
|
||||
print(result[i])
|
||||
print()
|
||||
# Or for one specific node
|
||||
mynodes.router1.run(["term len 0". "show run"], folder = "/home/user/logs")
|
||||
```
|
||||
### Using variables
|
||||
```
|
||||
### Parallel Tasks with Variables
|
||||
```python
|
||||
import connpy
|
||||
config = connpy.configfile()
|
||||
nodes = config.getitem("@office", ["router1", "router2", "router3"])
|
||||
commands = []
|
||||
commands.append("config t")
|
||||
commands.append("interface lo {id}")
|
||||
commands.append("ip add {ip} {mask}")
|
||||
commands.append("end")
|
||||
variables = {}
|
||||
variables["router1@office"] = {"ip": "10.57.57.1"}
|
||||
variables["router2@office"] = {"ip": "10.57.57.2"}
|
||||
variables["router3@office"] = {"ip": "10.57.57.3"}
|
||||
variables["__global__"] = {"id": "57"}
|
||||
variables["__global__"]["mask"] = "255.255.255.255"
|
||||
expected = "!"
|
||||
routers = connpy.nodes(nodes, config = config)
|
||||
routers.run(commands, variables)
|
||||
routers.test("ping {ip}", expected, variables)
|
||||
for key in routers.result:
|
||||
print(key, ' ---> ', ("pass" if routers.result[key] else "fail"))
|
||||
nodes = config.getitem("@office", ["router1", "router2"])
|
||||
routers = connpy.nodes(nodes, config=config)
|
||||
|
||||
variables = {
|
||||
"router1@office": {"id": "1"},
|
||||
"__global__": {"mask": "255.255.255.0"}
|
||||
}
|
||||
routers.run(["interface lo{id}", "ip address 10.0.0.{id} {mask}"], variables)
|
||||
```
|
||||
### Using AI
|
||||
The AI module uses a multi-agent architecture with an **Engineer** (fast execution) and an **Architect** (strategic reasoning). It supports any LLM provider through [litellm](https://github.com/BerriAI/litellm).
|
||||
|
||||
### AI Programmatic Use
|
||||
```python
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
# Uses models and API keys from config, or override them:
|
||||
myai = connpy.ai(conf, engineer_model="gemini/gemini-2.5-flash", engineer_api_key="your-key")
|
||||
result = myai.ask("go to router1 and show me the running configuration")
|
||||
print(result["response"])
|
||||
# Streaming is enabled by default for CLI, disable for programmatic use:
|
||||
result = myai.ask("show interfaces on all routers", stream=False)
|
||||
print(result["response"])
|
||||
myai = connpy.ai(connpy.configfile())
|
||||
response = myai.ask("What is the status of the BGP neighbors in the office?")
|
||||
```
|
||||
|
||||
#### AI Plugin Tool Registration
|
||||
Plugins can extend the AI system by registering custom tools via the `Preload` class:
|
||||
```python
|
||||
def _register_my_tools(ai_instance):
|
||||
tool_def = {
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "my_custom_tool",
|
||||
"description": "Does something useful.",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {"query": {"type": "string"}},
|
||||
"required": ["query"]
|
||||
}
|
||||
}
|
||||
}
|
||||
ai_instance.register_ai_tool(
|
||||
tool_definition=tool_def,
|
||||
handler=my_handler_function,
|
||||
target="engineer", # or "architect" or "both"
|
||||
engineer_prompt="- My tool: does X.",
|
||||
architect_prompt=" * My tool (my_custom_tool)."
|
||||
)
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
connapp.ai.modify(_register_my_tools)
|
||||
```
|
||||
## gRPC Service Architecture
|
||||
Connpy features a completely decoupled gRPC Client/Server architecture. You can run Connpy as a standalone background service and connect to it remotely via the CLI or other clients.
|
||||
|
||||
### 1. Start the Server
|
||||
Start the gRPC service by running:
|
||||
```bash
|
||||
connpy api -s 50051
|
||||
```
|
||||
The server will handle all configurations, connections, AI sessions, and plugin execution locally on the machine it runs on.
|
||||
|
||||
### 2. Connect the Client
|
||||
Configure your local CLI client to connect to the remote server:
|
||||
```bash
|
||||
connpy config --service-mode remote
|
||||
connpy config --remote-host localhost:50051
|
||||
```
|
||||
Once configured, all commands (`connpy node`, `connpy list`, `connpy ai`, etc.) will execute transparently on the remote server via thin-client proxies. You can revert back to standalone execution at any time by running `connpy config --service-mode local`.
|
||||
|
||||
### Programmatic Access (gRPC & SOA)
|
||||
If you wish to build your own application (Web, Desktop, or Scripts) using the Connpy backend, you can use the `ServiceProvider` to interact with either a local or remote service transparently.
|
||||
|
||||
```python
|
||||
import connpy
|
||||
from connpy.services.provider import ServiceProvider
|
||||
|
||||
# Initialize local config
|
||||
config = connpy.configfile()
|
||||
|
||||
# Connect to the remote gRPC service
|
||||
services = ServiceProvider(
|
||||
config,
|
||||
mode="remote",
|
||||
remote_host="localhost:50051"
|
||||
)
|
||||
|
||||
# Use any service (the logic is identical to local mode)
|
||||
nodes = services.nodes.list_nodes()
|
||||
for name in nodes:
|
||||
print(f"Found node: {name}")
|
||||
|
||||
# Run a command remotely via streaming
|
||||
for chunk in services.execution.run_commands(nodes=["server1"], commands=["uptime"]):
|
||||
print(chunk["output"], end="")
|
||||
```
|
||||
|
||||
|
||||
---
|
||||
*For detailed developer notes and plugin hooks documentation, see the [Documentation](https://fluzzi.github.io/connpy/).*
|
||||
|
||||
+139
-433
@@ -1,476 +1,182 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
## Connection manager
|
||||
<p align="center">
|
||||
<img src="https://nginx.gederico.dynu.net/images/CONNPY-resized.png" alt="App Logo">
|
||||
</p>
|
||||
|
||||
Connpy is a SSH, SFTP, Telnet, kubectl, Docker pod, and AWS SSM connection manager and automation module for Linux, Mac, and Docker.
|
||||
|
||||
### Features
|
||||
- Manage connections using SSH, SFTP, Telnet, kubectl, Docker exec, and AWS SSM.
|
||||
- Set contexts to manage specific nodes from specific contexts (work/home/clients/etc).
|
||||
- You can generate profiles and reference them from nodes using @profilename so you don't
|
||||
need to edit multiple nodes when changing passwords or other information.
|
||||
- Nodes can be stored on @folder or @subfolder@folder to organize your devices. They can
|
||||
be referenced using node@subfolder@folder or node@folder.
|
||||
- If you have too many nodes, get a completion script using: conn config --completion.
|
||||
Or use fzf by installing pyfzf and running conn config --fzf true.
|
||||
- Create in bulk, copy, move, export, and import nodes for easy management.
|
||||
- Run automation scripts on network devices.
|
||||
- Use AI with a multi-agent system (Engineer/Architect) to help you manage your devices.
|
||||
Supports any LLM provider via litellm (OpenAI, Anthropic, Google, etc.).
|
||||
- Add plugins with your own scripts, and execute them remotely.
|
||||
- Fully decoupled gRPC Client/Server architecture.
|
||||
- Unified UI with syntax highlighting and theming.
|
||||
- Much more!
|
||||
# Connpy
|
||||
[](https://pypi.org/pypi/connpy/)
|
||||
[](https://pypi.org/pypi/connpy/)
|
||||
[](https://github.com/fluzzi/connpy/blob/main/LICENSE)
|
||||
[](https://pypi.org/pypi/connpy/)
|
||||
|
||||
### Usage
|
||||
**Connpy** is a powerful Connection Manager and Network Automation Platform for Linux, Mac, and Docker. It provides a unified interface for **SSH, SFTP, Telnet, kubectl, Docker pods, and AWS SSM**.
|
||||
|
||||
The v6 release introduces the **AI Copilot**, an interactive terminal assistant that understands your network context and helps you manage your infrastructure more intelligently.
|
||||
|
||||
|
||||
## 🤖 AI Copilot (New in v6)
|
||||
The AI Copilot is deeply integrated into your terminal workflow:
|
||||
- **Terminal Context Awareness**: The Copilot can "see" your screen output, helping you diagnose errors or analyze command results in real-time.
|
||||
- **Hybrid Multi-Agent System**: Automatically escalates complex tasks between the **Network Engineer** (execution) and the **Network Architect** (strategy).
|
||||
- **MCP Integration**: Dynamically load tools from external providers (6WIND, AWS, etc.) via the Model Context Protocol.
|
||||
- **Interactive Chat**: Launch with `conn ai` for a collaborative troubleshooting session.
|
||||
|
||||
|
||||
## Core Features
|
||||
- **Multi-Protocol**: Native support for SSH, SFTP, Telnet, kubectl, Docker exec, and AWS SSM.
|
||||
- **Context Management**: Set regex-based contexts to manage specific nodes across different environments (work, home, clients).
|
||||
- **Advanced Inventory**:
|
||||
- Organize nodes in folders (`@folder`) and subfolders (`@subfolder@folder`).
|
||||
- Use Global Profiles (`@profilename`) to manage shared credentials easily.
|
||||
- Bulk creation, copying, moving, and export/import of nodes.
|
||||
- **Modern UI**: High-performance terminal experience with `prompt-toolkit`, including:
|
||||
- Fuzzy search integration with `fzf`.
|
||||
- Advanced tab completion.
|
||||
- Syntax highlighting and customizable themes.
|
||||
- **Automation Engine**: Run parallel tasks and playbooks on multiple devices with variable support.
|
||||
- **Plugin System**: Build and execute custom Python scripts locally or on a remote gRPC server.
|
||||
- **gRPC Architecture**: Fully decoupled Client/Server model for distributed management.
|
||||
- **Privacy & Sync**: Local-first encrypted storage (RSA/OAEP) with optional Google Drive backup.
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install connpy
|
||||
```
|
||||
|
||||
### Run it in Windows/Linux using Docker
|
||||
```bash
|
||||
git clone https://github.com/fluzzi/connpy
|
||||
cd connpy
|
||||
docker compose build
|
||||
|
||||
# Run it like a native app (completely silent)
|
||||
docker compose --log-level ERROR run --rm --remove-orphans connpy-app [command]
|
||||
|
||||
# Pro Tip: Add this alias for a 100% native experience from any folder
|
||||
alias conn='docker compose -f /path/to/connpy/docker-compose.yml --log-level ERROR run --rm --remove-orphans connpy-app'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔒 Privacy & Integration
|
||||
|
||||
### Privacy Policy
|
||||
Connpy is committed to protecting your privacy:
|
||||
- **Local Storage**: All server addresses, usernames, and passwords are encrypted and stored **only** on your machine. No data is transmitted to our servers.
|
||||
- **Data Access**: Data is used solely for managing and automating your connections.
|
||||
|
||||
### Google Integration
|
||||
Used strictly for backup:
|
||||
- **Backup**: Sync your encrypted configuration with your Google Drive account.
|
||||
- **Scoped Access**: Connpy only accesses its own backup files.
|
||||
|
||||
---
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
usage: conn [-h] [--add | --del | --mod | --show | --debug] [node|folder] [--sftp]
|
||||
conn {profile,move,mv,copy,cp,list,ls,bulk,export,import,ai,run,api,plugin,config,sync,context} ...
|
||||
|
||||
positional arguments:
|
||||
node|folder node[@subfolder][@folder]
|
||||
Connect to specific node or show all matching nodes
|
||||
[@subfolder][@folder]
|
||||
Show all available connections globally or in specified path
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
-v, --version Show version
|
||||
-a, --add Add new node[@subfolder][@folder] or [@subfolder]@folder
|
||||
-r, --del, --rm Delete node[@subfolder][@folder] or [@subfolder]@folder
|
||||
-e, --mod, --edit Modify node[@subfolder][@folder]
|
||||
-s, --show Show node[@subfolder][@folder]
|
||||
-d, --debug Display all conections steps
|
||||
-t, --sftp Connects using sftp instead of ssh
|
||||
--service-mode Set the backend service mode (local or remote)
|
||||
--remote Connect to a remote connpy service via gRPC
|
||||
--theme UI Output theme (dark, light, or path)
|
||||
|
||||
Commands:
|
||||
profile Manage profiles
|
||||
move(mv) Move node
|
||||
copy(cp) Copy node
|
||||
list(ls) List profiles, nodes or folders
|
||||
bulk Add nodes in bulk
|
||||
export Export connection folder to Yaml file
|
||||
import Import connection folder to config from Yaml file
|
||||
ai Make request to an AI
|
||||
run Run scripts or commands on nodes
|
||||
api Start and stop connpy api
|
||||
plugin Manage plugins
|
||||
config Manage app config
|
||||
sync Sync config with Google
|
||||
context Manage contexts with regex matching
|
||||
conn {profile,move,copy,list,bulk,export,import,ai,run,api,plugin,config,sync,context} ...
|
||||
```
|
||||
|
||||
### Manage profiles
|
||||
```
|
||||
usage: conn profile [-h] (--add | --del | --mod | --show) profile
|
||||
### Basic Examples:
|
||||
```bash
|
||||
# Add a folder and subfolder
|
||||
conn --add @office
|
||||
conn --add @datacenter@office
|
||||
|
||||
positional arguments:
|
||||
profile Name of profile to manage
|
||||
# Add a node with a profile
|
||||
conn --add server1@datacenter@office --profile @myuser
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
-a, --add Add new profile
|
||||
-r, --del, --rm Delete profile
|
||||
-e, --mod, --edit Modify profile
|
||||
-s, --show Show profile
|
||||
# Connect to a node (fuzzy match)
|
||||
conn server1
|
||||
|
||||
# Start the AI Copilot
|
||||
conn ai
|
||||
|
||||
# Run a command on all nodes in a folder
|
||||
conn run @office "uptime"
|
||||
```
|
||||
|
||||
### Examples
|
||||
```
|
||||
#Add new profile
|
||||
conn profile --add office-user
|
||||
#Add new folder
|
||||
conn --add @office
|
||||
#Add new subfolder
|
||||
conn --add @datacenter@office
|
||||
#Add node to subfolder
|
||||
conn --add server@datacenter@office
|
||||
#Add node to folder
|
||||
conn --add pc@office
|
||||
#Show node information
|
||||
conn --show server@datacenter@office
|
||||
#Connect to nodes
|
||||
conn pc@office
|
||||
conn server
|
||||
#Create and set new context
|
||||
conn context -a office .*@office
|
||||
conn context --set office
|
||||
#Run a command in a node
|
||||
conn run server ls -la
|
||||
```
|
||||
---
|
||||
|
||||
## Plugin Requirements for Connpy
|
||||
|
||||
### Remote Plugin Execution
|
||||
When Connpy operates in remote mode, plugins are executed **transparently on the server**:
|
||||
- The client automatically downloads the plugin source code (`Parser` class context) to generate the local `argparse` structure and provide autocompletion.
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory, ensuring the plugin runs securely against the server's inventory without passing sensitive data to the client.
|
||||
- You can manage remote plugins using the `--remote` flag (e.g. `connpy plugin --add myplugin script.py --remote`).
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory.
|
||||
- You can manage remote plugins using the `--remote` flag.
|
||||
|
||||
### General Structure
|
||||
- The plugin script must be a Python file.
|
||||
- Only the following top-level elements are allowed in the plugin script:
|
||||
- Class definitions
|
||||
- Function definitions
|
||||
- Import statements
|
||||
- The `if __name__ == "__main__":` block for standalone execution
|
||||
- Pass statements
|
||||
|
||||
### Specific Class Requirements
|
||||
- The plugin script must define specific classes with particular attributes and methods. Each class serves a distinct role within the plugin's architecture:
|
||||
1. **Class `Parser`**:
|
||||
- **Purpose**: Handles parsing of command-line arguments.
|
||||
- **Requirements**:
|
||||
- Must contain only one method: `__init__`.
|
||||
- The `__init__` method must initialize at least one attribute:
|
||||
- `self.parser`: An instance of `argparse.ArgumentParser`.
|
||||
2. **Class `Entrypoint`**:
|
||||
- **Purpose**: Acts as the entry point for plugin execution, utilizing parsed arguments and integrating with the main application.
|
||||
- **Requirements**:
|
||||
- Must have an `__init__` method that accepts exactly three parameters besides `self`:
|
||||
- `args`: Arguments passed to the plugin.
|
||||
- The parser instance (typically `self.parser` from the `Parser` class).
|
||||
- The Connapp instance to interact with the Connpy app.
|
||||
3. **Class `Preload`**:
|
||||
- **Purpose**: Performs any necessary preliminary setup or configuration independent of the main parsing and entry logic.
|
||||
- **Requirements**:
|
||||
- Contains at least an `__init__` method that accepts parameter connapp besides `self`.
|
||||
|
||||
### Class Dependencies and Combinations
|
||||
- **Dependencies**:
|
||||
- `Parser` and `Entrypoint` are interdependent and must both be present if one is included.
|
||||
- `Preload` is independent and may exist alone or alongside the other classes.
|
||||
- **Valid Combinations**:
|
||||
- `Parser` and `Entrypoint` together.
|
||||
- `Preload` alone.
|
||||
- All three classes (`Parser`, `Entrypoint`, `Preload`).
|
||||
- The plugin script must define specific classes:
|
||||
1. **Class `Parser`**: Handles `argparse.ArgumentParser` initialization.
|
||||
2. **Class `Entrypoint`**: Main execution logic (receives `args`, `parser`, and `connapp`).
|
||||
3. **Class `Preload`**: (Optional) For modifying core app behavior or registering hooks.
|
||||
|
||||
### Preload Modifications and Hooks
|
||||
|
||||
In the `Preload` class of the plugin system, you have the ability to customize the behavior of existing classes and methods within the application through a robust hooking system. This documentation explains how to use the `modify`, `register_pre_hook`, and `register_post_hook` methods to tailor plugin functionality to your needs.
|
||||
|
||||
#### Modifying Classes with `modify`
|
||||
The `modify` method allows you to alter instances of a class at the time they are created or after their creation. This is particularly useful for setting or modifying configuration settings, altering default behaviors, or adding new functionalities to existing classes without changing the original class definitions.
|
||||
|
||||
- **Usage**: Modify a class to include additional configurations or changes
|
||||
- **Modify Method Signature**:
|
||||
- `modify(modification_method)`: A function that is invoked with an instance of the class as its argument. This function should perform any modifications directly on this instance.
|
||||
- **Modification Method Signature**:
|
||||
- **Arguments**:
|
||||
- `cls`: This function accepts a single argument, the class instance, which it then modifies.
|
||||
- **Modifiable Classes**:
|
||||
- `connapp.config`
|
||||
- `connapp.node`
|
||||
- `connapp.nodes`
|
||||
- `connapp.ai`
|
||||
- ```python
|
||||
def modify_config(cls):
|
||||
# Example modification: adding a new attribute or modifying an existing one
|
||||
cls.new_attribute = 'New Value'
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
# Applying modification to the config class instance
|
||||
connapp.config.modify(modify_config)
|
||||
```
|
||||
|
||||
#### Implementing Method Hooks
|
||||
There are 2 methods that allows you to define custom logic to be executed before (`register_pre_hook`) or after (`register_post_hook`) the main logic of a method. This is particularly useful for logging, auditing, preprocessing inputs, postprocessing outputs or adding functionalities.
|
||||
|
||||
- **Usage**: Register hooks to methods to execute additional logic before or after the main method execution.
|
||||
- **Registration Methods Signature**:
|
||||
- `register_pre_hook(pre_hook_method)`: A function that is invoked before the main method is executed. This function should do preprocessing of the arguments.
|
||||
- `register_post_hook(post_hook_method)`: A function that is invoked after the main method is executed. This function should do postprocessing of the outputs.
|
||||
- **Method Signatures for Pre-Hooks**
|
||||
- `pre_hook_method(*args, **kwargs)`
|
||||
- **Arguments**:
|
||||
- `*args`, `**kwargs`: The arguments and keyword arguments that will be passed to the method being hooked. The pre-hook function has the opportunity to inspect and modify these arguments before they are passed to the main method.
|
||||
- **Return**:
|
||||
- Must return a tuple `(args, kwargs)`, which will be used as the new arguments for the main method. If the original arguments are not modified, the function should return them as received.
|
||||
- **Method Signatures for Post-Hooks**:
|
||||
- `post_hook_method(*args, **kwargs)`
|
||||
- **Arguments**:
|
||||
- `*args`, `**kwargs`: The arguments and keyword arguments that were passed to the main method.
|
||||
- `kwargs["result"]`: The value returned by the main method. This allows the post-hook to inspect and even alter the result before it is returned to the original caller.
|
||||
- **Return**:
|
||||
- Can return a modified result, which will replace the original result of the main method, or simply return `kwargs["result"]` to return the original method result.
|
||||
- ```python
|
||||
def pre_processing_hook(*args, **kwargs):
|
||||
print("Pre-processing logic here")
|
||||
# Modify arguments or perform any checks
|
||||
return args, kwargs # Return modified or unmodified args and kwargs
|
||||
|
||||
def post_processing_hook(*args, **kwargs):
|
||||
print("Post-processing logic here")
|
||||
# Modify the result or perform any final logging or cleanup
|
||||
return kwargs["result"] # Return the modified or unmodified result
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
# Registering a pre-hook
|
||||
connapp.ai.some_method.register_pre_hook(pre_processing_hook)
|
||||
|
||||
# Registering a post-hook
|
||||
connapp.node.another_method.register_post_hook(post_processing_hook)
|
||||
```
|
||||
|
||||
### Executable Block
|
||||
- The plugin script can include an executable block:
|
||||
- `if __name__ == "__main__":`
|
||||
- This block allows the plugin to be run as a standalone script for testing or independent use.
|
||||
You can customize the behavior of core classes using hooks:
|
||||
- **`modify(method)`**: Alter class instances (e.g., `connapp.config`, `connapp.ai`).
|
||||
- **`register_pre_hook(method)`**: Logic to run before a method execution.
|
||||
- **`register_post_hook(method)`**: Logic to run after a method execution.
|
||||
|
||||
### Command Completion Support
|
||||
Plugins can provide intelligent tab completion:
|
||||
1. **Tree-based Completion (Recommended)**: Define `_connpy_tree(info)` returning a navigation dictionary.
|
||||
2. **Legacy Completion**: Define `_connpy_completion(wordsnumber, words, info)`.
|
||||
|
||||
Plugins can provide intelligent **tab completion** by defining autocompletion logic. There are two supported methods, with the tree-based approach being the most modern and recommended.
|
||||
---
|
||||
|
||||
#### 1. Tree-based Completion (Recommended)
|
||||
## ⚙️ gRPC Service Architecture
|
||||
Connpy can operate in a decoupled mode:
|
||||
1. **Start the API (Server)**: `conn api -s 50051`
|
||||
2. **Configure the Client**:
|
||||
```bash
|
||||
conn config --service-mode remote
|
||||
conn config --remote-host localhost:50051
|
||||
```
|
||||
All inventory management and execution will now happen on the server.
|
||||
|
||||
Define a function called `_connpy_tree` that returns a declarative navigation tree. This method is highly efficient, supports complex state loops, and is very simple to implement for most use cases.
|
||||
---
|
||||
|
||||
## 🐍 Automation Module (API)
|
||||
You can use `connpy` as a Python library for your own scripts.
|
||||
|
||||
### Basic Execution
|
||||
```python
|
||||
def _connpy_tree(info=None):
|
||||
nodes = info.get("nodes", [])
|
||||
return {
|
||||
"__exclude_used__": True, # Filter out words already typed
|
||||
"__extra__": nodes, # Suggest nodes at this level
|
||||
"--format": ["json", "yaml", "table"], # Fixed suggestions
|
||||
"*": { # Wildcard matches any positional word
|
||||
"interface1": None,
|
||||
"interface2": None,
|
||||
"--verbose": None
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- **Keys**: Literal completions (exact matches).
|
||||
- **`*` Key**: A wildcard that matches any positional word typed by the user.
|
||||
- **`__extra__`**: A list or a callable `(words) -> list` that adds dynamic suggestions.
|
||||
- **`__exclude_used__`**: (Boolean) If True, automatically filters out words already present in the command line.
|
||||
|
||||
#### 2. Legacy Function-based Completion
|
||||
|
||||
For backward compatibility or highly custom logic, you can define `_connpy_completion`.
|
||||
|
||||
```python
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
if wordsnumber == 3:
|
||||
return ["--help", "--verbose", "start", "stop"]
|
||||
|
||||
elif wordsnumber == 4 and words[2] == "start":
|
||||
return info["nodes"] # Suggest node names
|
||||
|
||||
return []
|
||||
```
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the total number of words on the command line. For plugins, this typically starts at 3. |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin. |
|
||||
| `info` | A dictionary of structured context data (`nodes`, `folders`, `profiles`, `config`). |
|
||||
|
||||
> In this example, if the user types `connpy myplugin start ` and presses Tab, it will suggest node names.
|
||||
|
||||
### Handling Unknown Arguments
|
||||
|
||||
Plugins can choose to accept and process unknown arguments that are **not explicitly defined** in the parser. To enable this behavior, the plugin must define the following hidden argument in its `Parser` class:
|
||||
|
||||
```
|
||||
self.parser.add_argument(
|
||||
"--unknown-args",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help=argparse.SUPPRESS
|
||||
)
|
||||
```
|
||||
|
||||
#### Behavior:
|
||||
|
||||
- When this argument is present, Connpy will parse the known arguments and capture any extra (unknown) ones.
|
||||
- These unknown arguments will be passed to the plugin as `args.unknown_args` inside the `Entrypoint`.
|
||||
- If the user does not pass any unknown arguments, `args.unknown_args` will contain the default value (`True`, unless overridden).
|
||||
|
||||
#### Example:
|
||||
|
||||
If a plugin accepts unknown tcpdump flags like this:
|
||||
|
||||
```
|
||||
connpy myplugin -nn -s0
|
||||
```
|
||||
|
||||
And defines the hidden `--unknown-args` flag as shown above, then:
|
||||
|
||||
- `args.unknown_args` inside `Entrypoint.__init__()` will be: `['-nn', '-s0']`
|
||||
|
||||
> This allows the plugin to receive and process arguments intended for external tools (e.g., `tcpdump`) without argparse raising an error.
|
||||
|
||||
#### Note:
|
||||
|
||||
If a plugin does **not** define `--unknown-args`, any extra arguments passed will cause argparse to fail with an unrecognized arguments error.
|
||||
|
||||
### Script Verification
|
||||
- The `verify_script` method in `plugins.py` is used to check the plugin script's compliance with these standards.
|
||||
- Non-compliant scripts will be rejected to ensure consistency and proper functionality within the plugin system.
|
||||
-
|
||||
### Example Script
|
||||
|
||||
For a practical example of how to write a compatible plugin script, please refer to the following example:
|
||||
|
||||
[Example Plugin Script](https://github.com/fluzzi/awspy)
|
||||
|
||||
This script demonstrates the required structure and implementation details according to the plugin system's standards.
|
||||
|
||||
## gRPC Service Architecture
|
||||
Connpy features a completely decoupled gRPC Client/Server architecture. You can run Connpy as a standalone background service and connect to it remotely via the CLI or other clients.
|
||||
|
||||
### 1. Start the Server
|
||||
Start the gRPC service by running:
|
||||
```bash
|
||||
connpy api -s 50051
|
||||
```
|
||||
The server will handle all configurations, connections, AI sessions, and plugin execution locally on the machine it runs on.
|
||||
|
||||
### 2. Connect the Client
|
||||
Configure your local CLI client to connect to the remote server:
|
||||
```bash
|
||||
connpy config --service-mode remote
|
||||
connpy config --remote-host localhost:50051
|
||||
```
|
||||
Once configured, all commands (`connpy node`, `connpy list`, `connpy ai`, etc.) will execute transparently on the remote server via thin-client proxies. You can revert back to standalone execution at any time by running `connpy config --service-mode local`.
|
||||
|
||||
### Programmatic Access (gRPC & SOA)
|
||||
Developers can build their own applications using the Connpy backend by utilizing the `ServiceProvider`:
|
||||
|
||||
```python
|
||||
from connpy.services.provider import ServiceProvider
|
||||
services = ServiceProvider(config, mode="remote", remote_host="localhost:50051")
|
||||
nodes = services.nodes.list_nodes()
|
||||
```
|
||||
|
||||
|
||||
## Automation module
|
||||
The automation module
|
||||
### Standalone module
|
||||
```
|
||||
import connpy
|
||||
router = connpy.node("uniqueName","ip/host", user="user", password="pass")
|
||||
router.run(["term len 0","show run"])
|
||||
router = connpy.node("uniqueName", "1.1.1.1", user="admin")
|
||||
router.run(["show ip int brief"])
|
||||
print(router.output)
|
||||
hasip = router.test("show ip int brief","1.1.1.1")
|
||||
if hasip:
|
||||
print("Router has ip 1.1.1.1")
|
||||
else:
|
||||
print("router does not have ip 1.1.1.1")
|
||||
```
|
||||
|
||||
### Using manager configuration
|
||||
```
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
device = conf.getitem("server@office")
|
||||
server = connpy.node("unique name", **device, config=conf)
|
||||
result = server.run(["cd /", "ls -la"])
|
||||
print(result)
|
||||
```
|
||||
### Running parallel tasks
|
||||
```
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
#You can get the nodes from the config from a folder and fitlering in it
|
||||
nodes = conf.getitem("@office", ["router1", "router2", "router3"])
|
||||
#You can also get each node individually:
|
||||
nodes = {}
|
||||
nodes["router1"] = conf.getitem("router1@office")
|
||||
nodes["router2"] = conf.getitem("router2@office")
|
||||
nodes["router10"] = conf.getitem("router10@datacenter")
|
||||
#Also, you can create the nodes manually:
|
||||
nodes = {}
|
||||
nodes["router1"] = {"host": "1.1.1.1", "user": "user", "password": "pass1"}
|
||||
nodes["router2"] = {"host": "1.1.1.2", "user": "user", "password": "pass2"}
|
||||
nodes["router3"] = {"host": "1.1.1.2", "user": "user", "password": "pass3"}
|
||||
#Finally you run some tasks on the nodes
|
||||
mynodes = connpy.nodes(nodes, config = conf)
|
||||
result = mynodes.test(["show ip int br"], "1.1.1.2")
|
||||
for i in result:
|
||||
print("---" + i + "---")
|
||||
print(result[i])
|
||||
print()
|
||||
# Or for one specific node
|
||||
mynodes.router1.run(["term len 0". "show run"], folder = "/home/user/logs")
|
||||
```
|
||||
### Using variables
|
||||
```
|
||||
### Parallel Tasks with Variables
|
||||
```python
|
||||
import connpy
|
||||
config = connpy.configfile()
|
||||
nodes = config.getitem("@office", ["router1", "router2", "router3"])
|
||||
commands = []
|
||||
commands.append("config t")
|
||||
commands.append("interface lo {id}")
|
||||
commands.append("ip add {ip} {mask}")
|
||||
commands.append("end")
|
||||
variables = {}
|
||||
variables["router1@office"] = {"ip": "10.57.57.1"}
|
||||
variables["router2@office"] = {"ip": "10.57.57.2"}
|
||||
variables["router3@office"] = {"ip": "10.57.57.3"}
|
||||
variables["__global__"] = {"id": "57"}
|
||||
variables["__global__"]["mask"] = "255.255.255.255"
|
||||
expected = "!"
|
||||
routers = connpy.nodes(nodes, config = config)
|
||||
routers.run(commands, variables)
|
||||
routers.test("ping {ip}", expected, variables)
|
||||
for key in routers.result:
|
||||
print(key, ' ---> ', ("pass" if routers.result[key] else "fail"))
|
||||
```
|
||||
### Using AI
|
||||
nodes = config.getitem("@office", ["router1", "router2"])
|
||||
routers = connpy.nodes(nodes, config=config)
|
||||
|
||||
variables = {
|
||||
"router1@office": {"id": "1"},
|
||||
"__global__": {"mask": "255.255.255.0"}
|
||||
}
|
||||
routers.run(["interface lo{id}", "ip address 10.0.0.{id} {mask}"], variables)
|
||||
```
|
||||
|
||||
### AI Programmatic Use
|
||||
```python
|
||||
import connpy
|
||||
conf = connpy.configfile()
|
||||
# Uses models and API keys from config, or override them:
|
||||
myai = connpy.ai(conf, engineer_model="gemini/gemini-2.5-flash", engineer_api_key="your-key")
|
||||
result = myai.ask("go to router1 and show me the running configuration")
|
||||
print(result["response"])
|
||||
# Streaming is enabled by default for CLI, disable for programmatic use:
|
||||
result = myai.ask("show interfaces on all routers", stream=False)
|
||||
print(result["response"])
|
||||
myai = connpy.ai(connpy.configfile())
|
||||
response = myai.ask("What is the status of the BGP neighbors in the office?")
|
||||
```
|
||||
|
||||
#### AI Plugin Tool Registration
|
||||
Plugins can register custom tools with the AI system using `register_ai_tool()` in their `Preload` class:
|
||||
```
|
||||
def _register_my_tools(ai_instance):
|
||||
tool_def = {
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "my_custom_tool",
|
||||
"description": "Does something useful.",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {"query": {"type": "string"}},
|
||||
"required": ["query"]
|
||||
}
|
||||
}
|
||||
}
|
||||
ai_instance.register_ai_tool(
|
||||
tool_definition=tool_def,
|
||||
handler=my_handler_function,
|
||||
target="engineer", # or "architect" or "both"
|
||||
engineer_prompt="- My tool: does X.",
|
||||
architect_prompt=" * My tool (my_custom_tool)."
|
||||
)
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
connapp.ai.modify(_register_my_tools)
|
||||
```
|
||||
|
||||
## Developer Notes (SOA Architecture)
|
||||
As of version 2.0, Connpy has migrated to a **Service-Oriented Architecture (SOA)**:
|
||||
- **`connpy/cli/`**: Contains all CLI handlers. These are responsible for argument parsing, user interaction (via `inquirer`), and visual output (via `printer`).
|
||||
- **`connpy/services/`**: Contains pure logic services (Node, Profile, Execution, etc.).
|
||||
- **Zero-Print Policy**: Services must never use `print()`. All output must be returned as data structures or generators to the caller (CLI handlers).
|
||||
- **ServiceProvider**: Access services via `connapp.services`. This allows transparent switching between local and remote (gRPC) backends without modifying CLI logic.
|
||||
---
|
||||
*For detailed developer notes and plugin hooks documentation, see the [Documentation](https://fluzzi.github.io/connpy/).*
|
||||
'''
|
||||
from .core import node,nodes
|
||||
from .configfile import configfile
|
||||
|
||||
+1
-1
@@ -1 +1 @@
|
||||
__version__ = "6.0.0b7"
|
||||
__version__ = "6.0.0b8"
|
||||
|
||||
+38
-3
@@ -118,7 +118,7 @@ class ai:
|
||||
aiconfig = self.config.config.get("ai", {})
|
||||
|
||||
# Modelos (Prioridad: Argumento -> Config -> Default)
|
||||
self.engineer_model = engineer_model or aiconfig.get("engineer_model") or "gemini/gemini-3.1-flash-lite-preview"
|
||||
self.engineer_model = engineer_model or aiconfig.get("engineer_model") or "gemini/gemini-3.1-flash-lite"
|
||||
self.architect_model = architect_model or aiconfig.get("architect_model") or "anthropic/claude-sonnet-4-6"
|
||||
|
||||
# API Keys (Prioridad: Argumento -> Config)
|
||||
@@ -1303,6 +1303,8 @@ class ai:
|
||||
node_info = node_info or {}
|
||||
os_info = node_info.get("os", "unknown")
|
||||
node_name = node_info.get("name", "unknown")
|
||||
persona = node_info.get("persona", "engineer")
|
||||
memories = node_info.get("memories", [])
|
||||
|
||||
vendor_reference = ""
|
||||
if os_info and os_info != "unknown":
|
||||
@@ -1315,6 +1317,30 @@ class ai:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if persona == "architect":
|
||||
system_prompt = f"""Role: NETWORK ARCHITECT. You act as a senior strategic advisor during a live SSH session.
|
||||
Rules:
|
||||
1. Answer the user's question directly based on the Terminal Context.
|
||||
2. Focus on the "why" and "how". Analyze topologies, design patterns, and validate configurations.
|
||||
3. Do NOT provide commands to execute unless specifically requested. Instead, explain the consequences and best practices.
|
||||
4. Keep your guide concise and authoritative.
|
||||
5. You MUST output your response in the following strict format:
|
||||
<guide>
|
||||
Your brief tactical guide in markdown.
|
||||
</guide>
|
||||
<commands>
|
||||
</commands>
|
||||
<risk>
|
||||
low
|
||||
</risk>
|
||||
6. Risk level is usually "low" for read-only/no commands.
|
||||
|
||||
Terminal Context:
|
||||
{terminal_buffer}
|
||||
|
||||
Device OS: {os_info}
|
||||
Node: {node_name}"""
|
||||
else:
|
||||
system_prompt = f"""Role: TERMINAL COPILOT. You assist a network engineer during a live SSH session.
|
||||
Rules:
|
||||
1. Answer the user's question directly based on the Terminal Context.
|
||||
@@ -1343,6 +1369,11 @@ Node: {node_name}"""
|
||||
if vendor_reference:
|
||||
system_prompt += f"\n\nVendor Command Reference:\n{vendor_reference}"
|
||||
|
||||
if memories:
|
||||
system_prompt += "\n\nSession Memory (Important Facts):\n"
|
||||
for m in memories:
|
||||
system_prompt += f"- {m}\n"
|
||||
|
||||
# Fetch MCP tools for the current OS
|
||||
mcp_tools = []
|
||||
try:
|
||||
@@ -1362,14 +1393,18 @@ Node: {node_name}"""
|
||||
iteration = 0
|
||||
max_iterations = 5 # Allow up to 5 iterations for tool usage
|
||||
|
||||
# Use models based on persona
|
||||
current_model = self.architect_model if persona == "architect" else self.engineer_model
|
||||
current_key = self.architect_key if persona == "architect" else self.engineer_key
|
||||
|
||||
try:
|
||||
while iteration < max_iterations:
|
||||
iteration += 1
|
||||
response = await acompletion(
|
||||
model=self.engineer_model,
|
||||
model=current_model,
|
||||
messages=messages,
|
||||
tools=mcp_tools if mcp_tools else None,
|
||||
api_key=self.engineer_key,
|
||||
api_key=current_key,
|
||||
stream=True
|
||||
)
|
||||
|
||||
|
||||
@@ -121,7 +121,7 @@ class RunHandler:
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_run_complete
|
||||
@@ -155,7 +155,7 @@ class RunHandler:
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_test_complete
|
||||
|
||||
+198
-24
@@ -1,6 +1,7 @@
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
import asyncio
|
||||
import fcntl
|
||||
import termios
|
||||
@@ -22,12 +23,19 @@ from connpy.utils import log_cleaner
|
||||
from ..services.ai_service import AIService
|
||||
|
||||
class CopilotInterface:
|
||||
def __init__(self, config, history=None, pt_input=None, pt_output=None, rich_file=None):
|
||||
def __init__(self, config, history=None, pt_input=None, pt_output=None, rich_file=None, session_state=None):
|
||||
self.config = config
|
||||
self.history = history or InMemoryHistory()
|
||||
self.pt_input = pt_input
|
||||
self.pt_output = pt_output
|
||||
self.ai_service = AIService(config)
|
||||
self.session_state = session_state if session_state is not None else {
|
||||
'persona': 'engineer',
|
||||
'trust_mode': False,
|
||||
'memories': [],
|
||||
'os': None,
|
||||
'prompt': None
|
||||
}
|
||||
|
||||
if rich_file:
|
||||
self.console = Console(theme=connpy_theme, force_terminal=True, file=rich_file)
|
||||
@@ -36,6 +44,17 @@ class CopilotInterface:
|
||||
|
||||
self.mode_range, self.mode_single, self.mode_lines = 0, 1, 2
|
||||
|
||||
def _get_theme_color(self, style_name: str, fallback: str = "white") -> str:
|
||||
"""Extract Hex or ANSI color name from the active rich theme."""
|
||||
try:
|
||||
style = connpy_theme.styles.get(style_name)
|
||||
if style and style.color:
|
||||
# If it's a standard color like 'green', Rich might return its hex triplet
|
||||
if style.color.is_default: return fallback
|
||||
return style.color.triplet.hex if style.color.triplet else style.color.name
|
||||
except: pass
|
||||
return fallback
|
||||
|
||||
async def run_session(self,
|
||||
raw_bytes: bytes,
|
||||
cmd_byte_positions: List[tuple],
|
||||
@@ -60,7 +79,9 @@ class CopilotInterface:
|
||||
'total_lines': len(buffer.split('\n')),
|
||||
'context_lines': min(50, len(buffer.split('\n'))),
|
||||
'context_mode': self.mode_range,
|
||||
'cancelled': False
|
||||
'cancelled': False,
|
||||
'toolbar_msg': '',
|
||||
'msg_expiry': 0
|
||||
}
|
||||
|
||||
# 1. Visual Separation
|
||||
@@ -90,6 +111,11 @@ class CopilotInterface:
|
||||
event.app.invalidate()
|
||||
@bindings.add('tab')
|
||||
def _(event):
|
||||
buf = event.current_buffer
|
||||
# If typing a slash command (no spaces yet), use tab to autocomplete inline
|
||||
if buf.text.startswith('/') and ' ' not in buf.text:
|
||||
buf.complete_next()
|
||||
else:
|
||||
state['context_mode'] = (state['context_mode'] + 1) % 3
|
||||
event.app.invalidate()
|
||||
@bindings.add('escape', eager=True)
|
||||
@@ -111,22 +137,100 @@ class CopilotInterface:
|
||||
return preview + "\n" + log_cleaner(active_raw.decode(errors='replace'))
|
||||
|
||||
def get_prompt_text():
|
||||
import html
|
||||
# Always use user_prompt color for the Ask prompt
|
||||
color = self._get_theme_color("user_prompt", "cyan")
|
||||
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
return HTML(f"<ansicyan>Ask [Ctx: {state['context_lines']}/{state['total_lines']}L]: </ansicyan>")
|
||||
text = html.escape(f"Ask [Ctx: {state['context_lines']}/{state['total_lines']}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
active = get_active_buffer()
|
||||
lines_count = len(active.split('\n'))
|
||||
mode_str = {self.mode_range: "Range", self.mode_single: "Cmd"}[state['context_mode']]
|
||||
return HTML(f"<ansicyan>Ask [{mode_str} {state['context_cmd']} ~{lines_count}L]: </ansicyan>")
|
||||
text = html.escape(f"Ask [{mode_str} {state['context_cmd']} ~{lines_count}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
|
||||
from prompt_toolkit.application.current import get_app
|
||||
|
||||
def get_toolbar():
|
||||
import html
|
||||
app = get_app()
|
||||
c_warning = self._get_theme_color("warning", "yellow")
|
||||
|
||||
if app and app.current_buffer:
|
||||
text = app.current_buffer.text
|
||||
# Solo mostrar ayuda de comandos si estamos escribiendo el primer comando y no hay espacios
|
||||
if text.startswith('/') and ' ' not in text:
|
||||
commands = ['/os', '/prompt', '/architect', '/engineer', '/trust', '/untrust', '/memorize', '/clear']
|
||||
matches = [c for c in commands if c.startswith(text.lower())]
|
||||
if matches:
|
||||
m_text = html.escape(f"Available: {' '.join(matches)}")
|
||||
return HTML(f'<style fg="{c_warning}">{m_text}</style>' + " " * 20)
|
||||
|
||||
m_label = {self.mode_range: "RANGE", self.mode_single: "SINGLE", self.mode_lines: "LINES"}[state['context_mode']]
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
return HTML(f"<ansigray>\u25b6 Ctrl+\u2191/\u2193 adjusts by 50 lines [Tab: {m_label}]</ansigray>")
|
||||
base_str = f'\u25b6 Ctrl+\u2191/\u2193 adjusts by 50 lines [Tab: {m_label}]'
|
||||
else:
|
||||
idx = max(0, state['total_cmds'] - state['context_cmd'])
|
||||
return HTML(f"<ansigray>\u25b6 {blocks[idx][1]} [Tab: {m_label}]</ansigray>")
|
||||
desc = blocks[idx][1]
|
||||
base_str = f'\u25b6 {desc} [Tab: {m_label}]'
|
||||
|
||||
# Wrap base_str in a style to maintain consistency and avoid glitches
|
||||
# The fg color will be inherited from bottom-toolbar global style if not specified here
|
||||
base_html = f'<span>{html.escape(base_str)}</span>'
|
||||
|
||||
res_html = base_html
|
||||
if state.get('toolbar_msg'):
|
||||
if time.time() < state.get('msg_expiry', 0):
|
||||
msg = html.escape(state['toolbar_msg'])
|
||||
res_html = f'<style fg="{c_warning}">⚙️ {msg}</style> | ' + base_html
|
||||
else:
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
# Pad with spaces to ensure the line is cleared when the message disappears
|
||||
return HTML(res_html + " " * 20)
|
||||
|
||||
from prompt_toolkit.completion import Completer, Completion
|
||||
class SlashCommandCompleter(Completer):
|
||||
def get_completions(self, document, complete_event):
|
||||
text = document.text_before_cursor
|
||||
if text.startswith('/'):
|
||||
parts = text.split()
|
||||
# Only autocomplete the first word
|
||||
if len(parts) <= 1 or (len(parts) == 1 and not text.endswith(' ')):
|
||||
cmd_part = parts[0] if parts else text
|
||||
commands = [
|
||||
('/os', 'Set device OS (e.g. cisco_ios)'),
|
||||
('/prompt', 'Override prompt regex'),
|
||||
('/architect', 'Switch to Architect persona'),
|
||||
('/engineer', 'Switch to Engineer persona'),
|
||||
('/trust', 'Enable auto-execute'),
|
||||
('/untrust', 'Disable auto-execute'),
|
||||
('/memorize', 'Add fact to memory'),
|
||||
('/clear', 'Clear memory')
|
||||
]
|
||||
for cmd, desc in commands:
|
||||
if cmd.startswith(cmd_part.lower()):
|
||||
yield Completion(cmd, start_position=-len(cmd_part), display_meta=desc)
|
||||
|
||||
copilot_completer = SlashCommandCompleter()
|
||||
|
||||
while True:
|
||||
# 2. Ask question
|
||||
session = PromptSession(history=self.history)
|
||||
from prompt_toolkit.styles import Style
|
||||
c_contrast = self._get_theme_color("contrast", "gray")
|
||||
ui_style = Style.from_dict({
|
||||
'bottom-toolbar': f'fg:{c_contrast}',
|
||||
})
|
||||
|
||||
session = PromptSession(
|
||||
history=self.history,
|
||||
input=self.pt_input,
|
||||
output=self.pt_output,
|
||||
completer=copilot_completer,
|
||||
reserve_space_for_menu=0,
|
||||
style=ui_style
|
||||
)
|
||||
try:
|
||||
# Usamos un try/finally interno para asegurar que si algo falla en prompt_async,
|
||||
# no nos quedemos con la terminal en un estado extraño.
|
||||
@@ -139,19 +243,66 @@ class CopilotInterface:
|
||||
state['cancelled'] = True
|
||||
question = ""
|
||||
|
||||
if state['cancelled'] or not question.strip() or question.strip().lower() == 'cancel':
|
||||
if state['cancelled'] or not question.strip() or question.strip().lower() in ['cancel', 'exit', 'quit']:
|
||||
return "cancel", None, None
|
||||
|
||||
# 3. Process Input via AIService
|
||||
directive = self.ai_service.process_copilot_input(question, self.session_state)
|
||||
|
||||
if directive["action"] == "state_update":
|
||||
state['toolbar_msg'] = directive['message']
|
||||
state['msg_expiry'] = time.time() + 3 # 3 seconds timeout
|
||||
|
||||
async def delayed_refresh():
|
||||
await asyncio.sleep(3.1)
|
||||
# Only invalidate if the message hasn't been replaced by a newer one
|
||||
if state.get('toolbar_msg') == directive['message']:
|
||||
state['toolbar_msg'] = '' # Explicitly clear
|
||||
try:
|
||||
from prompt_toolkit.application.current import get_app
|
||||
app = get_app()
|
||||
if app: app.invalidate()
|
||||
except: pass
|
||||
asyncio.create_task(delayed_refresh())
|
||||
|
||||
# Mover el cursor arriba y limpiar la línea para que el nuevo prompt reemplace al anterior
|
||||
sys.stdout.write('\x1b[1A\x1b[2K')
|
||||
sys.stdout.flush()
|
||||
continue
|
||||
else:
|
||||
# Limpiar el mensaje de la barra cuando se hace una pregunta real
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
clean_question = directive.get("clean_prompt", question)
|
||||
overrides = directive.get("overrides", {})
|
||||
|
||||
# Merge node_info with session_state and overrides
|
||||
merged_node_info = node_info.copy()
|
||||
if self.session_state['os']: merged_node_info['os'] = self.session_state['os']
|
||||
if self.session_state['prompt']: merged_node_info['prompt'] = self.session_state['prompt']
|
||||
merged_node_info['persona'] = self.session_state['persona']
|
||||
merged_node_info['trust'] = self.session_state['trust_mode']
|
||||
merged_node_info['memories'] = list(self.session_state['memories'])
|
||||
|
||||
for k, v in overrides.items():
|
||||
merged_node_info[k] = v
|
||||
|
||||
# Enrich question
|
||||
past = self.history.get_strings()
|
||||
if len(past) > 1:
|
||||
history_text = "\n".join(f"- {q}" for q in past[-6:-1])
|
||||
question = f"Previous questions:\n{history_text}\n\nCurrent Question:\n{question}"
|
||||
clean_past = [q for q in past[-6:-1] if not q.startswith('/')]
|
||||
if clean_past:
|
||||
history_text = "\n".join(f"- {q}" for q in clean_past)
|
||||
clean_question = f"Previous questions:\n{history_text}\n\nCurrent Question:\n{clean_question}"
|
||||
|
||||
# 3. AI Execution
|
||||
# Use persona from overrides (one-shot) or from session state
|
||||
active_persona = merged_node_info.get('persona', self.session_state.get('persona', 'engineer'))
|
||||
persona_color = self._get_theme_color(active_persona, fallback="cyan")
|
||||
|
||||
active_buffer = get_active_buffer()
|
||||
live_text = "Thinking..."
|
||||
panel = Panel(live_text, title="[bold cyan]Copilot Guide[/bold cyan]", border_style="cyan")
|
||||
panel = Panel(live_text, title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color)
|
||||
|
||||
def on_chunk(text):
|
||||
nonlocal live_text
|
||||
@@ -160,12 +311,12 @@ class CopilotInterface:
|
||||
|
||||
with Live(panel, console=self.console, refresh_per_second=10) as live:
|
||||
def update_live(t):
|
||||
live.update(Panel(Markdown(t), title="[bold cyan]Copilot Guide[/bold cyan]", border_style="cyan"))
|
||||
live.update(Panel(Markdown(t), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
wrapped_chunk = lambda t: (on_chunk(t), update_live(live_text))
|
||||
|
||||
# Check for interruption during AI call
|
||||
ai_task = asyncio.create_task(on_ai_call(active_buffer, question, wrapped_chunk))
|
||||
ai_task = asyncio.create_task(on_ai_call(active_buffer, clean_question, wrapped_chunk, merged_node_info))
|
||||
|
||||
try:
|
||||
while not ai_task.done():
|
||||
@@ -180,27 +331,39 @@ class CopilotInterface:
|
||||
|
||||
# 4. Handle result
|
||||
if live_text == "Thinking..." and result.get("guide"):
|
||||
self.console.print(Panel(Markdown(result["guide"]), title="[bold cyan]Copilot Guide[/bold cyan]", border_style="cyan"))
|
||||
self.console.print(Panel(Markdown(result["guide"]), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
commands = result.get("commands", [])
|
||||
if not commands:
|
||||
return "cancel", None, None
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
risk = result.get("risk_level", "low")
|
||||
style = {"low": "green", "high": "yellow", "destructive": "red"}.get(risk, "green")
|
||||
cmd_text = "\n".join(f" {i+1}. {c}" for i, c in enumerate(commands))
|
||||
self.console.print(Panel(cmd_text, title=f"[bold {style}]Suggested Commands [{risk.upper()}][/bold {style}]", border_style=style))
|
||||
risk_style = {"low": "success", "high": "warning", "destructive": "error"}.get(risk, "success")
|
||||
style_color = self._get_theme_color(risk_style, fallback="green")
|
||||
|
||||
confirm_session = PromptSession()
|
||||
cmd_text = "\n".join(f" {i+1}. {c}" for i, c in enumerate(commands))
|
||||
# Explicitly use 'bold style_color' for both TITLE and BORDER to ensure maximum consistency
|
||||
self.console.print(Panel(cmd_text, title=f"[bold {style_color}]Suggested Commands [{risk.upper()}][/bold {style_color}]", border_style=f"bold {style_color}"))
|
||||
|
||||
if merged_node_info.get('trust', False) and risk != "destructive":
|
||||
self.console.print(f"[dim]⚙️ Auto-executing (Trust Mode)[/dim]")
|
||||
return "send_all", commands, None
|
||||
|
||||
confirm_session = PromptSession(input=self.pt_input, output=self.pt_output)
|
||||
c_bindings = KeyBindings()
|
||||
@c_bindings.add('escape', eager=True)
|
||||
@c_bindings.add('c-c')
|
||||
def _(ev): ev.app.exit(result='n')
|
||||
|
||||
import html
|
||||
try:
|
||||
action = await confirm_session.prompt_async(HTML(f"<ansi{style}>Send? (y/n/e/range) [n]: </ansi{style}>"), key_bindings=c_bindings)
|
||||
p_text = html.escape(f"Send? (y/n/e/range) [n]: ")
|
||||
# Use the EXACT same style_color and force bold="true" for Prompt-Toolkit
|
||||
action = await confirm_session.prompt_async(HTML(f'<style fg="{style_color}" bold="true">{p_text}</style>'), key_bindings=c_bindings)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
action = "n"
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
def parse_indices(text, max_len):
|
||||
"""Helper to parse '1-3, 5, 7' into [0, 1, 2, 4, 6]."""
|
||||
@@ -247,18 +410,29 @@ class CopilotInterface:
|
||||
@e_bindings.add('escape')
|
||||
def _(ev): ev.app.exit(result='')
|
||||
|
||||
c_edit = self._get_theme_color("user_prompt", "cyan")
|
||||
import html
|
||||
e_text = html.escape("Edit (Ctrl+Enter or Esc+Enter to submit):\n")
|
||||
try:
|
||||
edited = await confirm_session.prompt_async(
|
||||
HTML("<ansicyan>Edit (Ctrl+Enter or Esc+Enter to submit):\n</ansicyan>"),
|
||||
HTML(f'<style fg="{c_edit}">{e_text}</style>'),
|
||||
default=target, multiline=True, key_bindings=e_bindings
|
||||
)
|
||||
if edited.strip():
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
if edited and edited.strip():
|
||||
# Split by lines to ensure core.py applies delay between each command
|
||||
lines = [l.strip() for l in edited.split('\n') if l.strip()]
|
||||
return "custom", None, lines
|
||||
return "cancel", None, None
|
||||
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
return "cancel", None, None
|
||||
|
||||
finally:
|
||||
state['cancelled'] = True
|
||||
self.console.print("[dim]Returning to session...[/dim]")
|
||||
|
||||
|
||||
@@ -169,11 +169,21 @@ def _build_tree(nodes, folders, profiles, plugins, configdir):
|
||||
}
|
||||
|
||||
# State Machine Definitions
|
||||
mcp_dict = {
|
||||
"list": None,
|
||||
"add": {"*": {"*": {"*": None}}}, # name url [os]
|
||||
"remove": {"*": None},
|
||||
"enable": {"*": None},
|
||||
"disable": {"*": None},
|
||||
"--help": None, "-h": None
|
||||
}
|
||||
|
||||
ai_dict = {"__exclude_used__": True, "--help": None, "-h": None}
|
||||
for opt in ["--engineer-model", "--engineer-api-key", "--architect-model", "--architect-api-key"]:
|
||||
ai_dict[opt] = {"*": ai_dict} # takes value, loops back
|
||||
for opt in ["--debug", "--trust", "--list", "--list-sessions", "--session", "--resume", "--delete", "--delete-session", "-y"]:
|
||||
ai_dict[opt] = ai_dict # takes no value, loops back
|
||||
ai_dict["--mcp"] = mcp_dict
|
||||
ai_dict["*"] = ai_dict
|
||||
|
||||
mv_state = {"__extra__": _nodes, "--help": None, "-h": None}
|
||||
|
||||
+4
-4
@@ -89,6 +89,10 @@ class connapp:
|
||||
if hasattr(self.services.nodes, "list_folders") and hasattr(self.services.nodes.list_folders, "register_post_hook"):
|
||||
self.services.nodes.list_folders.register_post_hook(self.services.context.filter_node_list)
|
||||
|
||||
# Apply theme from config if exists before remote connection attempts
|
||||
user_theme = self.config.config.get("theme", {})
|
||||
self._apply_app_theme(user_theme)
|
||||
|
||||
# Populate data via services
|
||||
try:
|
||||
self.nodes_list = self.services.nodes.list_nodes()
|
||||
@@ -152,10 +156,6 @@ class connapp:
|
||||
|
||||
configfile._saveconfig.register_post_hook(auto_sync_hook)
|
||||
|
||||
# Apply theme from config if exists
|
||||
user_theme = self.config.config.get("theme", {})
|
||||
self._apply_app_theme(user_theme)
|
||||
|
||||
def _apply_app_theme(self, styles):
|
||||
"""Unified method to apply theme to printer and help formatter."""
|
||||
active_styles = printer.apply_theme(styles)
|
||||
|
||||
+12
-6
@@ -35,8 +35,6 @@ def copilot_terminal_mode():
|
||||
new_settings[1] = new_settings[1] | termios.OPOST
|
||||
termios.tcsetattr(fd, termios.TCSANOW, new_settings)
|
||||
|
||||
yield
|
||||
except Exception:
|
||||
yield
|
||||
finally:
|
||||
try:
|
||||
@@ -610,20 +608,24 @@ class node:
|
||||
|
||||
async def handler(buffer, node_info, stream, child_fd, cmd_byte_positions=None):
|
||||
try:
|
||||
interface = CopilotInterface(config, history=getattr(stream, 'copilot_history', None))
|
||||
interface = CopilotInterface(
|
||||
config,
|
||||
history=getattr(stream, 'copilot_history', None),
|
||||
session_state=getattr(stream, 'copilot_state', None)
|
||||
)
|
||||
# Save history back to stream for persistence in current session
|
||||
stream.copilot_history = interface.history
|
||||
stream.copilot_state = interface.session_state
|
||||
|
||||
ai_service = AIService(config)
|
||||
|
||||
async def on_ai_call(active_buffer, question, chunk_callback):
|
||||
async def on_ai_call(active_buffer, question, chunk_callback, merged_node_info):
|
||||
return await ai_service.aask_copilot(
|
||||
active_buffer,
|
||||
question,
|
||||
node_info=node_info,
|
||||
node_info=merged_node_info,
|
||||
chunk_callback=chunk_callback
|
||||
)
|
||||
|
||||
# Get raw bytes from BytesIO
|
||||
raw_bytes = self.mylog.getvalue()
|
||||
|
||||
@@ -637,12 +639,16 @@ class node:
|
||||
|
||||
try:
|
||||
with copilot_terminal_mode():
|
||||
while True:
|
||||
action, commands, custom_cmd = await interface.run_session(
|
||||
raw_bytes=raw_bytes,
|
||||
cmd_byte_positions=cmd_byte_positions,
|
||||
node_info=node_info,
|
||||
on_ai_call=on_ai_call
|
||||
)
|
||||
if action == "continue":
|
||||
continue
|
||||
break
|
||||
finally:
|
||||
# Reiniciar el lector de la terminal para volver al modo interactivo SSH/Telnet
|
||||
if hasattr(stream, 'start_reading'):
|
||||
|
||||
File diff suppressed because one or more lines are too long
+22
-42
@@ -223,6 +223,7 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
copilot_node_info_json=node_info_json
|
||||
))
|
||||
|
||||
while True:
|
||||
# 2. Await the question from client via the copilot_queue
|
||||
import threading
|
||||
def preload_ai_deps():
|
||||
@@ -234,11 +235,19 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
|
||||
try:
|
||||
req_data = await asyncio.wait_for(remote_stream.copilot_queue.get(), timeout=120)
|
||||
if "question" not in req_data or not req_data["question"] or req_data["question"] == "CANCEL":
|
||||
if not req_data: return
|
||||
if "question" not in req_data or not req_data["question"] or req_data["question"] == "CANCEL" or req_data.get("action") == "cancel":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
question = req_data["question"]
|
||||
|
||||
merged_node_info_str = req_data.get("node_info_json", "")
|
||||
if merged_node_info_str:
|
||||
try:
|
||||
merged_node_info = json.loads(merged_node_info_str)
|
||||
node_info.update(merged_node_info)
|
||||
except: pass
|
||||
|
||||
context_buffer = req_data.get("context_buffer", "")
|
||||
if context_buffer.startswith('{"context_start_pos"'):
|
||||
try:
|
||||
@@ -278,10 +287,10 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
if wait_action_task in done:
|
||||
req_data = wait_action_task.result()
|
||||
ai_task.cancel()
|
||||
if req_data.get("question") == "CANCEL" or req_data.get("action") == "cancel":
|
||||
if req_data.get("action") == "cancel" or req_data.get("question") == "CANCEL":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
return
|
||||
continue # Loop back instead of returning to keep session alive
|
||||
else:
|
||||
wait_action_task.cancel()
|
||||
result = ai_task.result()
|
||||
@@ -297,10 +306,15 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
# 5. Wait for user action
|
||||
try:
|
||||
action_data = await asyncio.wait_for(remote_stream.copilot_queue.get(), timeout=60)
|
||||
if "action" not in action_data or not action_data["action"] or action_data["action"] == "cancel":
|
||||
if not action_data: return
|
||||
action = action_data.get("action", "cancel")
|
||||
|
||||
if action == "continue":
|
||||
continue # Loop back for next question
|
||||
|
||||
if action == "cancel":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
action = action_data["action"]
|
||||
except asyncio.TimeoutError:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
@@ -320,6 +334,7 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
os.write(child_fd, (cmd + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=cmd))
|
||||
await asyncio.sleep(0.8)
|
||||
return
|
||||
elif action.startswith("custom:"):
|
||||
custom_cmds = action[7:]
|
||||
os.write(child_fd, b'\x15')
|
||||
@@ -336,45 +351,10 @@ class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
os.write(child_fd, (cmd.strip() + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=cmd.strip()))
|
||||
await asyncio.sleep(0.8)
|
||||
elif action not in ('cancel', 'n', 'no'):
|
||||
# Handle numbers and ranges like "1,2,4-6"
|
||||
try:
|
||||
commands = result.get("commands", [])
|
||||
selected_indices = set()
|
||||
for part in action.split(','):
|
||||
part = part.strip()
|
||||
if not part: continue
|
||||
if '-' in part:
|
||||
start_str, end_str = part.split('-', 1)
|
||||
start = int(start_str) - 1
|
||||
end = int(end_str) - 1
|
||||
for i in range(start, end + 1):
|
||||
selected_indices.add(i)
|
||||
else:
|
||||
selected_indices.add(int(part) - 1)
|
||||
|
||||
valid_indices = sorted([i for i in selected_indices if 0 <= i < len(commands)])
|
||||
if valid_indices:
|
||||
os.write(child_fd, b'\x15')
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
# Prepend screen length command to avoid pagination
|
||||
if "screen_length_command" in n.tags:
|
||||
os.write(child_fd, (n.tags["screen_length_command"] + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=n.tags["screen_length_command"]))
|
||||
await asyncio.sleep(0.8)
|
||||
|
||||
for idx in valid_indices:
|
||||
os.write(child_fd, (commands[idx] + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=commands[idx]))
|
||||
await asyncio.sleep(0.8)
|
||||
return
|
||||
else:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
except (ValueError, IndexError):
|
||||
os.write(child_fd, b'\x15\r')
|
||||
else:
|
||||
# Cancelled or invalid action
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
|
||||
asyncio.run(n._async_interact_loop(remote_stream, resize_callback, copilot_handler=remote_copilot_handler))
|
||||
except Exception as e:
|
||||
|
||||
@@ -51,16 +51,22 @@ class NodeStub:
|
||||
pause_generator()
|
||||
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
interface = CopilotInterface(self.config, history=getattr(self, 'copilot_history', None))
|
||||
interface = CopilotInterface(
|
||||
self.config,
|
||||
history=getattr(self, 'copilot_history', None),
|
||||
session_state=getattr(self, 'copilot_state', None)
|
||||
)
|
||||
self.copilot_history = interface.history
|
||||
self.copilot_state = interface.session_state
|
||||
|
||||
node_info = json.loads(res.copilot_node_info_json) if res.copilot_node_info_json else {}
|
||||
|
||||
async def on_ai_call_remote(active_buffer, question, chunk_callback):
|
||||
async def on_ai_call_remote(active_buffer, question, chunk_callback, merged_node_info):
|
||||
# Send request to server
|
||||
request_queue.put(connpy_pb2.InteractRequest(
|
||||
copilot_question=question,
|
||||
copilot_context_buffer=active_buffer
|
||||
copilot_context_buffer=active_buffer,
|
||||
copilot_node_info_json=json.dumps(merged_node_info)
|
||||
))
|
||||
# Wait for chunks from server
|
||||
while True:
|
||||
@@ -76,13 +82,21 @@ class NodeStub:
|
||||
|
||||
# Wrap in async loop
|
||||
async def run_remote_copilot():
|
||||
return await interface.run_session(
|
||||
while True:
|
||||
action, commands, custom_cmd = await interface.run_session(
|
||||
raw_bytes=bytes(client_buffer_bytes),
|
||||
cmd_byte_positions=cmd_byte_positions,
|
||||
node_info=node_info,
|
||||
on_ai_call=on_ai_call_remote
|
||||
)
|
||||
|
||||
if action == "continue":
|
||||
# Send continue signal to server to loop back for another question
|
||||
request_queue.put(connpy_pb2.InteractRequest(copilot_action="continue"))
|
||||
continue
|
||||
|
||||
return action, commands, custom_cmd
|
||||
|
||||
with copilot_terminal_mode():
|
||||
action, commands, custom_cmd = asyncio.run(run_remote_copilot())
|
||||
|
||||
|
||||
+50
-20
@@ -46,8 +46,9 @@ def _get_local():
|
||||
_local.console = None
|
||||
if not hasattr(_local, 'err_console'):
|
||||
_local.err_console = None
|
||||
if not hasattr(_local, 'theme'):
|
||||
_local.theme = None
|
||||
if not hasattr(_local, 'theme') or _local.theme is None:
|
||||
from rich.theme import Theme
|
||||
_local.theme = Theme(_global_active_styles)
|
||||
return _local
|
||||
|
||||
def set_thread_stream(stream):
|
||||
@@ -69,23 +70,45 @@ def get_original_stderr():
|
||||
|
||||
# Centralized design system
|
||||
STYLES = {
|
||||
"info": "cyan",
|
||||
"warning": "yellow",
|
||||
"error": "red",
|
||||
"success": "green",
|
||||
"debug": "dim",
|
||||
"header": "bold cyan",
|
||||
"key": "bold cyan",
|
||||
"border": "cyan",
|
||||
"pass": "bold green",
|
||||
"fail": "bold red",
|
||||
"engineer": "blue",
|
||||
"architect": "medium_purple",
|
||||
"ai_status": "bold green",
|
||||
"user_prompt": "bold cyan",
|
||||
"unavailable": "orange3",
|
||||
"info": "#00ffff", # Cyan
|
||||
"warning": "#ffff00", # Yellow
|
||||
"error": "#ff0000", # Red
|
||||
"success": "#00ff00", # Green
|
||||
"debug": "#888888",
|
||||
"header": "bold #00ffff",
|
||||
"key": "bold #00ffff",
|
||||
"border": "#00ffff",
|
||||
"pass": "bold #00ff00",
|
||||
"fail": "bold #ff0000",
|
||||
"engineer": "#5fafff", # Sky Blue (lighter than pure blue)
|
||||
"architect": "#9370db", # Medium Purple
|
||||
"ai_status": "bold #00ff00",
|
||||
"user_prompt": "bold #00afd7", # Deep Sky Blue / Soft Cyan
|
||||
"unavailable": "#d78700",
|
||||
"contrast": "#bbbbbb",
|
||||
}
|
||||
|
||||
LIGHT_THEME = {
|
||||
"info": "#00008b", # Navy Blue
|
||||
"warning": "#d78700", # Orange
|
||||
"error": "#cd0000", # Dark Red
|
||||
"success": "#006400", # Dark Green
|
||||
"debug": "#777777",
|
||||
"header": "bold #00008b",
|
||||
"key": "bold #00008b",
|
||||
"border": "#00008b",
|
||||
"pass": "bold #006400",
|
||||
"fail": "bold #cd0000",
|
||||
"engineer": "#00008b",
|
||||
"architect": "#8b008b", # Dark Magenta
|
||||
"ai_status": "bold #006400",
|
||||
"user_prompt": "bold #00008b",
|
||||
"unavailable": "#666666",
|
||||
"contrast": "#777777",
|
||||
}
|
||||
|
||||
_global_active_styles = STYLES.copy()
|
||||
|
||||
def _get_console():
|
||||
local = _get_local()
|
||||
|
||||
@@ -171,7 +194,7 @@ def connpy_theme():
|
||||
local = _get_local()
|
||||
if local.theme is None:
|
||||
from rich.theme import Theme
|
||||
local.theme = Theme(STYLES)
|
||||
local.theme = Theme(_global_active_styles)
|
||||
return local.theme
|
||||
|
||||
def apply_theme(user_styles=None):
|
||||
@@ -179,6 +202,7 @@ def apply_theme(user_styles=None):
|
||||
Updates the global console themes with user-defined styles.
|
||||
If a style is missing in user_styles, it falls back to the default in STYLES.
|
||||
"""
|
||||
global _global_active_styles
|
||||
local = _get_local()
|
||||
from rich.theme import Theme
|
||||
|
||||
@@ -190,6 +214,7 @@ def apply_theme(user_styles=None):
|
||||
if key in active_styles:
|
||||
active_styles[key] = value
|
||||
|
||||
_global_active_styles = active_styles
|
||||
local.theme = Theme(active_styles)
|
||||
if local.console:
|
||||
local.console.push_theme(local.theme)
|
||||
@@ -202,10 +227,15 @@ def _format_multiline(tag, message, style=None):
|
||||
message = str(message)
|
||||
lines = message.splitlines()
|
||||
if not lines:
|
||||
return f"[{style}]\\[{tag}][/{style}]" if style else f"\\[{tag}]"
|
||||
if style:
|
||||
return f"[{style}]\\[{tag}][/{style}]"
|
||||
return f"\\[{tag}]"
|
||||
|
||||
# Apply style to the tag if provided
|
||||
styled_tag = f"[{style}]\\[{tag}][/{style}]" if style else f"\\[{tag}]"
|
||||
if style:
|
||||
# Include brackets in the styling
|
||||
styled_tag = f"[{style}]\\[{tag}][/{style}]"
|
||||
formatted = [f"{styled_tag} {lines[0]}"]
|
||||
|
||||
# Indent subsequent lines
|
||||
@@ -462,7 +492,7 @@ class _ThemeProxy:
|
||||
local = _get_local()
|
||||
if local.theme is None:
|
||||
from rich.theme import Theme
|
||||
local.theme = Theme(STYLES)
|
||||
local.theme = Theme(_global_active_styles)
|
||||
return getattr(local.theme, name)
|
||||
|
||||
connpy_theme = _ThemeProxy()
|
||||
|
||||
@@ -95,6 +95,7 @@ message InteractRequest {
|
||||
string copilot_question = 8;
|
||||
string copilot_action = 9;
|
||||
string copilot_context_buffer = 10;
|
||||
string copilot_node_info_json = 13;
|
||||
}
|
||||
|
||||
message InteractResponse {
|
||||
|
||||
@@ -45,6 +45,65 @@ class AIService(BaseService):
|
||||
blocks.append((pos, preview[:80]))
|
||||
return blocks
|
||||
|
||||
def process_copilot_input(self, input_text: str, session_state: dict) -> dict:
|
||||
"""Parses slash commands and manages session state. Returns directive dict."""
|
||||
text = input_text.strip()
|
||||
if not text.startswith('/'):
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
parts = text.split(maxsplit=1)
|
||||
cmd = parts[0].lower()
|
||||
args = parts[1] if len(parts) > 1 else ""
|
||||
|
||||
# 1. State Commands (Persistent)
|
||||
if cmd == "/os":
|
||||
if args:
|
||||
session_state['os'] = args
|
||||
return {"action": "state_update", "message": f"OS context changed to {args}"}
|
||||
elif cmd == "/prompt":
|
||||
if args:
|
||||
session_state['prompt'] = args
|
||||
return {"action": "state_update", "message": f"Prompt regex changed to {args}"}
|
||||
elif cmd == "/memorize":
|
||||
if args:
|
||||
session_state['memories'].append(args)
|
||||
return {"action": "state_update", "message": f"Memory added: {args}"}
|
||||
elif cmd == "/clear":
|
||||
session_state['memories'] = []
|
||||
return {"action": "state_update", "message": "Memory cleared"}
|
||||
|
||||
# 2. Hybrid Commands
|
||||
elif cmd == "/architect":
|
||||
if not args:
|
||||
session_state['persona'] = 'architect'
|
||||
return {"action": "state_update", "message": "Persona set to Architect"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "architect"}}
|
||||
|
||||
elif cmd == "/engineer":
|
||||
if not args:
|
||||
session_state['persona'] = 'engineer'
|
||||
return {"action": "state_update", "message": "Persona set to Engineer"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "engineer"}}
|
||||
|
||||
elif cmd == "/trust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = True
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) enabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": True}}
|
||||
|
||||
elif cmd == "/untrust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = False
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) disabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": False}}
|
||||
|
||||
# Unknown command, execute normally
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
|
||||
@@ -70,6 +70,10 @@ class ConfigService(BaseService):
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Support both direct styles and nested under 'theme' key
|
||||
if "theme" in user_styles and isinstance(user_styles["theme"], dict):
|
||||
user_styles = user_styles["theme"]
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
|
||||
+9
-3
@@ -162,13 +162,19 @@ class RemoteStream:
|
||||
if req.cols > 0 and req.rows > 0:
|
||||
if self.resize_callback:
|
||||
self._loop.call_soon_threadsafe(self.resize_callback, req.rows, req.cols)
|
||||
# Copilot dispatching
|
||||
copilot_msg = {}
|
||||
if getattr(req, "copilot_question", ""):
|
||||
self._loop.call_soon_threadsafe(self.copilot_queue.put_nowait, {
|
||||
copilot_msg.update({
|
||||
"question": req.copilot_question,
|
||||
"context_buffer": getattr(req, "copilot_context_buffer", "")
|
||||
"context_buffer": getattr(req, "copilot_context_buffer", ""),
|
||||
"node_info_json": getattr(req, "copilot_node_info_json", "")
|
||||
})
|
||||
if getattr(req, "copilot_action", ""):
|
||||
self._loop.call_soon_threadsafe(self.copilot_queue.put_nowait, {"action": req.copilot_action})
|
||||
copilot_msg["action"] = req.copilot_action
|
||||
|
||||
if copilot_msg:
|
||||
self._loop.call_soon_threadsafe(self.copilot_queue.put_nowait, copilot_msg)
|
||||
if req.stdin_data:
|
||||
self._loop.call_soon_threadsafe(self._reader_queue.put_nowait, req.stdin_data)
|
||||
except Exception:
|
||||
|
||||
+13
-5
@@ -1,9 +1,17 @@
|
||||
version: "3.8"
|
||||
services:
|
||||
connpy-app:
|
||||
build: .
|
||||
image: connpy-app
|
||||
image: connpy:latest
|
||||
container_name: connpy
|
||||
# Fundamental para la interactividad de la terminal
|
||||
stdin_open: true
|
||||
tty: true
|
||||
environment:
|
||||
- TERM=xterm-256color
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
volumes:
|
||||
- ./docker/connpy/:/app
|
||||
- ./docker/logs/:/logs
|
||||
- ./docker/ssh/:/root/.ssh/
|
||||
- ./docker/config:/config
|
||||
- ./docker/ssh:/root/.ssh
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
# No definimos comando por defecto para que 'run' sea más natural
|
||||
|
||||
+58
-14
@@ -1,21 +1,65 @@
|
||||
# Use the official python image
|
||||
# connpy v6.0.0b8 - Modern Network Automation Environment (Local Build)
|
||||
FROM python:3.11-slim
|
||||
|
||||
FROM python:3.11-alpine as connpy-app
|
||||
LABEL description="Connpy: AI-Driven Network Automation & Intelligence Platform"
|
||||
|
||||
# Configuración de Terminal y Python
|
||||
ENV DEBIAN_FRONTEND=noninteractive \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
TERM=xterm-256color
|
||||
|
||||
# Set the entrypoint
|
||||
# Set the working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install any additional dependencies
|
||||
RUN apk update && apk add --no-cache openssh fzf fzf-tmux ncurses bash
|
||||
RUN pip3 install connpy
|
||||
RUN connpy config --configfolder /app
|
||||
# 1. Herramientas base del sistema
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
git \
|
||||
openssh-client \
|
||||
fzf \
|
||||
ncurses-bin \
|
||||
bash \
|
||||
procps \
|
||||
unzip \
|
||||
ca-certificates \
|
||||
gnupg \
|
||||
iputils-ping \
|
||||
telnet \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
#AUTH
|
||||
RUN ssh-keygen -A
|
||||
RUN mkdir /root/.ssh && \
|
||||
chmod 700 /root/.ssh
|
||||
# 2. Instalar Docker CLI (para el plugin de docker de connpy)
|
||||
RUN install -m 0755 -d /etc/apt/keyrings && \
|
||||
curl -fsSL https://download.docker.com/linux/debian/gpg | gpg --dearmor -o /etc/apt/keyrings/docker.gpg && \
|
||||
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/debian $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
|
||||
tee /etc/apt/sources.list.d/docker.list > /dev/null && \
|
||||
apt-get update && apt-get install -y docker-ce-cli && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# 3. Instalar Kubectl (para el plugin de k8s de connpy)
|
||||
RUN curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/$(dpkg --print-architecture)/kubectl" && \
|
||||
install -o root -g root -m 0755 kubectl /usr/local/bin/kubectl && \
|
||||
rm kubectl
|
||||
|
||||
#Set the entrypoint
|
||||
ENTRYPOINT ["connpy"]
|
||||
# 4. Instalar AWS CLI y Session Manager Plugin (Universal x86_64/ARM64)
|
||||
RUN ARCH=$(uname -m) && \
|
||||
if [ "$ARCH" = "x86_64" ]; then AWS_ARCH="x86_64"; else AWS_ARCH="aarch64"; fi && \
|
||||
curl "https://awscli.amazonaws.com/awscli-exe-linux-$AWS_ARCH.zip" -o "awscliv2.zip" && \
|
||||
unzip awscliv2.zip && ./aws/install && rm -rf awscliv2.zip aws/ && \
|
||||
if [ "$ARCH" = "x86_64" ]; then \
|
||||
curl "https://s3.amazonaws.com/session-manager-downloads/plugin/latest/ubuntu_64bit/session-manager-plugin.deb" -o "ssm.deb"; \
|
||||
else \
|
||||
curl "https://s3.amazonaws.com/session-manager-downloads/plugin/latest/ubuntu_arm64/session-manager-plugin.deb" -o "ssm.deb"; \
|
||||
fi && \
|
||||
dpkg -i ssm.deb && rm ssm.deb
|
||||
|
||||
# 5. Copiar código local e instalar dependencias
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --upgrade pip && \
|
||||
pip install --no-cache-dir .
|
||||
|
||||
# 6. Configuración de persistencia
|
||||
# Creamos la carpeta y el puntero .folder para que connpy use /config
|
||||
RUN mkdir -p /config /root/.ssh /root/.config/conn && chmod 700 /root/.ssh && \
|
||||
echo -n "/config" > /root/.config/conn/.folder
|
||||
|
||||
# Punto de entrada directo a connpy
|
||||
ENTRYPOINT ["conn"]
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.ai_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -78,6 +78,9 @@ el.replaceWith(d);
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
if args.mcp is not None:
|
||||
return self.configure_mcp(args)
|
||||
|
||||
# Determinar session_id para retomar
|
||||
session_id = None
|
||||
if args.resume:
|
||||
@@ -156,7 +159,7 @@ el.replaceWith(d);
|
||||
try:
|
||||
user_query = Prompt.ask("[user_prompt]User[/user_prompt]")
|
||||
if not user_query.strip(): continue
|
||||
if user_query.lower() in ['exit', 'quit', 'bye']: break
|
||||
if user_query.lower() in ['exit', 'quit', 'bye', 'cancel']: break
|
||||
|
||||
with console.status("[ai_status]Agent is thinking...") as status:
|
||||
result = self.app.myai.ask(user_query, chat_history=history, status=status, debug=args.debug, trust=args.trust, **self.ai_overrides)
|
||||
@@ -179,11 +182,245 @@ el.replaceWith(d);
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
console.print("\n[dim]Session closed.[/dim]")
|
||||
break</code></pre>
|
||||
break
|
||||
|
||||
def configure_mcp(self, args):
|
||||
"""Handle MCP server configuration via CLI tokens or interactive wizard."""
|
||||
mcp_args = args.mcp
|
||||
|
||||
# 1. Non-interactive CLI Mode (if arguments are provided)
|
||||
if mcp_args:
|
||||
action = mcp_args[0].lower()
|
||||
|
||||
if action == "list":
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
mcp_servers = settings.get("ai", {}).get("mcp_servers", {})
|
||||
if not mcp_servers:
|
||||
printer.info("No MCP servers configured.")
|
||||
else:
|
||||
columns = ["Name", "URL", "Enabled", "Auto-load OS"]
|
||||
rows = []
|
||||
for name, cfg in mcp_servers.items():
|
||||
rows.append([
|
||||
name,
|
||||
cfg.get("url", ""),
|
||||
"[green]Yes[/green]" if cfg.get("enabled", True) else "[red]No[/red]",
|
||||
cfg.get("auto_load_on_os", "Any")
|
||||
])
|
||||
printer.table("Configured MCP Servers", columns, rows)
|
||||
return
|
||||
|
||||
elif action == "add":
|
||||
if len(mcp_args) < 3:
|
||||
printer.error("Usage: connpy ai --mcp add <name> <url> [os_filter]")
|
||||
return
|
||||
name, url = mcp_args[1], mcp_args[2]
|
||||
os_filter = mcp_args[3] if len(mcp_args) > 3 else None
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, url=url, auto_load_on_os=os_filter)
|
||||
printer.success(f"MCP server '{name}' added/updated.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
elif action == "remove":
|
||||
if len(mcp_args) < 2:
|
||||
printer.error("Usage: connpy ai --mcp remove <name>")
|
||||
return
|
||||
name = mcp_args[1]
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, remove=True)
|
||||
printer.success(f"MCP server '{name}' removed.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
elif action in ["enable", "disable"]:
|
||||
if len(mcp_args) < 2:
|
||||
printer.error(f"Usage: connpy ai --mcp {action} <name>")
|
||||
return
|
||||
name = mcp_args[1]
|
||||
enabled = (action == "enable")
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, enabled=enabled)
|
||||
printer.success(f"MCP server '{name}' {'enabled' if enabled else 'disabled'}.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
else:
|
||||
printer.error(f"Unknown MCP action: {action}")
|
||||
printer.info("Available actions: list, add, remove, enable, disable")
|
||||
return
|
||||
|
||||
# 2. Interactive Wizard Mode (if no arguments provided)
|
||||
# Import forms dynamically to avoid circular dependencies if any
|
||||
if not hasattr(self.app, "cli_forms"):
|
||||
from .forms import Forms
|
||||
self.app.cli_forms = Forms(self.app)
|
||||
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
mcp_servers = settings.get("ai", {}).get("mcp_servers", {})
|
||||
|
||||
result = self.app.cli_forms.mcp_wizard(mcp_servers)
|
||||
if not result:
|
||||
return
|
||||
|
||||
action = result["action"]
|
||||
try:
|
||||
if action == "list":
|
||||
# Recursive call to the non-interactive list logic
|
||||
args.mcp = ["list"]
|
||||
return self.configure_mcp(args)
|
||||
|
||||
elif action == "add":
|
||||
self.app.services.ai.configure_mcp(
|
||||
result["name"],
|
||||
url=result["url"],
|
||||
enabled=result["enabled"],
|
||||
auto_load_on_os=result["os"]
|
||||
)
|
||||
printer.success(f"MCP server '{result['name']}' saved.")
|
||||
|
||||
elif action == "update": # Used for toggle
|
||||
self.app.services.ai.configure_mcp(
|
||||
result["name"],
|
||||
enabled=result["enabled"]
|
||||
)
|
||||
printer.success(f"MCP server '{result['name']}' updated.")
|
||||
|
||||
elif action == "remove":
|
||||
self.app.services.ai.configure_mcp(result["name"], remove=True)
|
||||
printer.success(f"MCP server '{result['name']}' removed.")
|
||||
|
||||
except Exception as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler.configure_mcp"><code class="name flex">
|
||||
<span>def <span class="ident">configure_mcp</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def configure_mcp(self, args):
|
||||
"""Handle MCP server configuration via CLI tokens or interactive wizard."""
|
||||
mcp_args = args.mcp
|
||||
|
||||
# 1. Non-interactive CLI Mode (if arguments are provided)
|
||||
if mcp_args:
|
||||
action = mcp_args[0].lower()
|
||||
|
||||
if action == "list":
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
mcp_servers = settings.get("ai", {}).get("mcp_servers", {})
|
||||
if not mcp_servers:
|
||||
printer.info("No MCP servers configured.")
|
||||
else:
|
||||
columns = ["Name", "URL", "Enabled", "Auto-load OS"]
|
||||
rows = []
|
||||
for name, cfg in mcp_servers.items():
|
||||
rows.append([
|
||||
name,
|
||||
cfg.get("url", ""),
|
||||
"[green]Yes[/green]" if cfg.get("enabled", True) else "[red]No[/red]",
|
||||
cfg.get("auto_load_on_os", "Any")
|
||||
])
|
||||
printer.table("Configured MCP Servers", columns, rows)
|
||||
return
|
||||
|
||||
elif action == "add":
|
||||
if len(mcp_args) < 3:
|
||||
printer.error("Usage: connpy ai --mcp add <name> <url> [os_filter]")
|
||||
return
|
||||
name, url = mcp_args[1], mcp_args[2]
|
||||
os_filter = mcp_args[3] if len(mcp_args) > 3 else None
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, url=url, auto_load_on_os=os_filter)
|
||||
printer.success(f"MCP server '{name}' added/updated.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
elif action == "remove":
|
||||
if len(mcp_args) < 2:
|
||||
printer.error("Usage: connpy ai --mcp remove <name>")
|
||||
return
|
||||
name = mcp_args[1]
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, remove=True)
|
||||
printer.success(f"MCP server '{name}' removed.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
elif action in ["enable", "disable"]:
|
||||
if len(mcp_args) < 2:
|
||||
printer.error(f"Usage: connpy ai --mcp {action} <name>")
|
||||
return
|
||||
name = mcp_args[1]
|
||||
enabled = (action == "enable")
|
||||
try:
|
||||
self.app.services.ai.configure_mcp(name, enabled=enabled)
|
||||
printer.success(f"MCP server '{name}' {'enabled' if enabled else 'disabled'}.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
else:
|
||||
printer.error(f"Unknown MCP action: {action}")
|
||||
printer.info("Available actions: list, add, remove, enable, disable")
|
||||
return
|
||||
|
||||
# 2. Interactive Wizard Mode (if no arguments provided)
|
||||
# Import forms dynamically to avoid circular dependencies if any
|
||||
if not hasattr(self.app, "cli_forms"):
|
||||
from .forms import Forms
|
||||
self.app.cli_forms = Forms(self.app)
|
||||
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
mcp_servers = settings.get("ai", {}).get("mcp_servers", {})
|
||||
|
||||
result = self.app.cli_forms.mcp_wizard(mcp_servers)
|
||||
if not result:
|
||||
return
|
||||
|
||||
action = result["action"]
|
||||
try:
|
||||
if action == "list":
|
||||
# Recursive call to the non-interactive list logic
|
||||
args.mcp = ["list"]
|
||||
return self.configure_mcp(args)
|
||||
|
||||
elif action == "add":
|
||||
self.app.services.ai.configure_mcp(
|
||||
result["name"],
|
||||
url=result["url"],
|
||||
enabled=result["enabled"],
|
||||
auto_load_on_os=result["os"]
|
||||
)
|
||||
printer.success(f"MCP server '{result['name']}' saved.")
|
||||
|
||||
elif action == "update": # Used for toggle
|
||||
self.app.services.ai.configure_mcp(
|
||||
result["name"],
|
||||
enabled=result["enabled"]
|
||||
)
|
||||
printer.success(f"MCP server '{result['name']}' updated.")
|
||||
|
||||
elif action == "remove":
|
||||
self.app.services.ai.configure_mcp(result["name"], remove=True)
|
||||
printer.success(f"MCP server '{result['name']}' removed.")
|
||||
|
||||
except Exception as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Handle MCP server configuration via CLI tokens or interactive wizard.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
@@ -211,6 +448,9 @@ el.replaceWith(d);
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
if args.mcp is not None:
|
||||
return self.configure_mcp(args)
|
||||
|
||||
# Determinar session_id para retomar
|
||||
session_id = None
|
||||
if args.resume:
|
||||
@@ -283,7 +523,7 @@ el.replaceWith(d);
|
||||
try:
|
||||
user_query = Prompt.ask("[user_prompt]User[/user_prompt]")
|
||||
if not user_query.strip(): continue
|
||||
if user_query.lower() in ['exit', 'quit', 'bye']: break
|
||||
if user_query.lower() in ['exit', 'quit', 'bye', 'cancel']: break
|
||||
|
||||
with console.status("[ai_status]Agent is thinking...") as status:
|
||||
result = self.app.myai.ask(user_query, chat_history=history, status=status, debug=args.debug, trust=args.trust, **self.ai_overrides)
|
||||
@@ -356,6 +596,7 @@ el.replaceWith(d);
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.ai_handler.AIHandler" href="#connpy.cli.ai_handler.AIHandler">AIHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.configure_mcp" href="#connpy.cli.ai_handler.AIHandler.configure_mcp">configure_mcp</a></code></li>
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.dispatch" href="#connpy.cli.ai_handler.AIHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.interactive_chat" href="#connpy.cli.ai_handler.AIHandler.interactive_chat">interactive_chat</a></code></li>
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.single_question" href="#connpy.cli.ai_handler.AIHandler.single_question">single_question</a></code></li>
|
||||
@@ -367,7 +608,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.api_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -193,7 +193,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.config_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -482,7 +482,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.context_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -249,7 +249,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
+176
-3
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.forms API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -249,11 +249,183 @@ el.replaceWith(d);
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
return answer</code></pre>
|
||||
return answer
|
||||
|
||||
def mcp_wizard(self, mcp_servers):
|
||||
"""Interactive wizard to manage MCP servers."""
|
||||
from .helpers import theme
|
||||
|
||||
while True:
|
||||
options = [
|
||||
("List Configured Servers", "list"),
|
||||
("Add/Update Server", "add"),
|
||||
("Enable/Disable Server", "toggle"),
|
||||
("Remove Server", "remove"),
|
||||
("Back", "exit")
|
||||
]
|
||||
|
||||
questions = [
|
||||
inquirer.List("action", message="MCP Configuration", choices=options)
|
||||
]
|
||||
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if not answers or answers["action"] == "exit":
|
||||
return None
|
||||
|
||||
action = answers["action"]
|
||||
|
||||
if action == "list":
|
||||
if not mcp_servers:
|
||||
print("\nNo MCP servers configured.\n")
|
||||
else:
|
||||
return {"action": "list"}
|
||||
|
||||
elif action == "add":
|
||||
questions = [
|
||||
inquirer.Text("name", message="Server Name (identifier)"),
|
||||
inquirer.Text("url", message="SSE URL (e.g., http://localhost:8000/sse)"),
|
||||
inquirer.Confirm("enabled", message="Enabled?", default=True),
|
||||
inquirer.Text("auto_load_os", message="Auto-load on specific OS (blank for any)")
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers:
|
||||
return {
|
||||
"action": "add",
|
||||
"name": answers["name"],
|
||||
"url": answers["url"],
|
||||
"enabled": answers["enabled"],
|
||||
"os": answers["auto_load_os"]
|
||||
}
|
||||
|
||||
elif action == "toggle":
|
||||
if not mcp_servers:
|
||||
print("\nNo servers to toggle.\n")
|
||||
continue
|
||||
|
||||
choices = []
|
||||
for name, cfg in mcp_servers.items():
|
||||
status = "[Enabled]" if cfg.get("enabled", True) else "[Disabled]"
|
||||
choices.append((f"{name} {status}", name))
|
||||
|
||||
questions = [
|
||||
inquirer.List("name", message="Select server to toggle", choices=choices + [("Cancel", None)])
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers and answers["name"]:
|
||||
current = mcp_servers[answers["name"]].get("enabled", True)
|
||||
return {
|
||||
"action": "update",
|
||||
"name": answers["name"],
|
||||
"enabled": not current
|
||||
}
|
||||
|
||||
elif action == "remove":
|
||||
if not mcp_servers:
|
||||
print("\nNo servers to remove.\n")
|
||||
continue
|
||||
|
||||
questions = [
|
||||
inquirer.List("name", message="Select server to remove", choices=list(mcp_servers.keys()) + ["Cancel"])
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers and answers["name"] != "Cancel":
|
||||
return {"action": "remove", "name": answers["name"]}
|
||||
return None</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.forms.Forms.mcp_wizard"><code class="name flex">
|
||||
<span>def <span class="ident">mcp_wizard</span></span>(<span>self, mcp_servers)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def mcp_wizard(self, mcp_servers):
|
||||
"""Interactive wizard to manage MCP servers."""
|
||||
from .helpers import theme
|
||||
|
||||
while True:
|
||||
options = [
|
||||
("List Configured Servers", "list"),
|
||||
("Add/Update Server", "add"),
|
||||
("Enable/Disable Server", "toggle"),
|
||||
("Remove Server", "remove"),
|
||||
("Back", "exit")
|
||||
]
|
||||
|
||||
questions = [
|
||||
inquirer.List("action", message="MCP Configuration", choices=options)
|
||||
]
|
||||
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if not answers or answers["action"] == "exit":
|
||||
return None
|
||||
|
||||
action = answers["action"]
|
||||
|
||||
if action == "list":
|
||||
if not mcp_servers:
|
||||
print("\nNo MCP servers configured.\n")
|
||||
else:
|
||||
return {"action": "list"}
|
||||
|
||||
elif action == "add":
|
||||
questions = [
|
||||
inquirer.Text("name", message="Server Name (identifier)"),
|
||||
inquirer.Text("url", message="SSE URL (e.g., http://localhost:8000/sse)"),
|
||||
inquirer.Confirm("enabled", message="Enabled?", default=True),
|
||||
inquirer.Text("auto_load_os", message="Auto-load on specific OS (blank for any)")
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers:
|
||||
return {
|
||||
"action": "add",
|
||||
"name": answers["name"],
|
||||
"url": answers["url"],
|
||||
"enabled": answers["enabled"],
|
||||
"os": answers["auto_load_os"]
|
||||
}
|
||||
|
||||
elif action == "toggle":
|
||||
if not mcp_servers:
|
||||
print("\nNo servers to toggle.\n")
|
||||
continue
|
||||
|
||||
choices = []
|
||||
for name, cfg in mcp_servers.items():
|
||||
status = "[Enabled]" if cfg.get("enabled", True) else "[Disabled]"
|
||||
choices.append((f"{name} {status}", name))
|
||||
|
||||
questions = [
|
||||
inquirer.List("name", message="Select server to toggle", choices=choices + [("Cancel", None)])
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers and answers["name"]:
|
||||
current = mcp_servers[answers["name"]].get("enabled", True)
|
||||
return {
|
||||
"action": "update",
|
||||
"name": answers["name"],
|
||||
"enabled": not current
|
||||
}
|
||||
|
||||
elif action == "remove":
|
||||
if not mcp_servers:
|
||||
print("\nNo servers to remove.\n")
|
||||
continue
|
||||
|
||||
questions = [
|
||||
inquirer.List("name", message="Select server to remove", choices=list(mcp_servers.keys()) + ["Cancel"])
|
||||
]
|
||||
answers = inquirer.prompt(questions, theme=theme)
|
||||
if answers and answers["name"] != "Cancel":
|
||||
return {"action": "remove", "name": answers["name"]}
|
||||
return None</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Interactive wizard to manage MCP servers.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.forms.Forms.questions_bulk"><code class="name flex">
|
||||
<span>def <span class="ident">questions_bulk</span></span>(<span>self, nodes='', hosts='')</span>
|
||||
</code></dt>
|
||||
@@ -505,6 +677,7 @@ el.replaceWith(d);
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.forms.Forms" href="#connpy.cli.forms.Forms">Forms</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.forms.Forms.mcp_wizard" href="#connpy.cli.forms.Forms.mcp_wizard">mcp_wizard</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_bulk" href="#connpy.cli.forms.Forms.questions_bulk">questions_bulk</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_edit" href="#connpy.cli.forms.Forms.questions_edit">questions_edit</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_nodes" href="#connpy.cli.forms.Forms.questions_nodes">questions_nodes</a></code></li>
|
||||
@@ -517,7 +690,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.help_text API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -303,7 +303,7 @@ tasks:
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.helpers API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -207,7 +207,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.import_export_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -272,7 +272,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -92,6 +92,10 @@ el.replaceWith(d);
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.terminal_ui" href="terminal_ui.html">connpy.cli.terminal_ui</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.validators" href="validators.html">connpy.cli.validators</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
@@ -130,6 +134,7 @@ el.replaceWith(d);
|
||||
<li><code><a title="connpy.cli.profile_handler" href="profile_handler.html">connpy.cli.profile_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler" href="run_handler.html">connpy.cli.run_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler" href="sync_handler.html">connpy.cli.sync_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.terminal_ui" href="terminal_ui.html">connpy.cli.terminal_ui</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators" href="validators.html">connpy.cli.validators</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
@@ -137,7 +142,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.node_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -606,7 +606,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.plugin_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -385,7 +385,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.profile_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -314,7 +314,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.run_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -169,7 +169,7 @@ el.replaceWith(d);
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_run_complete
|
||||
@@ -203,7 +203,7 @@ el.replaceWith(d);
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_test_complete
|
||||
@@ -260,7 +260,7 @@ el.replaceWith(d);
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_run_complete
|
||||
@@ -294,7 +294,7 @@ el.replaceWith(d);
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
timeout=options.get("timeout", 20),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=_on_test_complete
|
||||
@@ -454,7 +454,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.sync_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -427,7 +427,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -0,0 +1,899 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.terminal_ui API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.terminal_ui</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.terminal_ui.CopilotInterface"><code class="flex name class">
|
||||
<span>class <span class="ident">CopilotInterface</span></span>
|
||||
<span>(</span><span>config,<br>history=None,<br>pt_input=None,<br>pt_output=None,<br>rich_file=None,<br>session_state=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class CopilotInterface:
|
||||
def __init__(self, config, history=None, pt_input=None, pt_output=None, rich_file=None, session_state=None):
|
||||
self.config = config
|
||||
self.history = history or InMemoryHistory()
|
||||
self.pt_input = pt_input
|
||||
self.pt_output = pt_output
|
||||
self.ai_service = AIService(config)
|
||||
self.session_state = session_state if session_state is not None else {
|
||||
'persona': 'engineer',
|
||||
'trust_mode': False,
|
||||
'memories': [],
|
||||
'os': None,
|
||||
'prompt': None
|
||||
}
|
||||
|
||||
if rich_file:
|
||||
self.console = Console(theme=connpy_theme, force_terminal=True, file=rich_file)
|
||||
else:
|
||||
self.console = Console(theme=connpy_theme)
|
||||
|
||||
self.mode_range, self.mode_single, self.mode_lines = 0, 1, 2
|
||||
|
||||
def _get_theme_color(self, style_name: str, fallback: str = "white") -> str:
|
||||
"""Extract Hex or ANSI color name from the active rich theme."""
|
||||
try:
|
||||
style = connpy_theme.styles.get(style_name)
|
||||
if style and style.color:
|
||||
# If it's a standard color like 'green', Rich might return its hex triplet
|
||||
if style.color.is_default: return fallback
|
||||
return style.color.triplet.hex if style.color.triplet else style.color.name
|
||||
except: pass
|
||||
return fallback
|
||||
|
||||
async def run_session(self,
|
||||
raw_bytes: bytes,
|
||||
cmd_byte_positions: List[tuple],
|
||||
node_info: dict,
|
||||
on_ai_call: Callable):
|
||||
"""
|
||||
Runs the interactive Copilot session.
|
||||
on_ai_call: async function(active_buffer, question) -> result_dict
|
||||
"""
|
||||
from rich.rule import Rule
|
||||
|
||||
try:
|
||||
# Prepare UI state
|
||||
buffer = log_cleaner(raw_bytes.decode(errors='replace'))
|
||||
blocks = self.ai_service.build_context_blocks(raw_bytes, cmd_byte_positions, node_info)
|
||||
last_line = buffer.split('\n')[-1].strip() if buffer.strip() else "(prompt)"
|
||||
blocks.append((len(raw_bytes), last_line[:80]))
|
||||
|
||||
state = {
|
||||
'context_cmd': 1,
|
||||
'total_cmds': len(blocks),
|
||||
'total_lines': len(buffer.split('\n')),
|
||||
'context_lines': min(50, len(buffer.split('\n'))),
|
||||
'context_mode': self.mode_range,
|
||||
'cancelled': False,
|
||||
'toolbar_msg': '',
|
||||
'msg_expiry': 0
|
||||
}
|
||||
|
||||
# 1. Visual Separation
|
||||
self.console.print("") # Salto de línea real
|
||||
self.console.print(Rule(title="[bold cyan] AI TERMINAL COPILOT [/bold cyan]", style="cyan"))
|
||||
self.console.print(Panel(
|
||||
"[dim]Type your question. Enter to send, Escape/Ctrl+C to cancel.\n"
|
||||
"Tab to change context mode. Ctrl+\u2191/\u2193 to adjust context. \u2191\u2193 for question history.[/dim]",
|
||||
border_style="cyan"
|
||||
))
|
||||
self.console.print("\n") # Pequeño espacio antes del prompt del copilot
|
||||
|
||||
bindings = KeyBindings()
|
||||
@bindings.add('c-up')
|
||||
def _(event):
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
state['context_lines'] = min(state['context_lines'] + 50, state['total_lines'])
|
||||
else:
|
||||
state['context_cmd'] = min(state['context_cmd'] + 1, state['total_cmds'])
|
||||
event.app.invalidate()
|
||||
@bindings.add('c-down')
|
||||
def _(event):
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
state['context_lines'] = max(state['context_lines'] - 50, min(50, state['total_lines']))
|
||||
else:
|
||||
state['context_cmd'] = max(state['context_cmd'] - 1, 1)
|
||||
event.app.invalidate()
|
||||
@bindings.add('tab')
|
||||
def _(event):
|
||||
buf = event.current_buffer
|
||||
# If typing a slash command (no spaces yet), use tab to autocomplete inline
|
||||
if buf.text.startswith('/') and ' ' not in buf.text:
|
||||
buf.complete_next()
|
||||
else:
|
||||
state['context_mode'] = (state['context_mode'] + 1) % 3
|
||||
event.app.invalidate()
|
||||
@bindings.add('escape', eager=True)
|
||||
@bindings.add('c-c')
|
||||
def _(event):
|
||||
state['cancelled'] = True
|
||||
event.app.exit(result='')
|
||||
|
||||
def get_active_buffer():
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
return '\n'.join(buffer.split('\n')[-state['context_lines']:])
|
||||
idx = max(0, state['total_cmds'] - state['context_cmd'])
|
||||
start, preview = blocks[idx]
|
||||
if state['context_mode'] == self.mode_single and idx + 1 < state['total_cmds']:
|
||||
end = blocks[idx + 1][0]
|
||||
active_raw = raw_bytes[start:end]
|
||||
else:
|
||||
active_raw = raw_bytes[start:]
|
||||
return preview + "\n" + log_cleaner(active_raw.decode(errors='replace'))
|
||||
|
||||
def get_prompt_text():
|
||||
import html
|
||||
# Always use user_prompt color for the Ask prompt
|
||||
color = self._get_theme_color("user_prompt", "cyan")
|
||||
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
text = html.escape(f"Ask [Ctx: {state['context_lines']}/{state['total_lines']}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
active = get_active_buffer()
|
||||
lines_count = len(active.split('\n'))
|
||||
mode_str = {self.mode_range: "Range", self.mode_single: "Cmd"}[state['context_mode']]
|
||||
text = html.escape(f"Ask [{mode_str} {state['context_cmd']} ~{lines_count}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
|
||||
from prompt_toolkit.application.current import get_app
|
||||
|
||||
def get_toolbar():
|
||||
import html
|
||||
app = get_app()
|
||||
c_warning = self._get_theme_color("warning", "yellow")
|
||||
|
||||
if app and app.current_buffer:
|
||||
text = app.current_buffer.text
|
||||
# Solo mostrar ayuda de comandos si estamos escribiendo el primer comando y no hay espacios
|
||||
if text.startswith('/') and ' ' not in text:
|
||||
commands = ['/os', '/prompt', '/architect', '/engineer', '/trust', '/untrust', '/memorize', '/clear']
|
||||
matches = [c for c in commands if c.startswith(text.lower())]
|
||||
if matches:
|
||||
m_text = html.escape(f"Available: {' '.join(matches)}")
|
||||
return HTML(f'<style fg="{c_warning}">{m_text}</style>' + " " * 20)
|
||||
|
||||
m_label = {self.mode_range: "RANGE", self.mode_single: "SINGLE", self.mode_lines: "LINES"}[state['context_mode']]
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
base_str = f'\u25b6 Ctrl+\u2191/\u2193 adjusts by 50 lines [Tab: {m_label}]'
|
||||
else:
|
||||
idx = max(0, state['total_cmds'] - state['context_cmd'])
|
||||
desc = blocks[idx][1]
|
||||
base_str = f'\u25b6 {desc} [Tab: {m_label}]'
|
||||
|
||||
# Wrap base_str in a style to maintain consistency and avoid glitches
|
||||
# The fg color will be inherited from bottom-toolbar global style if not specified here
|
||||
base_html = f'<span>{html.escape(base_str)}</span>'
|
||||
|
||||
res_html = base_html
|
||||
if state.get('toolbar_msg'):
|
||||
if time.time() < state.get('msg_expiry', 0):
|
||||
msg = html.escape(state['toolbar_msg'])
|
||||
res_html = f'<style fg="{c_warning}">⚙️ {msg}</style> | ' + base_html
|
||||
else:
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
# Pad with spaces to ensure the line is cleared when the message disappears
|
||||
return HTML(res_html + " " * 20)
|
||||
|
||||
from prompt_toolkit.completion import Completer, Completion
|
||||
class SlashCommandCompleter(Completer):
|
||||
def get_completions(self, document, complete_event):
|
||||
text = document.text_before_cursor
|
||||
if text.startswith('/'):
|
||||
parts = text.split()
|
||||
# Only autocomplete the first word
|
||||
if len(parts) <= 1 or (len(parts) == 1 and not text.endswith(' ')):
|
||||
cmd_part = parts[0] if parts else text
|
||||
commands = [
|
||||
('/os', 'Set device OS (e.g. cisco_ios)'),
|
||||
('/prompt', 'Override prompt regex'),
|
||||
('/architect', 'Switch to Architect persona'),
|
||||
('/engineer', 'Switch to Engineer persona'),
|
||||
('/trust', 'Enable auto-execute'),
|
||||
('/untrust', 'Disable auto-execute'),
|
||||
('/memorize', 'Add fact to memory'),
|
||||
('/clear', 'Clear memory')
|
||||
]
|
||||
for cmd, desc in commands:
|
||||
if cmd.startswith(cmd_part.lower()):
|
||||
yield Completion(cmd, start_position=-len(cmd_part), display_meta=desc)
|
||||
|
||||
copilot_completer = SlashCommandCompleter()
|
||||
|
||||
while True:
|
||||
# 2. Ask question
|
||||
from prompt_toolkit.styles import Style
|
||||
c_contrast = self._get_theme_color("contrast", "gray")
|
||||
ui_style = Style.from_dict({
|
||||
'bottom-toolbar': f'fg:{c_contrast}',
|
||||
})
|
||||
|
||||
session = PromptSession(
|
||||
history=self.history,
|
||||
input=self.pt_input,
|
||||
output=self.pt_output,
|
||||
completer=copilot_completer,
|
||||
reserve_space_for_menu=0,
|
||||
style=ui_style
|
||||
)
|
||||
try:
|
||||
# Usamos un try/finally interno para asegurar que si algo falla en prompt_async,
|
||||
# no nos quedemos con la terminal en un estado extraño.
|
||||
question = await session.prompt_async(
|
||||
get_prompt_text,
|
||||
key_bindings=bindings,
|
||||
bottom_toolbar=get_toolbar
|
||||
)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
state['cancelled'] = True
|
||||
question = ""
|
||||
|
||||
if state['cancelled'] or not question.strip() or question.strip().lower() in ['cancel', 'exit', 'quit']:
|
||||
return "cancel", None, None
|
||||
|
||||
# 3. Process Input via AIService
|
||||
directive = self.ai_service.process_copilot_input(question, self.session_state)
|
||||
|
||||
if directive["action"] == "state_update":
|
||||
state['toolbar_msg'] = directive['message']
|
||||
state['msg_expiry'] = time.time() + 3 # 3 seconds timeout
|
||||
|
||||
async def delayed_refresh():
|
||||
await asyncio.sleep(3.1)
|
||||
# Only invalidate if the message hasn't been replaced by a newer one
|
||||
if state.get('toolbar_msg') == directive['message']:
|
||||
state['toolbar_msg'] = '' # Explicitly clear
|
||||
try:
|
||||
from prompt_toolkit.application.current import get_app
|
||||
app = get_app()
|
||||
if app: app.invalidate()
|
||||
except: pass
|
||||
asyncio.create_task(delayed_refresh())
|
||||
|
||||
# Mover el cursor arriba y limpiar la línea para que el nuevo prompt reemplace al anterior
|
||||
sys.stdout.write('\x1b[1A\x1b[2K')
|
||||
sys.stdout.flush()
|
||||
continue
|
||||
else:
|
||||
# Limpiar el mensaje de la barra cuando se hace una pregunta real
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
clean_question = directive.get("clean_prompt", question)
|
||||
overrides = directive.get("overrides", {})
|
||||
|
||||
# Merge node_info with session_state and overrides
|
||||
merged_node_info = node_info.copy()
|
||||
if self.session_state['os']: merged_node_info['os'] = self.session_state['os']
|
||||
if self.session_state['prompt']: merged_node_info['prompt'] = self.session_state['prompt']
|
||||
merged_node_info['persona'] = self.session_state['persona']
|
||||
merged_node_info['trust'] = self.session_state['trust_mode']
|
||||
merged_node_info['memories'] = list(self.session_state['memories'])
|
||||
|
||||
for k, v in overrides.items():
|
||||
merged_node_info[k] = v
|
||||
|
||||
# Enrich question
|
||||
past = self.history.get_strings()
|
||||
if len(past) > 1:
|
||||
clean_past = [q for q in past[-6:-1] if not q.startswith('/')]
|
||||
if clean_past:
|
||||
history_text = "\n".join(f"- {q}" for q in clean_past)
|
||||
clean_question = f"Previous questions:\n{history_text}\n\nCurrent Question:\n{clean_question}"
|
||||
|
||||
# 3. AI Execution
|
||||
# Use persona from overrides (one-shot) or from session state
|
||||
active_persona = merged_node_info.get('persona', self.session_state.get('persona', 'engineer'))
|
||||
persona_color = self._get_theme_color(active_persona, fallback="cyan")
|
||||
|
||||
active_buffer = get_active_buffer()
|
||||
live_text = "Thinking..."
|
||||
panel = Panel(live_text, title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color)
|
||||
|
||||
def on_chunk(text):
|
||||
nonlocal live_text
|
||||
if live_text == "Thinking...": live_text = ""
|
||||
live_text += text
|
||||
|
||||
with Live(panel, console=self.console, refresh_per_second=10) as live:
|
||||
def update_live(t):
|
||||
live.update(Panel(Markdown(t), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
wrapped_chunk = lambda t: (on_chunk(t), update_live(live_text))
|
||||
|
||||
# Check for interruption during AI call
|
||||
ai_task = asyncio.create_task(on_ai_call(active_buffer, clean_question, wrapped_chunk, merged_node_info))
|
||||
|
||||
try:
|
||||
while not ai_task.done():
|
||||
await asyncio.sleep(0.05)
|
||||
result = await ai_task
|
||||
except asyncio.CancelledError:
|
||||
return "cancel", None, None
|
||||
|
||||
if not result or result.get("error"):
|
||||
if result and result.get("error"): self.console.print(f"[red]Error: {result['error']}[/red]")
|
||||
return "cancel", None, None
|
||||
|
||||
# 4. Handle result
|
||||
if live_text == "Thinking..." and result.get("guide"):
|
||||
self.console.print(Panel(Markdown(result["guide"]), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
commands = result.get("commands", [])
|
||||
if not commands:
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
risk = result.get("risk_level", "low")
|
||||
risk_style = {"low": "success", "high": "warning", "destructive": "error"}.get(risk, "success")
|
||||
style_color = self._get_theme_color(risk_style, fallback="green")
|
||||
|
||||
cmd_text = "\n".join(f" {i+1}. {c}" for i, c in enumerate(commands))
|
||||
# Explicitly use 'bold style_color' for both TITLE and BORDER to ensure maximum consistency
|
||||
self.console.print(Panel(cmd_text, title=f"[bold {style_color}]Suggested Commands [{risk.upper()}][/bold {style_color}]", border_style=f"bold {style_color}"))
|
||||
|
||||
if merged_node_info.get('trust', False) and risk != "destructive":
|
||||
self.console.print(f"[dim]⚙️ Auto-executing (Trust Mode)[/dim]")
|
||||
return "send_all", commands, None
|
||||
|
||||
confirm_session = PromptSession(input=self.pt_input, output=self.pt_output)
|
||||
c_bindings = KeyBindings()
|
||||
@c_bindings.add('escape', eager=True)
|
||||
@c_bindings.add('c-c')
|
||||
def _(ev): ev.app.exit(result='n')
|
||||
|
||||
import html
|
||||
try:
|
||||
p_text = html.escape(f"Send? (y/n/e/range) [n]: ")
|
||||
# Use the EXACT same style_color and force bold="true" for Prompt-Toolkit
|
||||
action = await confirm_session.prompt_async(HTML(f'<style fg="{style_color}" bold="true">{p_text}</style>'), key_bindings=c_bindings)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
def parse_indices(text, max_len):
|
||||
"""Helper to parse '1-3, 5, 7' into [0, 1, 2, 4, 6]."""
|
||||
indices = []
|
||||
# Replace commas with spaces and split
|
||||
parts = text.replace(',', ' ').split()
|
||||
for part in parts:
|
||||
if '-' in part:
|
||||
try:
|
||||
start, end = map(int, part.split('-'))
|
||||
# Ensure inclusive and 0-indexed
|
||||
indices.extend(range(start-1, end))
|
||||
except: continue
|
||||
elif part.isdigit():
|
||||
indices.append(int(part)-1)
|
||||
# Filter valid indices and remove duplicates
|
||||
return [i for i in sorted(set(indices)) if 0 <= i < max_len]
|
||||
|
||||
action_l = (action or "n").lower().strip()
|
||||
if action_l in ('y', 'yes', 'all'):
|
||||
return "send_all", commands, None
|
||||
|
||||
# Check for numeric selection (e.g., "1, 2-4")
|
||||
if re.match(r'^[0-9,\-\s]+$', action_l):
|
||||
selected_idxs = parse_indices(action_l, len(commands))
|
||||
if selected_idxs:
|
||||
return "send_all", [commands[i] for i in selected_idxs], None
|
||||
|
||||
elif action_l.startswith('e'):
|
||||
# Check if it's a selective edit like 'e1-2'
|
||||
selection_str = action_l[1:].strip()
|
||||
if selection_str:
|
||||
idxs = parse_indices(selection_str, len(commands))
|
||||
cmds_to_edit = [commands[i] for i in idxs] if idxs else commands
|
||||
else:
|
||||
cmds_to_edit = commands
|
||||
|
||||
target = "\n".join(cmds_to_edit)
|
||||
e_bindings = KeyBindings()
|
||||
@e_bindings.add('c-j')
|
||||
def _(ev): ev.app.exit(result=ev.app.current_buffer.text)
|
||||
@e_bindings.add('escape', 'enter')
|
||||
def _(ev): ev.app.exit(result=ev.app.current_buffer.text)
|
||||
@e_bindings.add('escape')
|
||||
def _(ev): ev.app.exit(result='')
|
||||
|
||||
c_edit = self._get_theme_color("user_prompt", "cyan")
|
||||
import html
|
||||
e_text = html.escape("Edit (Ctrl+Enter or Esc+Enter to submit):\n")
|
||||
try:
|
||||
edited = await confirm_session.prompt_async(
|
||||
HTML(f'<style fg="{c_edit}">{e_text}</style>'),
|
||||
default=target, multiline=True, key_bindings=e_bindings
|
||||
)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
if edited and edited.strip():
|
||||
# Split by lines to ensure core.py applies delay between each command
|
||||
lines = [l.strip() for l in edited.split('\n') if l.strip()]
|
||||
return "custom", None, lines
|
||||
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
return "cancel", None, None
|
||||
|
||||
finally:
|
||||
state['cancelled'] = True
|
||||
self.console.print("[dim]Returning to session...[/dim]")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.terminal_ui.CopilotInterface.run_session"><code class="name flex">
|
||||
<span>async def <span class="ident">run_session</span></span>(<span>self,<br>raw_bytes: bytes,<br>cmd_byte_positions: List[tuple],<br>node_info: dict,<br>on_ai_call: Callable)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def run_session(self,
|
||||
raw_bytes: bytes,
|
||||
cmd_byte_positions: List[tuple],
|
||||
node_info: dict,
|
||||
on_ai_call: Callable):
|
||||
"""
|
||||
Runs the interactive Copilot session.
|
||||
on_ai_call: async function(active_buffer, question) -> result_dict
|
||||
"""
|
||||
from rich.rule import Rule
|
||||
|
||||
try:
|
||||
# Prepare UI state
|
||||
buffer = log_cleaner(raw_bytes.decode(errors='replace'))
|
||||
blocks = self.ai_service.build_context_blocks(raw_bytes, cmd_byte_positions, node_info)
|
||||
last_line = buffer.split('\n')[-1].strip() if buffer.strip() else "(prompt)"
|
||||
blocks.append((len(raw_bytes), last_line[:80]))
|
||||
|
||||
state = {
|
||||
'context_cmd': 1,
|
||||
'total_cmds': len(blocks),
|
||||
'total_lines': len(buffer.split('\n')),
|
||||
'context_lines': min(50, len(buffer.split('\n'))),
|
||||
'context_mode': self.mode_range,
|
||||
'cancelled': False,
|
||||
'toolbar_msg': '',
|
||||
'msg_expiry': 0
|
||||
}
|
||||
|
||||
# 1. Visual Separation
|
||||
self.console.print("") # Salto de línea real
|
||||
self.console.print(Rule(title="[bold cyan] AI TERMINAL COPILOT [/bold cyan]", style="cyan"))
|
||||
self.console.print(Panel(
|
||||
"[dim]Type your question. Enter to send, Escape/Ctrl+C to cancel.\n"
|
||||
"Tab to change context mode. Ctrl+\u2191/\u2193 to adjust context. \u2191\u2193 for question history.[/dim]",
|
||||
border_style="cyan"
|
||||
))
|
||||
self.console.print("\n") # Pequeño espacio antes del prompt del copilot
|
||||
|
||||
bindings = KeyBindings()
|
||||
@bindings.add('c-up')
|
||||
def _(event):
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
state['context_lines'] = min(state['context_lines'] + 50, state['total_lines'])
|
||||
else:
|
||||
state['context_cmd'] = min(state['context_cmd'] + 1, state['total_cmds'])
|
||||
event.app.invalidate()
|
||||
@bindings.add('c-down')
|
||||
def _(event):
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
state['context_lines'] = max(state['context_lines'] - 50, min(50, state['total_lines']))
|
||||
else:
|
||||
state['context_cmd'] = max(state['context_cmd'] - 1, 1)
|
||||
event.app.invalidate()
|
||||
@bindings.add('tab')
|
||||
def _(event):
|
||||
buf = event.current_buffer
|
||||
# If typing a slash command (no spaces yet), use tab to autocomplete inline
|
||||
if buf.text.startswith('/') and ' ' not in buf.text:
|
||||
buf.complete_next()
|
||||
else:
|
||||
state['context_mode'] = (state['context_mode'] + 1) % 3
|
||||
event.app.invalidate()
|
||||
@bindings.add('escape', eager=True)
|
||||
@bindings.add('c-c')
|
||||
def _(event):
|
||||
state['cancelled'] = True
|
||||
event.app.exit(result='')
|
||||
|
||||
def get_active_buffer():
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
return '\n'.join(buffer.split('\n')[-state['context_lines']:])
|
||||
idx = max(0, state['total_cmds'] - state['context_cmd'])
|
||||
start, preview = blocks[idx]
|
||||
if state['context_mode'] == self.mode_single and idx + 1 < state['total_cmds']:
|
||||
end = blocks[idx + 1][0]
|
||||
active_raw = raw_bytes[start:end]
|
||||
else:
|
||||
active_raw = raw_bytes[start:]
|
||||
return preview + "\n" + log_cleaner(active_raw.decode(errors='replace'))
|
||||
|
||||
def get_prompt_text():
|
||||
import html
|
||||
# Always use user_prompt color for the Ask prompt
|
||||
color = self._get_theme_color("user_prompt", "cyan")
|
||||
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
text = html.escape(f"Ask [Ctx: {state['context_lines']}/{state['total_lines']}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
active = get_active_buffer()
|
||||
lines_count = len(active.split('\n'))
|
||||
mode_str = {self.mode_range: "Range", self.mode_single: "Cmd"}[state['context_mode']]
|
||||
text = html.escape(f"Ask [{mode_str} {state['context_cmd']} ~{lines_count}L]: ")
|
||||
return HTML(f'<style fg="{color}">{text}</style>')
|
||||
|
||||
from prompt_toolkit.application.current import get_app
|
||||
|
||||
def get_toolbar():
|
||||
import html
|
||||
app = get_app()
|
||||
c_warning = self._get_theme_color("warning", "yellow")
|
||||
|
||||
if app and app.current_buffer:
|
||||
text = app.current_buffer.text
|
||||
# Solo mostrar ayuda de comandos si estamos escribiendo el primer comando y no hay espacios
|
||||
if text.startswith('/') and ' ' not in text:
|
||||
commands = ['/os', '/prompt', '/architect', '/engineer', '/trust', '/untrust', '/memorize', '/clear']
|
||||
matches = [c for c in commands if c.startswith(text.lower())]
|
||||
if matches:
|
||||
m_text = html.escape(f"Available: {' '.join(matches)}")
|
||||
return HTML(f'<style fg="{c_warning}">{m_text}</style>' + " " * 20)
|
||||
|
||||
m_label = {self.mode_range: "RANGE", self.mode_single: "SINGLE", self.mode_lines: "LINES"}[state['context_mode']]
|
||||
if state['context_mode'] == self.mode_lines:
|
||||
base_str = f'\u25b6 Ctrl+\u2191/\u2193 adjusts by 50 lines [Tab: {m_label}]'
|
||||
else:
|
||||
idx = max(0, state['total_cmds'] - state['context_cmd'])
|
||||
desc = blocks[idx][1]
|
||||
base_str = f'\u25b6 {desc} [Tab: {m_label}]'
|
||||
|
||||
# Wrap base_str in a style to maintain consistency and avoid glitches
|
||||
# The fg color will be inherited from bottom-toolbar global style if not specified here
|
||||
base_html = f'<span>{html.escape(base_str)}</span>'
|
||||
|
||||
res_html = base_html
|
||||
if state.get('toolbar_msg'):
|
||||
if time.time() < state.get('msg_expiry', 0):
|
||||
msg = html.escape(state['toolbar_msg'])
|
||||
res_html = f'<style fg="{c_warning}">⚙️ {msg}</style> | ' + base_html
|
||||
else:
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
# Pad with spaces to ensure the line is cleared when the message disappears
|
||||
return HTML(res_html + " " * 20)
|
||||
|
||||
from prompt_toolkit.completion import Completer, Completion
|
||||
class SlashCommandCompleter(Completer):
|
||||
def get_completions(self, document, complete_event):
|
||||
text = document.text_before_cursor
|
||||
if text.startswith('/'):
|
||||
parts = text.split()
|
||||
# Only autocomplete the first word
|
||||
if len(parts) <= 1 or (len(parts) == 1 and not text.endswith(' ')):
|
||||
cmd_part = parts[0] if parts else text
|
||||
commands = [
|
||||
('/os', 'Set device OS (e.g. cisco_ios)'),
|
||||
('/prompt', 'Override prompt regex'),
|
||||
('/architect', 'Switch to Architect persona'),
|
||||
('/engineer', 'Switch to Engineer persona'),
|
||||
('/trust', 'Enable auto-execute'),
|
||||
('/untrust', 'Disable auto-execute'),
|
||||
('/memorize', 'Add fact to memory'),
|
||||
('/clear', 'Clear memory')
|
||||
]
|
||||
for cmd, desc in commands:
|
||||
if cmd.startswith(cmd_part.lower()):
|
||||
yield Completion(cmd, start_position=-len(cmd_part), display_meta=desc)
|
||||
|
||||
copilot_completer = SlashCommandCompleter()
|
||||
|
||||
while True:
|
||||
# 2. Ask question
|
||||
from prompt_toolkit.styles import Style
|
||||
c_contrast = self._get_theme_color("contrast", "gray")
|
||||
ui_style = Style.from_dict({
|
||||
'bottom-toolbar': f'fg:{c_contrast}',
|
||||
})
|
||||
|
||||
session = PromptSession(
|
||||
history=self.history,
|
||||
input=self.pt_input,
|
||||
output=self.pt_output,
|
||||
completer=copilot_completer,
|
||||
reserve_space_for_menu=0,
|
||||
style=ui_style
|
||||
)
|
||||
try:
|
||||
# Usamos un try/finally interno para asegurar que si algo falla en prompt_async,
|
||||
# no nos quedemos con la terminal en un estado extraño.
|
||||
question = await session.prompt_async(
|
||||
get_prompt_text,
|
||||
key_bindings=bindings,
|
||||
bottom_toolbar=get_toolbar
|
||||
)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
state['cancelled'] = True
|
||||
question = ""
|
||||
|
||||
if state['cancelled'] or not question.strip() or question.strip().lower() in ['cancel', 'exit', 'quit']:
|
||||
return "cancel", None, None
|
||||
|
||||
# 3. Process Input via AIService
|
||||
directive = self.ai_service.process_copilot_input(question, self.session_state)
|
||||
|
||||
if directive["action"] == "state_update":
|
||||
state['toolbar_msg'] = directive['message']
|
||||
state['msg_expiry'] = time.time() + 3 # 3 seconds timeout
|
||||
|
||||
async def delayed_refresh():
|
||||
await asyncio.sleep(3.1)
|
||||
# Only invalidate if the message hasn't been replaced by a newer one
|
||||
if state.get('toolbar_msg') == directive['message']:
|
||||
state['toolbar_msg'] = '' # Explicitly clear
|
||||
try:
|
||||
from prompt_toolkit.application.current import get_app
|
||||
app = get_app()
|
||||
if app: app.invalidate()
|
||||
except: pass
|
||||
asyncio.create_task(delayed_refresh())
|
||||
|
||||
# Mover el cursor arriba y limpiar la línea para que el nuevo prompt reemplace al anterior
|
||||
sys.stdout.write('\x1b[1A\x1b[2K')
|
||||
sys.stdout.flush()
|
||||
continue
|
||||
else:
|
||||
# Limpiar el mensaje de la barra cuando se hace una pregunta real
|
||||
state['toolbar_msg'] = ''
|
||||
|
||||
clean_question = directive.get("clean_prompt", question)
|
||||
overrides = directive.get("overrides", {})
|
||||
|
||||
# Merge node_info with session_state and overrides
|
||||
merged_node_info = node_info.copy()
|
||||
if self.session_state['os']: merged_node_info['os'] = self.session_state['os']
|
||||
if self.session_state['prompt']: merged_node_info['prompt'] = self.session_state['prompt']
|
||||
merged_node_info['persona'] = self.session_state['persona']
|
||||
merged_node_info['trust'] = self.session_state['trust_mode']
|
||||
merged_node_info['memories'] = list(self.session_state['memories'])
|
||||
|
||||
for k, v in overrides.items():
|
||||
merged_node_info[k] = v
|
||||
|
||||
# Enrich question
|
||||
past = self.history.get_strings()
|
||||
if len(past) > 1:
|
||||
clean_past = [q for q in past[-6:-1] if not q.startswith('/')]
|
||||
if clean_past:
|
||||
history_text = "\n".join(f"- {q}" for q in clean_past)
|
||||
clean_question = f"Previous questions:\n{history_text}\n\nCurrent Question:\n{clean_question}"
|
||||
|
||||
# 3. AI Execution
|
||||
# Use persona from overrides (one-shot) or from session state
|
||||
active_persona = merged_node_info.get('persona', self.session_state.get('persona', 'engineer'))
|
||||
persona_color = self._get_theme_color(active_persona, fallback="cyan")
|
||||
|
||||
active_buffer = get_active_buffer()
|
||||
live_text = "Thinking..."
|
||||
panel = Panel(live_text, title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color)
|
||||
|
||||
def on_chunk(text):
|
||||
nonlocal live_text
|
||||
if live_text == "Thinking...": live_text = ""
|
||||
live_text += text
|
||||
|
||||
with Live(panel, console=self.console, refresh_per_second=10) as live:
|
||||
def update_live(t):
|
||||
live.update(Panel(Markdown(t), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
wrapped_chunk = lambda t: (on_chunk(t), update_live(live_text))
|
||||
|
||||
# Check for interruption during AI call
|
||||
ai_task = asyncio.create_task(on_ai_call(active_buffer, clean_question, wrapped_chunk, merged_node_info))
|
||||
|
||||
try:
|
||||
while not ai_task.done():
|
||||
await asyncio.sleep(0.05)
|
||||
result = await ai_task
|
||||
except asyncio.CancelledError:
|
||||
return "cancel", None, None
|
||||
|
||||
if not result or result.get("error"):
|
||||
if result and result.get("error"): self.console.print(f"[red]Error: {result['error']}[/red]")
|
||||
return "cancel", None, None
|
||||
|
||||
# 4. Handle result
|
||||
if live_text == "Thinking..." and result.get("guide"):
|
||||
self.console.print(Panel(Markdown(result["guide"]), title=f"[bold {persona_color}]Copilot Guide[/bold {persona_color}]", border_style=persona_color))
|
||||
|
||||
commands = result.get("commands", [])
|
||||
if not commands:
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
risk = result.get("risk_level", "low")
|
||||
risk_style = {"low": "success", "high": "warning", "destructive": "error"}.get(risk, "success")
|
||||
style_color = self._get_theme_color(risk_style, fallback="green")
|
||||
|
||||
cmd_text = "\n".join(f" {i+1}. {c}" for i, c in enumerate(commands))
|
||||
# Explicitly use 'bold style_color' for both TITLE and BORDER to ensure maximum consistency
|
||||
self.console.print(Panel(cmd_text, title=f"[bold {style_color}]Suggested Commands [{risk.upper()}][/bold {style_color}]", border_style=f"bold {style_color}"))
|
||||
|
||||
if merged_node_info.get('trust', False) and risk != "destructive":
|
||||
self.console.print(f"[dim]⚙️ Auto-executing (Trust Mode)[/dim]")
|
||||
return "send_all", commands, None
|
||||
|
||||
confirm_session = PromptSession(input=self.pt_input, output=self.pt_output)
|
||||
c_bindings = KeyBindings()
|
||||
@c_bindings.add('escape', eager=True)
|
||||
@c_bindings.add('c-c')
|
||||
def _(ev): ev.app.exit(result='n')
|
||||
|
||||
import html
|
||||
try:
|
||||
p_text = html.escape(f"Send? (y/n/e/range) [n]: ")
|
||||
# Use the EXACT same style_color and force bold="true" for Prompt-Toolkit
|
||||
action = await confirm_session.prompt_async(HTML(f'<style fg="{style_color}" bold="true">{p_text}</style>'), key_bindings=c_bindings)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
def parse_indices(text, max_len):
|
||||
"""Helper to parse '1-3, 5, 7' into [0, 1, 2, 4, 6]."""
|
||||
indices = []
|
||||
# Replace commas with spaces and split
|
||||
parts = text.replace(',', ' ').split()
|
||||
for part in parts:
|
||||
if '-' in part:
|
||||
try:
|
||||
start, end = map(int, part.split('-'))
|
||||
# Ensure inclusive and 0-indexed
|
||||
indices.extend(range(start-1, end))
|
||||
except: continue
|
||||
elif part.isdigit():
|
||||
indices.append(int(part)-1)
|
||||
# Filter valid indices and remove duplicates
|
||||
return [i for i in sorted(set(indices)) if 0 <= i < max_len]
|
||||
|
||||
action_l = (action or "n").lower().strip()
|
||||
if action_l in ('y', 'yes', 'all'):
|
||||
return "send_all", commands, None
|
||||
|
||||
# Check for numeric selection (e.g., "1, 2-4")
|
||||
if re.match(r'^[0-9,\-\s]+$', action_l):
|
||||
selected_idxs = parse_indices(action_l, len(commands))
|
||||
if selected_idxs:
|
||||
return "send_all", [commands[i] for i in selected_idxs], None
|
||||
|
||||
elif action_l.startswith('e'):
|
||||
# Check if it's a selective edit like 'e1-2'
|
||||
selection_str = action_l[1:].strip()
|
||||
if selection_str:
|
||||
idxs = parse_indices(selection_str, len(commands))
|
||||
cmds_to_edit = [commands[i] for i in idxs] if idxs else commands
|
||||
else:
|
||||
cmds_to_edit = commands
|
||||
|
||||
target = "\n".join(cmds_to_edit)
|
||||
e_bindings = KeyBindings()
|
||||
@e_bindings.add('c-j')
|
||||
def _(ev): ev.app.exit(result=ev.app.current_buffer.text)
|
||||
@e_bindings.add('escape', 'enter')
|
||||
def _(ev): ev.app.exit(result=ev.app.current_buffer.text)
|
||||
@e_bindings.add('escape')
|
||||
def _(ev): ev.app.exit(result='')
|
||||
|
||||
c_edit = self._get_theme_color("user_prompt", "cyan")
|
||||
import html
|
||||
e_text = html.escape("Edit (Ctrl+Enter or Esc+Enter to submit):\n")
|
||||
try:
|
||||
edited = await confirm_session.prompt_async(
|
||||
HTML(f'<style fg="{c_edit}">{e_text}</style>'),
|
||||
default=target, multiline=True, key_bindings=e_bindings
|
||||
)
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
if edited and edited.strip():
|
||||
# Split by lines to ensure core.py applies delay between each command
|
||||
lines = [l.strip() for l in edited.split('\n') if l.strip()]
|
||||
return "custom", None, lines
|
||||
|
||||
self.console.print("")
|
||||
return "continue", None, None
|
||||
|
||||
return "cancel", None, None
|
||||
|
||||
finally:
|
||||
state['cancelled'] = True
|
||||
self.console.print("[dim]Returning to session...[/dim]")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Runs the interactive Copilot session.
|
||||
on_ai_call: async function(active_buffer, question) -> result_dict</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.terminal_ui.CopilotInterface" href="#connpy.cli.terminal_ui.CopilotInterface">CopilotInterface</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.terminal_ui.CopilotInterface.run_session" href="#connpy.cli.terminal_ui.CopilotInterface.run_session">run_session</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.validators API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -508,7 +508,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.connpy_pb2 API documentation</title>
|
||||
<meta name="description" content="Generated protocol buffer code.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -62,7 +62,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.AIResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -81,7 +81,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.AskRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -100,7 +100,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.BoolResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -119,7 +119,45 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.BulkRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.CopilotRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">CopilotRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.CopilotRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.CopilotResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">CopilotResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.CopilotResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -138,7 +176,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.DeleteRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -157,7 +195,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ExportRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -176,7 +214,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.FilterRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -195,7 +233,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.FullReplaceRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -214,7 +252,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.IdRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -233,7 +271,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.IntRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -252,7 +290,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.InteractRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -271,7 +309,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.InteractResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -290,7 +328,26 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ListRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.MCPRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">MCPRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.MCPRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -309,7 +366,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.MessageValue.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -328,7 +385,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.MoveRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -347,7 +404,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.NodeRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -366,7 +423,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.NodeRunResult.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -385,7 +442,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.PluginRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -404,7 +461,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ProfileRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -423,7 +480,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ProviderRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -442,7 +499,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.RunRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -461,7 +518,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ScriptRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -480,7 +537,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.StringRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -499,7 +556,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.StringResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -518,7 +575,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.StructRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -537,7 +594,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.StructResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -556,7 +613,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.TestRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -575,7 +632,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.UpdateRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -594,7 +651,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.connpy_pb2.ValueResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -638,6 +695,18 @@ el.replaceWith(d);
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc_layer.connpy_pb2.CopilotRequest" href="#connpy.grpc_layer.connpy_pb2.CopilotRequest">CopilotRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2.CopilotRequest.DESCRIPTOR" href="#connpy.grpc_layer.connpy_pb2.CopilotRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc_layer.connpy_pb2.CopilotResponse" href="#connpy.grpc_layer.connpy_pb2.CopilotResponse">CopilotResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2.CopilotResponse.DESCRIPTOR" href="#connpy.grpc_layer.connpy_pb2.CopilotResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc_layer.connpy_pb2.DeleteRequest" href="#connpy.grpc_layer.connpy_pb2.DeleteRequest">DeleteRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2.DeleteRequest.DESCRIPTOR" href="#connpy.grpc_layer.connpy_pb2.DeleteRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
@@ -692,6 +761,12 @@ el.replaceWith(d);
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc_layer.connpy_pb2.MCPRequest" href="#connpy.grpc_layer.connpy_pb2.MCPRequest">MCPRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2.MCPRequest.DESCRIPTOR" href="#connpy.grpc_layer.connpy_pb2.MCPRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc_layer.connpy_pb2.MessageValue" href="#connpy.grpc_layer.connpy_pb2.MessageValue">MessageValue</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2.MessageValue.DESCRIPTOR" href="#connpy.grpc_layer.connpy_pb2.MessageValue.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
@@ -793,7 +868,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -102,7 +102,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.remote_plugin_pb2 API documentation</title>
|
||||
<meta name="description" content="Generated protocol buffer code.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -62,7 +62,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.remote_plugin_pb2.IdRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -81,7 +81,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.remote_plugin_pb2.OutputChunk.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -100,7 +100,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.remote_plugin_pb2.PluginInvokeRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -119,7 +119,7 @@ el.replaceWith(d);
|
||||
<dl>
|
||||
<dt id="connpy.grpc_layer.remote_plugin_pb2.StringResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>The type of the None singleton.</p></div>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
@@ -168,7 +168,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.remote_plugin_pb2_grpc API documentation</title>
|
||||
<meta name="description" content="Client and server classes corresponding to protobuf-defined services.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -366,7 +366,7 @@ def invoke_plugin(request,
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.server API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -96,7 +96,7 @@ el.replaceWith(d);
|
||||
interceptors = [LoggingInterceptor()] if debug else []
|
||||
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10), interceptors=interceptors)
|
||||
|
||||
connpy_pb2_grpc.add_NodeServiceServicer_to_server(NodeServicer(config), server)
|
||||
connpy_pb2_grpc.add_NodeServiceServicer_to_server(NodeServicer(config, debug=debug), server)
|
||||
connpy_pb2_grpc.add_ProfileServiceServicer_to_server(ProfileServicer(config), server)
|
||||
connpy_pb2_grpc.add_ConfigServiceServicer_to_server(ConfigServicer(config), server)
|
||||
plugin_servicer = PluginServicer(config)
|
||||
@@ -245,6 +245,22 @@ el.replaceWith(d);
|
||||
res = self.service.confirm(request.value)
|
||||
return connpy_pb2.BoolResponse(value=res)
|
||||
|
||||
@handle_errors
|
||||
def ask_copilot(self, request, context):
|
||||
import json
|
||||
node_info = json.loads(request.node_info_json) if request.node_info_json else None
|
||||
result = self.service.ask_copilot(
|
||||
request.terminal_buffer,
|
||||
request.user_question,
|
||||
node_info
|
||||
)
|
||||
return connpy_pb2.CopilotResponse(
|
||||
commands=result.get("commands", []),
|
||||
guide=result.get("guide", ""),
|
||||
risk_level=result.get("risk_level", "low"),
|
||||
error=result.get("error") or ""
|
||||
)
|
||||
|
||||
@handle_errors
|
||||
def list_sessions(self, request, context):
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_sessions()))
|
||||
@@ -259,6 +275,17 @@ el.replaceWith(d);
|
||||
self.service.configure_provider(request.provider, request.model, request.api_key)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def configure_mcp(self, request, context):
|
||||
self.service.configure_mcp(
|
||||
request.name,
|
||||
url=request.url or None,
|
||||
enabled=request.enabled,
|
||||
auto_load_on_os=request.auto_load_on_os or None,
|
||||
remove=request.remove
|
||||
)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def load_session_data(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.load_session_data(request.value)))</code></pre>
|
||||
@@ -273,6 +300,8 @@ el.replaceWith(d);
|
||||
<li><code><b><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer">AIServiceServicer</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.ask" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.ask">ask</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.ask_copilot" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.ask_copilot">ask_copilot</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.configure_mcp" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.configure_mcp">configure_mcp</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.configure_provider" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.configure_provider">configure_provider</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.confirm" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.confirm">confirm</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.delete_session" href="connpy_pb2_grpc.html#connpy.grpc_layer.connpy_pb2_grpc.AIServiceServicer.delete_session">delete_session</a></code></li>
|
||||
@@ -618,7 +647,7 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.server.NodeServicer"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeServicer</span></span>
|
||||
<span>(</span><span>config)</span>
|
||||
<span>(</span><span>config, debug=False)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
@@ -626,8 +655,13 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
def __init__(self, config):
|
||||
def __init__(self, config, debug=False):
|
||||
self.service = NodeService(config)
|
||||
self.server_debug = debug
|
||||
if debug:
|
||||
from rich.console import Console
|
||||
from ..printer import connpy_theme, get_original_stdout
|
||||
self.server_console = Console(theme=connpy_theme, file=get_original_stdout())
|
||||
|
||||
@handle_errors
|
||||
def interact_node(self, request_iterator, context):
|
||||
@@ -650,8 +684,8 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
sftp = first_req.sftp
|
||||
debug = first_req.debug
|
||||
|
||||
if debug:
|
||||
printer.console.print(f"[debug][DEBUG][/debug] gRPC interact_node request for: [bold cyan]{unique_id}[/bold cyan]")
|
||||
if self.server_debug:
|
||||
self.server_console.print(f"[debug][DEBUG][/debug] gRPC interact_node request for: [bold cyan]{unique_id}[/bold cyan]")
|
||||
|
||||
if first_req.connection_params_json:
|
||||
import json
|
||||
@@ -710,7 +744,39 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
if sftp:
|
||||
n.protocol = "sftp"
|
||||
|
||||
connect = n._connect(debug=debug)
|
||||
# Build a logger that captures debug messages as ANSI-colored bytes for the client
|
||||
debug_chunks = []
|
||||
if debug:
|
||||
from io import StringIO
|
||||
from rich.console import Console as RichConsole
|
||||
from ..printer import connpy_theme
|
||||
from .. import printer as _printer
|
||||
|
||||
def remote_logger(msg_type, message):
|
||||
buf = StringIO()
|
||||
c = RichConsole(file=buf, force_terminal=True, width=120, theme=connpy_theme)
|
||||
if msg_type == "debug":
|
||||
c.print(_printer._format_multiline("i", f"[DEBUG] {message}", style="info"))
|
||||
elif msg_type == "success":
|
||||
c.print(_printer._format_multiline("✓", message, style="success"))
|
||||
elif msg_type == "error":
|
||||
c.print(_printer._format_multiline("✗", message, style="error"))
|
||||
else:
|
||||
c.print(str(message))
|
||||
rendered = buf.getvalue()
|
||||
if rendered:
|
||||
# Raw TTY needs \r\n instead of \n
|
||||
rendered = rendered.replace('\n', '\r\n')
|
||||
debug_chunks.append(rendered.encode())
|
||||
else:
|
||||
remote_logger = None
|
||||
|
||||
connect = n._connect(debug=debug, logger=remote_logger)
|
||||
|
||||
# Send debug output to client before checking result (always show the command)
|
||||
for chunk in debug_chunks:
|
||||
yield connpy_pb2.InteractResponse(stdout_data=chunk)
|
||||
|
||||
if connect != True:
|
||||
yield connpy_pb2.InteractResponse(success=False, error_message=str(connect))
|
||||
return
|
||||
@@ -737,7 +803,160 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
asyncio.run(n._async_interact_loop(remote_stream, resize_callback))
|
||||
async def remote_copilot_handler(buffer, node_info, stream, child_fd, cmd_byte_positions=None):
|
||||
import json
|
||||
import asyncio
|
||||
import os
|
||||
|
||||
if node_info is None:
|
||||
node_info = {}
|
||||
|
||||
node_info_json = json.dumps(node_info)
|
||||
|
||||
# Convert buffer to string if it's bytes for the preview
|
||||
preview_str = buffer[-200:].decode(errors='replace') if isinstance(buffer, bytes) else str(buffer)[-200:]
|
||||
|
||||
# 1. Send prompt to client
|
||||
response_queue.put(connpy_pb2.InteractResponse(
|
||||
copilot_prompt=True,
|
||||
copilot_buffer_preview=preview_str,
|
||||
copilot_node_info_json=node_info_json
|
||||
))
|
||||
|
||||
while True:
|
||||
# 2. Await the question from client via the copilot_queue
|
||||
import threading
|
||||
def preload_ai_deps():
|
||||
try:
|
||||
import litellm
|
||||
except Exception:
|
||||
pass
|
||||
threading.Thread(target=preload_ai_deps, daemon=True).start()
|
||||
|
||||
try:
|
||||
req_data = await asyncio.wait_for(remote_stream.copilot_queue.get(), timeout=120)
|
||||
if not req_data: return
|
||||
if "question" not in req_data or not req_data["question"] or req_data["question"] == "CANCEL" or req_data.get("action") == "cancel":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
question = req_data["question"]
|
||||
|
||||
merged_node_info_str = req_data.get("node_info_json", "")
|
||||
if merged_node_info_str:
|
||||
try:
|
||||
merged_node_info = json.loads(merged_node_info_str)
|
||||
node_info.update(merged_node_info)
|
||||
except: pass
|
||||
|
||||
context_buffer = req_data.get("context_buffer", "")
|
||||
if context_buffer.startswith('{"context_start_pos"'):
|
||||
try:
|
||||
parsed = json.loads(context_buffer)
|
||||
start_pos = parsed["context_start_pos"]
|
||||
selected_raw = raw_bytes[start_pos:]
|
||||
context_buffer = n._logclean(selected_raw.decode(errors='replace'), var=True)
|
||||
except Exception:
|
||||
context_buffer = buffer
|
||||
elif not context_buffer:
|
||||
context_buffer = buffer
|
||||
except asyncio.TimeoutError:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
|
||||
# 3. Call AI Service with streaming
|
||||
from ..services.ai_service import AIService
|
||||
service = AIService(self.service.config)
|
||||
|
||||
def chunk_callback(chunk_text):
|
||||
if chunk_text:
|
||||
response_queue.put(connpy_pb2.InteractResponse(
|
||||
copilot_stream_chunk=chunk_text
|
||||
))
|
||||
|
||||
# Create a clean version of node_info for the AI to save tokens and match local CLI behavior
|
||||
ai_node_info = {k: v for k, v in node_info.items() if k not in ("context_blocks", "full_buffer")}
|
||||
|
||||
ai_task = asyncio.create_task(service.aask_copilot(context_buffer, question, ai_node_info, chunk_callback=chunk_callback))
|
||||
wait_action_task = asyncio.create_task(remote_stream.copilot_queue.get())
|
||||
|
||||
done, pending = await asyncio.wait(
|
||||
[ai_task, wait_action_task],
|
||||
return_when=asyncio.FIRST_COMPLETED
|
||||
)
|
||||
|
||||
if wait_action_task in done:
|
||||
req_data = wait_action_task.result()
|
||||
ai_task.cancel()
|
||||
if req_data.get("action") == "cancel" or req_data.get("question") == "CANCEL":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
continue # Loop back instead of returning to keep session alive
|
||||
else:
|
||||
wait_action_task.cancel()
|
||||
result = ai_task.result()
|
||||
if not result:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
|
||||
# 4. Send response back to client
|
||||
response_queue.put(connpy_pb2.InteractResponse(
|
||||
copilot_response_json=json.dumps(result)
|
||||
))
|
||||
|
||||
# 5. Wait for user action
|
||||
try:
|
||||
action_data = await asyncio.wait_for(remote_stream.copilot_queue.get(), timeout=60)
|
||||
if not action_data: return
|
||||
action = action_data.get("action", "cancel")
|
||||
|
||||
if action == "continue":
|
||||
continue # Loop back for next question
|
||||
|
||||
if action == "cancel":
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
except asyncio.TimeoutError:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
|
||||
if action == "send_all":
|
||||
commands = result.get("commands", [])
|
||||
os.write(child_fd, b'\x15') # Ctrl+U to clear line
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
# Prepend screen length command to avoid pagination
|
||||
if "screen_length_command" in n.tags:
|
||||
os.write(child_fd, (n.tags["screen_length_command"] + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=n.tags["screen_length_command"]))
|
||||
await asyncio.sleep(0.8)
|
||||
|
||||
for cmd in commands:
|
||||
os.write(child_fd, (cmd + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=cmd))
|
||||
await asyncio.sleep(0.8)
|
||||
return
|
||||
elif action.startswith("custom:"):
|
||||
custom_cmds = action[7:]
|
||||
os.write(child_fd, b'\x15')
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
# Prepend screen length command to avoid pagination
|
||||
if "screen_length_command" in n.tags:
|
||||
os.write(child_fd, (n.tags["screen_length_command"] + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=n.tags["screen_length_command"]))
|
||||
await asyncio.sleep(0.8)
|
||||
|
||||
for cmd in custom_cmds.split('\n'):
|
||||
if cmd.strip():
|
||||
os.write(child_fd, (cmd.strip() + "\n").encode())
|
||||
response_queue.put(connpy_pb2.InteractResponse(copilot_injected_command=cmd.strip()))
|
||||
await asyncio.sleep(0.8)
|
||||
return
|
||||
else:
|
||||
os.write(child_fd, b'\x15\r')
|
||||
return
|
||||
|
||||
asyncio.run(n._async_interact_loop(remote_stream, resize_callback, copilot_handler=remote_copilot_handler))
|
||||
except Exception as e:
|
||||
pass
|
||||
finally:
|
||||
@@ -746,14 +965,19 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
|
||||
t_loop = threading.Thread(target=run_async_loop, daemon=True)
|
||||
t_loop.start()
|
||||
|
||||
def response_generator():
|
||||
while True:
|
||||
data = response_queue.get()
|
||||
if data is None:
|
||||
if debug:
|
||||
printer.console.print(f"[debug][DEBUG][/debug] gRPC interact_node session closed for: [bold cyan]{unique_id}[/bold cyan]")
|
||||
if self.server_debug:
|
||||
self.server_console.print(f"[debug][DEBUG][/debug] gRPC interact_node session closed for: [bold cyan]{unique_id}[/bold cyan]")
|
||||
break
|
||||
if isinstance(data, connpy_pb2.InteractResponse):
|
||||
yield data
|
||||
else:
|
||||
yield connpy_pb2.InteractResponse(stdout_data=data)
|
||||
yield from response_generator()
|
||||
|
||||
@handle_errors
|
||||
def list_nodes(self, request, context):
|
||||
f = request.filter_str if request.filter_str else None
|
||||
@@ -1319,7 +1543,7 @@ interceptor chooses to service this RPC, or None otherwise.</p></div>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.stubs API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -200,21 +200,33 @@ el.replaceWith(d);
|
||||
|
||||
if response.debug_message:
|
||||
if debug:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
printer.console.print(Text.from_ansi(response.debug_message))
|
||||
if status:
|
||||
if live_display:
|
||||
try: live_display.start()
|
||||
except: pass
|
||||
elif status:
|
||||
try: status.start()
|
||||
except: pass
|
||||
continue
|
||||
|
||||
if response.important_message:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
printer.console.print(Text.from_ansi(response.important_message))
|
||||
if status:
|
||||
if live_display:
|
||||
try: live_display.start()
|
||||
except: pass
|
||||
elif status:
|
||||
try: status.start()
|
||||
except: pass
|
||||
continue
|
||||
@@ -223,14 +235,33 @@ el.replaceWith(d);
|
||||
if response.text_chunk:
|
||||
full_content += response.text_chunk
|
||||
|
||||
if status and not debug:
|
||||
# Update the spinner line with a preview of the response
|
||||
preview = full_content.replace("\n", " ").strip()
|
||||
if len(preview) > 60: preview = preview[:57] + "..."
|
||||
status.update(f"[ai_status]{preview}")
|
||||
if not live_display:
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
|
||||
from rich.console import Console as RichConsole
|
||||
from ..printer import connpy_theme, get_original_stdout
|
||||
stable_console = RichConsole(theme=connpy_theme, file=get_original_stdout())
|
||||
|
||||
# We default to Engineer title during stream, final result will correct it if needed
|
||||
live_display = Live(
|
||||
Panel(Markdown(full_content), title="[bold engineer]Network Engineer[/bold engineer]", border_style="engineer", expand=False),
|
||||
console=stable_console,
|
||||
refresh_per_second=8,
|
||||
transient=False
|
||||
)
|
||||
live_display.start()
|
||||
else:
|
||||
live_display.update(
|
||||
Panel(Markdown(full_content), title="[bold engineer]Network Engineer[/bold engineer]", border_style="engineer", expand=False)
|
||||
)
|
||||
continue
|
||||
|
||||
if response.is_final:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
# Final stop for status to ensure it disappears before the panel
|
||||
if status:
|
||||
try: status.stop()
|
||||
@@ -242,9 +273,12 @@ el.replaceWith(d);
|
||||
role_label = "Network Architect" if responder == "architect" else "Network Engineer"
|
||||
title = f"[bold {alias}]{role_label}[/bold {alias}]"
|
||||
|
||||
# Always print the final Panel
|
||||
content_to_print = full_content or final_result.get("response", "")
|
||||
if content_to_print:
|
||||
if live_display:
|
||||
# Re-render the final frame with correct title/colors
|
||||
live_display.update(Panel(Markdown(content_to_print), title=title, border_style=alias, expand=False))
|
||||
else:
|
||||
printer.console.print(Panel(Markdown(content_to_print), title=title, border_style=alias, expand=False))
|
||||
break
|
||||
except Exception as e:
|
||||
@@ -277,6 +311,17 @@ el.replaceWith(d);
|
||||
req = connpy_pb2.ProviderRequest(provider=provider, model=model or "", api_key=api_key or "")
|
||||
self.stub.configure_provider(req)
|
||||
|
||||
@handle_errors
|
||||
def configure_mcp(self, name, url=None, enabled=True, auto_load_on_os=None, remove=False):
|
||||
req = connpy_pb2.MCPRequest(
|
||||
name=name,
|
||||
url=url or "",
|
||||
enabled=enabled,
|
||||
auto_load_on_os=auto_load_on_os or "",
|
||||
remove=remove
|
||||
)
|
||||
self.stub.configure_mcp(req)
|
||||
|
||||
@handle_errors
|
||||
def load_session_data(self, session_id):
|
||||
return from_struct(self.stub.load_session_data(connpy_pb2.StringRequest(value=session_id)).data)</code></pre>
|
||||
@@ -393,21 +438,33 @@ def ask(self, input_text, dryrun=False, chat_history=None, session_id=None, debu
|
||||
|
||||
if response.debug_message:
|
||||
if debug:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
printer.console.print(Text.from_ansi(response.debug_message))
|
||||
if status:
|
||||
if live_display:
|
||||
try: live_display.start()
|
||||
except: pass
|
||||
elif status:
|
||||
try: status.start()
|
||||
except: pass
|
||||
continue
|
||||
|
||||
if response.important_message:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
printer.console.print(Text.from_ansi(response.important_message))
|
||||
if status:
|
||||
if live_display:
|
||||
try: live_display.start()
|
||||
except: pass
|
||||
elif status:
|
||||
try: status.start()
|
||||
except: pass
|
||||
continue
|
||||
@@ -416,14 +473,33 @@ def ask(self, input_text, dryrun=False, chat_history=None, session_id=None, debu
|
||||
if response.text_chunk:
|
||||
full_content += response.text_chunk
|
||||
|
||||
if status and not debug:
|
||||
# Update the spinner line with a preview of the response
|
||||
preview = full_content.replace("\n", " ").strip()
|
||||
if len(preview) > 60: preview = preview[:57] + "..."
|
||||
status.update(f"[ai_status]{preview}")
|
||||
if not live_display:
|
||||
if status:
|
||||
try: status.stop()
|
||||
except: pass
|
||||
|
||||
from rich.console import Console as RichConsole
|
||||
from ..printer import connpy_theme, get_original_stdout
|
||||
stable_console = RichConsole(theme=connpy_theme, file=get_original_stdout())
|
||||
|
||||
# We default to Engineer title during stream, final result will correct it if needed
|
||||
live_display = Live(
|
||||
Panel(Markdown(full_content), title="[bold engineer]Network Engineer[/bold engineer]", border_style="engineer", expand=False),
|
||||
console=stable_console,
|
||||
refresh_per_second=8,
|
||||
transient=False
|
||||
)
|
||||
live_display.start()
|
||||
else:
|
||||
live_display.update(
|
||||
Panel(Markdown(full_content), title="[bold engineer]Network Engineer[/bold engineer]", border_style="engineer", expand=False)
|
||||
)
|
||||
continue
|
||||
|
||||
if response.is_final:
|
||||
if live_display:
|
||||
try: live_display.stop()
|
||||
except: pass
|
||||
# Final stop for status to ensure it disappears before the panel
|
||||
if status:
|
||||
try: status.stop()
|
||||
@@ -435,9 +511,12 @@ def ask(self, input_text, dryrun=False, chat_history=None, session_id=None, debu
|
||||
role_label = "Network Architect" if responder == "architect" else "Network Engineer"
|
||||
title = f"[bold {alias}]{role_label}[/bold {alias}]"
|
||||
|
||||
# Always print the final Panel
|
||||
content_to_print = full_content or final_result.get("response", "")
|
||||
if content_to_print:
|
||||
if live_display:
|
||||
# Re-render the final frame with correct title/colors
|
||||
live_display.update(Panel(Markdown(content_to_print), title=title, border_style=alias, expand=False))
|
||||
else:
|
||||
printer.console.print(Panel(Markdown(content_to_print), title=title, border_style=alias, expand=False))
|
||||
break
|
||||
except Exception as e:
|
||||
@@ -455,6 +534,27 @@ def ask(self, input_text, dryrun=False, chat_history=None, session_id=None, debu
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.stubs.AIStub.configure_mcp"><code class="name flex">
|
||||
<span>def <span class="ident">configure_mcp</span></span>(<span>self, name, url=None, enabled=True, auto_load_on_os=None, remove=False)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@handle_errors
|
||||
def configure_mcp(self, name, url=None, enabled=True, auto_load_on_os=None, remove=False):
|
||||
req = connpy_pb2.MCPRequest(
|
||||
name=name,
|
||||
url=url or "",
|
||||
enabled=enabled,
|
||||
auto_load_on_os=auto_load_on_os or "",
|
||||
remove=remove
|
||||
)
|
||||
self.stub.configure_mcp(req)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc_layer.stubs.AIStub.configure_provider"><code class="name flex">
|
||||
<span>def <span class="ident">configure_provider</span></span>(<span>self, provider, model=None, api_key=None)</span>
|
||||
</code></dt>
|
||||
@@ -924,15 +1024,98 @@ def set_reserved_names(self, names):
|
||||
self.remote_host = remote_host
|
||||
self.config = config
|
||||
|
||||
def _handle_remote_copilot(self, res, request_queue, response_queue, client_buffer_bytes, cmd_byte_positions, pause_generator, resume_generator, old_tty):
|
||||
import json, asyncio, termios, sys, tty, queue
|
||||
from ..core import copilot_terminal_mode
|
||||
from . import connpy_pb2
|
||||
|
||||
pause_generator()
|
||||
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
interface = CopilotInterface(
|
||||
self.config,
|
||||
history=getattr(self, 'copilot_history', None),
|
||||
session_state=getattr(self, 'copilot_state', None)
|
||||
)
|
||||
self.copilot_history = interface.history
|
||||
self.copilot_state = interface.session_state
|
||||
|
||||
node_info = json.loads(res.copilot_node_info_json) if res.copilot_node_info_json else {}
|
||||
|
||||
async def on_ai_call_remote(active_buffer, question, chunk_callback, merged_node_info):
|
||||
# Send request to server
|
||||
request_queue.put(connpy_pb2.InteractRequest(
|
||||
copilot_question=question,
|
||||
copilot_context_buffer=active_buffer,
|
||||
copilot_node_info_json=json.dumps(merged_node_info)
|
||||
))
|
||||
# Wait for chunks from server
|
||||
while True:
|
||||
try:
|
||||
chunk_res = response_queue.get(timeout=0.1)
|
||||
if chunk_res is None: return {"error": "Server disconnected"}
|
||||
if chunk_res.copilot_stream_chunk:
|
||||
chunk_callback(chunk_res.copilot_stream_chunk)
|
||||
elif chunk_res.copilot_response_json:
|
||||
return json.loads(chunk_res.copilot_response_json)
|
||||
except queue.Empty:
|
||||
await asyncio.sleep(0.05)
|
||||
|
||||
# Wrap in async loop
|
||||
async def run_remote_copilot():
|
||||
while True:
|
||||
action, commands, custom_cmd = await interface.run_session(
|
||||
raw_bytes=bytes(client_buffer_bytes),
|
||||
cmd_byte_positions=cmd_byte_positions,
|
||||
node_info=node_info,
|
||||
on_ai_call=on_ai_call_remote
|
||||
)
|
||||
|
||||
if action == "continue":
|
||||
# Send continue signal to server to loop back for another question
|
||||
request_queue.put(connpy_pb2.InteractRequest(copilot_action="continue"))
|
||||
continue
|
||||
|
||||
return action, commands, custom_cmd
|
||||
|
||||
with copilot_terminal_mode():
|
||||
action, commands, custom_cmd = asyncio.run(run_remote_copilot())
|
||||
|
||||
# Prepare final action for server
|
||||
action_sent = "cancel"
|
||||
if action == "send_all" and commands:
|
||||
# In remote mode, send the selected commands as a custom block
|
||||
# so the server executes exactly what the user picked (e.g., selection '1')
|
||||
action_sent = f"custom:{chr(10).join(commands)}"
|
||||
elif action == "custom" and custom_cmd:
|
||||
action_sent = f"custom:{chr(10).join(custom_cmd)}"
|
||||
request_queue.put(connpy_pb2.InteractRequest(copilot_action=action_sent))
|
||||
resume_generator()
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
|
||||
@handle_errors
|
||||
def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
import sys
|
||||
import select
|
||||
import tty
|
||||
import termios
|
||||
import queue
|
||||
import os
|
||||
import threading
|
||||
|
||||
request_queue = queue.Queue()
|
||||
client_buffer_bytes = bytearray()
|
||||
cmd_byte_positions = [(0, None)]
|
||||
pause_stdin = [False]
|
||||
wake_r, wake_w = os.pipe()
|
||||
|
||||
def pause_generator():
|
||||
pause_stdin[0] = True
|
||||
os.write(wake_w, b'\x00')
|
||||
|
||||
def resume_generator():
|
||||
pause_stdin[0] = False
|
||||
|
||||
def request_generator():
|
||||
cols, rows = 80, 24
|
||||
try:
|
||||
@@ -946,12 +1129,31 @@ def set_reserved_names(self, names):
|
||||
)
|
||||
|
||||
while True:
|
||||
r, _, _ = select.select([sys.stdin.fileno()], [], [])
|
||||
if r:
|
||||
try:
|
||||
while True:
|
||||
req = request_queue.get_nowait()
|
||||
if req is None:
|
||||
return
|
||||
yield req
|
||||
except queue.Empty:
|
||||
pass
|
||||
|
||||
if pause_stdin[0]:
|
||||
import time
|
||||
time.sleep(0.05)
|
||||
continue
|
||||
|
||||
r, _, _ = select.select([sys.stdin.fileno(), wake_r], [], [], 0.05)
|
||||
if wake_r in r:
|
||||
os.read(wake_r, 1)
|
||||
continue
|
||||
if sys.stdin.fileno() in r and not pause_stdin[0]:
|
||||
try:
|
||||
data = os.read(sys.stdin.fileno(), 1024)
|
||||
if not data:
|
||||
break
|
||||
if b'\r' in data or b'\n' in data:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), None))
|
||||
yield connpy_pb2.InteractRequest(stdin_data=data)
|
||||
except OSError:
|
||||
break
|
||||
@@ -969,30 +1171,77 @@ def set_reserved_names(self, names):
|
||||
|
||||
old_tty = termios.tcgetattr(sys.stdin)
|
||||
try:
|
||||
import time
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
response_iterator = self.stub.interact_node(request_generator())
|
||||
|
||||
# First response is connection status
|
||||
import queue
|
||||
response_queue = queue.Queue()
|
||||
|
||||
def response_consumer():
|
||||
try:
|
||||
first_res = next(response_iterator)
|
||||
if first_res.success:
|
||||
for r in response_iterator:
|
||||
response_queue.put(r)
|
||||
except Exception:
|
||||
pass
|
||||
response_queue.put(None)
|
||||
|
||||
t_consumer = threading.Thread(target=response_consumer, daemon=True)
|
||||
t_consumer.start()
|
||||
|
||||
# First phase: Wait for connection status, print early data
|
||||
try:
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
return
|
||||
if res.stdout_data:
|
||||
data = res.stdout_data
|
||||
if debug:
|
||||
data = data.replace(b'\x1b[H\x1b[2J', b'').replace(b'\x1bc', b'').replace(b'\x1b[3J', b'')
|
||||
os.write(sys.stdout.fileno(), data)
|
||||
|
||||
if res.success:
|
||||
# Connection established on server, show success message
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.success(conn_msg)
|
||||
pause_stdin[0] = False
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
else:
|
||||
break
|
||||
|
||||
if res.error_message:
|
||||
# Connection failed on server
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.error(f"Connection failed: {first_res.error_message}")
|
||||
printer.error(f"Connection failed: {res.error_message}")
|
||||
return
|
||||
except StopIteration:
|
||||
except queue.Empty:
|
||||
return
|
||||
|
||||
for res in response_iterator:
|
||||
# Second phase: Stream active session
|
||||
# Clear screen filter is only applied before success (Phase 1).
|
||||
# Once the user has a prompt, Ctrl+L must work normally.
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
break
|
||||
if res.copilot_prompt:
|
||||
self._handle_remote_copilot(
|
||||
res, request_queue, response_queue,
|
||||
client_buffer_bytes, cmd_byte_positions,
|
||||
pause_generator, resume_generator, old_tty
|
||||
)
|
||||
continue
|
||||
|
||||
if res.copilot_injected_command:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), res.copilot_injected_command))
|
||||
|
||||
if res.stdout_data:
|
||||
os.write(sys.stdout.fileno(), res.stdout_data)
|
||||
client_buffer_bytes.extend(res.stdout_data)
|
||||
finally:
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
os.close(wake_r)
|
||||
os.close(wake_w)
|
||||
|
||||
@handle_errors
|
||||
def connect_dynamic(self, connection_params, debug=False):
|
||||
@@ -1000,10 +1249,23 @@ def set_reserved_names(self, names):
|
||||
import select
|
||||
import tty
|
||||
import termios
|
||||
import queue
|
||||
import os
|
||||
import json
|
||||
|
||||
params_json = json.dumps(connection_params)
|
||||
request_queue = queue.Queue()
|
||||
client_buffer_bytes = bytearray()
|
||||
cmd_byte_positions = [(0, None)]
|
||||
pause_stdin = [False]
|
||||
wake_r, wake_w = os.pipe()
|
||||
|
||||
def pause_generator():
|
||||
pause_stdin[0] = True
|
||||
os.write(wake_w, b'\x00')
|
||||
|
||||
def resume_generator():
|
||||
pause_stdin[0] = False
|
||||
|
||||
def request_generator():
|
||||
cols, rows = 80, 24
|
||||
@@ -1019,12 +1281,31 @@ def set_reserved_names(self, names):
|
||||
)
|
||||
|
||||
while True:
|
||||
r, _, _ = select.select([sys.stdin.fileno()], [], [])
|
||||
if r:
|
||||
try:
|
||||
while True:
|
||||
req = request_queue.get_nowait()
|
||||
if req is None:
|
||||
return
|
||||
yield req
|
||||
except queue.Empty:
|
||||
pass
|
||||
|
||||
if pause_stdin[0]:
|
||||
import time
|
||||
time.sleep(0.05)
|
||||
continue
|
||||
|
||||
r, _, _ = select.select([sys.stdin.fileno(), wake_r], [], [], 0.05)
|
||||
if wake_r in r:
|
||||
os.read(wake_r, 1)
|
||||
continue
|
||||
if sys.stdin.fileno() in r and not pause_stdin[0]:
|
||||
try:
|
||||
data = os.read(sys.stdin.fileno(), 1024)
|
||||
if not data:
|
||||
break
|
||||
if b'\r' in data or b'\n' in data:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), None))
|
||||
yield connpy_pb2.InteractRequest(stdin_data=data)
|
||||
except OSError:
|
||||
break
|
||||
@@ -1043,30 +1324,75 @@ def set_reserved_names(self, names):
|
||||
|
||||
old_tty = termios.tcgetattr(sys.stdin)
|
||||
try:
|
||||
import time
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
response_iterator = self.stub.interact_node(request_generator())
|
||||
|
||||
# First response is connection status
|
||||
import queue
|
||||
response_queue = queue.Queue()
|
||||
|
||||
def response_consumer():
|
||||
try:
|
||||
first_res = next(response_iterator)
|
||||
if first_res.success:
|
||||
for r in response_iterator:
|
||||
response_queue.put(r)
|
||||
except Exception:
|
||||
pass
|
||||
response_queue.put(None)
|
||||
|
||||
t_consumer = threading.Thread(target=response_consumer, daemon=True)
|
||||
t_consumer.start()
|
||||
|
||||
# First phase: Wait for connection status, print early data
|
||||
try:
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
return
|
||||
if res.stdout_data:
|
||||
data = res.stdout_data
|
||||
if debug:
|
||||
data = data.replace(b'\x1b[H\x1b[2J', b'').replace(b'\x1bc', b'').replace(b'\x1b[3J', b'')
|
||||
os.write(sys.stdout.fileno(), data)
|
||||
|
||||
if res.success:
|
||||
# Connection established on server, show success message
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.success(conn_msg)
|
||||
pause_stdin[0] = False
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
else:
|
||||
break
|
||||
|
||||
if res.error_message:
|
||||
# Connection failed on server
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.error(f"Connection failed: {first_res.error_message}")
|
||||
printer.error(f"Connection failed: {res.error_message}")
|
||||
return
|
||||
except StopIteration:
|
||||
except queue.Empty:
|
||||
return
|
||||
|
||||
for res in response_iterator:
|
||||
# Second phase: Stream active session
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
break
|
||||
if res.copilot_prompt:
|
||||
self._handle_remote_copilot(
|
||||
res, request_queue, response_queue,
|
||||
client_buffer_bytes, cmd_byte_positions,
|
||||
pause_generator, resume_generator, old_tty
|
||||
)
|
||||
continue
|
||||
|
||||
if res.copilot_injected_command:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), res.copilot_injected_command))
|
||||
|
||||
if res.stdout_data:
|
||||
os.write(sys.stdout.fileno(), res.stdout_data)
|
||||
client_buffer_bytes.extend(res.stdout_data)
|
||||
finally:
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
os.close(wake_r)
|
||||
os.close(wake_w)
|
||||
|
||||
@MethodHook
|
||||
@handle_errors
|
||||
@@ -1220,10 +1546,23 @@ def connect_dynamic(self, connection_params, debug=False):
|
||||
import select
|
||||
import tty
|
||||
import termios
|
||||
import queue
|
||||
import os
|
||||
import json
|
||||
|
||||
params_json = json.dumps(connection_params)
|
||||
request_queue = queue.Queue()
|
||||
client_buffer_bytes = bytearray()
|
||||
cmd_byte_positions = [(0, None)]
|
||||
pause_stdin = [False]
|
||||
wake_r, wake_w = os.pipe()
|
||||
|
||||
def pause_generator():
|
||||
pause_stdin[0] = True
|
||||
os.write(wake_w, b'\x00')
|
||||
|
||||
def resume_generator():
|
||||
pause_stdin[0] = False
|
||||
|
||||
def request_generator():
|
||||
cols, rows = 80, 24
|
||||
@@ -1239,12 +1578,31 @@ def connect_dynamic(self, connection_params, debug=False):
|
||||
)
|
||||
|
||||
while True:
|
||||
r, _, _ = select.select([sys.stdin.fileno()], [], [])
|
||||
if r:
|
||||
try:
|
||||
while True:
|
||||
req = request_queue.get_nowait()
|
||||
if req is None:
|
||||
return
|
||||
yield req
|
||||
except queue.Empty:
|
||||
pass
|
||||
|
||||
if pause_stdin[0]:
|
||||
import time
|
||||
time.sleep(0.05)
|
||||
continue
|
||||
|
||||
r, _, _ = select.select([sys.stdin.fileno(), wake_r], [], [], 0.05)
|
||||
if wake_r in r:
|
||||
os.read(wake_r, 1)
|
||||
continue
|
||||
if sys.stdin.fileno() in r and not pause_stdin[0]:
|
||||
try:
|
||||
data = os.read(sys.stdin.fileno(), 1024)
|
||||
if not data:
|
||||
break
|
||||
if b'\r' in data or b'\n' in data:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), None))
|
||||
yield connpy_pb2.InteractRequest(stdin_data=data)
|
||||
except OSError:
|
||||
break
|
||||
@@ -1263,30 +1621,75 @@ def connect_dynamic(self, connection_params, debug=False):
|
||||
|
||||
old_tty = termios.tcgetattr(sys.stdin)
|
||||
try:
|
||||
import time
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
response_iterator = self.stub.interact_node(request_generator())
|
||||
|
||||
# First response is connection status
|
||||
import queue
|
||||
response_queue = queue.Queue()
|
||||
|
||||
def response_consumer():
|
||||
try:
|
||||
first_res = next(response_iterator)
|
||||
if first_res.success:
|
||||
for r in response_iterator:
|
||||
response_queue.put(r)
|
||||
except Exception:
|
||||
pass
|
||||
response_queue.put(None)
|
||||
|
||||
t_consumer = threading.Thread(target=response_consumer, daemon=True)
|
||||
t_consumer.start()
|
||||
|
||||
# First phase: Wait for connection status, print early data
|
||||
try:
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
return
|
||||
if res.stdout_data:
|
||||
data = res.stdout_data
|
||||
if debug:
|
||||
data = data.replace(b'\x1b[H\x1b[2J', b'').replace(b'\x1bc', b'').replace(b'\x1b[3J', b'')
|
||||
os.write(sys.stdout.fileno(), data)
|
||||
|
||||
if res.success:
|
||||
# Connection established on server, show success message
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.success(conn_msg)
|
||||
pause_stdin[0] = False
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
else:
|
||||
break
|
||||
|
||||
if res.error_message:
|
||||
# Connection failed on server
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.error(f"Connection failed: {first_res.error_message}")
|
||||
printer.error(f"Connection failed: {res.error_message}")
|
||||
return
|
||||
except StopIteration:
|
||||
except queue.Empty:
|
||||
return
|
||||
|
||||
for res in response_iterator:
|
||||
# Second phase: Stream active session
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
break
|
||||
if res.copilot_prompt:
|
||||
self._handle_remote_copilot(
|
||||
res, request_queue, response_queue,
|
||||
client_buffer_bytes, cmd_byte_positions,
|
||||
pause_generator, resume_generator, old_tty
|
||||
)
|
||||
continue
|
||||
|
||||
if res.copilot_injected_command:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), res.copilot_injected_command))
|
||||
|
||||
if res.stdout_data:
|
||||
os.write(sys.stdout.fileno(), res.stdout_data)
|
||||
client_buffer_bytes.extend(res.stdout_data)
|
||||
finally:
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)</code></pre>
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
os.close(wake_r)
|
||||
os.close(wake_w)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
@@ -1304,9 +1707,23 @@ def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
import select
|
||||
import tty
|
||||
import termios
|
||||
import queue
|
||||
import os
|
||||
import threading
|
||||
|
||||
request_queue = queue.Queue()
|
||||
client_buffer_bytes = bytearray()
|
||||
cmd_byte_positions = [(0, None)]
|
||||
pause_stdin = [False]
|
||||
wake_r, wake_w = os.pipe()
|
||||
|
||||
def pause_generator():
|
||||
pause_stdin[0] = True
|
||||
os.write(wake_w, b'\x00')
|
||||
|
||||
def resume_generator():
|
||||
pause_stdin[0] = False
|
||||
|
||||
def request_generator():
|
||||
cols, rows = 80, 24
|
||||
try:
|
||||
@@ -1320,12 +1737,31 @@ def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
)
|
||||
|
||||
while True:
|
||||
r, _, _ = select.select([sys.stdin.fileno()], [], [])
|
||||
if r:
|
||||
try:
|
||||
while True:
|
||||
req = request_queue.get_nowait()
|
||||
if req is None:
|
||||
return
|
||||
yield req
|
||||
except queue.Empty:
|
||||
pass
|
||||
|
||||
if pause_stdin[0]:
|
||||
import time
|
||||
time.sleep(0.05)
|
||||
continue
|
||||
|
||||
r, _, _ = select.select([sys.stdin.fileno(), wake_r], [], [], 0.05)
|
||||
if wake_r in r:
|
||||
os.read(wake_r, 1)
|
||||
continue
|
||||
if sys.stdin.fileno() in r and not pause_stdin[0]:
|
||||
try:
|
||||
data = os.read(sys.stdin.fileno(), 1024)
|
||||
if not data:
|
||||
break
|
||||
if b'\r' in data or b'\n' in data:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), None))
|
||||
yield connpy_pb2.InteractRequest(stdin_data=data)
|
||||
except OSError:
|
||||
break
|
||||
@@ -1343,30 +1779,77 @@ def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
|
||||
old_tty = termios.tcgetattr(sys.stdin)
|
||||
try:
|
||||
import time
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
response_iterator = self.stub.interact_node(request_generator())
|
||||
|
||||
# First response is connection status
|
||||
import queue
|
||||
response_queue = queue.Queue()
|
||||
|
||||
def response_consumer():
|
||||
try:
|
||||
first_res = next(response_iterator)
|
||||
if first_res.success:
|
||||
for r in response_iterator:
|
||||
response_queue.put(r)
|
||||
except Exception:
|
||||
pass
|
||||
response_queue.put(None)
|
||||
|
||||
t_consumer = threading.Thread(target=response_consumer, daemon=True)
|
||||
t_consumer.start()
|
||||
|
||||
# First phase: Wait for connection status, print early data
|
||||
try:
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
return
|
||||
if res.stdout_data:
|
||||
data = res.stdout_data
|
||||
if debug:
|
||||
data = data.replace(b'\x1b[H\x1b[2J', b'').replace(b'\x1bc', b'').replace(b'\x1b[3J', b'')
|
||||
os.write(sys.stdout.fileno(), data)
|
||||
|
||||
if res.success:
|
||||
# Connection established on server, show success message
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.success(conn_msg)
|
||||
pause_stdin[0] = False
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
else:
|
||||
break
|
||||
|
||||
if res.error_message:
|
||||
# Connection failed on server
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
printer.error(f"Connection failed: {first_res.error_message}")
|
||||
printer.error(f"Connection failed: {res.error_message}")
|
||||
return
|
||||
except StopIteration:
|
||||
except queue.Empty:
|
||||
return
|
||||
|
||||
for res in response_iterator:
|
||||
# Second phase: Stream active session
|
||||
# Clear screen filter is only applied before success (Phase 1).
|
||||
# Once the user has a prompt, Ctrl+L must work normally.
|
||||
while True:
|
||||
res = response_queue.get()
|
||||
if res is None:
|
||||
break
|
||||
if res.copilot_prompt:
|
||||
self._handle_remote_copilot(
|
||||
res, request_queue, response_queue,
|
||||
client_buffer_bytes, cmd_byte_positions,
|
||||
pause_generator, resume_generator, old_tty
|
||||
)
|
||||
continue
|
||||
|
||||
if res.copilot_injected_command:
|
||||
cmd_byte_positions.append((len(client_buffer_bytes), res.copilot_injected_command))
|
||||
|
||||
if res.stdout_data:
|
||||
os.write(sys.stdout.fileno(), res.stdout_data)
|
||||
client_buffer_bytes.extend(res.stdout_data)
|
||||
finally:
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)</code></pre>
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
os.close(wake_r)
|
||||
os.close(wake_w)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
@@ -2036,6 +2519,7 @@ def stop_api(self):
|
||||
<h4><code><a title="connpy.grpc_layer.stubs.AIStub" href="#connpy.grpc_layer.stubs.AIStub">AIStub</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.grpc_layer.stubs.AIStub.ask" href="#connpy.grpc_layer.stubs.AIStub.ask">ask</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.stubs.AIStub.configure_mcp" href="#connpy.grpc_layer.stubs.AIStub.configure_mcp">configure_mcp</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.stubs.AIStub.configure_provider" href="#connpy.grpc_layer.stubs.AIStub.configure_provider">configure_provider</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.stubs.AIStub.confirm" href="#connpy.grpc_layer.stubs.AIStub.confirm">confirm</a></code></li>
|
||||
<li><code><a title="connpy.grpc_layer.stubs.AIStub.delete_session" href="#connpy.grpc_layer.stubs.AIStub.delete_session">delete_session</a></code></li>
|
||||
@@ -2130,7 +2614,7 @@ def stop_api(self):
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc_layer.utils API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -138,7 +138,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
+989
-585
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,349 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.mcp_client API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.mcp_client</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.mcp_client.MCPClientManager"><code class="flex name class">
|
||||
<span>class <span class="ident">MCPClientManager</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class MCPClientManager:
|
||||
"""Manages MCP SSE client connections for connpy."""
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
def __new__(cls, *args, **kwargs):
|
||||
with cls._lock:
|
||||
if cls._instance is None:
|
||||
cls._instance = super(MCPClientManager, cls).__new__(cls)
|
||||
cls._instance._initialized = False
|
||||
return cls._instance
|
||||
|
||||
def __init__(self, config=None):
|
||||
if self._initialized:
|
||||
return
|
||||
self.config = config
|
||||
self.sessions: Dict[str, Dict[str, Any]] = {} # name -> {session, stack}
|
||||
self.tool_cache: Dict[str, List[Dict[str, Any]]] = {}
|
||||
self._connecting: Dict[str, asyncio.Future] = {}
|
||||
self._initialized = True
|
||||
|
||||
async def get_tools_for_llm(self, os_filter: Optional[str] = None) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Fetches tools from enabled MCP servers that match the OS filter.
|
||||
"""
|
||||
if not MCP_AVAILABLE:
|
||||
return []
|
||||
|
||||
all_llm_tools = []
|
||||
try:
|
||||
mcp_config = self.config.config.get("ai", {}).get("mcp_servers", {})
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
async def _fetch(name, cfg):
|
||||
if not cfg.get("enabled", True): return []
|
||||
|
||||
# Filter by OS if specified in config (primarily used for copilot strict matching)
|
||||
auto_os = cfg.get("auto_load_on_os")
|
||||
if os_filter is not None and auto_os and os_filter.lower() != auto_os.lower():
|
||||
return []
|
||||
|
||||
try:
|
||||
session = await self._ensure_connected(name, cfg)
|
||||
if session:
|
||||
if name in self.tool_cache: return self.tool_cache[name]
|
||||
llm_tools = await self._fetch_tools_as_openai(name, session)
|
||||
self.tool_cache[name] = llm_tools
|
||||
return llm_tools
|
||||
except Exception:
|
||||
pass
|
||||
return []
|
||||
|
||||
tasks = [ _fetch(name, cfg) for name, cfg in mcp_config.items() ]
|
||||
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks)
|
||||
for tools in results:
|
||||
all_llm_tools.extend(tools)
|
||||
|
||||
return all_llm_tools
|
||||
|
||||
async def _ensure_connected(self, name: str, cfg: Dict[str, Any]) -> Optional[Any]:
|
||||
if not MCP_AVAILABLE: return None
|
||||
|
||||
if name in self.sessions and self.sessions[name].get("session"):
|
||||
return self.sessions[name]["session"]
|
||||
|
||||
url = cfg.get("url")
|
||||
if not url:
|
||||
return None
|
||||
|
||||
if name in self._connecting:
|
||||
try:
|
||||
return await asyncio.wait_for(asyncio.shield(self._connecting[name]), timeout=10.0)
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
loop = asyncio.get_running_loop()
|
||||
fut = loop.create_future()
|
||||
self._connecting[name] = fut
|
||||
|
||||
try:
|
||||
from contextlib import AsyncExitStack
|
||||
stack = AsyncExitStack()
|
||||
|
||||
async def _do_connect():
|
||||
read, write = await stack.enter_async_context(sse_client(url))
|
||||
session = await stack.enter_async_context(ClientSession(read, write))
|
||||
await session.initialize()
|
||||
return session
|
||||
|
||||
session = await asyncio.wait_for(_do_connect(), timeout=15.0)
|
||||
self.sessions[name] = {"session": session, "stack": stack}
|
||||
fut.set_result(session)
|
||||
return session
|
||||
except Exception:
|
||||
fut.set_result(None)
|
||||
return None
|
||||
finally:
|
||||
if name in self._connecting:
|
||||
del self._connecting[name]
|
||||
|
||||
async def _fetch_tools_as_openai(self, server_name: str, session: Any) -> List[Dict[str, Any]]:
|
||||
try:
|
||||
result = await asyncio.wait_for(session.list_tools(), timeout=5.0)
|
||||
openai_tools = []
|
||||
for tool in result.tools:
|
||||
# Use mcp_ prefix to ensure valid function name for LiteLLM/Gemini
|
||||
prefixed_name = f"mcp_{server_name}__{tool.name}"
|
||||
openai_tools.append({
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": prefixed_name,
|
||||
"description": f"[{server_name}] {tool.description}",
|
||||
"parameters": tool.inputSchema
|
||||
}
|
||||
})
|
||||
return openai_tools
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
async def call_tool(self, full_tool_name: str, arguments: Dict[str, Any]) -> Any:
|
||||
"""Calls an MCP tool and returns text result."""
|
||||
if not MCP_AVAILABLE:
|
||||
return "Error: MCP SDK is not installed."
|
||||
|
||||
if "__" not in full_tool_name:
|
||||
return f"Error: Tool {full_tool_name} is not a valid MCP tool."
|
||||
|
||||
clean_name = full_tool_name[4:] if full_tool_name.startswith("mcp_") else full_tool_name
|
||||
server_name, tool_name = clean_name.split("__", 1)
|
||||
|
||||
if server_name not in self.sessions:
|
||||
return f"Error: MCP server {server_name} is not connected."
|
||||
|
||||
session = self.sessions[server_name]["session"]
|
||||
try:
|
||||
result = await asyncio.wait_for(session.call_tool(tool_name, arguments), timeout=60.0)
|
||||
text_outputs = [content.text for content in result.content if hasattr(content, "text")]
|
||||
return "\n".join(text_outputs) if text_outputs else str(result)
|
||||
except Exception as e:
|
||||
return f"Error calling tool {tool_name} on {server_name}: {str(e)}"
|
||||
|
||||
async def shutdown(self):
|
||||
"""Close all SSE connections."""
|
||||
for name, data in self.sessions.items():
|
||||
stack = data.get("stack")
|
||||
if stack:
|
||||
await stack.aclose()
|
||||
self.sessions = {}</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Manages MCP SSE client connections for connpy.</p></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.mcp_client.MCPClientManager.call_tool"><code class="name flex">
|
||||
<span>async def <span class="ident">call_tool</span></span>(<span>self, full_tool_name: str, arguments: Dict[str, Any]) ‑> Any</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def call_tool(self, full_tool_name: str, arguments: Dict[str, Any]) -> Any:
|
||||
"""Calls an MCP tool and returns text result."""
|
||||
if not MCP_AVAILABLE:
|
||||
return "Error: MCP SDK is not installed."
|
||||
|
||||
if "__" not in full_tool_name:
|
||||
return f"Error: Tool {full_tool_name} is not a valid MCP tool."
|
||||
|
||||
clean_name = full_tool_name[4:] if full_tool_name.startswith("mcp_") else full_tool_name
|
||||
server_name, tool_name = clean_name.split("__", 1)
|
||||
|
||||
if server_name not in self.sessions:
|
||||
return f"Error: MCP server {server_name} is not connected."
|
||||
|
||||
session = self.sessions[server_name]["session"]
|
||||
try:
|
||||
result = await asyncio.wait_for(session.call_tool(tool_name, arguments), timeout=60.0)
|
||||
text_outputs = [content.text for content in result.content if hasattr(content, "text")]
|
||||
return "\n".join(text_outputs) if text_outputs else str(result)
|
||||
except Exception as e:
|
||||
return f"Error calling tool {tool_name} on {server_name}: {str(e)}"</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Calls an MCP tool and returns text result.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.mcp_client.MCPClientManager.get_tools_for_llm"><code class="name flex">
|
||||
<span>async def <span class="ident">get_tools_for_llm</span></span>(<span>self, os_filter: str | None = None) ‑> List[Dict[str, Any]]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def get_tools_for_llm(self, os_filter: Optional[str] = None) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Fetches tools from enabled MCP servers that match the OS filter.
|
||||
"""
|
||||
if not MCP_AVAILABLE:
|
||||
return []
|
||||
|
||||
all_llm_tools = []
|
||||
try:
|
||||
mcp_config = self.config.config.get("ai", {}).get("mcp_servers", {})
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
async def _fetch(name, cfg):
|
||||
if not cfg.get("enabled", True): return []
|
||||
|
||||
# Filter by OS if specified in config (primarily used for copilot strict matching)
|
||||
auto_os = cfg.get("auto_load_on_os")
|
||||
if os_filter is not None and auto_os and os_filter.lower() != auto_os.lower():
|
||||
return []
|
||||
|
||||
try:
|
||||
session = await self._ensure_connected(name, cfg)
|
||||
if session:
|
||||
if name in self.tool_cache: return self.tool_cache[name]
|
||||
llm_tools = await self._fetch_tools_as_openai(name, session)
|
||||
self.tool_cache[name] = llm_tools
|
||||
return llm_tools
|
||||
except Exception:
|
||||
pass
|
||||
return []
|
||||
|
||||
tasks = [ _fetch(name, cfg) for name, cfg in mcp_config.items() ]
|
||||
|
||||
if tasks:
|
||||
results = await asyncio.gather(*tasks)
|
||||
for tools in results:
|
||||
all_llm_tools.extend(tools)
|
||||
|
||||
return all_llm_tools</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Fetches tools from enabled MCP servers that match the OS filter.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.mcp_client.MCPClientManager.shutdown"><code class="name flex">
|
||||
<span>async def <span class="ident">shutdown</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def shutdown(self):
|
||||
"""Close all SSE connections."""
|
||||
for name, data in self.sessions.items():
|
||||
stack = data.get("stack")
|
||||
if stack:
|
||||
await stack.aclose()
|
||||
self.sessions = {}</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Close all SSE connections.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy" href="index.html">connpy</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.mcp_client.MCPClientManager" href="#connpy.mcp_client.MCPClientManager">MCPClientManager</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.mcp_client.MCPClientManager.call_tool" href="#connpy.mcp_client.MCPClientManager.call_tool">call_tool</a></code></li>
|
||||
<li><code><a title="connpy.mcp_client.MCPClientManager.get_tools_for_llm" href="#connpy.mcp_client.MCPClientManager.get_tools_for_llm">get_tools_for_llm</a></code></li>
|
||||
<li><code><a title="connpy.mcp_client.MCPClientManager.shutdown" href="#connpy.mcp_client.MCPClientManager.shutdown">shutdown</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.proto API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -60,7 +60,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.ai_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -58,6 +58,104 @@ el.replaceWith(d);
|
||||
<pre><code class="python">class AIService(BaseService):
|
||||
"""Business logic for interacting with AI agents and LLM configurations."""
|
||||
|
||||
def build_context_blocks(self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) -> list:
|
||||
"""Identifies command blocks in the terminal history."""
|
||||
blocks = []
|
||||
if not (cmd_byte_positions and len(cmd_byte_positions) >= 2 and raw_bytes):
|
||||
return blocks
|
||||
|
||||
default_prompt = r'>$|#$|\$$|>.$|#.$|\$.$'
|
||||
device_prompt = node_info.get("prompt", default_prompt) if isinstance(node_info, dict) else default_prompt
|
||||
prompt_re_str = re.sub(r'(?<!\\)\$', '', device_prompt)
|
||||
try:
|
||||
prompt_re = re.compile(prompt_re_str)
|
||||
except Exception:
|
||||
prompt_re = re.compile(re.sub(r'(?<!\\)\$', '', default_prompt))
|
||||
|
||||
for i in range(1, len(cmd_byte_positions)):
|
||||
pos, known_cmd = cmd_byte_positions[i]
|
||||
prev_pos = cmd_byte_positions[i-1][0]
|
||||
|
||||
if known_cmd:
|
||||
prev_chunk = raw_bytes[prev_pos:pos]
|
||||
prev_cleaned = log_cleaner(prev_chunk.decode(errors='replace'))
|
||||
prev_lines = [l for l in prev_cleaned.split('\n') if l.strip()]
|
||||
prompt_text = prev_lines[-1].strip() if prev_lines else ""
|
||||
preview = f"{prompt_text}{known_cmd}" if prompt_text else known_cmd
|
||||
blocks.append((pos, preview[:80]))
|
||||
else:
|
||||
chunk = raw_bytes[prev_pos:pos]
|
||||
cleaned = log_cleaner(chunk.decode(errors='replace'))
|
||||
lines = [l for l in cleaned.split('\n') if l.strip()]
|
||||
preview = lines[-1].strip() if lines else ""
|
||||
|
||||
if preview:
|
||||
match = prompt_re.search(preview)
|
||||
if match:
|
||||
cmd_text = preview[match.end():].strip()
|
||||
if cmd_text:
|
||||
blocks.append((pos, preview[:80]))
|
||||
return blocks
|
||||
|
||||
def process_copilot_input(self, input_text: str, session_state: dict) -> dict:
|
||||
"""Parses slash commands and manages session state. Returns directive dict."""
|
||||
text = input_text.strip()
|
||||
if not text.startswith('/'):
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
parts = text.split(maxsplit=1)
|
||||
cmd = parts[0].lower()
|
||||
args = parts[1] if len(parts) > 1 else ""
|
||||
|
||||
# 1. State Commands (Persistent)
|
||||
if cmd == "/os":
|
||||
if args:
|
||||
session_state['os'] = args
|
||||
return {"action": "state_update", "message": f"OS context changed to {args}"}
|
||||
elif cmd == "/prompt":
|
||||
if args:
|
||||
session_state['prompt'] = args
|
||||
return {"action": "state_update", "message": f"Prompt regex changed to {args}"}
|
||||
elif cmd == "/memorize":
|
||||
if args:
|
||||
session_state['memories'].append(args)
|
||||
return {"action": "state_update", "message": f"Memory added: {args}"}
|
||||
elif cmd == "/clear":
|
||||
session_state['memories'] = []
|
||||
return {"action": "state_update", "message": "Memory cleared"}
|
||||
|
||||
# 2. Hybrid Commands
|
||||
elif cmd == "/architect":
|
||||
if not args:
|
||||
session_state['persona'] = 'architect'
|
||||
return {"action": "state_update", "message": "Persona set to Architect"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "architect"}}
|
||||
|
||||
elif cmd == "/engineer":
|
||||
if not args:
|
||||
session_state['persona'] = 'engineer'
|
||||
return {"action": "state_update", "message": "Persona set to Engineer"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "engineer"}}
|
||||
|
||||
elif cmd == "/trust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = True
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) enabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": True}}
|
||||
|
||||
elif cmd == "/untrust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = False
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) disabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": False}}
|
||||
|
||||
# Unknown command, execute normally
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
@@ -71,6 +169,21 @@ el.replaceWith(d);
|
||||
agent = ai(self.config, console=console)
|
||||
return agent.confirm(input_text)
|
||||
|
||||
def ask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return future.result()
|
||||
|
||||
async def aask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance asynchronously."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
import asyncio
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return await asyncio.wrap_future(future)
|
||||
|
||||
|
||||
def list_sessions(self):
|
||||
"""Return a list of all saved AI sessions."""
|
||||
@@ -99,6 +212,40 @@ el.replaceWith(d);
|
||||
self.config.config["ai"] = settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def configure_mcp(self, name, url=None, enabled=None, auto_load_on_os=None, remove=False):
|
||||
"""Update MCP server settings in the configuration with smart merging."""
|
||||
ai_settings = self.config.config.get("ai", {})
|
||||
mcp_servers = ai_settings.get("mcp_servers", {})
|
||||
|
||||
if remove:
|
||||
if name in mcp_servers:
|
||||
del mcp_servers[name]
|
||||
else:
|
||||
# Get existing or new
|
||||
server_cfg = mcp_servers.get(name, {})
|
||||
|
||||
# Partial updates
|
||||
if url is not None:
|
||||
server_cfg["url"] = url
|
||||
|
||||
if enabled is not None:
|
||||
server_cfg["enabled"] = bool(enabled)
|
||||
elif "enabled" not in server_cfg:
|
||||
server_cfg["enabled"] = True # Default for new entries
|
||||
|
||||
if auto_load_on_os is not None:
|
||||
if auto_load_on_os == "": # Explicit clear
|
||||
if "auto_load_on_os" in server_cfg:
|
||||
del server_cfg["auto_load_on_os"]
|
||||
else:
|
||||
server_cfg["auto_load_on_os"] = auto_load_on_os
|
||||
|
||||
mcp_servers[name] = server_cfg
|
||||
|
||||
ai_settings["mcp_servers"] = mcp_servers
|
||||
self.config.config["ai"] = ai_settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def load_session_data(self, session_id):
|
||||
"""Load a session's raw data by ID."""
|
||||
from connpy.ai import ai
|
||||
@@ -118,6 +265,24 @@ el.replaceWith(d);
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.ai_service.AIService.aask_copilot"><code class="name flex">
|
||||
<span>async def <span class="ident">aask_copilot</span></span>(<span>self, terminal_buffer, user_question, node_info=None, chunk_callback=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def aask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance asynchronously."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
import asyncio
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return await asyncio.wrap_future(future)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Ask the AI copilot for terminal assistance asynchronously.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.ask"><code class="name flex">
|
||||
<span>def <span class="ident">ask</span></span>(<span>self,<br>input_text,<br>dryrun=False,<br>chat_history=None,<br>status=None,<br>debug=False,<br>session_id=None,<br>console=None,<br>chunk_callback=None,<br>confirm_handler=None,<br>trust=False,<br>**overrides)</span>
|
||||
</code></dt>
|
||||
@@ -134,6 +299,116 @@ el.replaceWith(d);
|
||||
</details>
|
||||
<div class="desc"><p>Send a prompt to the AI agent.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.ask_copilot"><code class="name flex">
|
||||
<span>def <span class="ident">ask_copilot</span></span>(<span>self, terminal_buffer, user_question, node_info=None, chunk_callback=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def ask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return future.result()</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Ask the AI copilot for terminal assistance.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.build_context_blocks"><code class="name flex">
|
||||
<span>def <span class="ident">build_context_blocks</span></span>(<span>self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) ‑> list</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def build_context_blocks(self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) -> list:
|
||||
"""Identifies command blocks in the terminal history."""
|
||||
blocks = []
|
||||
if not (cmd_byte_positions and len(cmd_byte_positions) >= 2 and raw_bytes):
|
||||
return blocks
|
||||
|
||||
default_prompt = r'>$|#$|\$$|>.$|#.$|\$.$'
|
||||
device_prompt = node_info.get("prompt", default_prompt) if isinstance(node_info, dict) else default_prompt
|
||||
prompt_re_str = re.sub(r'(?<!\\)\$', '', device_prompt)
|
||||
try:
|
||||
prompt_re = re.compile(prompt_re_str)
|
||||
except Exception:
|
||||
prompt_re = re.compile(re.sub(r'(?<!\\)\$', '', default_prompt))
|
||||
|
||||
for i in range(1, len(cmd_byte_positions)):
|
||||
pos, known_cmd = cmd_byte_positions[i]
|
||||
prev_pos = cmd_byte_positions[i-1][0]
|
||||
|
||||
if known_cmd:
|
||||
prev_chunk = raw_bytes[prev_pos:pos]
|
||||
prev_cleaned = log_cleaner(prev_chunk.decode(errors='replace'))
|
||||
prev_lines = [l for l in prev_cleaned.split('\n') if l.strip()]
|
||||
prompt_text = prev_lines[-1].strip() if prev_lines else ""
|
||||
preview = f"{prompt_text}{known_cmd}" if prompt_text else known_cmd
|
||||
blocks.append((pos, preview[:80]))
|
||||
else:
|
||||
chunk = raw_bytes[prev_pos:pos]
|
||||
cleaned = log_cleaner(chunk.decode(errors='replace'))
|
||||
lines = [l for l in cleaned.split('\n') if l.strip()]
|
||||
preview = lines[-1].strip() if lines else ""
|
||||
|
||||
if preview:
|
||||
match = prompt_re.search(preview)
|
||||
if match:
|
||||
cmd_text = preview[match.end():].strip()
|
||||
if cmd_text:
|
||||
blocks.append((pos, preview[:80]))
|
||||
return blocks</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Identifies command blocks in the terminal history.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.configure_mcp"><code class="name flex">
|
||||
<span>def <span class="ident">configure_mcp</span></span>(<span>self, name, url=None, enabled=None, auto_load_on_os=None, remove=False)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def configure_mcp(self, name, url=None, enabled=None, auto_load_on_os=None, remove=False):
|
||||
"""Update MCP server settings in the configuration with smart merging."""
|
||||
ai_settings = self.config.config.get("ai", {})
|
||||
mcp_servers = ai_settings.get("mcp_servers", {})
|
||||
|
||||
if remove:
|
||||
if name in mcp_servers:
|
||||
del mcp_servers[name]
|
||||
else:
|
||||
# Get existing or new
|
||||
server_cfg = mcp_servers.get(name, {})
|
||||
|
||||
# Partial updates
|
||||
if url is not None:
|
||||
server_cfg["url"] = url
|
||||
|
||||
if enabled is not None:
|
||||
server_cfg["enabled"] = bool(enabled)
|
||||
elif "enabled" not in server_cfg:
|
||||
server_cfg["enabled"] = True # Default for new entries
|
||||
|
||||
if auto_load_on_os is not None:
|
||||
if auto_load_on_os == "": # Explicit clear
|
||||
if "auto_load_on_os" in server_cfg:
|
||||
del server_cfg["auto_load_on_os"]
|
||||
else:
|
||||
server_cfg["auto_load_on_os"] = auto_load_on_os
|
||||
|
||||
mcp_servers[name] = server_cfg
|
||||
|
||||
ai_settings["mcp_servers"] = mcp_servers
|
||||
self.config.config["ai"] = ai_settings
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Update MCP server settings in the configuration with smart merging.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.configure_provider"><code class="name flex">
|
||||
<span>def <span class="ident">configure_provider</span></span>(<span>self, provider, model=None, api_key=None)</span>
|
||||
</code></dt>
|
||||
@@ -223,6 +498,75 @@ el.replaceWith(d);
|
||||
</details>
|
||||
<div class="desc"><p>Load a session's raw data by ID.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.process_copilot_input"><code class="name flex">
|
||||
<span>def <span class="ident">process_copilot_input</span></span>(<span>self, input_text: str, session_state: dict) ‑> dict</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def process_copilot_input(self, input_text: str, session_state: dict) -> dict:
|
||||
"""Parses slash commands and manages session state. Returns directive dict."""
|
||||
text = input_text.strip()
|
||||
if not text.startswith('/'):
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
parts = text.split(maxsplit=1)
|
||||
cmd = parts[0].lower()
|
||||
args = parts[1] if len(parts) > 1 else ""
|
||||
|
||||
# 1. State Commands (Persistent)
|
||||
if cmd == "/os":
|
||||
if args:
|
||||
session_state['os'] = args
|
||||
return {"action": "state_update", "message": f"OS context changed to {args}"}
|
||||
elif cmd == "/prompt":
|
||||
if args:
|
||||
session_state['prompt'] = args
|
||||
return {"action": "state_update", "message": f"Prompt regex changed to {args}"}
|
||||
elif cmd == "/memorize":
|
||||
if args:
|
||||
session_state['memories'].append(args)
|
||||
return {"action": "state_update", "message": f"Memory added: {args}"}
|
||||
elif cmd == "/clear":
|
||||
session_state['memories'] = []
|
||||
return {"action": "state_update", "message": "Memory cleared"}
|
||||
|
||||
# 2. Hybrid Commands
|
||||
elif cmd == "/architect":
|
||||
if not args:
|
||||
session_state['persona'] = 'architect'
|
||||
return {"action": "state_update", "message": "Persona set to Architect"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "architect"}}
|
||||
|
||||
elif cmd == "/engineer":
|
||||
if not args:
|
||||
session_state['persona'] = 'engineer'
|
||||
return {"action": "state_update", "message": "Persona set to Engineer"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "engineer"}}
|
||||
|
||||
elif cmd == "/trust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = True
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) enabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": True}}
|
||||
|
||||
elif cmd == "/untrust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = False
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) disabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": False}}
|
||||
|
||||
# Unknown command, execute normally
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Parses slash commands and manages session state. Returns directive dict.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
@@ -250,13 +594,18 @@ el.replaceWith(d);
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.ai_service.AIService" href="#connpy.services.ai_service.AIService">AIService</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.ai_service.AIService.aask_copilot" href="#connpy.services.ai_service.AIService.aask_copilot">aask_copilot</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.ask" href="#connpy.services.ai_service.AIService.ask">ask</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.ask_copilot" href="#connpy.services.ai_service.AIService.ask_copilot">ask_copilot</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.build_context_blocks" href="#connpy.services.ai_service.AIService.build_context_blocks">build_context_blocks</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.configure_mcp" href="#connpy.services.ai_service.AIService.configure_mcp">configure_mcp</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.configure_provider" href="#connpy.services.ai_service.AIService.configure_provider">configure_provider</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.confirm" href="#connpy.services.ai_service.AIService.confirm">confirm</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.delete_session" href="#connpy.services.ai_service.AIService.delete_session">delete_session</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.list_sessions" href="#connpy.services.ai_service.AIService.list_sessions">list_sessions</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.load_session_data" href="#connpy.services.ai_service.AIService.load_session_data">load_session_data</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.process_copilot_input" href="#connpy.services.ai_service.AIService.process_copilot_input">process_copilot_input</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
@@ -265,7 +614,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.base API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -152,7 +152,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.config_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -117,6 +117,10 @@ el.replaceWith(d);
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Support both direct styles and nested under 'theme' key
|
||||
if "theme" in user_styles and isinstance(user_styles["theme"], dict):
|
||||
user_styles = user_styles["theme"]
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
@@ -174,6 +178,10 @@ el.replaceWith(d);
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Support both direct styles and nested under 'theme' key
|
||||
if "theme" in user_styles and isinstance(user_styles["theme"], dict):
|
||||
user_styles = user_styles["theme"]
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
@@ -311,7 +319,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.context_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -370,7 +370,7 @@ def current_context(self) -> str:
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.exceptions API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -268,7 +268,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.execution_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -64,7 +64,7 @@ el.replaceWith(d);
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -112,7 +112,7 @@ el.replaceWith(d);
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -189,7 +189,7 @@ el.replaceWith(d);
|
||||
"commands": playbook["commands"],
|
||||
"variables": playbook.get("variables"),
|
||||
"parallel": options.get("parallel", parallel),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 10)),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 20)),
|
||||
"prompt": options.get("prompt"),
|
||||
"name": playbook.get("name", "Task")
|
||||
}
|
||||
@@ -244,7 +244,7 @@ el.replaceWith(d);
|
||||
<div class="desc"><p>Run a plain-text script containing one command per line.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.run_commands"><code class="name flex">
|
||||
<span>def <span class="ident">run_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, str]</span>
|
||||
<span>def <span class="ident">run_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 20,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, str]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
@@ -257,7 +257,7 @@ el.replaceWith(d);
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -339,7 +339,7 @@ el.replaceWith(d);
|
||||
"commands": playbook["commands"],
|
||||
"variables": playbook.get("variables"),
|
||||
"parallel": options.get("parallel", parallel),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 10)),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 20)),
|
||||
"prompt": options.get("prompt"),
|
||||
"name": playbook.get("name", "Task")
|
||||
}
|
||||
@@ -360,7 +360,7 @@ el.replaceWith(d);
|
||||
<div class="desc"><p>Run a structured Connpy YAML automation playbook (from path or content).</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.test_commands"><code class="name flex">
|
||||
<span>def <span class="ident">test_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>expected: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, Dict[str, bool]]</span>
|
||||
<span>def <span class="ident">test_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>expected: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 20,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, Dict[str, bool]]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
@@ -374,7 +374,7 @@ el.replaceWith(d);
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -449,7 +449,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.import_export_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -361,7 +361,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
+414
-19
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -113,6 +113,104 @@ el.replaceWith(d);
|
||||
<pre><code class="python">class AIService(BaseService):
|
||||
"""Business logic for interacting with AI agents and LLM configurations."""
|
||||
|
||||
def build_context_blocks(self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) -> list:
|
||||
"""Identifies command blocks in the terminal history."""
|
||||
blocks = []
|
||||
if not (cmd_byte_positions and len(cmd_byte_positions) >= 2 and raw_bytes):
|
||||
return blocks
|
||||
|
||||
default_prompt = r'>$|#$|\$$|>.$|#.$|\$.$'
|
||||
device_prompt = node_info.get("prompt", default_prompt) if isinstance(node_info, dict) else default_prompt
|
||||
prompt_re_str = re.sub(r'(?<!\\)\$', '', device_prompt)
|
||||
try:
|
||||
prompt_re = re.compile(prompt_re_str)
|
||||
except Exception:
|
||||
prompt_re = re.compile(re.sub(r'(?<!\\)\$', '', default_prompt))
|
||||
|
||||
for i in range(1, len(cmd_byte_positions)):
|
||||
pos, known_cmd = cmd_byte_positions[i]
|
||||
prev_pos = cmd_byte_positions[i-1][0]
|
||||
|
||||
if known_cmd:
|
||||
prev_chunk = raw_bytes[prev_pos:pos]
|
||||
prev_cleaned = log_cleaner(prev_chunk.decode(errors='replace'))
|
||||
prev_lines = [l for l in prev_cleaned.split('\n') if l.strip()]
|
||||
prompt_text = prev_lines[-1].strip() if prev_lines else ""
|
||||
preview = f"{prompt_text}{known_cmd}" if prompt_text else known_cmd
|
||||
blocks.append((pos, preview[:80]))
|
||||
else:
|
||||
chunk = raw_bytes[prev_pos:pos]
|
||||
cleaned = log_cleaner(chunk.decode(errors='replace'))
|
||||
lines = [l for l in cleaned.split('\n') if l.strip()]
|
||||
preview = lines[-1].strip() if lines else ""
|
||||
|
||||
if preview:
|
||||
match = prompt_re.search(preview)
|
||||
if match:
|
||||
cmd_text = preview[match.end():].strip()
|
||||
if cmd_text:
|
||||
blocks.append((pos, preview[:80]))
|
||||
return blocks
|
||||
|
||||
def process_copilot_input(self, input_text: str, session_state: dict) -> dict:
|
||||
"""Parses slash commands and manages session state. Returns directive dict."""
|
||||
text = input_text.strip()
|
||||
if not text.startswith('/'):
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
parts = text.split(maxsplit=1)
|
||||
cmd = parts[0].lower()
|
||||
args = parts[1] if len(parts) > 1 else ""
|
||||
|
||||
# 1. State Commands (Persistent)
|
||||
if cmd == "/os":
|
||||
if args:
|
||||
session_state['os'] = args
|
||||
return {"action": "state_update", "message": f"OS context changed to {args}"}
|
||||
elif cmd == "/prompt":
|
||||
if args:
|
||||
session_state['prompt'] = args
|
||||
return {"action": "state_update", "message": f"Prompt regex changed to {args}"}
|
||||
elif cmd == "/memorize":
|
||||
if args:
|
||||
session_state['memories'].append(args)
|
||||
return {"action": "state_update", "message": f"Memory added: {args}"}
|
||||
elif cmd == "/clear":
|
||||
session_state['memories'] = []
|
||||
return {"action": "state_update", "message": "Memory cleared"}
|
||||
|
||||
# 2. Hybrid Commands
|
||||
elif cmd == "/architect":
|
||||
if not args:
|
||||
session_state['persona'] = 'architect'
|
||||
return {"action": "state_update", "message": "Persona set to Architect"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "architect"}}
|
||||
|
||||
elif cmd == "/engineer":
|
||||
if not args:
|
||||
session_state['persona'] = 'engineer'
|
||||
return {"action": "state_update", "message": "Persona set to Engineer"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "engineer"}}
|
||||
|
||||
elif cmd == "/trust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = True
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) enabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": True}}
|
||||
|
||||
elif cmd == "/untrust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = False
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) disabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": False}}
|
||||
|
||||
# Unknown command, execute normally
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
@@ -126,6 +224,21 @@ el.replaceWith(d);
|
||||
agent = ai(self.config, console=console)
|
||||
return agent.confirm(input_text)
|
||||
|
||||
def ask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return future.result()
|
||||
|
||||
async def aask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance asynchronously."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
import asyncio
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return await asyncio.wrap_future(future)
|
||||
|
||||
|
||||
def list_sessions(self):
|
||||
"""Return a list of all saved AI sessions."""
|
||||
@@ -154,6 +267,40 @@ el.replaceWith(d);
|
||||
self.config.config["ai"] = settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def configure_mcp(self, name, url=None, enabled=None, auto_load_on_os=None, remove=False):
|
||||
"""Update MCP server settings in the configuration with smart merging."""
|
||||
ai_settings = self.config.config.get("ai", {})
|
||||
mcp_servers = ai_settings.get("mcp_servers", {})
|
||||
|
||||
if remove:
|
||||
if name in mcp_servers:
|
||||
del mcp_servers[name]
|
||||
else:
|
||||
# Get existing or new
|
||||
server_cfg = mcp_servers.get(name, {})
|
||||
|
||||
# Partial updates
|
||||
if url is not None:
|
||||
server_cfg["url"] = url
|
||||
|
||||
if enabled is not None:
|
||||
server_cfg["enabled"] = bool(enabled)
|
||||
elif "enabled" not in server_cfg:
|
||||
server_cfg["enabled"] = True # Default for new entries
|
||||
|
||||
if auto_load_on_os is not None:
|
||||
if auto_load_on_os == "": # Explicit clear
|
||||
if "auto_load_on_os" in server_cfg:
|
||||
del server_cfg["auto_load_on_os"]
|
||||
else:
|
||||
server_cfg["auto_load_on_os"] = auto_load_on_os
|
||||
|
||||
mcp_servers[name] = server_cfg
|
||||
|
||||
ai_settings["mcp_servers"] = mcp_servers
|
||||
self.config.config["ai"] = ai_settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def load_session_data(self, session_id):
|
||||
"""Load a session's raw data by ID."""
|
||||
from connpy.ai import ai
|
||||
@@ -173,6 +320,24 @@ el.replaceWith(d);
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.AIService.aask_copilot"><code class="name flex">
|
||||
<span>async def <span class="ident">aask_copilot</span></span>(<span>self, terminal_buffer, user_question, node_info=None, chunk_callback=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">async def aask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance asynchronously."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
import asyncio
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return await asyncio.wrap_future(future)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Ask the AI copilot for terminal assistance asynchronously.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.ask"><code class="name flex">
|
||||
<span>def <span class="ident">ask</span></span>(<span>self,<br>input_text,<br>dryrun=False,<br>chat_history=None,<br>status=None,<br>debug=False,<br>session_id=None,<br>console=None,<br>chunk_callback=None,<br>confirm_handler=None,<br>trust=False,<br>**overrides)</span>
|
||||
</code></dt>
|
||||
@@ -189,6 +354,116 @@ el.replaceWith(d);
|
||||
</details>
|
||||
<div class="desc"><p>Send a prompt to the AI agent.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.ask_copilot"><code class="name flex">
|
||||
<span>def <span class="ident">ask_copilot</span></span>(<span>self, terminal_buffer, user_question, node_info=None, chunk_callback=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def ask_copilot(self, terminal_buffer, user_question, node_info=None, chunk_callback=None):
|
||||
"""Ask the AI copilot for terminal assistance."""
|
||||
from connpy.ai import ai, run_ai_async
|
||||
agent = ai(self.config)
|
||||
future = run_ai_async(agent.aask_copilot(terminal_buffer, user_question, node_info, chunk_callback=chunk_callback))
|
||||
return future.result()</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Ask the AI copilot for terminal assistance.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.build_context_blocks"><code class="name flex">
|
||||
<span>def <span class="ident">build_context_blocks</span></span>(<span>self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) ‑> list</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def build_context_blocks(self, raw_bytes: bytes, cmd_byte_positions: list, node_info: dict) -> list:
|
||||
"""Identifies command blocks in the terminal history."""
|
||||
blocks = []
|
||||
if not (cmd_byte_positions and len(cmd_byte_positions) >= 2 and raw_bytes):
|
||||
return blocks
|
||||
|
||||
default_prompt = r'>$|#$|\$$|>.$|#.$|\$.$'
|
||||
device_prompt = node_info.get("prompt", default_prompt) if isinstance(node_info, dict) else default_prompt
|
||||
prompt_re_str = re.sub(r'(?<!\\)\$', '', device_prompt)
|
||||
try:
|
||||
prompt_re = re.compile(prompt_re_str)
|
||||
except Exception:
|
||||
prompt_re = re.compile(re.sub(r'(?<!\\)\$', '', default_prompt))
|
||||
|
||||
for i in range(1, len(cmd_byte_positions)):
|
||||
pos, known_cmd = cmd_byte_positions[i]
|
||||
prev_pos = cmd_byte_positions[i-1][0]
|
||||
|
||||
if known_cmd:
|
||||
prev_chunk = raw_bytes[prev_pos:pos]
|
||||
prev_cleaned = log_cleaner(prev_chunk.decode(errors='replace'))
|
||||
prev_lines = [l for l in prev_cleaned.split('\n') if l.strip()]
|
||||
prompt_text = prev_lines[-1].strip() if prev_lines else ""
|
||||
preview = f"{prompt_text}{known_cmd}" if prompt_text else known_cmd
|
||||
blocks.append((pos, preview[:80]))
|
||||
else:
|
||||
chunk = raw_bytes[prev_pos:pos]
|
||||
cleaned = log_cleaner(chunk.decode(errors='replace'))
|
||||
lines = [l for l in cleaned.split('\n') if l.strip()]
|
||||
preview = lines[-1].strip() if lines else ""
|
||||
|
||||
if preview:
|
||||
match = prompt_re.search(preview)
|
||||
if match:
|
||||
cmd_text = preview[match.end():].strip()
|
||||
if cmd_text:
|
||||
blocks.append((pos, preview[:80]))
|
||||
return blocks</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Identifies command blocks in the terminal history.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.configure_mcp"><code class="name flex">
|
||||
<span>def <span class="ident">configure_mcp</span></span>(<span>self, name, url=None, enabled=None, auto_load_on_os=None, remove=False)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def configure_mcp(self, name, url=None, enabled=None, auto_load_on_os=None, remove=False):
|
||||
"""Update MCP server settings in the configuration with smart merging."""
|
||||
ai_settings = self.config.config.get("ai", {})
|
||||
mcp_servers = ai_settings.get("mcp_servers", {})
|
||||
|
||||
if remove:
|
||||
if name in mcp_servers:
|
||||
del mcp_servers[name]
|
||||
else:
|
||||
# Get existing or new
|
||||
server_cfg = mcp_servers.get(name, {})
|
||||
|
||||
# Partial updates
|
||||
if url is not None:
|
||||
server_cfg["url"] = url
|
||||
|
||||
if enabled is not None:
|
||||
server_cfg["enabled"] = bool(enabled)
|
||||
elif "enabled" not in server_cfg:
|
||||
server_cfg["enabled"] = True # Default for new entries
|
||||
|
||||
if auto_load_on_os is not None:
|
||||
if auto_load_on_os == "": # Explicit clear
|
||||
if "auto_load_on_os" in server_cfg:
|
||||
del server_cfg["auto_load_on_os"]
|
||||
else:
|
||||
server_cfg["auto_load_on_os"] = auto_load_on_os
|
||||
|
||||
mcp_servers[name] = server_cfg
|
||||
|
||||
ai_settings["mcp_servers"] = mcp_servers
|
||||
self.config.config["ai"] = ai_settings
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Update MCP server settings in the configuration with smart merging.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.configure_provider"><code class="name flex">
|
||||
<span>def <span class="ident">configure_provider</span></span>(<span>self, provider, model=None, api_key=None)</span>
|
||||
</code></dt>
|
||||
@@ -278,6 +553,75 @@ el.replaceWith(d);
|
||||
</details>
|
||||
<div class="desc"><p>Load a session's raw data by ID.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.AIService.process_copilot_input"><code class="name flex">
|
||||
<span>def <span class="ident">process_copilot_input</span></span>(<span>self, input_text: str, session_state: dict) ‑> dict</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def process_copilot_input(self, input_text: str, session_state: dict) -> dict:
|
||||
"""Parses slash commands and manages session state. Returns directive dict."""
|
||||
text = input_text.strip()
|
||||
if not text.startswith('/'):
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}
|
||||
|
||||
parts = text.split(maxsplit=1)
|
||||
cmd = parts[0].lower()
|
||||
args = parts[1] if len(parts) > 1 else ""
|
||||
|
||||
# 1. State Commands (Persistent)
|
||||
if cmd == "/os":
|
||||
if args:
|
||||
session_state['os'] = args
|
||||
return {"action": "state_update", "message": f"OS context changed to {args}"}
|
||||
elif cmd == "/prompt":
|
||||
if args:
|
||||
session_state['prompt'] = args
|
||||
return {"action": "state_update", "message": f"Prompt regex changed to {args}"}
|
||||
elif cmd == "/memorize":
|
||||
if args:
|
||||
session_state['memories'].append(args)
|
||||
return {"action": "state_update", "message": f"Memory added: {args}"}
|
||||
elif cmd == "/clear":
|
||||
session_state['memories'] = []
|
||||
return {"action": "state_update", "message": "Memory cleared"}
|
||||
|
||||
# 2. Hybrid Commands
|
||||
elif cmd == "/architect":
|
||||
if not args:
|
||||
session_state['persona'] = 'architect'
|
||||
return {"action": "state_update", "message": "Persona set to Architect"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "architect"}}
|
||||
|
||||
elif cmd == "/engineer":
|
||||
if not args:
|
||||
session_state['persona'] = 'engineer'
|
||||
return {"action": "state_update", "message": "Persona set to Engineer"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"persona": "engineer"}}
|
||||
|
||||
elif cmd == "/trust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = True
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) enabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": True}}
|
||||
|
||||
elif cmd == "/untrust":
|
||||
if not args:
|
||||
session_state['trust_mode'] = False
|
||||
return {"action": "state_update", "message": "Auto-execute (trust) disabled for session"}
|
||||
else:
|
||||
return {"action": "execute", "clean_prompt": args, "overrides": {"trust": False}}
|
||||
|
||||
# Unknown command, execute normally
|
||||
return {"action": "execute", "clean_prompt": text, "overrides": {}}</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Parses slash commands and manages session state. Returns directive dict.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
@@ -359,6 +703,10 @@ el.replaceWith(d);
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Support both direct styles and nested under 'theme' key
|
||||
if "theme" in user_styles and isinstance(user_styles["theme"], dict):
|
||||
user_styles = user_styles["theme"]
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
@@ -416,6 +764,10 @@ el.replaceWith(d);
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Support both direct styles and nested under 'theme' key
|
||||
if "theme" in user_styles and isinstance(user_styles["theme"], dict):
|
||||
user_styles = user_styles["theme"]
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
@@ -590,7 +942,7 @@ el.replaceWith(d);
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -638,7 +990,7 @@ el.replaceWith(d);
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -715,7 +1067,7 @@ el.replaceWith(d);
|
||||
"commands": playbook["commands"],
|
||||
"variables": playbook.get("variables"),
|
||||
"parallel": options.get("parallel", parallel),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 10)),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 20)),
|
||||
"prompt": options.get("prompt"),
|
||||
"name": playbook.get("name", "Task")
|
||||
}
|
||||
@@ -770,7 +1122,7 @@ el.replaceWith(d);
|
||||
<div class="desc"><p>Run a plain-text script containing one command per line.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ExecutionService.run_commands"><code class="name flex">
|
||||
<span>def <span class="ident">run_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, str]</span>
|
||||
<span>def <span class="ident">run_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 20,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, str]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
@@ -783,7 +1135,7 @@ el.replaceWith(d);
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -865,7 +1217,7 @@ el.replaceWith(d);
|
||||
"commands": playbook["commands"],
|
||||
"variables": playbook.get("variables"),
|
||||
"parallel": options.get("parallel", parallel),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 10)),
|
||||
"timeout": playbook.get("timeout", options.get("timeout", 20)),
|
||||
"prompt": options.get("prompt"),
|
||||
"name": playbook.get("name", "Task")
|
||||
}
|
||||
@@ -886,7 +1238,7 @@ el.replaceWith(d);
|
||||
<div class="desc"><p>Run a structured Connpy YAML automation playbook (from path or content).</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ExecutionService.test_commands"><code class="name flex">
|
||||
<span>def <span class="ident">test_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>expected: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, Dict[str, bool]]</span>
|
||||
<span>def <span class="ident">test_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>expected: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 20,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None,<br>name: str | None = None) ‑> Dict[str, Dict[str, bool]]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
@@ -900,7 +1252,7 @@ el.replaceWith(d);
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
timeout: int = 20,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
@@ -2231,14 +2583,26 @@ el.replaceWith(d);
|
||||
from rich.console import Console
|
||||
|
||||
from rich.console import Console
|
||||
buf = io.StringIO()
|
||||
import queue
|
||||
import threading
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
class QueueIO(io.StringIO):
|
||||
def write(self, s):
|
||||
q.put(s)
|
||||
return len(s)
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
buf = QueueIO()
|
||||
old_console = printer._get_console()
|
||||
old_err_console = printer._get_err_console()
|
||||
|
||||
def run_plugin():
|
||||
printer.set_thread_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_err_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_stream(buf)
|
||||
|
||||
try:
|
||||
if hasattr(module, "Entrypoint"):
|
||||
module.Entrypoint(args, parser, app)
|
||||
@@ -2250,9 +2614,16 @@ el.replaceWith(d);
|
||||
printer.set_thread_console(old_console)
|
||||
printer.set_thread_err_console(old_err_console)
|
||||
printer.set_thread_stream(None)
|
||||
q.put(None)
|
||||
|
||||
for line in buf.getvalue().splitlines(keepends=True):
|
||||
yield line</code></pre>
|
||||
t = threading.Thread(target=run_plugin, daemon=True)
|
||||
t.start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
yield item</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for enabling, disabling, and listing plugins.</p>
|
||||
<p>Initialize the service.</p>
|
||||
@@ -2507,14 +2878,26 @@ el.replaceWith(d);
|
||||
from rich.console import Console
|
||||
|
||||
from rich.console import Console
|
||||
buf = io.StringIO()
|
||||
import queue
|
||||
import threading
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
class QueueIO(io.StringIO):
|
||||
def write(self, s):
|
||||
q.put(s)
|
||||
return len(s)
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
buf = QueueIO()
|
||||
old_console = printer._get_console()
|
||||
old_err_console = printer._get_err_console()
|
||||
|
||||
def run_plugin():
|
||||
printer.set_thread_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_err_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_stream(buf)
|
||||
|
||||
try:
|
||||
if hasattr(module, "Entrypoint"):
|
||||
module.Entrypoint(args, parser, app)
|
||||
@@ -2526,9 +2909,16 @@ el.replaceWith(d);
|
||||
printer.set_thread_console(old_console)
|
||||
printer.set_thread_err_console(old_err_console)
|
||||
printer.set_thread_stream(None)
|
||||
q.put(None)
|
||||
|
||||
for line in buf.getvalue().splitlines(keepends=True):
|
||||
yield line</code></pre>
|
||||
t = threading.Thread(target=run_plugin, daemon=True)
|
||||
t.start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
yield item</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
@@ -3259,13 +3649,18 @@ el.replaceWith(d);
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.AIService" href="#connpy.services.AIService">AIService</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.AIService.aask_copilot" href="#connpy.services.AIService.aask_copilot">aask_copilot</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.ask" href="#connpy.services.AIService.ask">ask</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.ask_copilot" href="#connpy.services.AIService.ask_copilot">ask_copilot</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.build_context_blocks" href="#connpy.services.AIService.build_context_blocks">build_context_blocks</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.configure_mcp" href="#connpy.services.AIService.configure_mcp">configure_mcp</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.configure_provider" href="#connpy.services.AIService.configure_provider">configure_provider</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.confirm" href="#connpy.services.AIService.confirm">confirm</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.delete_session" href="#connpy.services.AIService.delete_session">delete_session</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.list_sessions" href="#connpy.services.AIService.list_sessions">list_sessions</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.load_session_data" href="#connpy.services.AIService.load_session_data">load_session_data</a></code></li>
|
||||
<li><code><a title="connpy.services.AIService.process_copilot_input" href="#connpy.services.AIService.process_copilot_input">process_copilot_input</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
@@ -3377,7 +3772,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.node_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -786,7 +786,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.plugin_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -284,14 +284,26 @@ el.replaceWith(d);
|
||||
from rich.console import Console
|
||||
|
||||
from rich.console import Console
|
||||
buf = io.StringIO()
|
||||
import queue
|
||||
import threading
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
class QueueIO(io.StringIO):
|
||||
def write(self, s):
|
||||
q.put(s)
|
||||
return len(s)
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
buf = QueueIO()
|
||||
old_console = printer._get_console()
|
||||
old_err_console = printer._get_err_console()
|
||||
|
||||
def run_plugin():
|
||||
printer.set_thread_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_err_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_stream(buf)
|
||||
|
||||
try:
|
||||
if hasattr(module, "Entrypoint"):
|
||||
module.Entrypoint(args, parser, app)
|
||||
@@ -303,9 +315,16 @@ el.replaceWith(d);
|
||||
printer.set_thread_console(old_console)
|
||||
printer.set_thread_err_console(old_err_console)
|
||||
printer.set_thread_stream(None)
|
||||
q.put(None)
|
||||
|
||||
for line in buf.getvalue().splitlines(keepends=True):
|
||||
yield line</code></pre>
|
||||
t = threading.Thread(target=run_plugin, daemon=True)
|
||||
t.start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
yield item</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for enabling, disabling, and listing plugins.</p>
|
||||
<p>Initialize the service.</p>
|
||||
@@ -560,14 +579,26 @@ el.replaceWith(d);
|
||||
from rich.console import Console
|
||||
|
||||
from rich.console import Console
|
||||
buf = io.StringIO()
|
||||
import queue
|
||||
import threading
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
class QueueIO(io.StringIO):
|
||||
def write(self, s):
|
||||
q.put(s)
|
||||
return len(s)
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
buf = QueueIO()
|
||||
old_console = printer._get_console()
|
||||
old_err_console = printer._get_err_console()
|
||||
|
||||
def run_plugin():
|
||||
printer.set_thread_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_err_console(Console(file=buf, theme=printer.connpy_theme, force_terminal=True))
|
||||
printer.set_thread_stream(buf)
|
||||
|
||||
try:
|
||||
if hasattr(module, "Entrypoint"):
|
||||
module.Entrypoint(args, parser, app)
|
||||
@@ -579,9 +610,16 @@ el.replaceWith(d);
|
||||
printer.set_thread_console(old_console)
|
||||
printer.set_thread_err_console(old_err_console)
|
||||
printer.set_thread_stream(None)
|
||||
q.put(None)
|
||||
|
||||
for line in buf.getvalue().splitlines(keepends=True):
|
||||
yield line</code></pre>
|
||||
t = threading.Thread(target=run_plugin, daemon=True)
|
||||
t.start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
yield item</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
@@ -671,7 +709,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.profile_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -429,7 +429,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.provider API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -164,7 +164,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.sync_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -964,7 +964,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.system_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -325,7 +325,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.conftest API documentation</title>
|
||||
<meta name="description" content="Shared fixtures for connpy tests …">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -258,7 +258,7 @@ def tmp_config_dir(tmp_path):
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -48,6 +48,10 @@ el.replaceWith(d);
|
||||
<dd>
|
||||
<div class="desc"><p>Tests for connpy.ai module.</p></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.tests.test_ai_copilot" href="test_ai_copilot.html">connpy.tests.test_ai_copilot</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.tests.test_capture" href="test_capture.html">connpy.tests.test_capture</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>Tests for connpy.core_plugins.capture</p></div>
|
||||
@@ -131,6 +135,7 @@ el.replaceWith(d);
|
||||
<ul>
|
||||
<li><code><a title="connpy.tests.conftest" href="conftest.html">connpy.tests.conftest</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai" href="test_ai.html">connpy.tests.test_ai</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai_copilot" href="test_ai_copilot.html">connpy.tests.test_ai_copilot</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_capture" href="test_capture.html">connpy.tests.test_capture</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_completion" href="test_completion.html">connpy.tests.test_completion</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_configfile" href="test_configfile.html">connpy.tests.test_configfile</a></code></li>
|
||||
@@ -152,7 +157,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_ai API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.ai module.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -1731,7 +1731,7 @@ def myai(self, ai_config, mock_litellm):
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -0,0 +1,315 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_ai_copilot API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.tests.test_ai_copilot</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.tests.test_ai_copilot.mock_acompletion"><code class="name flex">
|
||||
<span>def <span class="ident">mock_acompletion</span></span>(<span>)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@pytest.fixture
|
||||
def mock_acompletion():
|
||||
# Patch acompletion inside connpy.ai.aask_copilot
|
||||
with patch('litellm.acompletion') as mock:
|
||||
yield mock</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tests.test_ai_copilot.test_aask_copilot_fallback"><code class="name flex">
|
||||
<span>def <span class="ident">test_aask_copilot_fallback</span></span>(<span>mock_acompletion)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def test_aask_copilot_fallback(mock_acompletion):
|
||||
agent = ai(DummyConfig())
|
||||
|
||||
# Setup mock response for streaming
|
||||
class MockDelta:
|
||||
def __init__(self, content):
|
||||
self.content = content
|
||||
|
||||
class MockChoice:
|
||||
def __init__(self, content):
|
||||
self.delta = MockDelta(content)
|
||||
|
||||
class MockChunk:
|
||||
def __init__(self, content):
|
||||
self.choices = [MockChoice(content)]
|
||||
|
||||
async def mock_ac(*args, **kwargs):
|
||||
return MockAsyncIterator([
|
||||
MockChunk("Here is some text response instead of tool call.")
|
||||
])
|
||||
|
||||
mock_acompletion.side_effect = mock_ac
|
||||
|
||||
async def run_test():
|
||||
return await agent.aask_copilot("Router#", "What do I do?")
|
||||
|
||||
result = asyncio.run(run_test())
|
||||
|
||||
if result["error"]:
|
||||
print(f"ERROR OCCURRED: {result['error']}")
|
||||
|
||||
assert result["error"] is None
|
||||
assert result["guide"] == "Here is some text response instead of tool call."
|
||||
assert result["risk_level"] == "low"</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tests.test_ai_copilot.test_aask_copilot_tool_call"><code class="name flex">
|
||||
<span>def <span class="ident">test_aask_copilot_tool_call</span></span>(<span>mock_acompletion)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def test_aask_copilot_tool_call(mock_acompletion):
|
||||
agent = ai(DummyConfig())
|
||||
|
||||
# Setup mock response for streaming
|
||||
class MockDelta:
|
||||
def __init__(self, content):
|
||||
self.content = content
|
||||
|
||||
class MockChoice:
|
||||
def __init__(self, content):
|
||||
self.delta = MockDelta(content)
|
||||
|
||||
class MockChunk:
|
||||
def __init__(self, content):
|
||||
self.choices = [MockChoice(content)]
|
||||
|
||||
# acompletion is awaited and returns an async iterator
|
||||
async def mock_ac(*args, **kwargs):
|
||||
return MockAsyncIterator([
|
||||
MockChunk("<guide>Check the interfaces and running config.</guide>"),
|
||||
MockChunk("<commands>\nshow ip int br\nshow run\n</commands>"),
|
||||
MockChunk("<risk>low</risk>")
|
||||
])
|
||||
|
||||
mock_acompletion.side_effect = mock_ac
|
||||
|
||||
async def run_test():
|
||||
return await agent.aask_copilot("Router#", "What do I do?")
|
||||
|
||||
result = asyncio.run(run_test())
|
||||
|
||||
if result["error"]:
|
||||
print(f"ERROR OCCURRED: {result['error']}")
|
||||
|
||||
assert result["error"] is None
|
||||
assert result["guide"] == "Check the interfaces and running config."
|
||||
assert result["risk_level"] == "low"
|
||||
assert result["commands"] == ["show ip int br", "show run"]</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tests.test_ai_copilot.test_ingress_task_interception"><code class="name flex">
|
||||
<span>def <span class="ident">test_ingress_task_interception</span></span>(<span>)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def test_ingress_task_interception():
|
||||
async def run_test():
|
||||
c = node("test_node", "1.2.3.4")
|
||||
c.mylog = MagicMock()
|
||||
c.mylog.getvalue.return_value = b"Some session log"
|
||||
c.unique = "test_node"
|
||||
c.host = "1.2.3.4"
|
||||
c.tags = {"os": "cisco_ios"}
|
||||
|
||||
class MockStream:
|
||||
def __init__(self):
|
||||
self.data = [b"a", b"b", b"\x00", b"c", b""]
|
||||
async def read(self):
|
||||
if self.data:
|
||||
return self.data.pop(0)
|
||||
return b""
|
||||
def setup(self, resize_callback):
|
||||
pass
|
||||
|
||||
stream = MockStream()
|
||||
|
||||
called_copilot = False
|
||||
async def mock_handler(buffer, node_info, s, child_fd):
|
||||
nonlocal called_copilot
|
||||
called_copilot = True
|
||||
assert buffer == "Some session log"
|
||||
assert node_info["os"] == "cisco_ios"
|
||||
|
||||
c.child = MagicMock()
|
||||
c.child.child_fd = 123
|
||||
c.child.after = b""
|
||||
c.child.buffer = b""
|
||||
|
||||
async def mock_ingress():
|
||||
while True:
|
||||
data = await stream.read()
|
||||
if not data:
|
||||
break
|
||||
|
||||
if mock_handler and b'\x00' in data:
|
||||
buffer = c.mylog.getvalue().decode()
|
||||
node_info = {"name": getattr(c, 'unique', 'unknown'), "host": getattr(c, 'host', 'unknown')}
|
||||
if isinstance(getattr(c, 'tags', None), dict):
|
||||
node_info["os"] = c.tags.get("os", "unknown")
|
||||
await mock_handler(buffer, node_info, stream, c.child.child_fd)
|
||||
continue
|
||||
|
||||
await mock_ingress()
|
||||
assert called_copilot
|
||||
|
||||
asyncio.run(run_test())</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tests.test_ai_copilot.test_logclean_ansi"><code class="name flex">
|
||||
<span>def <span class="ident">test_logclean_ansi</span></span>(<span>)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def test_logclean_ansi():
|
||||
c = node("test_node", "1.2.3.4")
|
||||
raw = "Router#\x1b[K\x1b[m show ip"
|
||||
clean = c._logclean(raw, var=True)
|
||||
assert "\x1b" not in clean</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.tests.test_ai_copilot.DummyConfig"><code class="flex name class">
|
||||
<span>class <span class="ident">DummyConfig</span></span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class DummyConfig:
|
||||
def __init__(self):
|
||||
self.config = {"ai": {"engineer_api_key": "test_key", "engineer_model": "test_model"}}
|
||||
self.defaultdir = "/tmp"</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tests.test_ai_copilot.MockAsyncIterator"><code class="flex name class">
|
||||
<span>class <span class="ident">MockAsyncIterator</span></span>
|
||||
<span>(</span><span>items)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class MockAsyncIterator:
|
||||
def __init__(self, items):
|
||||
self.items = items
|
||||
def __aiter__(self):
|
||||
return self
|
||||
async def __anext__(self):
|
||||
if not self.items:
|
||||
raise StopAsyncIteration
|
||||
return self.items.pop(0)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.tests" href="index.html">connpy.tests</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.tests.test_ai_copilot.mock_acompletion" href="#connpy.tests.test_ai_copilot.mock_acompletion">mock_acompletion</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai_copilot.test_aask_copilot_fallback" href="#connpy.tests.test_ai_copilot.test_aask_copilot_fallback">test_aask_copilot_fallback</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai_copilot.test_aask_copilot_tool_call" href="#connpy.tests.test_ai_copilot.test_aask_copilot_tool_call">test_aask_copilot_tool_call</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai_copilot.test_ingress_task_interception" href="#connpy.tests.test_ai_copilot.test_ingress_task_interception">test_ingress_task_interception</a></code></li>
|
||||
<li><code><a title="connpy.tests.test_ai_copilot.test_logclean_ansi" href="#connpy.tests.test_ai_copilot.test_logclean_ansi">test_logclean_ansi</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.tests.test_ai_copilot.DummyConfig" href="#connpy.tests.test_ai_copilot.DummyConfig">DummyConfig</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.tests.test_ai_copilot.MockAsyncIterator" href="#connpy.tests.test_ai_copilot.MockAsyncIterator">MockAsyncIterator</a></code></h4>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_capture API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.core_plugins.capture">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -245,7 +245,7 @@ def mock_connapp():
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_completion API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.completion module.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -257,7 +257,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_configfile API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.configfile module.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -2005,7 +2005,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_connapp API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -699,7 +699,7 @@ def test_run(mock_run_commands, app):
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_core API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.core module — node and nodes classes.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -1369,7 +1369,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_execution_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -142,7 +142,7 @@ Regression: ExecutionService.test_commands currently ignores on_node_complete.</
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_grpc_layer API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -581,8 +581,8 @@ def test_interact_node_uses_passed_name(self, mock_node, servicer):
|
||||
|
||||
mock_resp = MagicMock()
|
||||
mock_resp.success = True
|
||||
mock_resp.stdout_data = b''
|
||||
stub.stub.interact_node.return_value = iter([mock_resp])
|
||||
|
||||
with patch("connpy.printer.success") as mock_success:
|
||||
with patch("sys.stdin.fileno", return_value=0):
|
||||
mock_select.return_value = ([], [], [])
|
||||
@@ -626,8 +626,8 @@ def test_connect_dynamic_msg_formatting_ssm(self, mock_select, mock_read, mock_s
|
||||
|
||||
mock_resp = MagicMock()
|
||||
mock_resp.success = True
|
||||
mock_resp.stdout_data = b''
|
||||
stub.stub.interact_node.return_value = iter([mock_resp])
|
||||
|
||||
with patch("connpy.printer.success") as mock_success:
|
||||
with patch("sys.stdin.fileno", return_value=0):
|
||||
mock_select.return_value = ([], [], [])
|
||||
@@ -709,7 +709,7 @@ def test_connect_dynamic_msg_formatting_ssm(self, mock_select, mock_read, mock_s
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_hooks API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.hooks module — MethodHook and ClassHook.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -673,7 +673,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_node_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -178,7 +178,7 @@ Regression: connapp._mod calls add_node instead of update_node.</p></div>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_plugins API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.plugins module.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -917,7 +917,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_printer API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.printer module.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -459,7 +459,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_printer_concurrency API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -148,7 +148,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_profile_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -192,7 +192,7 @@ Regression: ProfileService currently doesn't resolve inheritance within profiles
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_provider API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -139,7 +139,7 @@ el.replaceWith(d);
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tests.test_sync API documentation</title>
|
||||
<meta name="description" content="Tests for connpy.services.sync_service">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -354,7 +354,7 @@ def test_perform_restore(self, mock_remove, mock_dirname, mock_exists, MockZipFi
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.6">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.tunnels API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
@@ -94,6 +94,24 @@ el.replaceWith(d);
|
||||
# signal handling not supported on some loops (e.g., Windows Proactor)
|
||||
pass
|
||||
|
||||
def stop_reading(self):
|
||||
"""Temporarily stop reading from stdin."""
|
||||
if self._loop and self.stdin_fd is not None:
|
||||
try:
|
||||
self._loop.remove_reader(self.stdin_fd)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def start_reading(self):
|
||||
"""Resume reading from stdin."""
|
||||
if self._loop and self.stdin_fd is not None:
|
||||
try:
|
||||
# Ensure we don't add it twice
|
||||
self._loop.remove_reader(self.stdin_fd)
|
||||
except Exception:
|
||||
pass
|
||||
self._loop.add_reader(self.stdin_fd, self._read_ready)
|
||||
|
||||
def teardown(self):
|
||||
if self._loop:
|
||||
try:
|
||||
@@ -216,6 +234,44 @@ Handles terminal raw mode, async I/O, and SIGWINCH signals.</p></div>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.tunnels.LocalStream.start_reading"><code class="name flex">
|
||||
<span>def <span class="ident">start_reading</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def start_reading(self):
|
||||
"""Resume reading from stdin."""
|
||||
if self._loop and self.stdin_fd is not None:
|
||||
try:
|
||||
# Ensure we don't add it twice
|
||||
self._loop.remove_reader(self.stdin_fd)
|
||||
except Exception:
|
||||
pass
|
||||
self._loop.add_reader(self.stdin_fd, self._read_ready)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Resume reading from stdin.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.tunnels.LocalStream.stop_reading"><code class="name flex">
|
||||
<span>def <span class="ident">stop_reading</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def stop_reading(self):
|
||||
"""Temporarily stop reading from stdin."""
|
||||
if self._loop and self.stdin_fd is not None:
|
||||
try:
|
||||
self._loop.remove_reader(self.stdin_fd)
|
||||
except Exception:
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Temporarily stop reading from stdin.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.tunnels.LocalStream.teardown"><code class="name flex">
|
||||
<span>def <span class="ident">teardown</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
@@ -293,6 +349,7 @@ Handles terminal raw mode, async I/O, and SIGWINCH signals.</p></div>
|
||||
self.response_queue = response_queue
|
||||
self.running = True
|
||||
self._reader_queue = asyncio.Queue()
|
||||
self.copilot_queue = asyncio.Queue()
|
||||
self.resize_callback = None
|
||||
self._loop = None
|
||||
self.t = None
|
||||
@@ -309,6 +366,19 @@ Handles terminal raw mode, async I/O, and SIGWINCH signals.</p></div>
|
||||
if req.cols > 0 and req.rows > 0:
|
||||
if self.resize_callback:
|
||||
self._loop.call_soon_threadsafe(self.resize_callback, req.rows, req.cols)
|
||||
# Copilot dispatching
|
||||
copilot_msg = {}
|
||||
if getattr(req, "copilot_question", ""):
|
||||
copilot_msg.update({
|
||||
"question": req.copilot_question,
|
||||
"context_buffer": getattr(req, "copilot_context_buffer", ""),
|
||||
"node_info_json": getattr(req, "copilot_node_info_json", "")
|
||||
})
|
||||
if getattr(req, "copilot_action", ""):
|
||||
copilot_msg["action"] = req.copilot_action
|
||||
|
||||
if copilot_msg:
|
||||
self._loop.call_soon_threadsafe(self.copilot_queue.put_nowait, copilot_msg)
|
||||
if req.stdin_data:
|
||||
self._loop.call_soon_threadsafe(self._reader_queue.put_nowait, req.stdin_data)
|
||||
except Exception:
|
||||
@@ -374,6 +444,19 @@ Bridges the blocking gRPC iterators with the async _async_interact_loop.</p></di
|
||||
if req.cols > 0 and req.rows > 0:
|
||||
if self.resize_callback:
|
||||
self._loop.call_soon_threadsafe(self.resize_callback, req.rows, req.cols)
|
||||
# Copilot dispatching
|
||||
copilot_msg = {}
|
||||
if getattr(req, "copilot_question", ""):
|
||||
copilot_msg.update({
|
||||
"question": req.copilot_question,
|
||||
"context_buffer": getattr(req, "copilot_context_buffer", ""),
|
||||
"node_info_json": getattr(req, "copilot_node_info_json", "")
|
||||
})
|
||||
if getattr(req, "copilot_action", ""):
|
||||
copilot_msg["action"] = req.copilot_action
|
||||
|
||||
if copilot_msg:
|
||||
self._loop.call_soon_threadsafe(self.copilot_queue.put_nowait, copilot_msg)
|
||||
if req.stdin_data:
|
||||
self._loop.call_soon_threadsafe(self._reader_queue.put_nowait, req.stdin_data)
|
||||
except Exception:
|
||||
@@ -438,9 +521,11 @@ Bridges the blocking gRPC iterators with the async _async_interact_loop.</p></di
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.tunnels.LocalStream" href="#connpy.tunnels.LocalStream">LocalStream</a></code></h4>
|
||||
<ul class="">
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.tunnels.LocalStream.read" href="#connpy.tunnels.LocalStream.read">read</a></code></li>
|
||||
<li><code><a title="connpy.tunnels.LocalStream.setup" href="#connpy.tunnels.LocalStream.setup">setup</a></code></li>
|
||||
<li><code><a title="connpy.tunnels.LocalStream.start_reading" href="#connpy.tunnels.LocalStream.start_reading">start_reading</a></code></li>
|
||||
<li><code><a title="connpy.tunnels.LocalStream.stop_reading" href="#connpy.tunnels.LocalStream.stop_reading">stop_reading</a></code></li>
|
||||
<li><code><a title="connpy.tunnels.LocalStream.teardown" href="#connpy.tunnels.LocalStream.teardown">teardown</a></code></li>
|
||||
<li><code><a title="connpy.tunnels.LocalStream.write" href="#connpy.tunnels.LocalStream.write">write</a></code></li>
|
||||
</ul>
|
||||
@@ -460,7 +545,7 @@ Bridges the blocking gRPC iterators with the async _async_interact_loop.</p></di
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.6</a>.</p>
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
@@ -0,0 +1,130 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.utils API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.utils</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.utils.log_cleaner"><code class="name flex">
|
||||
<span>def <span class="ident">log_cleaner</span></span>(<span>data: str) ‑> str</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def log_cleaner(data: str) -> str:
|
||||
"""
|
||||
Stateless utility to remove ANSI sequences and process cursor movements.
|
||||
"""
|
||||
if not data:
|
||||
return ""
|
||||
|
||||
lines = data.split('\n')
|
||||
cleaned_lines = []
|
||||
|
||||
# Regex to capture: ANSI sequences, control characters (\r, \b, etc), and plain text chunks
|
||||
token_re = re.compile(r'(\x1B(?:[@-Z\\-_]|\[[0-?]*[ -/ ]*[@-~])|\r|\b|\x7f|[\x00-\x1F]|[^\x1B\r\b\x7f\x00-\x1F]+)')
|
||||
|
||||
for line in lines:
|
||||
buffer = []
|
||||
cursor = 0
|
||||
|
||||
for token in token_re.findall(line):
|
||||
if token == '\r':
|
||||
cursor = 0
|
||||
elif token in ('\b', '\x7f'):
|
||||
if cursor > 0:
|
||||
cursor -= 1
|
||||
elif token == '\x1B[D': # Left Arrow
|
||||
if cursor > 0:
|
||||
cursor -= 1
|
||||
elif token == '\x1B[C': # Right Arrow
|
||||
if cursor < len(buffer):
|
||||
cursor += 1
|
||||
elif token == '\x1B[K': # Clear to end of line
|
||||
buffer = buffer[:cursor]
|
||||
elif token.startswith('\x1B'):
|
||||
continue
|
||||
elif len(token) == 1 and ord(token) < 32:
|
||||
continue
|
||||
else:
|
||||
for char in token:
|
||||
if cursor == len(buffer):
|
||||
buffer.append(char)
|
||||
else:
|
||||
buffer[cursor] = char
|
||||
cursor += 1
|
||||
cleaned_lines.append("".join(buffer))
|
||||
|
||||
return "\n".join(cleaned_lines).replace('\n\n', '\n').strip()</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Stateless utility to remove ANSI sequences and process cursor movements.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy" href="index.html">connpy</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.utils.log_cleaner" href="#connpy.utils.log_cleaner">log_cleaner</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -13,5 +13,10 @@ protobuf>=6.31.1,<7.0.0
|
||||
google-api-python-client>=2.125.0
|
||||
google-auth-oauthlib>=1.2.0
|
||||
google-auth-httplib2>=0.2.0
|
||||
prompt-toolkit>=3.0.0
|
||||
mcp>=1.2.0
|
||||
aiohttp>=3.9.0
|
||||
httpx>=0.27.0
|
||||
requests>=2.31.0
|
||||
pytest>=8.0.0
|
||||
pytest-mock>=3.12.0
|
||||
|
||||
@@ -18,12 +18,16 @@ classifiers =
|
||||
Topic :: System :: Networking
|
||||
Intended Audience :: Telecommunications Industry
|
||||
Programming Language :: Python :: 3
|
||||
Programming Language :: Python :: 3.10
|
||||
Programming Language :: Python :: 3.11
|
||||
Programming Language :: Python :: 3.12
|
||||
Natural Language :: English
|
||||
Operating System :: MacOS
|
||||
Operating System :: Unix
|
||||
|
||||
[options]
|
||||
packages = find:
|
||||
python_requires = >=3.10
|
||||
install_requires =
|
||||
rich>=13.7.1
|
||||
rich-argparse>=1.4.0
|
||||
@@ -40,6 +44,11 @@ install_requires =
|
||||
google-api-python-client>=2.125.0
|
||||
google-auth-oauthlib>=1.2.0
|
||||
google-auth-httplib2>=0.2.0
|
||||
prompt-toolkit>=3.0.0
|
||||
mcp>=1.2.0
|
||||
aiohttp>=3.9.0
|
||||
httpx>=0.27.0
|
||||
requests>=2.31.0
|
||||
|
||||
[options.entry_points]
|
||||
console_scripts =
|
||||
|
||||
Reference in New Issue
Block a user