feat: major architectural refactor to 5.1b1 - Service Layer, gRPC & Agent evolution (fragmented secrets)
This commit is contained in:
@@ -145,3 +145,8 @@ package.json
|
||||
|
||||
# Development docs
|
||||
connpy_roadmap.md
|
||||
testall/
|
||||
testremote/
|
||||
*.db
|
||||
*.patch
|
||||
scratch.py
|
||||
|
||||
@@ -0,0 +1,123 @@
|
||||
# Plan de Arquitectura: Creación de Capa de Servicios en Connpy
|
||||
|
||||
Este documento detalla el plan paso a paso para refactorizar `connpy` y extraer la lógica de negocio actual (acoplada en `connapp.py` y `api.py`) hacia una **Capa de Servicios (Service Layer)** limpia y reutilizable.
|
||||
|
||||
## 🎯 Objetivos
|
||||
1. **Desacoplar la CLI (`connapp.py`)**: La CLI solo debe encargarse de procesar argumentos (`argparse`), solicitar datos al usuario (`inquirer`, `rich.prompt`) y renderizar la salida en pantalla (`rich`).
|
||||
2. **Desacoplar la API (`api.py`)**: La API actual (Flask) y la futura API gRPC solo deben encargarse de exponer endpoints y delegar la ejecución a la capa subyacente.
|
||||
3. **Centralizar la Lógica de Negocio**: Todas las operaciones sobre nodos, perfiles, configuración, ejecución de comandos, IA, plugins e importación/exportación vivirán en la nueva capa de servicios. Esto asegura que ejecutar una acción desde la CLI local, CLI remota, o API produzca **exactamente el mismo comportamiento**.
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ 1. Estructura de la Capa de Servicios
|
||||
|
||||
Crearemos un nuevo paquete `connpy/services/` que agrupe las distintas responsabilidades del dominio. Basado en todos los comandos de `connapp.py`, la estructura será:
|
||||
|
||||
```text
|
||||
connpy/
|
||||
└── services/
|
||||
├── __init__.py
|
||||
├── node_service.py # CRUD de nodos, carpetas, bulk, mover, copiar y listar
|
||||
├── profile_service.py # CRUD de perfiles
|
||||
├── execution_service.py # Ejecución de comandos en paralelo (ad-hoc, scripts, yaml, test)
|
||||
├── import_export_service.py# Importación y exportación de configuración a YAML
|
||||
├── ai_service.py # Interacciones con el Agente (Claude/LLMs) y su configuración
|
||||
├── plugin_service.py # Habilitar, deshabilitar y listar plugins
|
||||
├── config_service.py # Manejo de la configuración global de la app (case, fzf, idletime)
|
||||
├── system_service.py # Control de ciclo de vida (iniciar/detener API local)
|
||||
└── exceptions.py # Excepciones de negocio (ej. NodeNotFoundError)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🛠️ 2. Diseño de los Servicios (Casos de Uso Completos)
|
||||
|
||||
A continuación, la lista detallada de servicios mapeando cada funcionalidad de la aplicación actual:
|
||||
|
||||
### 1. `NodeService`
|
||||
Maneja toda la interacción con `configfile` relacionada con la topología de red (nodos y carpetas).
|
||||
- `list_nodes(filter: str/list) -> list`: Devuelve lista de nodos (comando `list`).
|
||||
- `list_folders(filter: str/list) -> list`: Devuelve lista de carpetas.
|
||||
- `get_node_details(unique: str) -> dict`: Devuelve configuración de un nodo (`node show`).
|
||||
- `add_node(unique: str, data: dict) -> None`: Agrega un nuevo nodo (`node -a`).
|
||||
- `update_node(unique: str, data: dict) -> None`: Modifica un nodo (`node -e`).
|
||||
- `delete_node(unique: str) -> None`: Elimina un nodo (`node -r`).
|
||||
- `move_node(src: str, dst: str) -> None`: Renombra o mueve nodos a otras carpetas (`move`).
|
||||
- `copy_node(src: str, dst: str) -> None`: Duplica un nodo existente (`copy`).
|
||||
- `bulk_add_nodes(folder: str, nodes_data: list) -> dict`: Lógica para procesar la creación masiva de nodos (`bulk`).
|
||||
|
||||
### 2. `ProfileService`
|
||||
- `list_profiles() -> list`: Muestra los perfiles disponibles (`list`).
|
||||
- `get_profile(name: str) -> dict`: Muestra un perfil (`profile show`).
|
||||
- `add_profile(name: str, data: dict) -> None`: Agrega un perfil (`profile -a`).
|
||||
- `update_profile(name: str, data: dict) -> None`: Modifica un perfil (`profile mod`).
|
||||
- `delete_profile(name: str) -> None`: Elimina un perfil (`profile -r`).
|
||||
|
||||
### 3. `ExecutionService`
|
||||
Encapsula la clase `core.nodes` para conexiones y envíos de comandos, abstrayéndola de `sys.stdout` o funciones `print`.
|
||||
- `run_commands(nodes_list: list, commands: list) -> dict`: Llama a nodos en paralelo y devuelve un diccionario con los resultados (`run`).
|
||||
- `test_commands(nodes_list: list, commands: list, expected: str) -> dict`: Valida el output esperado.
|
||||
- `run_cli_script(nodes_list: list, script_path: str) -> dict`: Lee y ejecuta un script plano en los nodos.
|
||||
- `run_yaml_playbook(playbook_path: str) -> dict`: Ejecuta la lógica compleja definida en un archivo YAML.
|
||||
|
||||
### 4. `ImportExportService`
|
||||
- `export_to_yaml(folder_name: str, output_path: str) -> None`: Exporta la configuración completa de una carpeta de forma segura (`export`).
|
||||
- `import_from_yaml(yaml_path: str, destination_folder: str) -> dict`: Parsea e importa nodos desde un archivo YAML asegurando que no haya colisiones críticas (`import`).
|
||||
|
||||
### 5. `PluginService`
|
||||
- `list_plugins() -> list`: Devuelve el estado de todos los plugins detectados (activos/inactivos) (`plugin`).
|
||||
- `enable_plugin(name: str) -> None`: Activa un plugin en la configuración.
|
||||
- `disable_plugin(name: str) -> None`: Desactiva un plugin en la configuración.
|
||||
|
||||
### 6. `ConfigService`
|
||||
- `update_setting(key: str, value: any) -> None`: Actualiza de forma genérica o específica (fzf, case, idletime, configfolder) en el `configfile` (`config`).
|
||||
- `get_settings() -> dict`: Devuelve las configuraciones globales actuales.
|
||||
|
||||
### 7. `AIService`
|
||||
Encapsula `connpy.ai.ai`.
|
||||
- `ask(input_text: str, dryrun: bool, chat_history: list) -> dict/str`: Envia consulta al Agente (`ai`).
|
||||
- `confirm(input_text: str) -> bool`: Mecanismo de seguridad.
|
||||
- `configure_provider(provider: str, model: str, api_key: str) -> None`: Guarda configuración de OpenAI/Anthropic/Google en config (`config openai/anthropic/google`).
|
||||
|
||||
### 8. `SystemService`
|
||||
- `start_api(host: str, port: int) -> None`: Levanta el daemon o proceso de la API (`api start`).
|
||||
- `stop_api() -> None`: Baja el proceso local (`api stop`).
|
||||
- `status_api() -> dict`: Devuelve el estado del proceso local.
|
||||
|
||||
---
|
||||
|
||||
## 🔌 3. Sobre los Plugins (Core Plugins)
|
||||
Los plugins de core (como `sync.py`) añaden sus propios `subparsers` directamente a la CLI (ej. `sync start`, `sync backup`, `sync restore`).
|
||||
- **Arquitectura para Plugins**: Para mantener la capa de servicios limpia, los plugins deben instanciar su propio Service si requieren lógica compleja (ej. `GoogleSyncService` definido dentro de `core_plugins/sync.py`), o bien llamar a los servicios core que definimos arriba. El motor de plugins de la aplicación no se toca, pero el comportamiento dentro de los plugins debería alinearse a usar llamadas de la Capa de Servicios si tocan datos de nodos.
|
||||
|
||||
---
|
||||
|
||||
## 🚀 4. Fases de Implementación Actualizadas
|
||||
|
||||
### Fase 1: Creación del Esqueleto y Modelos de Datos
|
||||
1. Crear el directorio `connpy/services/` y los archivos listados.
|
||||
2. Definir `exceptions.py` con errores como `NodeNotFoundError`, `ProfileNotFoundError`, `DuplicateEntityError`.
|
||||
3. Crear el `connpy/services/__init__.py` que expondrá estos servicios para que puedan ser fácilmente importados (`from connpy.services import NodeService, ExecutionService`).
|
||||
|
||||
### Fase 2: Migración de CRUD y Configuración
|
||||
1. Refactorizar la CLI y la API para instanciar y usar: `NodeService`, `ProfileService`, `ConfigService` y `PluginService`.
|
||||
2. Todo el código de validación de variables (`_questions_nodes`, `_type_node`) permanecerá en `connapp.py` ya que pertenece a la "Presentación/CLI", pero los diccionarios limpios se pasarán al Servicio para su guardado final.
|
||||
|
||||
### Fase 3: Migración de Import/Export e IA
|
||||
1. Extraer la lógica de YAML a `ImportExportService`.
|
||||
2. Mover la configuración de las llaves API a `AIService`.
|
||||
|
||||
### Fase 4: Migración de Ejecución (El cambio más complejo)
|
||||
1. Desacoplar `core.nodes` para que sea capaz de retornar estado consolidado (diccionarios con la salida de los comandos por nodo) en vez de imprimir asíncronamente en pantalla con `printer`.
|
||||
2. Integrar `ExecutionService` en los comandos `run`, `node (connect)`, test, etc.
|
||||
3. La CLI se subscribirá a los resultados que devuelve el `ExecutionService` para formatearlos con `rich`.
|
||||
|
||||
### Fase 5: Preparación para Cliente Servidor (gRPC/REST remoto)
|
||||
1. Con los servicios totalmente aislados, si la CLI opera en "modo remoto", inyectará un Cliente Remoto que implementa las mismas interfaces (mismos métodos del `NodeService`) pero que serializa peticiones hacia la API en lugar de acceder directamente al archivo de configuración cifrado local.
|
||||
|
||||
---
|
||||
|
||||
## ✅ Checklist para el éxito
|
||||
- [ ] Ningún `print()`, `console.print()`, `Prompt.ask()` debe existir dentro del paquete `services/`.
|
||||
- [ ] Todas las excepciones lanzadas por `services/` deben ser manejadas visualmente por la capa que los consuma (`connapp.py` las pinta, `api.py` devuelve 400/500 JSON).
|
||||
- [ ] Asegurarse de que el comportamiento local (CLI sin red) no perciba pérdida de rendimiento.
|
||||
@@ -59,7 +59,9 @@ For more detailed information, please read our [Privacy Policy](https://connpy.g
|
||||
- Use AI with a multi-agent system (Engineer/Architect) to manage devices.
|
||||
Supports any LLM provider via litellm (OpenAI, Anthropic, Google, etc.).
|
||||
Features streaming responses, interactive chat, and extensible plugin tools.
|
||||
- Add plugins with your own scripts.
|
||||
- Add plugins with your own scripts, and execute them remotely.
|
||||
- Fully decoupled gRPC Client/Server architecture.
|
||||
- Unified UI with syntax highlighting and theming.
|
||||
- Much more!
|
||||
|
||||
### Usage:
|
||||
@@ -82,6 +84,9 @@ options:
|
||||
-s, --show Show node[@subfolder][@folder]
|
||||
-d, --debug Display all conections steps
|
||||
-t, --sftp Connects using sftp instead of ssh
|
||||
--service-mode Set the backend service mode (local or remote)
|
||||
--remote Connect to a remote connpy service via gRPC
|
||||
--theme UI Output theme (dark, light, or path)
|
||||
|
||||
Commands:
|
||||
profile Manage profiles
|
||||
@@ -141,6 +146,12 @@ options:
|
||||
```
|
||||
## Plugin Requirements for Connpy
|
||||
|
||||
### Remote Plugin Execution
|
||||
When Connpy operates in remote mode, plugins are executed **transparently on the server**:
|
||||
- The client automatically downloads the plugin source code (`Parser` class context) to generate the local `argparse` structure and provide autocompletion.
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory, ensuring the plugin runs securely against the server's inventory without passing sensitive data to the client.
|
||||
- You can manage remote plugins using the `--remote` flag (e.g. `connpy plugin --add myplugin script.py --remote`).
|
||||
|
||||
### General Structure
|
||||
- The plugin script must be a Python file.
|
||||
- Only the following top-level elements are allowed in the plugin script:
|
||||
@@ -256,46 +267,37 @@ There are 2 methods that allows you to define custom logic to be executed before
|
||||
|
||||
### Command Completion Support
|
||||
|
||||
Plugins can provide intelligent **tab completion** by defining a function called `_connpy_completion` in the plugin script. This function will be called by Connpy to assist with command-line completion when the user types partial input.
|
||||
Plugins can provide intelligent **tab completion** by defining autocompletion logic. There are two supported methods, with the tree-based approach being the most modern and recommended.
|
||||
|
||||
#### Function Signature
|
||||
#### 1. Tree-based Completion (Recommended)
|
||||
|
||||
```
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
...
|
||||
Define a function called `_connpy_tree` that returns a declarative navigation tree. This method is highly efficient, supports complex state loops, and is very simple to implement for most use cases.
|
||||
|
||||
```python
|
||||
def _connpy_tree(info=None):
|
||||
nodes = info.get("nodes", [])
|
||||
return {
|
||||
"__exclude_used__": True, # Filter out words already typed
|
||||
"__extra__": nodes, # Suggest nodes at this level
|
||||
"--format": ["json", "yaml", "table"], # Fixed suggestions
|
||||
"*": { # Wildcard matches any positional word
|
||||
"interface1": None,
|
||||
"interface2": None,
|
||||
"--verbose": None
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Parameters
|
||||
- **Keys**: Literal completions (exact matches).
|
||||
- **`*` Key**: A wildcard that matches any positional word typed by the user.
|
||||
- **`__extra__`**: A list or a callable `(words) -> list` that adds dynamic suggestions.
|
||||
- **`__exclude_used__`**: (Boolean) If True, automatically filters out words already present in the command line.
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the number of words (space-separated tokens) currently on the command line. For plugins, this typically starts at 3 (e.g., `connpy <plugin> ...`). |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin, followed by any subcommands or arguments. |
|
||||
| `info` | A dictionary of structured context data provided by Connpy to help with suggestions. |
|
||||
#### 2. Legacy Function-based Completion
|
||||
|
||||
#### Contents of `info`
|
||||
For backward compatibility or highly custom logic, you can define `_connpy_completion`.
|
||||
|
||||
The `info` dictionary contains helpful context to generate completions:
|
||||
|
||||
```
|
||||
info = {
|
||||
"config": config_dict, # The full loaded configuration
|
||||
"nodes": node_list, # List of all known node names
|
||||
"folders": folder_list, # List of all defined folder names
|
||||
"profiles": profile_list, # List of all profile names
|
||||
"plugins": plugin_list # List of all plugin names
|
||||
}
|
||||
```
|
||||
|
||||
You can use this data to generate suggestions based on the current input.
|
||||
|
||||
#### Return Value
|
||||
|
||||
The function must return a list of suggestion strings to be presented to the user.
|
||||
|
||||
#### Example
|
||||
|
||||
```
|
||||
```python
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
if wordsnumber == 3:
|
||||
return ["--help", "--verbose", "start", "stop"]
|
||||
@@ -306,6 +308,12 @@ def _connpy_completion(wordsnumber, words, info=None):
|
||||
return []
|
||||
```
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the total number of words on the command line. For plugins, this typically starts at 3. |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin. |
|
||||
| `info` | A dictionary of structured context data (`nodes`, `folders`, `profiles`, `config`). |
|
||||
|
||||
> In this example, if the user types `connpy myplugin start ` and presses Tab, it will suggest node names.
|
||||
|
||||
### Handling Unknown Arguments
|
||||
@@ -471,111 +479,49 @@ class Preload:
|
||||
def __init__(self, connapp):
|
||||
connapp.ai.modify(_register_my_tools)
|
||||
```
|
||||
## http API
|
||||
With the Connpy API you can run commands on devices using http requests
|
||||
## gRPC Service Architecture
|
||||
Connpy features a completely decoupled gRPC Client/Server architecture. You can run Connpy as a standalone background service and connect to it remotely via the CLI or other clients.
|
||||
|
||||
### 1. List Nodes
|
||||
### 1. Start the Server
|
||||
Start the gRPC service by running:
|
||||
```bash
|
||||
connpy api -s 50051
|
||||
```
|
||||
The server will handle all configurations, connections, AI sessions, and plugin execution locally on the machine it runs on.
|
||||
|
||||
**Endpoint**: `/list_nodes`
|
||||
### 2. Connect the Client
|
||||
Configure your local CLI client to connect to the remote server:
|
||||
```bash
|
||||
connpy config --service-mode remote
|
||||
connpy config --remote-host localhost:50051
|
||||
```
|
||||
Once configured, all commands (`connpy node`, `connpy list`, `connpy ai`, etc.) will execute transparently on the remote server via thin-client proxies. You can revert back to standalone execution at any time by running `connpy config --service-mode local`.
|
||||
|
||||
**Method**: `POST`
|
||||
### Programmatic Access (gRPC & SOA)
|
||||
If you wish to build your own application (Web, Desktop, or Scripts) using the Connpy backend, you can use the `ServiceProvider` to interact with either a local or remote service transparently.
|
||||
|
||||
**Description**: This route returns a list of nodes. It can also filter the list based on a given keyword.
|
||||
```python
|
||||
import connpy
|
||||
from connpy.services.provider import ServiceProvider
|
||||
|
||||
#### Request Body:
|
||||
# Initialize local config
|
||||
config = connpy.configfile()
|
||||
|
||||
```json
|
||||
{
|
||||
"filter": "<keyword>"
|
||||
}
|
||||
# Connect to the remote gRPC service
|
||||
services = ServiceProvider(
|
||||
config,
|
||||
mode="remote",
|
||||
remote_host="localhost:50051"
|
||||
)
|
||||
|
||||
# Use any service (the logic is identical to local mode)
|
||||
nodes = services.nodes.list_nodes()
|
||||
for name in nodes:
|
||||
print(f"Found node: {name}")
|
||||
|
||||
# Run a command remotely via streaming
|
||||
for chunk in services.execution.run_commands(nodes=["server1"], commands=["uptime"]):
|
||||
print(chunk["output"], end="")
|
||||
```
|
||||
|
||||
* `filter` (optional): A keyword to filter the list of nodes. It returns only the nodes that contain the keyword. If not provided, the route will return the entire list of nodes.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the filtered list of nodes.
|
||||
|
||||
---
|
||||
|
||||
### 2. Get Nodes
|
||||
|
||||
**Endpoint**: `/get_nodes`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route returns a dictionary of nodes with all their attributes. It can also filter the nodes based on a given keyword.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"filter": "<keyword>"
|
||||
}
|
||||
```
|
||||
|
||||
* `filter` (optional): A keyword to filter the nodes. It returns only the nodes that contain the keyword. If not provided, the route will return the entire list of nodes.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the filtered nodes.
|
||||
|
||||
---
|
||||
|
||||
### 3. Run Commands
|
||||
|
||||
**Endpoint**: `/run_commands`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route runs commands on selected nodes based on the provided action, nodes, and commands. It also supports executing tests by providing expected results.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"action": "<action>",
|
||||
"nodes": "<nodes>",
|
||||
"commands": "<commands>",
|
||||
"expected": "<expected>",
|
||||
"options": "<options>"
|
||||
}
|
||||
```
|
||||
|
||||
* `action` (required): The action to be performed. Possible values: `run` or `test`.
|
||||
* `nodes` (required): A list of nodes or a single node on which the commands will be executed. The nodes can be specified as individual node names or a node group with the `@` prefix. Node groups can also be specified as arrays with a list of nodes inside the group.
|
||||
* `commands` (required): A list of commands to be executed on the specified nodes.
|
||||
* `expected` (optional, only used when the action is `test`): A single expected result for the test.
|
||||
* `options` (optional): Array to pass options to the run command, options are: `prompt`, `parallel`, `timeout`
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON object with the results of the executed commands on the nodes.
|
||||
|
||||
---
|
||||
|
||||
### 4. Ask AI
|
||||
|
||||
**Endpoint**: `/ask_ai`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route sends a request to the AI multi-agent system which will analyze it, execute commands on devices if needed, and return the result. Supports any LLM provider configured via litellm.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"input": "<user input request>",
|
||||
"dryrun": true or false
|
||||
}
|
||||
```
|
||||
|
||||
* `input` (required): The user input requesting the AI to perform an action on some devices or get the devices list.
|
||||
* `dryrun` (optional): If set to true, it will return the parameters to run the request but it won't run it. default is false.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the action to run and the parameters and the result of the action.
|
||||
|
||||
|
||||
|
||||
+71
-134
@@ -17,7 +17,9 @@ Connpy is a SSH, SFTP, Telnet, kubectl, and Docker pod connection manager and au
|
||||
- Run automation scripts on network devices.
|
||||
- Use AI with a multi-agent system (Engineer/Architect) to help you manage your devices.
|
||||
Supports any LLM provider via litellm (OpenAI, Anthropic, Google, etc.).
|
||||
- Add plugins with your own scripts.
|
||||
- Add plugins with your own scripts, and execute them remotely.
|
||||
- Fully decoupled gRPC Client/Server architecture.
|
||||
- Unified UI with syntax highlighting and theming.
|
||||
- Much more!
|
||||
|
||||
### Usage
|
||||
@@ -40,6 +42,9 @@ options:
|
||||
-s, --show Show node[@subfolder][@folder]
|
||||
-d, --debug Display all conections steps
|
||||
-t, --sftp Connects using sftp instead of ssh
|
||||
--service-mode Set the backend service mode (local or remote)
|
||||
--remote Connect to a remote connpy service via gRPC
|
||||
--theme UI Output theme (dark, light, or path)
|
||||
|
||||
Commands:
|
||||
profile Manage profiles
|
||||
@@ -98,6 +103,13 @@ options:
|
||||
conn run server ls -la
|
||||
```
|
||||
## Plugin Requirements for Connpy
|
||||
|
||||
### Remote Plugin Execution
|
||||
When Connpy operates in remote mode, plugins are executed **transparently on the server**:
|
||||
- The client automatically downloads the plugin source code (`Parser` class context) to generate the local `argparse` structure and provide autocompletion.
|
||||
- The execution phase (`Entrypoint` class) is redirected via gRPC streams to execute in the server's memory, ensuring the plugin runs securely against the server's inventory without passing sensitive data to the client.
|
||||
- You can manage remote plugins using the `--remote` flag (e.g. `connpy plugin --add myplugin script.py --remote`).
|
||||
|
||||
### General Structure
|
||||
- The plugin script must be a Python file.
|
||||
- Only the following top-level elements are allowed in the plugin script:
|
||||
@@ -212,46 +224,37 @@ There are 2 methods that allows you to define custom logic to be executed before
|
||||
|
||||
### Command Completion Support
|
||||
|
||||
Plugins can provide intelligent **tab completion** by defining a function called `_connpy_completion` in the plugin script. This function will be called by Connpy to assist with command-line completion when the user types partial input.
|
||||
Plugins can provide intelligent **tab completion** by defining autocompletion logic. There are two supported methods, with the tree-based approach being the most modern and recommended.
|
||||
|
||||
#### Function Signature
|
||||
#### 1. Tree-based Completion (Recommended)
|
||||
|
||||
```
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
...
|
||||
Define a function called `_connpy_tree` that returns a declarative navigation tree. This method is highly efficient, supports complex state loops, and is very simple to implement for most use cases.
|
||||
|
||||
```python
|
||||
def _connpy_tree(info=None):
|
||||
nodes = info.get("nodes", [])
|
||||
return {
|
||||
"__exclude_used__": True, # Filter out words already typed
|
||||
"__extra__": nodes, # Suggest nodes at this level
|
||||
"--format": ["json", "yaml", "table"], # Fixed suggestions
|
||||
"*": { # Wildcard matches any positional word
|
||||
"interface1": None,
|
||||
"interface2": None,
|
||||
"--verbose": None
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Parameters
|
||||
- **Keys**: Literal completions (exact matches).
|
||||
- **`*` Key**: A wildcard that matches any positional word typed by the user.
|
||||
- **`__extra__`**: A list or a callable `(words) -> list` that adds dynamic suggestions.
|
||||
- **`__exclude_used__`**: (Boolean) If True, automatically filters out words already present in the command line.
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the number of words (space-separated tokens) currently on the command line. For plugins, this typically starts at 3 (e.g., `connpy <plugin> ...`). |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin, followed by any subcommands or arguments. |
|
||||
| `info` | A dictionary of structured context data provided by Connpy to help with suggestions. |
|
||||
#### 2. Legacy Function-based Completion
|
||||
|
||||
#### Contents of `info`
|
||||
For backward compatibility or highly custom logic, you can define `_connpy_completion`.
|
||||
|
||||
The `info` dictionary contains helpful context to generate completions:
|
||||
|
||||
```
|
||||
info = {
|
||||
"config": config_dict, # The full loaded configuration
|
||||
"nodes": node_list, # List of all known node names
|
||||
"folders": folder_list, # List of all defined folder names
|
||||
"profiles": profile_list, # List of all profile names
|
||||
"plugins": plugin_list # List of all plugin names
|
||||
}
|
||||
```
|
||||
|
||||
You can use this data to generate suggestions based on the current input.
|
||||
|
||||
#### Return Value
|
||||
|
||||
The function must return a list of suggestion strings to be presented to the user.
|
||||
|
||||
#### Example
|
||||
|
||||
```
|
||||
```python
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
if wordsnumber == 3:
|
||||
return ["--help", "--verbose", "start", "stop"]
|
||||
@@ -262,6 +265,12 @@ def _connpy_completion(wordsnumber, words, info=None):
|
||||
return []
|
||||
```
|
||||
|
||||
| Parameter | Description |
|
||||
|----------------|-------------|
|
||||
| `wordsnumber` | Integer indicating the total number of words on the command line. For plugins, this typically starts at 3. |
|
||||
| `words` | A list of tokens (words) already typed. `words[0]` is always the name of the plugin. |
|
||||
| `info` | A dictionary of structured context data (`nodes`, `folders`, `profiles`, `config`). |
|
||||
|
||||
> In this example, if the user types `connpy myplugin start ` and presses Tab, it will suggest node names.
|
||||
|
||||
### Handling Unknown Arguments
|
||||
@@ -313,112 +322,33 @@ For a practical example of how to write a compatible plugin script, please refer
|
||||
|
||||
This script demonstrates the required structure and implementation details according to the plugin system's standards.
|
||||
|
||||
## http API
|
||||
With the Connpy API you can run commands on devices using http requests
|
||||
## gRPC Service Architecture
|
||||
Connpy features a completely decoupled gRPC Client/Server architecture. You can run Connpy as a standalone background service and connect to it remotely via the CLI or other clients.
|
||||
|
||||
### 1. List Nodes
|
||||
### 1. Start the Server
|
||||
Start the gRPC service by running:
|
||||
```bash
|
||||
connpy api -s 50051
|
||||
```
|
||||
The server will handle all configurations, connections, AI sessions, and plugin execution locally on the machine it runs on.
|
||||
|
||||
**Endpoint**: `/list_nodes`
|
||||
### 2. Connect the Client
|
||||
Configure your local CLI client to connect to the remote server:
|
||||
```bash
|
||||
connpy config --service-mode remote
|
||||
connpy config --remote-host localhost:50051
|
||||
```
|
||||
Once configured, all commands (`connpy node`, `connpy list`, `connpy ai`, etc.) will execute transparently on the remote server via thin-client proxies. You can revert back to standalone execution at any time by running `connpy config --service-mode local`.
|
||||
|
||||
**Method**: `POST`
|
||||
### Programmatic Access (gRPC & SOA)
|
||||
Developers can build their own applications using the Connpy backend by utilizing the `ServiceProvider`:
|
||||
|
||||
**Description**: This route returns a list of nodes. It can also filter the list based on a given keyword.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"filter": "<keyword>"
|
||||
}
|
||||
```python
|
||||
from connpy.services.provider import ServiceProvider
|
||||
services = ServiceProvider(config, mode="remote", remote_host="localhost:50051")
|
||||
nodes = services.nodes.list_nodes()
|
||||
```
|
||||
|
||||
* `filter` (optional): A keyword to filter the list of nodes. It returns only the nodes that contain the keyword. If not provided, the route will return the entire list of nodes.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the filtered list of nodes.
|
||||
|
||||
---
|
||||
|
||||
### 2. Get Nodes
|
||||
|
||||
**Endpoint**: `/get_nodes`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route returns a dictionary of nodes with all their attributes. It can also filter the nodes based on a given keyword.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"filter": "<keyword>"
|
||||
}
|
||||
```
|
||||
|
||||
* `filter` (optional): A keyword to filter the nodes. It returns only the nodes that contain the keyword. If not provided, the route will return the entire list of nodes.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the filtered nodes.
|
||||
|
||||
---
|
||||
|
||||
### 3. Run Commands
|
||||
|
||||
**Endpoint**: `/run_commands`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route runs commands on selected nodes based on the provided action, nodes, and commands. It also supports executing tests by providing expected results.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"action": "<action>",
|
||||
"nodes": "<nodes>",
|
||||
"commands": "<commands>",
|
||||
"expected": "<expected>",
|
||||
"options": "<options>"
|
||||
}
|
||||
```
|
||||
|
||||
* `action` (required): The action to be performed. Possible values: `run` or `test`.
|
||||
* `nodes` (required): A list of nodes or a single node on which the commands will be executed. The nodes can be specified as individual node names or a node group with the `@` prefix. Node groups can also be specified as arrays with a list of nodes inside the group.
|
||||
* `commands` (required): A list of commands to be executed on the specified nodes.
|
||||
* `expected` (optional, only used when the action is `test`): A single expected result for the test.
|
||||
* `options` (optional): Array to pass options to the run command, options are: `prompt`, `parallel`, `timeout`
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON object with the results of the executed commands on the nodes.
|
||||
|
||||
---
|
||||
|
||||
### 4. Ask AI
|
||||
|
||||
**Endpoint**: `/ask_ai`
|
||||
|
||||
**Method**: `POST`
|
||||
|
||||
**Description**: This route sends to chatgpt IA a request that will parse it into an understandable output for the application and then run the request.
|
||||
|
||||
#### Request Body:
|
||||
|
||||
```json
|
||||
{
|
||||
"input": "<user input request>",
|
||||
"dryrun": true or false
|
||||
}
|
||||
```
|
||||
|
||||
* `input` (required): The user input requesting the AI to perform an action on some devices or get the devices list.
|
||||
* `dryrun` (optional): If set to true, it will return the parameters to run the request but it won't run it. default is false.
|
||||
|
||||
#### Response:
|
||||
|
||||
- A JSON array containing the action to run and the parameters and the result of the action.
|
||||
|
||||
## Automation module
|
||||
The automation module
|
||||
@@ -534,6 +464,13 @@ class Preload:
|
||||
def __init__(self, connapp):
|
||||
connapp.ai.modify(_register_my_tools)
|
||||
```
|
||||
|
||||
## Developer Notes (SOA Architecture)
|
||||
As of version 2.0, Connpy has migrated to a **Service-Oriented Architecture (SOA)**:
|
||||
- **`connpy/cli/`**: Contains all CLI handlers. These are responsible for argument parsing, user interaction (via `inquirer`), and visual output (via `printer`).
|
||||
- **`connpy/services/`**: Contains pure logic services (Node, Profile, Execution, etc.).
|
||||
- **Zero-Print Policy**: Services must never use `print()`. All output must be returned as data structures or generators to the caller (CLI handlers).
|
||||
- **ServiceProvider**: Access services via `connapp.services`. This allows transparent switching between local and remote (gRPC) backends without modifying CLI logic.
|
||||
'''
|
||||
from .core import node,nodes
|
||||
from .configfile import configfile
|
||||
|
||||
+1
-1
@@ -1 +1 @@
|
||||
__version__ = "5.0b6"
|
||||
__version__ = "5.1b1"
|
||||
|
||||
+226
-132
@@ -1,4 +1,5 @@
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import re
|
||||
import datetime
|
||||
@@ -23,11 +24,20 @@ console = printer.console
|
||||
class ai:
|
||||
"""Hybrid Multi-Agent System: Selective Escalation with Role Persistence."""
|
||||
|
||||
SAFE_COMMANDS = [r'^show\s+', r'^ls\s*', r'^cat\s+', r'^ip\s+route\s+show', r'^ip\s+addr\s+show', r'^ip\s+link\s+show', r'^pwd$', r'^hostname$', r'^uname', r'^df\s*', r'^free\s*', r'^ps\s*', r'^ping\s+', r'^traceroute\s+']
|
||||
SAFE_COMMANDS = [
|
||||
r'^show\s+', r'^ls\s*', r'^cat\s+', r'^ip\s+', r'^pwd$', r'^hostname$', r'^uname',
|
||||
r'^df\s*', r'^free\s*', r'^ps\s*', r'^ping\s+', r'^traceroute\s+', r'^whois\s+',
|
||||
r'^kubectl\s+(get|describe|version|logs|top|explain|cluster-info|api-resources|api-versions)\s+',
|
||||
r'^systemctl\s+status\s+', r'^journalctl\s+'
|
||||
]
|
||||
|
||||
def __init__(self, config, org=None, api_key=None, engineer_model=None, architect_model=None, engineer_api_key=None, architect_api_key=None):
|
||||
def __init__(self, config, org=None, api_key=None, engineer_model=None, architect_model=None, engineer_api_key=None, architect_api_key=None, console=None, confirm_handler=None, trust=False):
|
||||
self.config = config
|
||||
self.trusted_session = False # Trust mode for the entire session
|
||||
self.console = console or printer.console
|
||||
self.confirm_handler = confirm_handler or self._local_confirm_handler
|
||||
self.trusted_session = trust # Trust mode for the entire session
|
||||
self.interrupted = False
|
||||
|
||||
|
||||
# 1. Cargar configuración genérica
|
||||
aiconfig = self.config.config.get("ai", {})
|
||||
@@ -39,13 +49,12 @@ class ai:
|
||||
# API Keys (Prioridad: Argumento -> Config)
|
||||
self.engineer_key = engineer_api_key or aiconfig.get("engineer_api_key")
|
||||
self.architect_key = architect_api_key or aiconfig.get("architect_api_key")
|
||||
|
||||
# Validate configuration
|
||||
if not self.engineer_key:
|
||||
raise ValueError("Engineer API key not configured. Use 'conn config ai engineer_api_key <key>' to set it.")
|
||||
if not self.architect_key:
|
||||
console.print("[yellow]Warning: Architect API key not configured. Architect will be unavailable.[/yellow]")
|
||||
console.print("[yellow]Use 'conn config ai architect_api_key <key>' to enable it.[/yellow]")
|
||||
|
||||
# Custom Trusted Commands Regexes
|
||||
custom_trusted = aiconfig.get("trusted_commands", [])
|
||||
if isinstance(custom_trusted, str):
|
||||
custom_trusted = [c.strip() for c in custom_trusted.split(",") if c.strip()]
|
||||
self.safe_commands = list(self.SAFE_COMMANDS) + (custom_trusted if isinstance(custom_trusted, list) else [])
|
||||
|
||||
# Límites
|
||||
self.max_history = 30
|
||||
@@ -71,9 +80,9 @@ class ai:
|
||||
except FileNotFoundError:
|
||||
self.long_term_memory = ""
|
||||
except PermissionError as e:
|
||||
console.print(f"[yellow]Warning: Cannot read AI memory file: {e}[/yellow]")
|
||||
self.console.print(f"[warning]Warning: Cannot read AI memory file: {e}[/warning]")
|
||||
except Exception as e:
|
||||
console.print(f"[yellow]Warning: Failed to load AI memory: {e}[/yellow]")
|
||||
self.console.print(f"[warning]Warning: Failed to load AI memory: {e}[/warning]")
|
||||
|
||||
# Session Management
|
||||
self.sessions_dir = os.path.join(self.config.defaultdir, "ai_sessions")
|
||||
@@ -82,20 +91,9 @@ class ai:
|
||||
self.session_path = None
|
||||
|
||||
# Prompts base agnósticos
|
||||
self._engineer_base_prompt = dedent(f"""
|
||||
Role: TECHNICAL EXECUTION ENGINE.
|
||||
Expertise: Universal Networking (Cisco, Nokia, Juniper, 6wind, etc.).
|
||||
|
||||
Rules:
|
||||
- BE FAST: Execute tools directly to provide swift technical answers.
|
||||
- AUTONOMY: Proactively use iterative tool calls (list_nodes, run_commands) to find the root cause.
|
||||
- BATCH OPERATIONS: When working on multiple devices, call tools in parallel (multiple tool_calls in same response).
|
||||
- COMPLETE MISSIONS: Execute ALL steps of a mission before reporting back. Don't stop halfway.
|
||||
- DIAGRAM: Use ASCII art or Unicode box-drawing characters directly in your responses to visualize topologies or paths when helpful.
|
||||
- EVIDENCE: Include 'Key Snippets' from tool outputs. Be token-efficient.
|
||||
- NO WANDERING: Do not speculate. If stuck, report attempts.
|
||||
- SAFETY: When you use 'run_commands' with configuration commands, the system automatically prompts the user for confirmation. Just execute - don't ask permission first.
|
||||
|
||||
architect_instructions = ""
|
||||
if self.architect_key:
|
||||
architect_instructions = """
|
||||
CRITICAL - CONSULT vs ESCALATE:
|
||||
- ALWAYS use 'consult_architect' for: Configuration planning, design decisions, complex troubleshooting.
|
||||
Examples: "consultalo con el arquitecto", "preguntale al arquitecto", "que opina el arquitecto"
|
||||
@@ -106,8 +104,33 @@ class ai:
|
||||
After escalation, you hand over control completely.
|
||||
|
||||
- DEFAULT: When in doubt, use 'consult_architect'. Escalation is rare.
|
||||
"""
|
||||
else:
|
||||
architect_instructions = """
|
||||
CRITICAL - ARCHITECT UNAVAILABLE:
|
||||
- The Strategic Reasoning Engine (Architect) is currently UNAVAILABLE because its API key is not configured.
|
||||
- DO NOT attempt to consult or escalate to the architect.
|
||||
- If the user asks to consult the architect, inform them that the Architect is offline and offer to help them directly to the best of your abilities.
|
||||
"""
|
||||
|
||||
self._engineer_base_prompt = dedent(f"""
|
||||
Role: TECHNICAL EXECUTION ENGINE.
|
||||
Expertise: Universal Networking (Cisco, Nokia, Juniper, 6wind, etc.).
|
||||
|
||||
Network Context: {self.long_term_memory if self.long_term_memory else "Empty."}
|
||||
Rules:
|
||||
- BE FAST AND EXTREMELY CONCISE: Provide direct answers. No filler words, no decorative language, no polite pleasantries. Save output tokens at all costs.
|
||||
- KNOWLEDGE FIRST: For general networking questions (AS numbers, protocol details, standards, generic commands), use your internal knowledge. ONLY use tools when the user's specific infrastructure data is required.
|
||||
- INVENTORY ONLY: 'run_commands', 'list_nodes', and 'get_node_info' are ONLY for interacting with the user's inventory.
|
||||
- BROADCAST RESTRICTION: Avoid using filter '.*' in 'run_commands' unless the user explicitly requests a global action. Try to target specific nodes or groups based on the conversation.
|
||||
- AUTONOMY: Proactively use iterative tool calls to find the root cause of infrastructure issues.
|
||||
- BATCH OPERATIONS: When working on multiple devices, call tools in parallel.
|
||||
- COMPLETE MISSIONS: Execute ALL steps of a mission before reporting back.
|
||||
- DIAGRAM: Use ASCII art or Unicode box-drawing characters directly in your responses to visualize topologies or paths when helpful.
|
||||
- EVIDENCE: Include 'Key Snippets' from tool outputs. Be token-efficient.
|
||||
- NO WANDERING: Do not speculate. If stuck, report attempts.
|
||||
- SAFETY: When you use 'run_commands' with configuration commands, the system automatically prompts the user for confirmation. Just execute - don't ask permission first.
|
||||
{architect_instructions}
|
||||
Network Context: {{self.long_term_memory if self.long_term_memory else "Empty."}}
|
||||
""").strip()
|
||||
|
||||
self._architect_base_prompt = dedent(f"""
|
||||
@@ -115,6 +138,7 @@ class ai:
|
||||
Expertise: Network Architecture, Complex Troubleshooting, and Design Validation.
|
||||
|
||||
Rules:
|
||||
- CONCISENESS IS MANDATORY: Strip out fluff, decorative language, and filler words. Provide direct, tactical instructions and analysis to save output tokens.
|
||||
- STRATEGY: Define technical missions for the Engineer.
|
||||
- DIAGRAM: Use ASCII art or Unicode box-drawing characters in your responses to visualize topologies, traffic paths, or logic flows.
|
||||
- ENGINEER CAPABILITIES: Your Engineer can:
|
||||
@@ -137,6 +161,11 @@ class ai:
|
||||
Network Context: {self.long_term_memory if self.long_term_memory else "Empty."}
|
||||
""").strip()
|
||||
|
||||
def _local_confirm_handler(self, prompt, default="n"):
|
||||
"""Default confirmation handler using rich.prompt."""
|
||||
from rich.prompt import Prompt
|
||||
return Prompt.ask(prompt, default=default)
|
||||
|
||||
@property
|
||||
def engineer_system_prompt(self):
|
||||
"""Build engineer system prompt with plugin extensions."""
|
||||
@@ -177,57 +206,65 @@ class ai:
|
||||
if status_formatter:
|
||||
self.tool_status_formatters[name] = status_formatter
|
||||
|
||||
def _stream_completion(self, model, messages, tools, api_key, status=None, label="", debug=False, **kwargs):
|
||||
def _stream_completion(self, model, messages, tools, api_key, status=None, label="", debug=False, chunk_callback=None, **kwargs):
|
||||
"""Stream a completion call, rendering styled Markdown in real-time.
|
||||
|
||||
|
||||
Returns (response, streamed) where:
|
||||
- response: reconstructed ModelResponse (same as non-streaming)
|
||||
- streamed: True if text was rendered to console during streaming
|
||||
"""
|
||||
from rich.live import Live
|
||||
|
||||
|
||||
stream_resp = completion(model=model, messages=messages, tools=tools, api_key=api_key, stream=True, **kwargs)
|
||||
|
||||
|
||||
chunks = []
|
||||
full_content = ""
|
||||
is_streaming_text = False
|
||||
has_tool_calls = False
|
||||
live_display = None
|
||||
|
||||
|
||||
# Determine styling based on current brain
|
||||
role_label = "Network Architect" if "architect" in label.lower() else "Network Engineer"
|
||||
border = "medium_purple" if "architect" in label.lower() else "blue"
|
||||
title = f"[bold {border}]{role_label}[/bold {border}]"
|
||||
|
||||
alias = "architect" if "architect" in label.lower() else "engineer"
|
||||
title = f"[bold {alias}]{role_label}[/bold {alias}]"
|
||||
border = alias
|
||||
|
||||
try:
|
||||
for chunk in stream_resp:
|
||||
chunks.append(chunk)
|
||||
delta = chunk.choices[0].delta
|
||||
|
||||
|
||||
# Detect tool calls
|
||||
if hasattr(delta, 'tool_calls') and delta.tool_calls:
|
||||
has_tool_calls = True
|
||||
|
||||
|
||||
# Stream text content with styled rendering
|
||||
if hasattr(delta, 'content') and delta.content and not debug:
|
||||
if hasattr(delta, 'content') and delta.content:
|
||||
full_content += delta.content
|
||||
|
||||
if not is_streaming_text:
|
||||
# Stop spinner before starting live display
|
||||
if status:
|
||||
status.stop()
|
||||
live_display = Live(
|
||||
Panel(Markdown(full_content), title=title, border_style=border, expand=False),
|
||||
console=console,
|
||||
refresh_per_second=8,
|
||||
transient=False
|
||||
)
|
||||
live_display.start()
|
||||
is_streaming_text = True
|
||||
else:
|
||||
live_display.update(
|
||||
Panel(Markdown(full_content), title=title, border_style=border, expand=False)
|
||||
)
|
||||
|
||||
if chunk and chunk_callback:
|
||||
# Check for remote interruption during streaming
|
||||
if hasattr(self, "interrupted") and self.interrupted:
|
||||
raise KeyboardInterrupt
|
||||
chunk_callback(delta.content)
|
||||
|
||||
if not debug and not chunk_callback:
|
||||
if not is_streaming_text:
|
||||
# Stop spinner before starting live display
|
||||
if status:
|
||||
status.stop()
|
||||
live_display = Live(
|
||||
Panel(Markdown(full_content), title=title, border_style=border, expand=False),
|
||||
console=self.console,
|
||||
refresh_per_second=8,
|
||||
transient=False
|
||||
)
|
||||
live_display.start()
|
||||
is_streaming_text = True
|
||||
else:
|
||||
live_display.update(
|
||||
Panel(Markdown(full_content), title=title, border_style=border, expand=False)
|
||||
)
|
||||
except Exception as e:
|
||||
if not chunks:
|
||||
raise
|
||||
@@ -297,6 +334,7 @@ class ai:
|
||||
3. Orphaned tool_calls at the end are removed
|
||||
4. Orphaned tool responses without a preceding tool_call are removed
|
||||
5. Incompatible metadata like cache_control is stripped for non-Anthropic models
|
||||
6. Enforces strict alternating history to prevent BadRequestError on Gemini.
|
||||
"""
|
||||
if not messages:
|
||||
return messages
|
||||
@@ -309,8 +347,10 @@ class ai:
|
||||
|
||||
# Convert content list to plain string if it's a system message with caching metadata
|
||||
if m.get('role') == 'system' and isinstance(m.get('content'), list):
|
||||
# Extraer texto de [{"type": "text", "text": "...", "cache_control": ...}]
|
||||
m['content'] = m['content'][0]['text'] if m['content'] else ""
|
||||
if m['content'] and isinstance(m['content'][0], dict) and m['content'][0].get('text'):
|
||||
m['content'] = m['content'][0]['text']
|
||||
else:
|
||||
m['content'] = ""
|
||||
|
||||
# Remove any explicit cache_control key anywhere
|
||||
if 'cache_control' in m: del m['cache_control']
|
||||
@@ -321,43 +361,72 @@ class ai:
|
||||
pre_sanitized.append(m)
|
||||
|
||||
sanitized = []
|
||||
last_role = None
|
||||
|
||||
i = 0
|
||||
while i < len(pre_sanitized):
|
||||
msg = pre_sanitized[i]
|
||||
role = msg.get('role', '')
|
||||
|
||||
if role == 'assistant' and msg.get('tool_calls'):
|
||||
# Collect all expected tool_call_ids
|
||||
expected_ids = set()
|
||||
for tc in msg['tool_calls']:
|
||||
tc_id = tc.get('id') if isinstance(tc, dict) else getattr(tc, 'id', None)
|
||||
if tc_id:
|
||||
expected_ids.add(tc_id)
|
||||
if role == 'system':
|
||||
sanitized.append(msg)
|
||||
last_role = 'system'
|
||||
i += 1
|
||||
|
||||
# Look ahead for matching tool responses
|
||||
tool_responses = []
|
||||
j = i + 1
|
||||
while j < len(pre_sanitized):
|
||||
next_msg = pre_sanitized[j]
|
||||
if next_msg.get('role') == 'tool':
|
||||
tool_responses.append(next_msg)
|
||||
j += 1
|
||||
else:
|
||||
break
|
||||
|
||||
# Only include this assistant+tools block if we have responses
|
||||
if tool_responses:
|
||||
sanitized.append(msg)
|
||||
sanitized.extend(tool_responses)
|
||||
i = j
|
||||
elif role == 'user':
|
||||
if last_role == 'user' and sanitized:
|
||||
# Combine consecutive user messages
|
||||
sanitized[-1]['content'] = str(sanitized[-1].get('content', '') or '') + '\n' + str(msg.get('content', '') or '')
|
||||
else:
|
||||
# Orphaned tool_calls with no responses - skip the assistant message
|
||||
sanitized.append(msg)
|
||||
last_role = 'user'
|
||||
i += 1
|
||||
|
||||
elif role == 'assistant':
|
||||
has_tools = bool(msg.get('tool_calls'))
|
||||
|
||||
# Gemini strict sequence: Assistant MUST be preceded by user or tool.
|
||||
# If preceded by system, assistant, or if it's the very first message...
|
||||
if last_role not in ('user', 'tool'):
|
||||
sanitized.append({"role": "user", "content": "[System sequence separator: History Truncated/Merged]"})
|
||||
last_role = 'user'
|
||||
|
||||
if has_tools:
|
||||
# Look ahead for matching tool responses
|
||||
tool_responses = []
|
||||
j = i + 1
|
||||
while j < len(pre_sanitized):
|
||||
next_msg = pre_sanitized[j]
|
||||
if next_msg.get('role') == 'tool':
|
||||
tool_responses.append(next_msg)
|
||||
j += 1
|
||||
else:
|
||||
break
|
||||
|
||||
if tool_responses:
|
||||
sanitized.append(msg)
|
||||
sanitized.extend(tool_responses)
|
||||
last_role = 'tool'
|
||||
i = j
|
||||
else:
|
||||
# Orphaned tool_calls with no responses - skip the assistant message
|
||||
# If we just added a dummy user message for this assistant, remove it too
|
||||
if sanitized and sanitized[-1].get('content') == "[System sequence separator: History Truncated/Merged]":
|
||||
sanitized.pop()
|
||||
last_role = sanitized[-1].get('role', '') if sanitized else None
|
||||
i += 1
|
||||
else:
|
||||
sanitized.append(msg)
|
||||
last_role = 'assistant'
|
||||
i += 1
|
||||
|
||||
elif role == 'tool':
|
||||
# Orphaned tool response (no preceding assistant with tool_calls) - skip
|
||||
i += 1
|
||||
|
||||
else:
|
||||
sanitized.append(msg)
|
||||
last_role = role
|
||||
i += 1
|
||||
|
||||
return sanitized
|
||||
@@ -414,7 +483,7 @@ class ai:
|
||||
|
||||
def _is_safe_command(self, cmd):
|
||||
"""Check if a command matches safe patterns."""
|
||||
return any(re.match(pattern, cmd.strip(), re.IGNORECASE) for pattern in self.SAFE_COMMANDS)
|
||||
return any(re.match(pattern, cmd.strip(), re.IGNORECASE) for pattern in self.safe_commands)
|
||||
|
||||
def run_commands_tool(self, nodes_filter, commands, status=None):
|
||||
"""Execute commands on nodes matching the filter. Native interactive confirmation for unsafe commands."""
|
||||
@@ -445,35 +514,36 @@ class ai:
|
||||
formatted_cmds = []
|
||||
for cmd in commands:
|
||||
if cmd in unsafe_commands:
|
||||
formatted_cmds.append(f" • [yellow]{cmd}[/yellow]")
|
||||
formatted_cmds.append(f" • [warning]{cmd}[/warning]")
|
||||
else:
|
||||
formatted_cmds.append(f" • {cmd}")
|
||||
|
||||
panel_content = f"Target: {nodes_filter}\nCommands:\n" + "\n".join(formatted_cmds)
|
||||
console.print(Panel(panel_content, title="[bold yellow]⚠️ UNSAFE COMMANDS DETECTED[/bold yellow]", border_style="yellow"))
|
||||
# Use print_important if available (for remote bridges) fallback to standard print
|
||||
print_fn = getattr(self.console, "print_important", self.console.print)
|
||||
print_fn(Panel(panel_content, title="[bold warning]⚠️ UNSAFE COMMANDS DETECTED[/bold warning]", border_style="warning"))
|
||||
|
||||
try:
|
||||
from rich.prompt import Prompt
|
||||
user_resp = Prompt.ask("[bold yellow]Execute? (y: yes / n: no / a: allow all this session / <text>: feedback)[/bold yellow]", default="n")
|
||||
user_resp = self.confirm_handler("[bold warning]Execute? (y: yes / n: no / a: allow all this session / <text>: feedback)[/bold warning]", default="n")
|
||||
except KeyboardInterrupt:
|
||||
if status: status.update("[bold blue]Engineer: Resuming...")
|
||||
console.print("[bold red]✗ Aborted by user (Ctrl+C).[/bold red]")
|
||||
return "Error: User cancelled execution (Ctrl+C)."
|
||||
if status: status.update("[ai_status]Engineer: Resuming...")
|
||||
self.console.print("[fail]✗ Aborted by user (Ctrl+C).[/fail]")
|
||||
raise
|
||||
|
||||
# Resume the spinner
|
||||
if status: status.update("[bold blue]Engineer: Processing user response...")
|
||||
if status: status.update("[ai_status]Engineer: Processing user response...")
|
||||
|
||||
user_resp_lower = user_resp.strip().lower()
|
||||
if user_resp_lower in ['a', 'allow']:
|
||||
self.trusted_session = True
|
||||
console.print("[bold green]✓ Trust Mode Enabled. All future commands in this session will execute without confirmation.[/bold green]")
|
||||
self.console.print("[pass]✓ Trust Mode Enabled. All future commands in this session will execute without confirmation.[/pass]")
|
||||
elif user_resp_lower in ['y', 'yes']:
|
||||
console.print("[bold green]✓ Executing...[/bold green]")
|
||||
self.console.print("[pass]✓ Executing...[/pass]")
|
||||
elif user_resp_lower in ['n', 'no', '']:
|
||||
console.print("[bold red]✗ Execution rejected by user.[/bold red]")
|
||||
self.console.print("[fail]✗ Execution rejected by user.[/fail]")
|
||||
return "Error: User rejected execution."
|
||||
else:
|
||||
console.print(f"[bold cyan]User feedback: [/bold cyan]{user_resp}")
|
||||
self.console.print(f"[user_prompt]User feedback: [/user_prompt]{user_resp}")
|
||||
return f"User requested changes: {user_resp}. Please adjust the commands based on this feedback and try again."
|
||||
|
||||
try:
|
||||
@@ -517,22 +587,31 @@ class ai:
|
||||
soft_limit_warned = False
|
||||
|
||||
try:
|
||||
# Set up remote interrupt callback if bridge is provided
|
||||
if status and hasattr(status, "on_interrupt"):
|
||||
status.on_interrupt = lambda: setattr(self, "interrupted", True)
|
||||
|
||||
while iteration < self.hard_limit_iterations:
|
||||
iteration += 1
|
||||
|
||||
# Check for interruption
|
||||
if self.interrupted:
|
||||
raise KeyboardInterrupt
|
||||
|
||||
# Soft limit warning
|
||||
if iteration == self.soft_limit_iterations and not soft_limit_warned:
|
||||
console.print(f"[yellow]⚠ Engineer has performed {iteration} steps. This is taking longer than expected.[/yellow]")
|
||||
console.print(f"[yellow] You can press Ctrl+C to interrupt and get a summary.[/yellow]")
|
||||
self.console.print(f"[warning]⚠ Engineer has performed {iteration} steps. This is taking longer than expected.[/warning]")
|
||||
self.console.print(f"[warning] You can press Ctrl+C to interrupt and get a summary.[/warning]")
|
||||
soft_limit_warned = True
|
||||
|
||||
if status: status.update(f"[bold blue]Engineer: Analyzing mission... (step {iteration})")
|
||||
if status: status.update(f"[ai_status]Engineer: Analyzing mission... (step {iteration})")
|
||||
|
||||
try:
|
||||
safe_messages = self._sanitize_messages(messages)
|
||||
response = completion(model=self.engineer_model, messages=safe_messages, tools=tools, api_key=self.engineer_key)
|
||||
except Exception as e:
|
||||
return f"Engineer failed to connect: {str(e)}", usage
|
||||
if status: status.stop()
|
||||
raise ValueError(f"Engineer failed to connect: {str(e)}")
|
||||
|
||||
if hasattr(response, "usage") and response.usage:
|
||||
usage["input"] += getattr(response.usage, "prompt_tokens", 0)
|
||||
@@ -550,15 +629,15 @@ class ai:
|
||||
|
||||
# Notificación en tiempo real de la tarea técnica
|
||||
if status:
|
||||
if fn == "list_nodes": status.update(f"[bold blue]Engineer: [SEARCH] {args.get('filter_pattern','.*')}")
|
||||
if fn == "list_nodes": status.update(f"[ai_status]Engineer: [SEARCH] {args.get('filter_pattern','.*')}")
|
||||
elif fn == "run_commands":
|
||||
cmds = args.get('commands', [])
|
||||
cmd_str = cmds[0] if cmds else ""
|
||||
status.update(f"[bold blue]Engineer: [CMD] {cmd_str}")
|
||||
elif fn == "get_node_info": status.update(f"[bold blue]Engineer: [INSPECT] {args.get('node_name','')}")
|
||||
status.update(f"[ai_status]Engineer: [CMD] {cmd_str}")
|
||||
elif fn == "get_node_info": status.update(f"[ai_status]Engineer: [INSPECT] {args.get('node_name','')}")
|
||||
elif fn in self.tool_status_formatters: status.update(self.tool_status_formatters[fn](args))
|
||||
|
||||
if debug: console.print(Panel(Text(json.dumps(args, indent=2)), title=f"[bold blue]Engineer Tool: {fn}[/bold blue]", border_style="blue"))
|
||||
if debug: self.console.print(Panel(Text(json.dumps(args, indent=2)), title=f"[bold engineer]Engineer Tool: {fn}[/bold engineer]", border_style="engineer"))
|
||||
|
||||
if fn == "list_nodes": obs = self.list_nodes_tool(**args)
|
||||
elif fn == "run_commands": obs = self.run_commands_tool(**args, status=status)
|
||||
@@ -566,14 +645,14 @@ class ai:
|
||||
elif fn in self.external_tool_handlers: obs = self.external_tool_handlers[fn](self, **args)
|
||||
else: obs = f"Error: Unknown tool '{fn}'."
|
||||
|
||||
if debug: console.print(Panel(Text(str(obs)), title=f"[bold green]Engineer Observation: {fn}[/bold green]", border_style="green"))
|
||||
if debug: self.console.print(Panel(Text(str(obs)), title=f"[bold pass]Engineer Observation: {fn}[/bold pass]", border_style="success"))
|
||||
messages.append({"tool_call_id": tc.id, "role": "tool", "name": fn, "content": obs})
|
||||
|
||||
if iteration >= self.hard_limit_iterations:
|
||||
console.print(f"[red]⛔ Engineer reached hard limit ({self.hard_limit_iterations} steps). Forcing stop.[/red]")
|
||||
self.console.print(f"[error]⛔ Engineer reached hard limit ({self.hard_limit_iterations} steps). Forcing stop.[/error]")
|
||||
|
||||
if debug and resp_msg.content:
|
||||
console.print(Panel(Text(resp_msg.content), title="[bold blue]Engineer Final Report to Architect[/bold blue]", border_style="blue"))
|
||||
self.console.print(Panel(Text(resp_msg.content), title="[bold engineer]Engineer Final Report to Architect[/bold engineer]", border_style="engineer"))
|
||||
|
||||
return resp_msg.content, usage
|
||||
except Exception as e:
|
||||
@@ -584,10 +663,15 @@ class ai:
|
||||
tools = [
|
||||
{"type": "function", "function": {"name": "list_nodes", "description": "Lists available nodes in the inventory.", "parameters": {"type": "object", "properties": {"filter_pattern": {"type": "string", "description": "Regex to filter nodes (e.g. '.*', 'border.*')."}}}}},
|
||||
{"type": "function", "function": {"name": "run_commands", "description": "Runs one or more commands on matched nodes. MANDATORY: You MUST call 'list_nodes' first to verify the target list.", "parameters": {"type": "object", "properties": {"nodes_filter": {"type": "string", "description": "Exact node name or verified filter pattern."}, "commands": {"type": "array", "items": {"type": "string"}, "description": "List of commands (e.g. ['show ip route', 'show int desc'])."}}, "required": ["nodes_filter", "commands"]}}},
|
||||
{"type": "function", "function": {"name": "get_node_info", "description": "Gets full metadata for a specific node.", "parameters": {"type": "object", "properties": {"node_name": {"type": "string"}}, "required": ["node_name"]}}},
|
||||
{"type": "function", "function": {"name": "consult_architect", "description": "Ask the Strategic Reasoning Engine for advice on complex design, architecture, or troubleshooting decisions. You remain in control and will present the response to the user. Use this for: configuration planning, design validation, complex troubleshooting.", "parameters": {"type": "object", "properties": {"question": {"type": "string", "description": "Strategic question or decision needed."}, "technical_summary": {"type": "string", "description": "Technical findings and context gathered so far."}}, "required": ["question", "technical_summary"]}}},
|
||||
{"type": "function", "function": {"name": "escalate_to_architect", "description": "Transfer full control to the Strategic Reasoning Engine. Use ONLY when the user explicitly requests the Architect or when the problem requires strategic oversight beyond consultation. After escalation, the Architect takes over the conversation.", "parameters": {"type": "object", "properties": {"reason": {"type": "string", "description": "Why you're escalating (e.g. 'User requested Architect', 'Complex multi-site design needed')."}, "context": {"type": "string", "description": "Full context and findings to hand over."}}, "required": ["reason", "context"]}}}
|
||||
{"type": "function", "function": {"name": "get_node_info", "description": "Gets full metadata for a specific node.", "parameters": {"type": "object", "properties": {"node_name": {"type": "string"}}, "required": ["node_name"]}}}
|
||||
]
|
||||
|
||||
if self.architect_key:
|
||||
tools.extend([
|
||||
{"type": "function", "function": {"name": "consult_architect", "description": "Ask the Strategic Reasoning Engine for advice on complex design, architecture, or troubleshooting decisions. You remain in control and will present the response to the user. Use this for: configuration planning, design validation, complex troubleshooting.", "parameters": {"type": "object", "properties": {"question": {"type": "string", "description": "Strategic question or decision needed."}, "technical_summary": {"type": "string", "description": "Technical findings and context gathered so far."}}, "required": ["question", "technical_summary"]}}},
|
||||
{"type": "function", "function": {"name": "escalate_to_architect", "description": "Transfer full control to the Strategic Reasoning Engine. Use ONLY when the user explicitly requests the Architect or when the problem requires strategic oversight beyond consultation. After escalation, the Architect takes over the conversation.", "parameters": {"type": "object", "properties": {"reason": {"type": "string", "description": "Why you're escalating (e.g. 'User requested Architect', 'Complex multi-site design needed')."}, "context": {"type": "string", "description": "Full context and findings to hand over."}}, "required": ["reason", "context"]}}}
|
||||
])
|
||||
|
||||
tools.extend(self.external_engineer_tools)
|
||||
return tools
|
||||
|
||||
@@ -709,7 +793,10 @@ class ai:
|
||||
printer.error(f"Failed to save session: {e}")
|
||||
|
||||
@MethodHook
|
||||
def ask(self, user_input, dryrun=False, chat_history=None, status=None, debug=False, stream=True, session_id=None):
|
||||
def ask(self, user_input, dryrun=False, chat_history=None, status=None, debug=False, stream=True, session_id=None, chunk_callback=None):
|
||||
if not self.engineer_key:
|
||||
raise ValueError("Engineer API key not configured. Use 'connpy config --engineer-api-key <key>' to set it.")
|
||||
|
||||
if chat_history is None: chat_history = []
|
||||
|
||||
# Load session if provided and history is empty
|
||||
@@ -781,20 +868,25 @@ class ai:
|
||||
|
||||
# 3. Bucle de ejecución
|
||||
iteration = 0
|
||||
soft_limit_warned = False
|
||||
streamed_response = False
|
||||
|
||||
try:
|
||||
# Set up remote interrupt callback if bridge is provided
|
||||
if status and hasattr(status, "on_interrupt"):
|
||||
status.on_interrupt = lambda: setattr(self, "interrupted", True)
|
||||
|
||||
while iteration < self.hard_limit_iterations:
|
||||
iteration += 1
|
||||
|
||||
# Check for interruption
|
||||
if self.interrupted:
|
||||
raise KeyboardInterrupt
|
||||
|
||||
# Soft limit warning
|
||||
if iteration == self.soft_limit_iterations and not soft_limit_warned:
|
||||
console.print(f"[yellow]⚠ Agent has performed {iteration} steps. This is taking longer than expected.[/yellow]")
|
||||
console.print(f"[yellow] You can press Ctrl+C to interrupt and get a summary of progress.[/yellow]")
|
||||
self.console.print(f"[warning]⚠ Agent has performed {iteration} steps. This is taking longer than expected.[/warning]")
|
||||
self.console.print(f"[warning] You can press Ctrl+C to interrupt and get a summary of progress.[/warning]")
|
||||
soft_limit_warned = True
|
||||
|
||||
label = "[bold medium_purple]Architect" if current_brain == "architect" else "[bold blue]Engineer"
|
||||
label = "[architect][bold]Architect[/bold][/architect]" if current_brain == "architect" else "[engineer][bold]Engineer[/bold][/engineer]"
|
||||
if status: status.update(f"{label} is thinking... (step {iteration})")
|
||||
|
||||
streamed_response = False
|
||||
@@ -803,13 +895,14 @@ class ai:
|
||||
if stream and not debug:
|
||||
response, streamed_response = self._stream_completion(
|
||||
model=model, messages=safe_messages, tools=tools, api_key=key,
|
||||
status=status, label=label, debug=debug, num_retries=3
|
||||
status=status, label=label, debug=debug, num_retries=3,
|
||||
chunk_callback=chunk_callback
|
||||
)
|
||||
else:
|
||||
response = completion(model=model, messages=safe_messages, tools=tools, api_key=key, num_retries=3)
|
||||
except Exception as e:
|
||||
if current_brain == "architect":
|
||||
if status: status.update("[bold orange3]Architect unavailable! Falling back to Engineer...")
|
||||
if status: status.update("[unavailable]Architect unavailable! Falling back to Engineer...")
|
||||
# Preserve context when falling back - use clean_input directly
|
||||
current_brain = "engineer"
|
||||
model = self.engineer_model
|
||||
@@ -839,7 +932,7 @@ class ai:
|
||||
messages.append(msg_dict)
|
||||
|
||||
if debug and resp_msg.content:
|
||||
console.print(Panel(Markdown(resp_msg.content), title=f"{label} Reasoning", border_style="medium_purple" if current_brain == "architect" else "blue"))
|
||||
self.console.print(Panel(Markdown(resp_msg.content), title=f"{label} Reasoning", border_style="architect" if current_brain == "architect" else "engineer"))
|
||||
|
||||
if not resp_msg.tool_calls: break
|
||||
|
||||
@@ -856,16 +949,16 @@ class ai:
|
||||
continue
|
||||
|
||||
if status:
|
||||
if fn == "delegate_to_engineer": status.update(f"[bold medium_purple]Architect: [DELEGATING MISSION] {args.get('task','')[:40]}...")
|
||||
elif fn == "manage_memory_tool": status.update(f"[bold medium_purple]Architect: [UPDATING MEMORY]")
|
||||
if fn == "delegate_to_engineer": status.update(f"[architect]Architect: [DELEGATING MISSION] {args.get('task','')[:40]}...")
|
||||
elif fn == "manage_memory_tool": status.update(f"[architect]Architect: [UPDATING MEMORY]")
|
||||
|
||||
if debug: console.print(Panel(Text(json.dumps(args, indent=2)), title=f"{label} Decision: {fn}", border_style="white"))
|
||||
if debug: self.console.print(Panel(Text(json.dumps(args, indent=2)), title=f"{label} Decision: {fn}", border_style="debug"))
|
||||
|
||||
if fn == "delegate_to_engineer":
|
||||
obs, eng_usage = self._engineer_loop(args["task"], status=status, debug=debug, chat_history=messages[:-1])
|
||||
usage["input"] += eng_usage["input"]; usage["output"] += eng_usage["output"]; usage["total"] += eng_usage["total"]
|
||||
elif fn == "consult_architect":
|
||||
if status: status.update("[bold medium_purple]Engineer consulting Architect...")
|
||||
if status: status.update("[architect]Engineer consulting Architect...")
|
||||
try:
|
||||
# Consultation only - Engineer stays in control
|
||||
claude_resp = completion(
|
||||
@@ -878,13 +971,13 @@ class ai:
|
||||
num_retries=3
|
||||
)
|
||||
obs = claude_resp.choices[0].message.content
|
||||
if debug: console.print(Panel(Markdown(obs), title="[bold medium_purple]Architect Consultation[/bold medium_purple]", border_style="medium_purple"))
|
||||
if debug: self.console.print(Panel(Markdown(obs), title="[architect]Architect Consultation[/architect]", border_style="architect"))
|
||||
except Exception as e:
|
||||
if status: status.update("[bold orange3]Architect unavailable! Engineer continuing alone...")
|
||||
if status: status.update("[unavailable]Architect unavailable! Engineer continuing alone...")
|
||||
obs = f"Architect unavailable ({str(e)}). Proceeding with your best technical judgment."
|
||||
|
||||
elif fn == "escalate_to_architect":
|
||||
if status: status.update("[bold medium_purple]Transferring control to Architect...")
|
||||
if status: status.update("[architect]Transferring control to Architect...")
|
||||
# Full escalation - Architect takes over
|
||||
current_brain = "architect"
|
||||
model = self.architect_model
|
||||
@@ -895,10 +988,10 @@ class ai:
|
||||
handover_msg = f"HANDOVER FROM EXECUTION ENGINE\n\nReason: {args['reason']}\n\nContext: {args['context']}\n\nYou are now in control of this conversation."
|
||||
pending_user_message = handover_msg
|
||||
obs = "Control transferred to Architect. Handover context will be provided."
|
||||
if debug: console.print(Panel(Text(handover_msg), title="[bold medium_purple]Escalation to Architect[/bold medium_purple]", border_style="medium_purple"))
|
||||
if debug: self.console.print(Panel(Text(handover_msg), title="[architect]Escalation to Architect[/architect]", border_style="architect"))
|
||||
|
||||
elif fn == "return_to_engineer":
|
||||
if status: status.update("[bold blue]Transferring control back to Engineer...")
|
||||
if status: status.update("[engineer]Transferring control back to Engineer...")
|
||||
# Architect returns control to Engineer
|
||||
current_brain = "engineer"
|
||||
model = self.engineer_model
|
||||
@@ -909,7 +1002,7 @@ class ai:
|
||||
handover_msg = f"HANDOVER FROM ARCHITECT\n\nSummary: {args['summary']}\n\nYou are now back in control. Continue handling the user's requests."
|
||||
pending_user_message = handover_msg
|
||||
obs = "Control returned to Engineer. Handover summary will be provided."
|
||||
if debug: console.print(Panel(Text(handover_msg), title="[bold blue]Return to Engineer[/bold blue]", border_style="blue"))
|
||||
if debug: self.console.print(Panel(Text(handover_msg), title="[engineer]Return to Engineer[/engineer]", border_style="engineer"))
|
||||
|
||||
elif fn == "list_nodes": obs = self.list_nodes_tool(**args)
|
||||
elif fn == "run_commands": obs = self.run_commands_tool(**args, status=status)
|
||||
@@ -925,7 +1018,7 @@ class ai:
|
||||
messages.append({"role": "user", "content": pending_user_message})
|
||||
|
||||
if iteration >= self.hard_limit_iterations:
|
||||
console.print(f"[red]⛔ Agent reached hard limit ({self.hard_limit_iterations} steps). Forcing stop to prevent infinite loop.[/red]")
|
||||
self.console.print(f"[error]⛔ Agent reached hard limit ({self.hard_limit_iterations} steps). Forcing stop to prevent infinite loop.[/error]")
|
||||
# Only inject user message if we're not in the middle of tool calls
|
||||
last_msg = messages[-1] if messages else {}
|
||||
if last_msg.get("role") != "assistant" or not last_msg.get("tool_calls"):
|
||||
@@ -937,10 +1030,10 @@ class ai:
|
||||
messages.append(resp_msg.model_dump(exclude_none=True))
|
||||
except Exception as e:
|
||||
if status:
|
||||
status.update(f"[bold red]Error fetching summary: {e}[/bold red]")
|
||||
status.update(f"[error]Error fetching summary: {e}[/error]")
|
||||
printer.warning(f"Failed to fetch final summary from LLM: {e}")
|
||||
except KeyboardInterrupt:
|
||||
if status: status.update("[bold red]Interrupted! Closing pending tasks...")
|
||||
if status: status.update("[error]Interrupted! Closing pending tasks...")
|
||||
last_msg = messages[-1]
|
||||
if last_msg.get("tool_calls"):
|
||||
for tc in last_msg["tool_calls"]:
|
||||
@@ -948,7 +1041,8 @@ class ai:
|
||||
messages.append({"role": "user", "content": "USER INTERRUPTED. Briefly summarize what you were doing and stop."})
|
||||
try:
|
||||
safe_messages = self._sanitize_messages(messages)
|
||||
response = completion(model=model, messages=safe_messages, tools=tools, api_key=key)
|
||||
# Use tools=None to force a text summary during interruption
|
||||
response = completion(model=model, messages=safe_messages, tools=None, api_key=key)
|
||||
resp_msg = response.choices[0].message
|
||||
messages.append(resp_msg.model_dump(exclude_none=True))
|
||||
except Exception: pass
|
||||
|
||||
+42
-138
@@ -1,150 +1,42 @@
|
||||
from flask import Flask, request, jsonify
|
||||
from flask_cors import CORS
|
||||
from connpy import configfile, node, nodes, hooks, printer
|
||||
from connpy.ai import ai as myai
|
||||
from waitress import serve
|
||||
import os
|
||||
import signal
|
||||
import time
|
||||
|
||||
app = Flask(__name__)
|
||||
CORS(app)
|
||||
# conf = configfile() # REMOVED: Item #1 in Roadmap -> Don't instantiate globally
|
||||
# Suppress harmless but noisy gRPC fork() warnings from pexpect child processes
|
||||
os.environ["GRPC_VERBOSITY"] = "NONE"
|
||||
os.environ["GRPC_ENABLE_FORK_SUPPORT"] = "0"
|
||||
|
||||
from connpy import hooks, printer
|
||||
from connpy.configfile import configfile
|
||||
|
||||
PID_FILE1 = "/run/connpy.pid"
|
||||
PID_FILE2 = "/tmp/connpy.pid"
|
||||
|
||||
|
||||
@app.route("/")
|
||||
def root():
|
||||
return jsonify({
|
||||
'message': 'Welcome to Connpy api',
|
||||
'version': '1.0',
|
||||
'documentation': 'https://fluzzi.github.io/connpy/'
|
||||
})
|
||||
|
||||
@app.route("/list_nodes", methods=["POST"])
|
||||
def list_nodes():
|
||||
conf = app.custom_config
|
||||
case = conf.config["case"]
|
||||
def _wait_for_termination():
|
||||
try:
|
||||
data = request.get_json()
|
||||
filter = data["filter"]
|
||||
if not case:
|
||||
if isinstance(filter, list):
|
||||
filter = [item.lower() for item in filter]
|
||||
else:
|
||||
filter = filter.lower()
|
||||
output = conf._getallnodes(filter)
|
||||
except Exception:
|
||||
output = conf._getallnodes()
|
||||
return jsonify(output)
|
||||
|
||||
@app.route("/get_nodes", methods=["POST"])
|
||||
def get_nodes():
|
||||
conf = app.custom_config
|
||||
case = conf.config["case"]
|
||||
try:
|
||||
data = request.get_json()
|
||||
filter = data["filter"]
|
||||
if not case:
|
||||
if isinstance(filter, list):
|
||||
filter = [item.lower() for item in filter]
|
||||
else:
|
||||
filter = filter.lower()
|
||||
output = conf._getallnodesfull(filter)
|
||||
except Exception:
|
||||
output = conf._getallnodesfull()
|
||||
return jsonify(output)
|
||||
|
||||
@app.route("/ask_ai", methods=["POST"])
|
||||
def ask_ai():
|
||||
conf = app.custom_config
|
||||
data = request.get_json()
|
||||
input = data["input"]
|
||||
if "dryrun" in data:
|
||||
dryrun = data["dryrun"]
|
||||
else:
|
||||
dryrun = False
|
||||
if "chat_history" in data:
|
||||
chat_history = data["chat_history"]
|
||||
else:
|
||||
chat_history = None
|
||||
ai = myai(conf)
|
||||
return ai.ask(input, dryrun, chat_history)
|
||||
|
||||
@app.route("/confirm", methods=["POST"])
|
||||
def confirm():
|
||||
conf = app.custom_config
|
||||
data = request.get_json()
|
||||
input = data["input"]
|
||||
ai = myai(conf)
|
||||
return str(ai.confirm(input))
|
||||
|
||||
@app.route("/run_commands", methods=["POST"])
|
||||
def run_commands():
|
||||
conf = app.custom_config
|
||||
data = request.get_json()
|
||||
case = conf.config["case"]
|
||||
mynodes = {}
|
||||
args = {}
|
||||
try:
|
||||
action = data["action"]
|
||||
nodelist = data["nodes"]
|
||||
args["commands"] = data["commands"]
|
||||
if action == "test":
|
||||
args["expected"] = data["expected"]
|
||||
except KeyError as e:
|
||||
error = "'{}' is mandatory".format(e.args[0])
|
||||
return({"DataError": error})
|
||||
if isinstance(nodelist, list):
|
||||
mynodes = conf.getitems(nodelist)
|
||||
else:
|
||||
if not case:
|
||||
nodelist = nodelist.lower()
|
||||
if nodelist.startswith("@"):
|
||||
mynodes = conf.getitem(nodelist)
|
||||
else:
|
||||
mynodes[nodelist] = conf.getitem(nodelist)
|
||||
|
||||
mynodes = nodes(mynodes, config=conf)
|
||||
try:
|
||||
args["vars"] = data["vars"]
|
||||
except Exception:
|
||||
while True:
|
||||
time.sleep(86400)
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
try:
|
||||
options = data["options"]
|
||||
thisoptions = {k: v for k, v in options.items() if k in ["prompt", "parallel", "timeout"]}
|
||||
args.update(thisoptions)
|
||||
except Exception:
|
||||
options = None
|
||||
if action == "run":
|
||||
output = mynodes.run(**args)
|
||||
elif action == "test":
|
||||
output = {}
|
||||
output["result"] = mynodes.test(**args)
|
||||
output["output"] = mynodes.output
|
||||
else:
|
||||
error = "Wrong action '{}'".format(action)
|
||||
return({"DataError": error})
|
||||
return output
|
||||
|
||||
@hooks.MethodHook
|
||||
def stop_api():
|
||||
# Read the process ID (pid) from the file
|
||||
try:
|
||||
with open(PID_FILE1, "r") as f:
|
||||
pid = int(f.readline().strip())
|
||||
port = int(f.readline().strip())
|
||||
PID_FILE=PID_FILE1
|
||||
port_line = f.readline().strip()
|
||||
port = int(port_line) if port_line else None
|
||||
PID_FILE = PID_FILE1
|
||||
except (FileNotFoundError, ValueError, OSError):
|
||||
try:
|
||||
with open(PID_FILE2, "r") as f:
|
||||
pid = int(f.readline().strip())
|
||||
port = int(f.readline().strip())
|
||||
PID_FILE=PID_FILE2
|
||||
port_line = f.readline().strip()
|
||||
port = int(port_line) if port_line else None
|
||||
PID_FILE = PID_FILE2
|
||||
except (FileNotFoundError, ValueError, OSError):
|
||||
printer.warning("Connpy API server is not running.")
|
||||
return
|
||||
return None
|
||||
# Send a SIGTERM signal to the process
|
||||
try:
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
@@ -155,21 +47,34 @@ def stop_api():
|
||||
printer.info(f"Server with process ID {pid} stopped.")
|
||||
return port
|
||||
|
||||
@hooks.MethodHook
|
||||
def debug_api(port=8048, config=None):
|
||||
app.custom_config = config or configfile()
|
||||
app.run(debug=True, port=port)
|
||||
from .grpc.server import serve
|
||||
conf = config or configfile()
|
||||
server = serve(conf, port=port, debug=True)
|
||||
printer.info(f"gRPC Server running in debug mode on port {port}...")
|
||||
_wait_for_termination()
|
||||
server.stop(0)
|
||||
|
||||
@hooks.MethodHook
|
||||
def start_server(port=8048, config=None):
|
||||
app.custom_config = config or configfile()
|
||||
serve(app, host='0.0.0.0', port=port)
|
||||
from .grpc.server import serve
|
||||
conf = config or configfile()
|
||||
server = serve(conf, port=port, debug=False)
|
||||
_wait_for_termination()
|
||||
|
||||
@hooks.MethodHook
|
||||
def start_api(port=8048, config=None):
|
||||
if os.path.exists(PID_FILE1) or os.path.exists(PID_FILE2):
|
||||
printer.warning("Connpy server is already running.")
|
||||
return
|
||||
# Check if already running via PID file verification
|
||||
for pid_file in [PID_FILE1, PID_FILE2]:
|
||||
if os.path.exists(pid_file):
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid = int(f.readline().strip())
|
||||
os.kill(pid, 0)
|
||||
# If we get here, process exists
|
||||
return
|
||||
except (ValueError, OSError, ProcessLookupError):
|
||||
# Stale PID file, ignore here, start_api will overwrite
|
||||
pass
|
||||
|
||||
pid = os.fork()
|
||||
if pid == 0:
|
||||
start_server(port, config=config)
|
||||
@@ -184,5 +89,4 @@ def start_api(port=8048, config=None):
|
||||
except OSError:
|
||||
printer.error("Couldn't create PID file.")
|
||||
exit(1)
|
||||
printer.start(f"Server is running with process ID {pid} on port {port}")
|
||||
|
||||
printer.start(f"gRPC Server is running with process ID {pid} on port {port}")
|
||||
|
||||
@@ -0,0 +1,10 @@
|
||||
from .node_handler import NodeHandler
|
||||
from .profile_handler import ProfileHandler
|
||||
from .config_handler import ConfigHandler
|
||||
from .run_handler import RunHandler
|
||||
from .ai_handler import AIHandler
|
||||
from .api_handler import APIHandler
|
||||
from .plugin_handler import PluginHandler
|
||||
from .import_export_handler import ImportExportHandler
|
||||
from .context_handler import ContextHandler
|
||||
|
||||
@@ -0,0 +1,137 @@
|
||||
import sys
|
||||
from rich.panel import Panel
|
||||
from rich.markdown import Markdown
|
||||
from rich.rule import Rule
|
||||
from rich.prompt import Prompt
|
||||
|
||||
from .. import printer
|
||||
|
||||
console = printer.console
|
||||
mdprint = console.print
|
||||
|
||||
class AIHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
if args.list_sessions:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
if not sessions:
|
||||
printer.info("No saved AI sessions found.")
|
||||
return
|
||||
columns = ["ID", "Title", "Created At", "Model"]
|
||||
rows = [[s["id"], s["title"], s["created_at"], s["model"]] for s in sessions]
|
||||
printer.table("AI Persisted Sessions", columns, rows)
|
||||
return
|
||||
|
||||
if args.delete_session:
|
||||
try:
|
||||
self.app.services.ai.delete_session(args.delete_session[0])
|
||||
printer.success(f"Session {args.delete_session[0]} deleted.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
# Determinar session_id para retomar
|
||||
session_id = None
|
||||
if args.resume:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
session_id = sessions[0]["id"] if sessions else None
|
||||
if not session_id:
|
||||
printer.warning("No previous session found to resume.")
|
||||
elif args.session:
|
||||
session_id = args.session[0]
|
||||
|
||||
# Configurar argumentos adicionales para el servicio de AI
|
||||
# Prioridad: CLI Args > Configuración Local
|
||||
settings = self.app.services.config_svc.get_settings().get("ai", {})
|
||||
arguments = {}
|
||||
|
||||
for key in ["engineer_model", "engineer_api_key", "architect_model", "architect_api_key"]:
|
||||
cli_val = getattr(args, key, None)
|
||||
if cli_val:
|
||||
arguments[key] = cli_val[0]
|
||||
elif settings.get(key):
|
||||
arguments[key] = settings.get(key)
|
||||
|
||||
# Check keys only if running in local mode (not remote)
|
||||
if getattr(self.app.services, "mode", "local") == "local":
|
||||
if not arguments.get("engineer_api_key"):
|
||||
printer.error("Engineer API key not configured. The chat cannot start.")
|
||||
printer.info("Use 'connpy config --engineer-api-key <key>' to set it.")
|
||||
sys.exit(1)
|
||||
if not arguments.get("architect_api_key"):
|
||||
printer.warning("Architect API key not configured. Architect will be unavailable.")
|
||||
printer.info("Use 'connpy config --architect-api-key <key>' to enable it.")
|
||||
|
||||
# El resto de la interacción el CLI la maneja con el agente subyacente
|
||||
self.app.myai = self.app.services.ai
|
||||
self.ai_overrides = arguments
|
||||
|
||||
if args.ask:
|
||||
self.single_question(args, session_id)
|
||||
else:
|
||||
self.interactive_chat(args, session_id)
|
||||
|
||||
def single_question(self, args, session_id):
|
||||
query = " ".join(args.ask)
|
||||
with console.status("[ai_status]Agent is thinking and analyzing...") as status:
|
||||
result = self.app.myai.ask(query, status=status, debug=args.debug, session_id=session_id, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
mdprint(Panel(Markdown(result["response"]), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()
|
||||
|
||||
def interactive_chat(self, args, session_id):
|
||||
history = None
|
||||
if session_id:
|
||||
session_data = self.app.myai.load_session_data(session_id)
|
||||
if session_data:
|
||||
history = session_data.get("history", [])
|
||||
mdprint(Rule(title=f"[header] Resuming Session: {session_data.get('title')} [/header]", style="border"))
|
||||
if history:
|
||||
mdprint(f"[debug]Analyzing {len(history)} previous messages...[/debug]\n")
|
||||
else:
|
||||
printer.error(f"Could not load session {session_id}. Starting clean.")
|
||||
|
||||
if not history:
|
||||
mdprint(Rule(style="engineer"))
|
||||
mdprint(Markdown("**Networking Expert Agent**: Hi! I'm your assistant. I can help you diagnose issues, run commands, and manage your nodes.\nType 'exit' to quit.\n"))
|
||||
mdprint(Rule(style="engineer"))
|
||||
|
||||
while True:
|
||||
try:
|
||||
user_query = Prompt.ask("[user_prompt]User[/user_prompt]")
|
||||
if not user_query.strip(): continue
|
||||
if user_query.lower() in ['exit', 'quit', 'bye']: break
|
||||
|
||||
with console.status("[ai_status]Agent is thinking...") as status:
|
||||
result = self.app.myai.ask(user_query, chat_history=history, status=status, debug=args.debug, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
new_history = result.get("chat_history")
|
||||
if new_history is not None:
|
||||
history = new_history
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
response_text = result.get("response", "")
|
||||
if response_text:
|
||||
mdprint(Panel(Markdown(response_text), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()
|
||||
except KeyboardInterrupt:
|
||||
break
|
||||
@@ -0,0 +1,53 @@
|
||||
import sys
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError
|
||||
|
||||
class APIHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
status = self.app.services.system.get_api_status()
|
||||
|
||||
if args.command == "stop":
|
||||
if not status["running"]:
|
||||
printer.warning("API does not seem to be running.")
|
||||
else:
|
||||
stopped = self.app.services.system.stop_api()
|
||||
if stopped:
|
||||
printer.success("API stopped successfully.")
|
||||
|
||||
elif args.command == "restart":
|
||||
port = args.data if args.data and isinstance(args.data, int) else None
|
||||
if status["running"]:
|
||||
printer.info(f"Stopping server with process ID {status['pid']}...")
|
||||
|
||||
# Service handles port preservation if port is None
|
||||
self.app.services.system.restart_api(port=port)
|
||||
|
||||
if status["running"]:
|
||||
printer.info(f"Server with process ID {status['pid']} stopped.")
|
||||
|
||||
# Re-fetch status to show the actual port used
|
||||
new_status = self.app.services.system.get_api_status()
|
||||
printer.success(f"API restarted on port {new_status.get('port', 'unknown')}.")
|
||||
|
||||
elif args.command == "start":
|
||||
if status["running"]:
|
||||
msg = f"Connpy server is already running (PID: {status['pid']}"
|
||||
if status.get("port"):
|
||||
msg += f", Port: {status['port']}"
|
||||
msg += ")."
|
||||
printer.warning(msg)
|
||||
else:
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.start_api(port=port)
|
||||
printer.success(f"API started on port {port}.")
|
||||
|
||||
elif args.command == "debug":
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.debug_api(port=port)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
@@ -0,0 +1,135 @@
|
||||
import sys
|
||||
import yaml
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError, InvalidConfigurationError
|
||||
from .help_text import get_instructions
|
||||
|
||||
class ConfigHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
actions = {
|
||||
"completion": self.show_completion,
|
||||
"fzf_wrapper": self.show_fzf_wrapper,
|
||||
"case": self.set_case,
|
||||
"fzf": self.set_fzf,
|
||||
"idletime": self.set_idletime,
|
||||
"configfolder": self.set_configfolder,
|
||||
"theme": self.set_theme,
|
||||
"engineer_model": self.set_ai_config,
|
||||
"engineer_api_key": self.set_ai_config,
|
||||
"architect_model": self.set_ai_config,
|
||||
"architect_api_key": self.set_ai_config,
|
||||
"trusted_commands": self.set_ai_config,
|
||||
"service_mode": self.set_service_mode,
|
||||
"remote_host": self.set_remote_host,
|
||||
"sync_remote": self.set_sync_remote
|
||||
}
|
||||
handler = actions.get(getattr(args, "command", None))
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
# If no specific command was triggered, show current configuration
|
||||
return self.show_config(args)
|
||||
|
||||
def show_config(self, args):
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
yaml_str = yaml.dump(settings, sort_keys=False, default_flow_style=False)
|
||||
printer.data("Current Configuration", yaml_str)
|
||||
|
||||
def set_service_mode(self, args):
|
||||
new_mode = args.data[0]
|
||||
if new_mode == "remote":
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
if not settings.get("remote_host"):
|
||||
printer.error("Remote host must be configured before switching to remote mode")
|
||||
return
|
||||
|
||||
self.app.services.config_svc.update_setting("service_mode", new_mode)
|
||||
|
||||
# Immediate sync of fzf/text cache files for the new mode
|
||||
try:
|
||||
# 1. Clear old cache files to avoid discrepancies if fetch fails
|
||||
self.app.config._generate_nodes_cache(nodes=[], folders=[], profiles=[])
|
||||
|
||||
# 2. Re-initialize services for the new mode
|
||||
from ..services.provider import ServiceProvider
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
new_services = ServiceProvider(self.app.config, mode=new_mode, remote_host=settings.get("remote_host"))
|
||||
|
||||
# 3. Fetch data from new mode and generate cache
|
||||
nodes = new_services.nodes.list_nodes()
|
||||
folders = new_services.nodes.list_folders()
|
||||
profiles = new_services.profiles.list_profiles()
|
||||
new_services.nodes.generate_cache(nodes=nodes, folders=folders, profiles=profiles)
|
||||
|
||||
printer.success("Config saved")
|
||||
except Exception as e:
|
||||
printer.success("Config saved")
|
||||
printer.warning(f"Note: Could not synchronize fzf cache: {e}")
|
||||
|
||||
|
||||
def set_remote_host(self, args):
|
||||
self.app.services.config_svc.update_setting("remote_host", args.data[0])
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_theme(self, args):
|
||||
try:
|
||||
valid_styles = self.app.services.config_svc.apply_theme_from_file(args.data[0])
|
||||
# Apply immediately to current session
|
||||
printer.apply_theme(valid_styles)
|
||||
printer.success(f"Theme '{args.data[0]}' applied and saved")
|
||||
except (ConnpyError, InvalidConfigurationError) as e:
|
||||
printer.error(str(e))
|
||||
|
||||
def show_fzf_wrapper(self, args):
|
||||
print(get_instructions("fzf_wrapper_" + args.data[0]))
|
||||
|
||||
def show_completion(self, args):
|
||||
print(get_instructions(args.data[0] + "completion"))
|
||||
|
||||
def set_case(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("case", val)
|
||||
self.app.case = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_fzf(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("fzf", val)
|
||||
self.app.fzf = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_idletime(self, args):
|
||||
try:
|
||||
val = max(0, int(args.data[0]))
|
||||
self.app.services.config_svc.update_setting("idletime", val)
|
||||
printer.success("Config saved")
|
||||
except ValueError:
|
||||
printer.error("Keepalive must be an integer.")
|
||||
|
||||
def set_configfolder(self, args):
|
||||
try:
|
||||
self.app.services.config_svc.set_config_folder(args.data[0])
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def set_sync_remote(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("sync_remote", val)
|
||||
self.app.services.sync.sync_remote = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_ai_config(self, args):
|
||||
try:
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
aiconfig = settings.get("ai", {})
|
||||
aiconfig[args.command] = args.data[0]
|
||||
self.app.services.config_svc.update_setting("ai", aiconfig)
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
|
||||
@@ -0,0 +1,77 @@
|
||||
import sys
|
||||
import yaml
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError
|
||||
|
||||
class ContextHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.service = self.app.services.context
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
if args.add:
|
||||
if len(args.add) < 2:
|
||||
printer.error("--add requires name and at least one regex")
|
||||
return
|
||||
self.service.add_context(args.add[0], args.add[1:])
|
||||
printer.success(f"Context '{args.add[0]}' added successfully.")
|
||||
|
||||
elif args.rm:
|
||||
if not args.context_name:
|
||||
printer.error("--rm requires a context name")
|
||||
return
|
||||
self.service.delete_context(args.context_name)
|
||||
printer.success(f"Context '{args.context_name}' deleted successfully.")
|
||||
|
||||
elif args.ls:
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])
|
||||
|
||||
elif args.set:
|
||||
if not args.context_name:
|
||||
printer.error("--set requires a context name")
|
||||
return
|
||||
self.service.set_active_context(args.context_name)
|
||||
printer.success(f"Context set to: {args.context_name}")
|
||||
|
||||
elif args.show:
|
||||
if not args.context_name:
|
||||
printer.error("--show requires a context name")
|
||||
return
|
||||
contexts = self.service.contexts
|
||||
if args.context_name not in contexts:
|
||||
printer.error(f"Context '{args.context_name}' does not exist")
|
||||
return
|
||||
yaml_output = yaml.dump(contexts[args.context_name], sort_keys=False, default_flow_style=False)
|
||||
printer.custom(args.context_name, "")
|
||||
print(yaml_output)
|
||||
|
||||
elif args.edit:
|
||||
if len(args.edit) < 2:
|
||||
printer.error("--edit requires name and at least one regex")
|
||||
return
|
||||
self.service.update_context(args.edit[0], args.edit[1:])
|
||||
printer.success(f"Context '{args.edit[0]}' modified successfully.")
|
||||
|
||||
else:
|
||||
# Default behavior if no flags: show list
|
||||
self.dispatch_ls(args)
|
||||
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def dispatch_ls(self, args):
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])
|
||||
@@ -0,0 +1,199 @@
|
||||
import ast
|
||||
import inquirer
|
||||
from .validators import Validators
|
||||
|
||||
class Forms:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.validators = Validators(app)
|
||||
|
||||
def questions_edit(self):
|
||||
questions = []
|
||||
questions.append(inquirer.Confirm("host", message="Edit Hostname/IP?"))
|
||||
questions.append(inquirer.Confirm("protocol", message="Edit Protocol/app?"))
|
||||
questions.append(inquirer.Confirm("port", message="Edit Port?"))
|
||||
questions.append(inquirer.Confirm("options", message="Edit Options?"))
|
||||
questions.append(inquirer.Confirm("logs", message="Edit logging path/file?"))
|
||||
questions.append(inquirer.Confirm("tags", message="Edit tags?"))
|
||||
questions.append(inquirer.Confirm("jumphost", message="Edit jumphost?"))
|
||||
questions.append(inquirer.Confirm("user", message="Edit User?"))
|
||||
questions.append(inquirer.Confirm("password", message="Edit password?"))
|
||||
return inquirer.prompt(questions)
|
||||
|
||||
def questions_nodes(self, unique, uniques=None, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.nodes.get_node_details(unique)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "password": "", "jumphost": ""}
|
||||
node = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", validate=self.validators.host_validation, default=defaults["host"]))
|
||||
else:
|
||||
node["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
node["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation, default=defaults["port"]))
|
||||
else:
|
||||
node["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation, default=defaults["options"]))
|
||||
else:
|
||||
node["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation, default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation, default=defaults["user"]))
|
||||
else:
|
||||
node["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
else:
|
||||
node["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**uniques, **answer, **node}
|
||||
result["type"] = "connection"
|
||||
return result
|
||||
|
||||
def questions_profiles(self, unique, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.profiles.get_profile(unique, resolve=False)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "jumphost": ""}
|
||||
profile = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", default=defaults["host"]))
|
||||
else:
|
||||
profile["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.profile_protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
profile["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.profile_port_validation, default=defaults["port"]))
|
||||
else:
|
||||
profile["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", default=defaults["options"]))
|
||||
else:
|
||||
profile["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.profile_tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.profile_jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", default=defaults["user"]))
|
||||
else:
|
||||
profile["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.Password("password", message="Set Password"))
|
||||
else:
|
||||
profile["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] != "":
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(answer["password"])
|
||||
|
||||
if "tags" in answer and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**answer, **profile}
|
||||
result["id"] = unique
|
||||
return result
|
||||
|
||||
def questions_bulk(self, nodes="", hosts=""):
|
||||
questions = []
|
||||
questions.append(inquirer.Text("ids", message="add a comma separated list of nodes to add", default=nodes, validate=self.validators.bulk_node_validation))
|
||||
questions.append(inquirer.Text("location", message="Add a @folder, @subfolder@folder or leave empty", validate=self.validators.bulk_folder_validation))
|
||||
questions.append(inquirer.Text("host", message="Add comma separated list of Hostnames or IPs", default=hosts, validate=self.validators.bulk_host_validation))
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation))
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation))
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation))
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation))
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
answer["type"] = "connection"
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
return answer
|
||||
@@ -0,0 +1,215 @@
|
||||
import os
|
||||
|
||||
def get_help(type, parsers=None):
|
||||
if type == "export":
|
||||
return "Export /path/to/file.yml \[@subfolder1]\[@folder1] \[@subfolderN]\[@folderN]"
|
||||
if type == "import":
|
||||
return "Import /path/to/file.yml"
|
||||
if type == "node":
|
||||
return "node\[@subfolder]\[@folder]\nConnect to specific node or show all matching nodes\n\[@subfolder]\[@folder]\nShow all available connections globally or in specified path"
|
||||
if type == "usage":
|
||||
commands = []
|
||||
for subcommand, subparser in parsers.choices.items():
|
||||
if subparser.description != None:
|
||||
commands.append(subcommand)
|
||||
commands = ",".join(commands)
|
||||
usage_help = f"connpy [-h] [--add | --del | --mod | --show | --debug] [node|folder] [--sftp]\n connpy {{{commands}}} ..."
|
||||
return usage_help
|
||||
return get_instructions(type)
|
||||
|
||||
def get_instructions(type="add"):
|
||||
if type == "add":
|
||||
return """
|
||||
Welcome to Connpy node Addition Wizard!
|
||||
|
||||
Here are some important instructions and tips for configuring your new node:
|
||||
|
||||
1. **Profiles**:
|
||||
- You can use the configured settings in a profile using `@profilename`.
|
||||
|
||||
2. **Available Protocols and Apps**:
|
||||
- ssh
|
||||
- telnet
|
||||
- kubectl (`kubectl exec`)
|
||||
- docker (`docker exec`)
|
||||
|
||||
3. **Optional Values**:
|
||||
- You can leave any value empty except for the hostname/IP.
|
||||
|
||||
4. **Passwords**:
|
||||
- You can pass one or more passwords using comma-separated `@profiles`.
|
||||
|
||||
5. **Logging**:
|
||||
- You can use the following variables in the logging file name:
|
||||
- `${id}`
|
||||
- `${unique}`
|
||||
- `${host}`
|
||||
- `${port}`
|
||||
- `${user}`
|
||||
- `${protocol}`
|
||||
|
||||
6. **Well-Known Tags**:
|
||||
- `os`: Identified by AI to generate commands based on the operating system.
|
||||
- `screen_length_command`: Used by automation to avoid pagination on different devices (e.g., `terminal length 0` for Cisco devices).
|
||||
- `prompt`: Replaces default app prompt to identify the end of output or where the user can start inputting commands.
|
||||
- `kube_command`: Replaces the default command (`/bin/bash`) for `kubectl exec`.
|
||||
- `docker_command`: Replaces the default command for `docker exec`.
|
||||
"""
|
||||
if type == "bashcompletion":
|
||||
return '''
|
||||
# Bash completion for connpy
|
||||
# Run: eval "$(connpy config --completion bash)"
|
||||
# Or add it to your .bashrc
|
||||
|
||||
_connpy_autocomplete()
|
||||
{
|
||||
local strings
|
||||
strings=$(python3 -m connpy.completion bash ${#COMP_WORDS[@]} "${COMP_WORDS[@]}")
|
||||
|
||||
local IFS=$'\\t'
|
||||
COMPREPLY=( $(compgen -W "$strings" -- "${COMP_WORDS[$COMP_CWORD]}") )
|
||||
}
|
||||
complete -o nosort -F _connpy_autocomplete conn
|
||||
complete -o nosort -F _connpy_autocomplete connpy
|
||||
'''
|
||||
if type == "zshcompletion":
|
||||
return '''
|
||||
# Zsh completion for connpy
|
||||
# Run: eval "$(connpy config --completion zsh)"
|
||||
# Or add it to your .zshrc
|
||||
# Make sure compinit is loaded
|
||||
|
||||
autoload -U compinit && compinit
|
||||
_connpy_autocomplete()
|
||||
{
|
||||
local COMP_WORDS num strings
|
||||
COMP_WORDS=( $words )
|
||||
num=${#COMP_WORDS[@]}
|
||||
if [[ $words =~ '.* $' ]]; then
|
||||
num=$(($num + 1))
|
||||
fi
|
||||
strings=$(python3 -m connpy.completion zsh ${num} ${COMP_WORDS[@]})
|
||||
|
||||
local IFS=$'\\t'
|
||||
compadd "$@" -- ${=strings}
|
||||
}
|
||||
compdef _connpy_autocomplete conn
|
||||
compdef _connpy_autocomplete connpy
|
||||
'''
|
||||
if type == "fzf_wrapper_bash":
|
||||
return '''\n#Here starts bash 0ms fzf wrapper for connpy
|
||||
connpy() {
|
||||
if [ $# -eq 0 ]; then
|
||||
local selected
|
||||
local configdir=$(cat ~/.config/conn/.folder 2>/dev/null || echo ~/.config/conn)
|
||||
if [ -s "$configdir/.fzf_nodes_cache.txt" ]; then
|
||||
selected=$(cat "$configdir/.fzf_nodes_cache.txt" | fzf-tmux -i -d 25%)
|
||||
else
|
||||
command connpy
|
||||
return
|
||||
fi
|
||||
if [ -n "$selected" ]; then
|
||||
command connpy "$selected"
|
||||
fi
|
||||
else
|
||||
command connpy "$@"
|
||||
fi
|
||||
}
|
||||
alias c="connpy"
|
||||
#Here ends bash 0ms fzf wrapper for connpy
|
||||
'''
|
||||
if type == "fzf_wrapper_zsh":
|
||||
return '''\n#Here starts zsh 0ms fzf wrapper for connpy
|
||||
connpy() {
|
||||
if [ $# -eq 0 ]; then
|
||||
local selected
|
||||
local configdir=$(cat ~/.config/conn/.folder 2>/dev/null || echo ~/.config/conn)
|
||||
if [ -s "$configdir/.fzf_nodes_cache.txt" ]; then
|
||||
selected=$(cat "$configdir/.fzf_nodes_cache.txt" | fzf-tmux -i -d 25%)
|
||||
else
|
||||
command connpy
|
||||
return
|
||||
fi
|
||||
if [ -n "$selected" ]; then
|
||||
command connpy "$selected"
|
||||
fi
|
||||
else
|
||||
command connpy "$@"
|
||||
fi
|
||||
}
|
||||
alias c="connpy"
|
||||
#Here ends zsh 0ms fzf wrapper for connpy
|
||||
'''
|
||||
if type == "run":
|
||||
return "node[@subfolder][@folder] commmand to run\nRun the specific command on the node and print output\n/path/to/file.yaml\nUse a yaml file to run an automation script"
|
||||
if type == "generate":
|
||||
return r'''---
|
||||
tasks:
|
||||
- name: "Config"
|
||||
|
||||
action: 'run' #Action can be test or run. Mandatory
|
||||
|
||||
nodes: #List of nodes to work on. Mandatory
|
||||
- 'router1@office' #You can add specific nodes
|
||||
- '@aws' #entire folders or subfolders
|
||||
- '@office': #or filter inside a folder or subfolder
|
||||
- 'router2'
|
||||
- 'router7'
|
||||
|
||||
commands: #List of commands to send, use {name} to pass variables
|
||||
- 'term len 0'
|
||||
- 'conf t'
|
||||
- 'interface {if}'
|
||||
- 'ip address 10.100.100.{id} 255.255.255.255'
|
||||
- '{commit}'
|
||||
- 'end'
|
||||
|
||||
variables: #Variables to use on commands and expected. Optional
|
||||
__global__: #Global variables to use on all nodes, fallback if missing in the node.
|
||||
commit: ''
|
||||
if: 'loopback100'
|
||||
router1@office:
|
||||
id: 1
|
||||
router2@office:
|
||||
id: 2
|
||||
commit: 'commit'
|
||||
router3@office:
|
||||
id: 3
|
||||
vrouter1@aws:
|
||||
id: 4
|
||||
vrouterN@aws:
|
||||
id: 5
|
||||
|
||||
output: /home/user/logs #Type of output, if null you only get Connection and test result. Choices are: null,stdout,/path/to/folder. Folder path only works on 'run' action.
|
||||
|
||||
options:
|
||||
prompt: r'>$|#$|\$$|>.$|#.$|\$.$' #Optional prompt to check on your devices, default should work on most devices.
|
||||
parallel: 10 #Optional number of nodes to run commands on parallel. Default 10.
|
||||
timeout: 20 #Optional time to wait in seconds for prompt, expected or EOF. Default 20.
|
||||
|
||||
- name: "TestConfig"
|
||||
action: 'test'
|
||||
nodes:
|
||||
- 'router1@office'
|
||||
- '@aws'
|
||||
- '@office':
|
||||
- 'router2'
|
||||
- 'router7'
|
||||
commands:
|
||||
- 'ping 10.100.100.{id}'
|
||||
expected: '!' #Expected text to find when running test action. Mandatory for 'test'
|
||||
variables:
|
||||
router1@office:
|
||||
id: 1
|
||||
router2@office:
|
||||
id: 2
|
||||
commit: 'commit'
|
||||
router3@office:
|
||||
id: 3
|
||||
vrouter1@aws:
|
||||
id: 4
|
||||
vrouterN@aws:
|
||||
id: 5
|
||||
output: null
|
||||
...'''
|
||||
return ""
|
||||
@@ -0,0 +1,80 @@
|
||||
import os
|
||||
import inquirer
|
||||
try:
|
||||
from pyfzf.pyfzf import FzfPrompt
|
||||
except ImportError:
|
||||
FzfPrompt = None
|
||||
|
||||
def get_config_dir():
|
||||
home = os.path.expanduser("~")
|
||||
defaultdir = os.path.join(home, '.config/conn')
|
||||
pathfile = os.path.join(defaultdir, '.folder')
|
||||
try:
|
||||
with open(pathfile, "r") as f:
|
||||
return f.read().strip()
|
||||
except:
|
||||
return defaultdir
|
||||
|
||||
def nodes_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.fzf_nodes_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []
|
||||
|
||||
def folders_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.folders_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []
|
||||
|
||||
def profiles_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.profiles_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []
|
||||
|
||||
def choose(app, list_, name, action):
|
||||
# Generates an inquirer list to pick
|
||||
# Safeguard: Never prompt if running in autocomplete shell
|
||||
if os.environ.get("_ARGCOMPLETE") or os.environ.get("COMP_LINE"):
|
||||
return None
|
||||
|
||||
if FzfPrompt and app.fzf and os.environ.get("_ARGCOMPLETE") is None and os.environ.get("COMP_LINE") is None:
|
||||
fzf_prompt = FzfPrompt(executable_path="fzf-tmux")
|
||||
if not app.case:
|
||||
fzf_prompt = FzfPrompt(executable_path="fzf-tmux -i")
|
||||
answer = fzf_prompt.prompt(list_, fzf_options="-d 25%")
|
||||
if len(answer) == 0:
|
||||
return None
|
||||
else:
|
||||
return answer[0]
|
||||
else:
|
||||
questions = [inquirer.List(name, message="Pick {} to {}:".format(name,action), choices=list_, carousel=True)]
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer == None:
|
||||
return None
|
||||
else:
|
||||
return answer[name]
|
||||
|
||||
def toplevel_completer(prefix, parsed_args, **kwargs):
|
||||
commands = ["node", "profile", "move", "mv", "copy", "cp", "list", "ls", "bulk", "export", "import", "ai", "run", "api", "context", "plugin", "config", "sync"]
|
||||
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.fzf_nodes_cache.txt')
|
||||
nodes = []
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
nodes = [line.strip() for line in f if line.startswith(prefix)]
|
||||
|
||||
cache_folders = os.path.join(configdir, '.folders_cache.txt')
|
||||
if os.path.exists(cache_folders):
|
||||
with open(cache_folders, "r") as f:
|
||||
nodes += [line.strip() for line in f if line.startswith(prefix)]
|
||||
|
||||
return [c for c in commands + nodes if c.startswith(prefix)]
|
||||
@@ -0,0 +1,85 @@
|
||||
import os
|
||||
import sys
|
||||
import inquirer
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError
|
||||
from .forms import Forms
|
||||
|
||||
class ImportExportHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch_import(self, args):
|
||||
file_path = args.data[0]
|
||||
try:
|
||||
printer.warning("This could overwrite your current configuration!")
|
||||
question = [inquirer.Confirm("import", message=f"Are you sure you want to import {file_path}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["import"]:
|
||||
sys.exit(7)
|
||||
|
||||
self.app.services.import_export.import_from_file(file_path)
|
||||
printer.success(f"File {file_path} imported successfully.")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def dispatch_export(self, args):
|
||||
file_path = args.data[0]
|
||||
folders = args.data[1:] if len(args.data) > 1 else None
|
||||
try:
|
||||
self.app.services.import_export.export_to_file(file_path, folders=folders)
|
||||
printer.success(f"File {file_path} generated successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
sys.exit()
|
||||
|
||||
def bulk(self, args):
|
||||
if args.file and os.path.isfile(args.file[0]):
|
||||
with open(args.file[0], 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Expecting exactly 2 lines
|
||||
if len(lines) < 2:
|
||||
printer.error("The file must contain at least two lines: one for nodes, one for hosts.")
|
||||
sys.exit(11)
|
||||
|
||||
nodes = lines[0].strip()
|
||||
hosts = lines[1].strip()
|
||||
newnodes = self.forms.questions_bulk(nodes, hosts)
|
||||
else:
|
||||
newnodes = self.forms.questions_bulk()
|
||||
|
||||
if newnodes == False:
|
||||
sys.exit(7)
|
||||
|
||||
if not self.app.case:
|
||||
newnodes["location"] = newnodes["location"].lower()
|
||||
newnodes["ids"] = newnodes["ids"].lower()
|
||||
|
||||
# Handle the case where location might be a file reference (e.g. from a prompt)
|
||||
location = newnodes["location"]
|
||||
if location.startswith("@") and "/" in location:
|
||||
# Extract the actual @folder part (e.g. @testall from @testall/.folders_cache.txt)
|
||||
location = location.split("/")[0]
|
||||
newnodes["location"] = location
|
||||
|
||||
ids = newnodes["ids"].split(",")
|
||||
# Append location to each id for proper folder assignment
|
||||
location = newnodes["location"]
|
||||
if location:
|
||||
ids = [f"{i}{location}" for i in ids]
|
||||
|
||||
hosts = newnodes["host"].split(",")
|
||||
|
||||
try:
|
||||
count = self.app.services.nodes.bulk_add(ids, hosts, newnodes)
|
||||
if count > 0:
|
||||
printer.success(f"Successfully added {count} nodes.")
|
||||
else:
|
||||
printer.info("0 nodes added")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
@@ -0,0 +1,230 @@
|
||||
import sys
|
||||
import yaml
|
||||
import inquirer
|
||||
from rich.markdown import Markdown
|
||||
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError, InvalidConfigurationError
|
||||
from .helpers import choose
|
||||
from .forms import Forms
|
||||
from .help_text import get_instructions
|
||||
|
||||
class NodeHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch(self, args):
|
||||
if not self.app.case and args.data != None:
|
||||
args.data = args.data.lower()
|
||||
actions = {"version": self.version, "connect": self.connect, "add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def version(self, args):
|
||||
from .._version import __version__
|
||||
printer.info(f"Connpy {__version__}")
|
||||
|
||||
def connect(self, args):
|
||||
if args.data == None:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes()
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to list nodes: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.warning("There are no nodes created")
|
||||
printer.info("try: connpy --help")
|
||||
sys.exit(9)
|
||||
else:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "connect")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.nodes.connect_node(
|
||||
matches[0],
|
||||
sftp=args.sftp,
|
||||
debug=args.debug,
|
||||
logger=self.app._service_logger
|
||||
)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def delete(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
matches = self.app.services.nodes.list_folders(args.data)
|
||||
else:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
|
||||
printer.info(f"Removing: {matches}")
|
||||
question = [inquirer.Confirm("delete", message="Are you sure you want to continue?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
for item in matches:
|
||||
self.app.services.nodes.delete_node(item, is_folder=is_folder)
|
||||
|
||||
if len(matches) == 1:
|
||||
printer.success(f"{matches[0]} deleted successfully")
|
||||
else:
|
||||
printer.success(f"{len(matches)} items deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def add(self, args):
|
||||
try:
|
||||
args.data = self.app._type_node(args.data)
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(3)
|
||||
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
if not uniques:
|
||||
raise InvalidConfigurationError(f"Invalid folder {args.data}")
|
||||
self.app.services.nodes.add_node(args.data, {}, is_folder=True)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
else:
|
||||
if args.data in self.app.nodes_list:
|
||||
printer.error(f"Node '{args.data}' already exists.")
|
||||
sys.exit(1)
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
printer.console.print(Markdown(get_instructions()))
|
||||
|
||||
new_node_data = self.forms.questions_nodes(args.data, uniques)
|
||||
if not new_node_data:
|
||||
sys.exit(7)
|
||||
self.app.services.nodes.add_node(args.data, new_node_data)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def show(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "show")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
node = self.app.services.nodes.get_node_details(matches[0])
|
||||
yaml_output = yaml.dump(node, sort_keys=False, default_flow_style=False)
|
||||
printer.data(matches[0], yaml_output)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def modify(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"No connection found with filter: {args.data}")
|
||||
sys.exit(2)
|
||||
|
||||
unique = matches[0] if len(matches) == 1 else None
|
||||
uniques = self.app.services.nodes.explode_unique(unique) if unique else {"id": None, "folder": None}
|
||||
|
||||
printer.info(f"Editing: {matches}")
|
||||
node_details = {}
|
||||
for i in matches:
|
||||
node_details[i] = self.app.services.nodes.get_node_details(i)
|
||||
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
# Use first match as base for defaults if multiple matches exist
|
||||
base_unique = matches[0]
|
||||
base_uniques = self.app.services.nodes.explode_unique(base_unique)
|
||||
updatenode = self.forms.questions_nodes(base_unique, base_uniques, edit=edits)
|
||||
if not updatenode:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
if len(matches) == 1:
|
||||
# Comparison for "Nothing to do"
|
||||
current = node_details[matches[0]].copy()
|
||||
current.update(uniques)
|
||||
current["type"] = "connection"
|
||||
if sorted(updatenode.items()) == sorted(current.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
self.app.services.nodes.update_node(matches[0], updatenode)
|
||||
printer.success(f"{args.data} edited successfully")
|
||||
else:
|
||||
editcount = 0
|
||||
for k in matches:
|
||||
updated_item = self.app.services.nodes.explode_unique(k)
|
||||
updated_item["type"] = "connection"
|
||||
updated_item.update(node_details[k])
|
||||
|
||||
this_item_changed = False
|
||||
for key, should_edit in edits.items():
|
||||
if should_edit:
|
||||
this_item_changed = True
|
||||
updated_item[key] = updatenode[key]
|
||||
|
||||
if this_item_changed:
|
||||
editcount += 1
|
||||
self.app.services.nodes.update_node(k, updated_item)
|
||||
|
||||
if editcount == 0:
|
||||
printer.info("Nothing to do here")
|
||||
else:
|
||||
printer.success(f"{matches} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
@@ -0,0 +1,150 @@
|
||||
import sys
|
||||
import yaml
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError
|
||||
|
||||
class PluginHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
# We determine the target PluginService/PluginStub based on standard 'mode'
|
||||
# But wait, local plugins should go to app.services._init_local version
|
||||
# Or we can just use the provided app.services.plugins and pass the appropriate grpc calls if needed.
|
||||
|
||||
is_remote = getattr(args, "remote", False)
|
||||
if is_remote and self.app.services.mode != "remote":
|
||||
printer.error("Cannot use --remote flag when not running in remote mode.")
|
||||
return
|
||||
|
||||
if args.add:
|
||||
self.app.services.plugins.add_plugin(args.add[0], args.add[1])
|
||||
printer.success(f"Plugin {args.add[0]} added successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.update:
|
||||
self.app.services.plugins.add_plugin(args.update[0], args.update[1], update=True)
|
||||
printer.success(f"Plugin {args.update[0]} updated successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.delete:
|
||||
self.app.services.plugins.delete_plugin(args.delete[0])
|
||||
printer.success(f"Plugin {args.delete[0]} deleted successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.enable:
|
||||
name = args.enable[0]
|
||||
if is_remote:
|
||||
self.app.plugins.preferences[name] = "remote"
|
||||
else:
|
||||
if name in self.app.plugins.preferences:
|
||||
del self.app.plugins.preferences[name]
|
||||
|
||||
self.app.plugins._save_preferences(self.app.services.config_svc.get_default_dir())
|
||||
|
||||
# Always try to enable it locally (remove .bkp) if it exists
|
||||
# regardless of mode, to keep files consistent with "enabled" state
|
||||
try:
|
||||
# We use a local service instance to ensure we touch local files
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_svc.enable_plugin(name)
|
||||
except Exception:
|
||||
pass # Ignore if not found locally or already enabled
|
||||
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
self.app.services.plugins.enable_plugin(name)
|
||||
|
||||
printer.success(f"Plugin {name} enabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
elif args.disable:
|
||||
name = args.disable[0]
|
||||
success = False
|
||||
if is_remote:
|
||||
if self.app.services.mode == "remote":
|
||||
self.app.services.plugins.disable_plugin(name)
|
||||
success = True
|
||||
else:
|
||||
# Disable locally
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
try:
|
||||
if local_svc.disable_plugin(name):
|
||||
success = True
|
||||
except Exception as e:
|
||||
printer.warning(f"Could not disable local plugin: {e}")
|
||||
|
||||
if success:
|
||||
printer.success(f"Plugin {name} disabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
|
||||
# If any remote operation was performed, trigger a sync to update local cache immediately
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
try:
|
||||
import os
|
||||
cache_dir = os.path.join(self.app.services.config_svc.get_default_dir(), "remote_plugins")
|
||||
# We use a dummy subparser choice check bypass by passing force_sync=True
|
||||
# or just letting the hasher handle it.
|
||||
self.app.plugins._import_remote_plugins_to_argparse(
|
||||
self.app.services.plugins,
|
||||
self.app.subparsers, # We'll need to make sure this is available
|
||||
cache_dir,
|
||||
force_sync=True
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
elif getattr(args, "sync", False):
|
||||
# The actual sync logic is performed in connapp.py during init
|
||||
# if the --sync flag is detected in sys.argv
|
||||
printer.success("Remote plugins synchronized successfully.")
|
||||
elif args.list:
|
||||
# We need to fetch both local and remote if in remote mode
|
||||
local_plugins = {}
|
||||
remote_plugins = {}
|
||||
|
||||
# Fetch depending on mode
|
||||
if self.app.services.mode == "remote":
|
||||
# For local we need to instantiate a local plugin service bypassing stub
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_plugins = local_svc.list_plugins()
|
||||
remote_plugins = self.app.services.plugins.list_plugins()
|
||||
else:
|
||||
local_plugins = self.app.services.plugins.list_plugins()
|
||||
|
||||
from rich.table import Table
|
||||
|
||||
table = Table(title="Available Plugins", show_header=True, header_style="bold cyan")
|
||||
table.add_column("Plugin", style="cyan")
|
||||
table.add_column("State", style="bold")
|
||||
table.add_column("Origin", style="magenta")
|
||||
|
||||
# Populate local plugins
|
||||
for name, details in local_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if self.app.services.mode == "remote" and state == "Active":
|
||||
if self.app.plugins.preferences.get(name) == "remote":
|
||||
state = "Shadowed (Override by Remote)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Local")
|
||||
|
||||
# Populate remote plugins
|
||||
if self.app.services.mode == "remote":
|
||||
for name, details in remote_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if state == "Active":
|
||||
pref = self.app.plugins.preferences.get(name, "local")
|
||||
# If preference isn't remote and the plugin exists locally, local takes priority
|
||||
if pref != "remote" and name in local_plugins:
|
||||
state = "Shadowed (Override by Local)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Remote")
|
||||
|
||||
if not local_plugins and not remote_plugins:
|
||||
printer.console.print(" No plugins found.")
|
||||
else:
|
||||
printer.console.print(table)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
@@ -0,0 +1,96 @@
|
||||
import sys
|
||||
import yaml
|
||||
import inquirer
|
||||
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError, ProfileNotFoundError
|
||||
from .forms import Forms
|
||||
|
||||
class ProfileHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch(self, args):
|
||||
if not self.app.case:
|
||||
args.data[0] = args.data[0].lower()
|
||||
actions = {"add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def delete(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
self.app.services.profiles.get_profile(name)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{name} not found")
|
||||
sys.exit(2)
|
||||
|
||||
if name == "default":
|
||||
printer.error("Can't delete default profile")
|
||||
sys.exit(6)
|
||||
|
||||
question = [inquirer.Confirm("delete", message=f"Are you sure you want to delete {name}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.delete_profile(name)
|
||||
printer.success(f"{name} deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(8)
|
||||
|
||||
def show(self, args):
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(args.data[0])
|
||||
yaml_output = yaml.dump(profile, sort_keys=False, default_flow_style=False)
|
||||
printer.data(args.data[0], yaml_output)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{args.data[0]} not found")
|
||||
sys.exit(2)
|
||||
|
||||
def add(self, args):
|
||||
name = args.data[0]
|
||||
if name in self.app.services.profiles.list_profiles():
|
||||
printer.error(f"Profile '{name}' already exists.")
|
||||
sys.exit(4)
|
||||
|
||||
new_profile_data = self.forms.questions_profiles(name)
|
||||
if not new_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.add_profile(name, new_profile_data)
|
||||
printer.success(f"{name} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def modify(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(name, resolve=False)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"Profile '{name}' not found")
|
||||
sys.exit(2)
|
||||
|
||||
old_profile = {"id": name, **profile}
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
update_profile_data = self.forms.questions_profiles(name, edit=edits)
|
||||
if not update_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
if sorted(update_profile_data.items()) == sorted(old_profile.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
|
||||
try:
|
||||
self.app.services.profiles.update_profile(name, update_profile_data)
|
||||
printer.success(f"{name} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
@@ -0,0 +1,120 @@
|
||||
import os
|
||||
import sys
|
||||
import yaml
|
||||
from rich.rule import Rule
|
||||
from .. import printer
|
||||
from ..services.exceptions import ConnpyError
|
||||
from .help_text import get_instructions
|
||||
|
||||
class RunHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
if len(args.data) > 1:
|
||||
args.action = "noderun"
|
||||
actions = {"noderun": self.node_run, "generate": self.yaml_generate, "run": self.yaml_run}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def node_run(self, args):
|
||||
nodes_filter = args.data[0]
|
||||
commands = [" ".join(args.data[1:])]
|
||||
|
||||
try:
|
||||
header_printed = False
|
||||
# Inline execution with streaming results
|
||||
def _on_node_complete(unique, node_output, node_status):
|
||||
nonlocal header_printed
|
||||
if not header_printed:
|
||||
printer.console.print(Rule("OUTPUT", style="header"))
|
||||
header_printed = True
|
||||
printer.node_panel(unique, node_output, node_status)
|
||||
|
||||
self.app.services.execution.run_commands(
|
||||
nodes_filter=nodes_filter,
|
||||
commands=commands,
|
||||
on_node_complete=_on_node_complete
|
||||
)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def yaml_generate(self, args):
|
||||
if os.path.exists(args.data[0]):
|
||||
printer.error(f"File '{args.data[0]}' already exists.")
|
||||
sys.exit(14)
|
||||
else:
|
||||
with open(args.data[0], "w") as file:
|
||||
file.write(get_instructions("generate"))
|
||||
printer.success(f"File {args.data[0]} generated successfully")
|
||||
sys.exit()
|
||||
|
||||
def yaml_run(self, args):
|
||||
path = args.data[0]
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
|
||||
for task in playbook.get("tasks", []):
|
||||
self.cli_run(task)
|
||||
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to run playbook {path}: {e}")
|
||||
sys.exit(10)
|
||||
|
||||
def cli_run(self, script):
|
||||
try:
|
||||
action = script["action"]
|
||||
nodelist = script["nodes"]
|
||||
commands = script["commands"]
|
||||
variables = script.get("variables")
|
||||
output_cfg = script["output"]
|
||||
name = script.get("name", "Task")
|
||||
options = script.get("options", {})
|
||||
except KeyError as e:
|
||||
printer.error(f"'{e.args[0]}' is mandatory in script")
|
||||
sys.exit(11)
|
||||
|
||||
stdout = (output_cfg == "stdout")
|
||||
folder = output_cfg if output_cfg not in [None, "stdout"] else None
|
||||
prompt = options.get("prompt")
|
||||
printer.header(name.upper())
|
||||
|
||||
try:
|
||||
if action == "run":
|
||||
# If stdout is true, we stream results as they arrive
|
||||
on_complete = printer.node_panel if stdout else None
|
||||
results = self.app.services.execution.run_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
# If not streaming, we could print a summary table here if needed
|
||||
if not stdout:
|
||||
for unique, output in results.items():
|
||||
printer.node_panel(unique, output, 0)
|
||||
|
||||
elif action == "test":
|
||||
expected = script.get("expected", [])
|
||||
on_complete = printer.test_panel if stdout else None
|
||||
results = self.app.services.execution.test_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
if not stdout:
|
||||
printer.test_summary(results)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
@@ -0,0 +1,126 @@
|
||||
import sys
|
||||
import yaml
|
||||
from .. import printer
|
||||
|
||||
class SyncHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
action = getattr(args, "action", None)
|
||||
actions = {
|
||||
"login": self.login,
|
||||
"logout": self.logout,
|
||||
"status": self.status,
|
||||
"list": self.list_backups,
|
||||
"once": self.once,
|
||||
"restore": self.restore,
|
||||
"start": self.start,
|
||||
"stop": self.stop
|
||||
}
|
||||
handler = actions.get(action)
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
return self.status(args)
|
||||
|
||||
def login(self, args):
|
||||
self.app.services.sync.login()
|
||||
|
||||
def logout(self, args):
|
||||
self.app.services.sync.logout()
|
||||
|
||||
def status(self, args):
|
||||
status = self.app.services.sync.check_login_status()
|
||||
enabled = self.app.services.sync.sync_enabled
|
||||
remote = self.app.services.sync.sync_remote
|
||||
|
||||
printer.info(f"Login Status: {status}")
|
||||
printer.info(f"Auto-Sync: {'Enabled' if enabled else 'Disabled'}")
|
||||
printer.info(f"Sync Remote Nodes: {'Yes' if remote else 'No'}")
|
||||
|
||||
def list_backups(self, args):
|
||||
backups = self.app.services.sync.list_backups()
|
||||
if backups:
|
||||
yaml_output = yaml.dump(backups, sort_keys=False, default_flow_style=False)
|
||||
printer.custom("backups", "")
|
||||
print(yaml_output)
|
||||
else:
|
||||
printer.info("No backups found or not logged in.")
|
||||
|
||||
def once(self, args):
|
||||
# Manual backup. We check if we should include remote nodes
|
||||
remote_data = None
|
||||
if self.app.services.sync.sync_remote and self.app.services.mode == "remote":
|
||||
inventory = self.app.services.nodes.get_inventory()
|
||||
# Merge with local settings
|
||||
local_settings = self.app.services.config_svc.get_settings()
|
||||
local_settings.pop("configfolder", None)
|
||||
|
||||
# Maintain proper config structure: {config: {}, connections: {}, profiles: {}}
|
||||
remote_data = {
|
||||
"config": local_settings,
|
||||
"connections": inventory.get("connections", {}),
|
||||
"profiles": inventory.get("profiles", {})
|
||||
}
|
||||
|
||||
if self.app.services.sync.compress_and_upload(remote_data):
|
||||
printer.success("Manual backup completed.")
|
||||
|
||||
def restore(self, args):
|
||||
import inquirer
|
||||
file_id = getattr(args, "id", None)
|
||||
|
||||
# Segmented flags
|
||||
restore_config = getattr(args, "restore_config", False)
|
||||
restore_nodes = getattr(args, "restore_nodes", False)
|
||||
|
||||
# If neither is specified, we restore ALL (backwards compatibility)
|
||||
if not restore_config and not restore_nodes:
|
||||
restore_config = True
|
||||
restore_nodes = True
|
||||
|
||||
# 1. Analyze what we are about to restore
|
||||
info = self.app.services.sync.analyze_backup_content(file_id)
|
||||
if not info:
|
||||
printer.error("Could not analyze backup content.")
|
||||
return
|
||||
|
||||
# 2. Show detailed info
|
||||
printer.info("Restoration Details:")
|
||||
if restore_config:
|
||||
print(f" - Local Settings: Yes")
|
||||
print(f" - RSA Key (.osk): {'Yes' if info['has_key'] else 'No'}")
|
||||
if restore_nodes:
|
||||
target = "REMOTE" if self.app.services.mode == "remote" else "LOCAL"
|
||||
print(f" - Nodes: {info['nodes']}")
|
||||
print(f" - Folders: {info['folders']}")
|
||||
print(f" - Profiles: {info['profiles']}")
|
||||
print(f" - Destination: {target}")
|
||||
print("")
|
||||
|
||||
questions = [inquirer.Confirm("confirm", message="Do you want to proceed with the restoration?", default=False)]
|
||||
answers = inquirer.prompt(questions)
|
||||
|
||||
if not answers or not answers["confirm"]:
|
||||
printer.info("Restore cancelled.")
|
||||
return
|
||||
|
||||
# 3. Perform the actual restore
|
||||
if self.app.services.sync.restore_backup(
|
||||
file_id=file_id,
|
||||
restore_config=restore_config,
|
||||
restore_nodes=restore_nodes,
|
||||
app_instance=self.app
|
||||
):
|
||||
printer.success("Restore completed successfully.")
|
||||
|
||||
def start(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", True)
|
||||
self.app.services.sync.sync_enabled = True
|
||||
printer.success("Auto-sync enabled.")
|
||||
|
||||
def stop(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", False)
|
||||
self.app.services.sync.sync_enabled = False
|
||||
printer.success("Auto-sync disabled.")
|
||||
@@ -0,0 +1,139 @@
|
||||
import re
|
||||
import ast
|
||||
import inquirer
|
||||
|
||||
class Validators:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def profile_protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker or leave empty")
|
||||
return True
|
||||
|
||||
def protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker leave empty or @profile")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def profile_port_validation(self, answers, current, regex = "(^[0-9]*$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535 or leave empty")
|
||||
return True
|
||||
|
||||
def port_validation(self, answers, current, regex = "(^[0-9]*$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile or leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
return True
|
||||
|
||||
def pass_validation(self, answers, current, regex = "(^@.+$)"):
|
||||
profiles = current.split(",")
|
||||
for i in profiles:
|
||||
if not re.match(regex, i) or i[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(i))
|
||||
return True
|
||||
|
||||
def tags_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True
|
||||
|
||||
def profile_tags_validation(self, answers, current):
|
||||
if current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True
|
||||
|
||||
def jumphost_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True
|
||||
|
||||
def profile_jumphost_validation(self, answers, current):
|
||||
if current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True
|
||||
|
||||
def default_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_node_validation(self, answers, current, regex = "^[0-9a-zA-Z_.,$#-]+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_folder_validation(self, answers, current):
|
||||
if not self.app.case:
|
||||
current = current.lower()
|
||||
|
||||
candidate = current
|
||||
if "/" in current:
|
||||
candidate = current.split("/")[0]
|
||||
|
||||
matches = list(filter(lambda k: k == candidate, self.app.folders))
|
||||
if current != "" and len(matches) == 0:
|
||||
raise inquirer.errors.ValidationError("", reason="Location {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
hosts = current.split(",")
|
||||
nodes = answers["ids"].split(",")
|
||||
if len(hosts) > 1 and len(hosts) != len(nodes):
|
||||
raise inquirer.errors.ValidationError("", reason="Hosts list should be the same length of nodes list")
|
||||
return True
|
||||
+278
-92
@@ -8,12 +8,16 @@ def load_txt_cache(filepath):
|
||||
except FileNotFoundError:
|
||||
return []
|
||||
|
||||
def _getcwd(words, option, folderonly=False):
|
||||
def get_cwd(words, option=None, folderonly=False):
|
||||
import glob
|
||||
# Expand tilde to home directory if present
|
||||
if words[-1].startswith("~"):
|
||||
words[-1] = os.path.expanduser(words[-1])
|
||||
|
||||
# If option is not provided, try to infer it from the first word
|
||||
if option is None and words:
|
||||
option = words[0]
|
||||
|
||||
if words[-1] == option:
|
||||
path = './*'
|
||||
else:
|
||||
@@ -31,6 +35,21 @@ def _getcwd(words, option, folderonly=False):
|
||||
def _get_plugins(which, defaultdir):
|
||||
# Path to core_plugins relative to this script
|
||||
core_path = os.path.dirname(os.path.realpath(__file__)) + "/core_plugins"
|
||||
remote_path = os.path.join(defaultdir, "remote_plugins")
|
||||
|
||||
# Load preferences
|
||||
import json
|
||||
pref_path = os.path.join(defaultdir, "plugin_preferences.json")
|
||||
try:
|
||||
with open(pref_path) as f:
|
||||
preferences = json.load(f)
|
||||
except Exception:
|
||||
preferences = {}
|
||||
|
||||
# Load service mode
|
||||
# We try to infer if we are in remote mode by checking config.yaml or .folder
|
||||
# but for completion usually we just want to know if remote cache exists.
|
||||
# However, to be strict we should check preferences.
|
||||
|
||||
def get_plugins_from_directory(directory):
|
||||
enabled_files = []
|
||||
@@ -41,21 +60,38 @@ def _get_plugins(which, defaultdir):
|
||||
for file in os.listdir(directory):
|
||||
# Check if the file is a Python file
|
||||
if file.endswith('.py'):
|
||||
enabled_files.append(os.path.splitext(file)[0])
|
||||
all_plugins[os.path.splitext(file)[0]] = os.path.join(directory, file)
|
||||
name = os.path.splitext(file)[0]
|
||||
enabled_files.append(name)
|
||||
all_plugins[name] = os.path.join(directory, file)
|
||||
# Check if the file is a Python backup file
|
||||
elif file.endswith('.py.bkp'):
|
||||
disabled_files.append(os.path.splitext(os.path.splitext(file)[0])[0])
|
||||
name = os.path.splitext(os.path.splitext(file)[0])[0]
|
||||
disabled_files.append(name)
|
||||
return enabled_files, disabled_files, all_plugins
|
||||
|
||||
# Get plugins from both directories
|
||||
# Get plugins from all directories
|
||||
user_enabled, user_disabled, user_all_plugins = get_plugins_from_directory(defaultdir + "/plugins")
|
||||
core_enabled, core_disabled, core_all_plugins = get_plugins_from_directory(core_path)
|
||||
remote_enabled, remote_disabled, remote_all_plugins = get_plugins_from_directory(remote_path)
|
||||
|
||||
# Combine the results from user and core plugins
|
||||
enabled_files = user_enabled
|
||||
disabled_files = user_disabled
|
||||
all_plugins = {**user_all_plugins, **core_all_plugins} # Merge dictionaries
|
||||
# Calculate final paths respecting priorities and preferences
|
||||
# Priority: User Local > Core Local > Remote (unless preferred)
|
||||
|
||||
# Start with core
|
||||
final_all_plugins = core_all_plugins.copy()
|
||||
# Override with user local
|
||||
final_all_plugins.update(user_all_plugins)
|
||||
|
||||
# For remote, we only use them if:
|
||||
# 1. They don't exist locally OR
|
||||
# 2. Preference is explicitly 'remote'
|
||||
for name, path in remote_all_plugins.items():
|
||||
if name not in final_all_plugins or preferences.get(name) == "remote":
|
||||
final_all_plugins[name] = path
|
||||
|
||||
# Combine enabled/disabled for the helper commands
|
||||
enabled_files = list(set(user_enabled + core_enabled + [k for k,v in remote_all_plugins.items() if preferences.get(k) == "remote"]))
|
||||
disabled_files = list(set(user_disabled + core_disabled))
|
||||
|
||||
# Return based on the command
|
||||
if which == "--disable":
|
||||
@@ -66,7 +102,195 @@ def _get_plugins(which, defaultdir):
|
||||
all_files = enabled_files + disabled_files
|
||||
return all_files
|
||||
elif which == "all":
|
||||
return all_plugins
|
||||
return final_all_plugins
|
||||
|
||||
|
||||
def _build_tree(nodes, folders, profiles, plugins, configdir):
|
||||
"""Build the declarative CLI navigation tree.
|
||||
|
||||
Structure:
|
||||
- dict: keys are completions + subnavigation.
|
||||
"__extra__" adds dynamic data.
|
||||
"__exclude_used__" filters already-typed words.
|
||||
"*" absorbs unknown positional words and loops to a specific node.
|
||||
- list: static choice completions.
|
||||
- callable: dynamic completions (called with `words`, returns list).
|
||||
- None: no further completions.
|
||||
"""
|
||||
_nodes = lambda w=None: list(nodes)
|
||||
_folders = lambda w=None: list(folders)
|
||||
_profiles = lambda w=None: list(profiles)
|
||||
_nodes_folders = lambda w=None: list(nodes) + list(folders)
|
||||
|
||||
_profile_values = {"__extra__": _profiles}
|
||||
|
||||
# --- Stateful/Looping Nodes ---
|
||||
|
||||
# list nodes
|
||||
list_nodes = {"__exclude_used__": True}
|
||||
list_nodes.update({
|
||||
"--format": {"*": list_nodes},
|
||||
"--filter": {"*": list_nodes},
|
||||
"*": list_nodes
|
||||
})
|
||||
|
||||
# export / import / run loops
|
||||
export_dict = {"--help": None, "-h": None}
|
||||
export_dict.update({
|
||||
"*": export_dict,
|
||||
"__extra__": lambda w: get_cwd(w, "export", True) + [f for f in folders if not any(x in f for x in w[1:-1])]
|
||||
})
|
||||
|
||||
import_dict = {"--help": None, "-h": None}
|
||||
import_dict.update({
|
||||
"*": import_dict,
|
||||
"__extra__": lambda w: get_cwd(w, "import")
|
||||
})
|
||||
|
||||
run_dict = {"--generate": None, "--help": None, "-g": None, "-h": None}
|
||||
run_dict.update({
|
||||
"*": run_dict,
|
||||
"__extra__": lambda w: get_cwd(w, "run") + list(nodes)
|
||||
})
|
||||
|
||||
# State Machine Definitions
|
||||
ai_dict = {"__exclude_used__": True, "--help": None, "-h": None}
|
||||
for opt in ["--engineer-model", "--engineer-api-key", "--architect-model", "--architect-api-key"]:
|
||||
ai_dict[opt] = {"*": ai_dict} # takes value, loops back
|
||||
for opt in ["--debug", "--trust", "--list", "--list-sessions", "--session", "--resume", "--delete", "--delete-session", "-y"]:
|
||||
ai_dict[opt] = ai_dict # takes no value, loops back
|
||||
ai_dict["*"] = ai_dict
|
||||
|
||||
mv_state = {"__extra__": _nodes, "--help": None, "-h": None}
|
||||
cp_state = {"__extra__": _nodes, "--help": None, "-h": None}
|
||||
ls_state = {
|
||||
"profiles": None,
|
||||
"nodes": list_nodes,
|
||||
"folders": None,
|
||||
}
|
||||
|
||||
# --- Main Tree ---
|
||||
return {
|
||||
"__extra__": lambda w: list(nodes) + list(folders) + (list(plugins.keys()) if plugins else []),
|
||||
|
||||
"--add": {"profile": _profile_values},
|
||||
"--del": {"profile": _profile_values, "__extra__": _nodes_folders},
|
||||
"--rm": {"profile": _profile_values, "__extra__": _nodes_folders},
|
||||
"--edit": {"profile": _profile_values, "__extra__": _nodes},
|
||||
"--mod": {"profile": _profile_values, "__extra__": _nodes},
|
||||
"--show": {"profile": _profile_values, "__extra__": _nodes},
|
||||
"--help": None,
|
||||
|
||||
"-a": {"profile": _profile_values},
|
||||
"-r": {"profile": _profile_values, "__extra__": _nodes_folders},
|
||||
"-e": {"profile": _profile_values, "__extra__": _nodes},
|
||||
"-s": {"profile": _profile_values, "__extra__": _nodes},
|
||||
|
||||
"profile": {
|
||||
"--add": None, "--rm": _profiles, "--del": _profiles,
|
||||
"--edit": _profiles, "--mod": _profiles, "--show": _profiles,
|
||||
"--help": None,
|
||||
"-a": None, "-r": _profiles, "-e": _profiles, "-s": _profiles, "-h": None,
|
||||
},
|
||||
"move": mv_state,
|
||||
"mv": mv_state,
|
||||
"copy": cp_state,
|
||||
"cp": cp_state,
|
||||
|
||||
"list": ls_state,
|
||||
"ls": ls_state,
|
||||
|
||||
"bulk": {"--file": None, "--help": None, "-f": None, "-h": None},
|
||||
"run": run_dict,
|
||||
"export": export_dict,
|
||||
"import": import_dict,
|
||||
"ai": ai_dict,
|
||||
|
||||
"api": {
|
||||
"--start": None, "--restart": None, "--stop": None, "--debug": None,
|
||||
"--help": None,
|
||||
"-s": None, "-r": None, "-x": None, "-d": None, "-h": None,
|
||||
},
|
||||
"context": {
|
||||
"--add": None, "--rm": None, "--del": None,
|
||||
"--ls": None, "--set": None,
|
||||
"--show": None, "--edit": None, "--mod": None,
|
||||
"--help": None,
|
||||
"-a": None, "-r": None, "-s": None, "-e": None, "-h": None,
|
||||
},
|
||||
"plugin": {
|
||||
"--add": lambda w: get_cwd(w, "--add"),
|
||||
"--update": lambda w: get_cwd(w, "--update"),
|
||||
"--del": lambda w: _get_plugins("--del", configdir),
|
||||
"--enable": lambda w: _get_plugins("--enable", configdir),
|
||||
"--disable": lambda w: _get_plugins("--disable", configdir),
|
||||
"--list": None, "--help": None,
|
||||
"-h": None,
|
||||
},
|
||||
"config": {
|
||||
"--allow-uppercase": ["true", "false"],
|
||||
"--fzf": ["true", "false"],
|
||||
"--keepalive": None,
|
||||
"--completion": ["bash", "zsh"],
|
||||
"--fzf-wrapper": ["bash", "zsh"],
|
||||
"--configfolder": lambda w: get_cwd(w, "--configfolder", True),
|
||||
"--engineer-model": None, "--engineer-api-key": None,
|
||||
"--architect-model": None, "--architect-api-key": None,
|
||||
"--theme": None,
|
||||
"--service-mode": ["local", "remote"],
|
||||
"--remote": None,
|
||||
"--sync-remote": ["true", "false"],
|
||||
"--trusted-commands": None,
|
||||
"--help": None, "-h": None,
|
||||
},
|
||||
"sync": {
|
||||
"--login": None, "--logout": None,
|
||||
"--status": None, "--list": None,
|
||||
"--once": None, "--restore": None,
|
||||
"--start": None, "--stop": None,
|
||||
"--id": None, "--nodes": None, "--config": None,
|
||||
"--help": None, "-h": None,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def resolve_completion(words, tree):
|
||||
"""Navigate the tree following typed words, properly handling dynamic state loops."""
|
||||
current = tree
|
||||
for word in words[:-1]:
|
||||
if isinstance(current, dict):
|
||||
if word in current:
|
||||
current = current[word]
|
||||
elif "*" in current:
|
||||
current = current["*"]
|
||||
else:
|
||||
return []
|
||||
else:
|
||||
return []
|
||||
|
||||
results = []
|
||||
if isinstance(current, dict):
|
||||
results = [k for k in current
|
||||
if not k.startswith("__")
|
||||
and not k.startswith("*")
|
||||
and not (len(k) == 2 and k in ["mv", "cp", "ls"])
|
||||
and not (len(k) == 2 and k[0] == "-" and k[1] != "-")]
|
||||
|
||||
if current.get("__exclude_used__"):
|
||||
results = [r for r in results if r not in words[:-1]]
|
||||
|
||||
extra = current.get("__extra__")
|
||||
if callable(extra):
|
||||
results.extend(extra(words))
|
||||
elif isinstance(extra, list):
|
||||
results.extend(extra)
|
||||
elif isinstance(current, list):
|
||||
results = list(current)
|
||||
elif callable(current):
|
||||
results = list(current(words))
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def main():
|
||||
home = os.path.expanduser("~")
|
||||
@@ -82,7 +306,7 @@ def main():
|
||||
nodes = load_txt_cache(configdir + '/.fzf_nodes_cache.txt')
|
||||
folders = load_txt_cache(configdir + '/.folders_cache.txt')
|
||||
profiles = load_txt_cache(configdir + '/.profiles_cache.txt')
|
||||
plugins = _get_plugins("all", defaultdir)
|
||||
plugins = _get_plugins("all", configdir)
|
||||
|
||||
info = {}
|
||||
info["config"] = None
|
||||
@@ -97,100 +321,62 @@ def main():
|
||||
positions = [1,3]
|
||||
wordsnumber = int(sys.argv[positions[0]])
|
||||
words = sys.argv[positions[1]:]
|
||||
if wordsnumber == 2:
|
||||
strings=["--add", "--del", "--rm", "--edit", "--mod", "--show", "mv", "move", "ls", "list", "cp", "copy", "profile", "run", "bulk", "config", "api", "ai", "export", "import", "--help", "plugin"]
|
||||
if plugins:
|
||||
strings.extend(plugins.keys())
|
||||
strings.extend(nodes)
|
||||
strings.extend(folders)
|
||||
|
||||
elif wordsnumber >=3 and words[0] in plugins.keys():
|
||||
import json
|
||||
# --- Plugin completion ---
|
||||
# Try new tree API first: _connpy_tree integrates into the main tree.
|
||||
# Fall back to legacy _connpy_completion for older plugins.
|
||||
if wordsnumber >= 3 and plugins and words[0] in plugins:
|
||||
import importlib.util
|
||||
plugin_path = plugins[words[0]]
|
||||
try:
|
||||
with open(cachefile, "r") as jsonconf:
|
||||
info["config"] = json.load(jsonconf)
|
||||
except Exception:
|
||||
try:
|
||||
import yaml
|
||||
with open(configdir + '/config.yaml', "r") as yamlconf:
|
||||
info["config"] = yaml.safe_load(yamlconf)
|
||||
except Exception:
|
||||
info["config"] = {}
|
||||
|
||||
try:
|
||||
spec = importlib.util.spec_from_file_location("module.name", plugins[words[0]])
|
||||
spec = importlib.util.spec_from_file_location("module.name", plugin_path)
|
||||
module = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(module)
|
||||
plugin_completion = getattr(module, "_connpy_completion")
|
||||
strings = plugin_completion(wordsnumber, words, info)
|
||||
module.get_cwd = get_cwd
|
||||
except Exception:
|
||||
exit()
|
||||
elif wordsnumber >= 3 and words[0] == "ai":
|
||||
if wordsnumber == 3:
|
||||
strings = ["--help", "--engineer-model", "--engineer-api-key", "--architect-model", "--architect-api-key", "--debug"]
|
||||
|
||||
# New API: _connpy_tree → integrate into main tree and use resolver
|
||||
if hasattr(module, "_connpy_tree"):
|
||||
plugin_node = module._connpy_tree(info)
|
||||
tree = _build_tree(nodes, folders, profiles, plugins, configdir)
|
||||
tree[words[0]] = plugin_node
|
||||
strings = resolve_completion(words, tree)
|
||||
|
||||
# Legacy API: _connpy_completion → delegate entirely
|
||||
elif hasattr(module, "_connpy_completion"):
|
||||
import json
|
||||
try:
|
||||
with open(cachefile, "r") as jsonconf:
|
||||
info["config"] = json.load(jsonconf)
|
||||
except Exception:
|
||||
try:
|
||||
import yaml
|
||||
with open(configdir + '/config.yaml', "r") as yamlconf:
|
||||
info["config"] = yaml.safe_load(yamlconf)
|
||||
except Exception:
|
||||
info["config"] = {}
|
||||
try:
|
||||
plugin_completion = getattr(module, "_connpy_completion")
|
||||
strings = plugin_completion(wordsnumber, words, info)
|
||||
except Exception:
|
||||
exit()
|
||||
else:
|
||||
strings = ["--engineer-model", "--engineer-api-key", "--architect-model", "--architect-api-key", "--debug"]
|
||||
elif wordsnumber == 3:
|
||||
strings=[]
|
||||
if words[0] == "profile":
|
||||
strings=["--add", "--rm", "--del", "--edit", "--mod", "--show", "--help"]
|
||||
if words[0] == "config":
|
||||
strings=["--allow-uppercase", "--keepalive", "--completion", "--fzf", "--configfolder", "--engineer-model", "--engineer-api-key", "--architect-model", "--architect-api-key", "--help"]
|
||||
if words[0] == "api":
|
||||
strings=["--start", "--stop", "--restart", "--debug", "--help"]
|
||||
if words[0] in ["--mod", "--edit", "-e", "--show", "-s", "--add", "-a", "--rm", "--del", "-r"]:
|
||||
strings=["profile"]
|
||||
if words[0] in ["list", "ls"]:
|
||||
strings=["profiles", "nodes", "folders"]
|
||||
if words[0] in ["bulk", "mv", "cp", "copy"]:
|
||||
strings=["--help"]
|
||||
if words[0] in ["--rm", "--del", "-r"]:
|
||||
strings.extend(folders)
|
||||
if words[0] in ["--rm", "--del", "-r", "--mod", "--edit", "-e", "--show", "-s", "mv", "move", "cp", "copy"]:
|
||||
strings.extend(nodes)
|
||||
if words[0] == "plugin":
|
||||
strings = ["--help", "--add", "--update", "--del", "--enable", "--disable", "--list"]
|
||||
if words[0] in ["run", "import", "export"]:
|
||||
strings = ["--help"]
|
||||
if words[0] == "export":
|
||||
pathstrings = _getcwd(words, words[0], True)
|
||||
else:
|
||||
pathstrings = _getcwd(words, words[0])
|
||||
strings.extend(pathstrings)
|
||||
if words[0] == "run":
|
||||
strings.extend(nodes)
|
||||
exit()
|
||||
|
||||
elif wordsnumber >= 4 and words[0] == "export" and words[1] != "--help":
|
||||
strings = [item for item in folders if not any(word in item for word in words[:-1])]
|
||||
|
||||
elif wordsnumber >= 4 and words[0] in ["list", "ls"] and words[1] == "nodes":
|
||||
options = ["--format", "--filter"]
|
||||
strings = [item for item in options if not any(word in item for word in words[:-1])]
|
||||
|
||||
elif wordsnumber == 4:
|
||||
strings=[]
|
||||
if words[0] == "profile" and words[1] in ["--rm", "--del", "-r", "--mod", "--edit", "-e", "--show", "-s"]:
|
||||
strings.extend(profiles)
|
||||
if words[1] == "profile" and words[0] in ["--rm", "--del", "-r", "--mod", "--edit", "-e", "--show", "-s"]:
|
||||
strings.extend(profiles)
|
||||
if words[0] == "config" and words[1] == "--completion":
|
||||
strings=["bash", "zsh"]
|
||||
if words[0] == "config" and words[1] in ["--fzf", "--allow-uppercase"]:
|
||||
strings=["true", "false"]
|
||||
if words[0] == "config" and words[1] in ["--configfolder"]:
|
||||
strings=_getcwd(words,words[1],True)
|
||||
if words[0] == "plugin" and words[1] in ["--update", "--del", "--enable", "--disable"]:
|
||||
strings=_get_plugins(words[1], defaultdir)
|
||||
|
||||
elif wordsnumber == 5 and words[0] == "plugin" and words[1] in ["--add", "--update"]:
|
||||
strings=_getcwd(words, words[2])
|
||||
# --- Tree-based completion ---
|
||||
else:
|
||||
exit()
|
||||
tree = _build_tree(nodes, folders, profiles, plugins, configdir)
|
||||
strings = resolve_completion(words, tree)
|
||||
|
||||
current_word = words[-1] if len(words) > 0 else ""
|
||||
matches = [s for s in strings if s.startswith(current_word)]
|
||||
|
||||
if app == "bash":
|
||||
strings = [s if s.endswith('/') else f"'{s} '" for s in strings]
|
||||
strings = [s if s.endswith('/') else f"'{s} '" for s in matches]
|
||||
else:
|
||||
strings = matches
|
||||
|
||||
print('\t'.join(strings))
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
+20
-13
@@ -3,6 +3,7 @@
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import yaml
|
||||
import shutil
|
||||
from Crypto.PublicKey import RSA
|
||||
@@ -12,9 +13,9 @@ from copy import deepcopy
|
||||
from .hooks import MethodHook, ClassHook
|
||||
from . import printer
|
||||
|
||||
|
||||
|
||||
#functions and classes
|
||||
class NoAliasDumper(yaml.SafeDumper):
|
||||
def ignore_aliases(self, data):
|
||||
return True
|
||||
|
||||
@ClassHook
|
||||
class configfile:
|
||||
@@ -95,7 +96,7 @@ class configfile:
|
||||
printer.warning(f"Legacy config {legacy_file} has invalid structure, skipping migration.")
|
||||
else:
|
||||
with open(self.file, 'w') as f:
|
||||
yaml.dump(old_data, f, default_flow_style=False, sort_keys=False)
|
||||
yaml.dump(old_data, f, Dumper=NoAliasDumper, default_flow_style=False, sort_keys=False)
|
||||
# Verify the written YAML can be read back correctly
|
||||
with open(self.file, 'r') as f:
|
||||
verify = yaml.safe_load(f)
|
||||
@@ -173,7 +174,7 @@ class configfile:
|
||||
if self._validate_config(data):
|
||||
# Re-write the YAML from good cache
|
||||
with open(conf, 'w') as f:
|
||||
yaml.dump(data, f, default_flow_style=False, sort_keys=False)
|
||||
yaml.dump(data, f, Dumper=NoAliasDumper, default_flow_style=False, sort_keys=False)
|
||||
return data
|
||||
# Both broken or no cache - create fresh
|
||||
printer.error("Config file is corrupt and no valid cache exists. Creating default config.")
|
||||
@@ -202,7 +203,7 @@ class configfile:
|
||||
#Create config file (always writes defaults, safe for recovery)
|
||||
defaultconfig = {'config': {'case': False, 'idletime': 30, 'fzf': False}, 'connections': {}, 'profiles': { "default": { "host":"", "protocol":"ssh", "port":"", "user":"", "password":"", "options":"", "logs":"", "tags": "", "jumphost":""}}}
|
||||
with open(conf, "w") as f:
|
||||
yaml.dump(defaultconfig, f, default_flow_style=False, sort_keys=False)
|
||||
yaml.dump(defaultconfig, f, Dumper=NoAliasDumper, default_flow_style=False, sort_keys=False)
|
||||
os.chmod(conf, 0o600)
|
||||
try:
|
||||
with open(self.cachefile, 'w') as f:
|
||||
@@ -221,7 +222,7 @@ class configfile:
|
||||
tmpfile = conf + '.tmp'
|
||||
try:
|
||||
with open(tmpfile, "w") as f:
|
||||
yaml.dump(newconfig, f, default_flow_style=False, sort_keys=False)
|
||||
yaml.dump(newconfig, f, Dumper=NoAliasDumper, default_flow_style=False, sort_keys=False)
|
||||
# Atomic replace: only overwrite original if write succeeded
|
||||
shutil.move(tmpfile, conf)
|
||||
with open(self.cachefile, "w") as f:
|
||||
@@ -238,11 +239,14 @@ class configfile:
|
||||
return 1
|
||||
return 0
|
||||
|
||||
def _generate_nodes_cache(self):
|
||||
def _generate_nodes_cache(self, nodes=None, folders=None, profiles=None):
|
||||
try:
|
||||
nodes = self._getallnodes()
|
||||
folders = self._getallfolders()
|
||||
profiles = list(self.profiles.keys())
|
||||
if nodes is None:
|
||||
nodes = self._getallnodes()
|
||||
if folders is None:
|
||||
folders = self._getallfolders()
|
||||
if profiles is None:
|
||||
profiles = list(self.profiles.keys())
|
||||
|
||||
with open(self.fzf_cachefile, "w") as f:
|
||||
f.write("\n".join(nodes))
|
||||
@@ -253,6 +257,7 @@ class configfile:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
def _createkey(self, keyfile):
|
||||
#Create key file
|
||||
key = RSA.generate(2048)
|
||||
@@ -487,7 +492,8 @@ class configfile:
|
||||
elif isinstance(filter, list):
|
||||
nodes = [item for item in nodes if any(re.search(pattern, item) for pattern in filter)]
|
||||
else:
|
||||
raise ValueError("filter must be a string or a list of strings")
|
||||
printer.error("Invalid filter: must be a string or a list of strings.")
|
||||
sys.exit(1)
|
||||
return nodes
|
||||
|
||||
@MethodHook
|
||||
@@ -512,7 +518,8 @@ class configfile:
|
||||
filter = ["^(?!.*@).+$" if item == "@" else item for item in filter]
|
||||
nodes = {k: v for k, v in nodes.items() if any(re.search(pattern, k) for pattern in filter)}
|
||||
else:
|
||||
raise ValueError("filter must be a string or a list of strings")
|
||||
printer.error("Invalid filter: must be a string or a list of strings.")
|
||||
sys.exit(1)
|
||||
if extract:
|
||||
for node, keys in nodes.items():
|
||||
for key, value in keys.items():
|
||||
|
||||
+370
-1613
File diff suppressed because it is too large
Load Diff
+76
-22
@@ -13,8 +13,9 @@ import threading
|
||||
from pathlib import Path
|
||||
from copy import deepcopy
|
||||
from .hooks import ClassHook, MethodHook
|
||||
from . import printer
|
||||
import io
|
||||
from . import printer
|
||||
|
||||
|
||||
#functions and classes
|
||||
@ClassHook
|
||||
@@ -99,6 +100,8 @@ class node:
|
||||
profile = re.search("^@(.*)", password[i])
|
||||
if profile and config != '':
|
||||
self.password.append(config.profiles[profile.group(1)]["password"])
|
||||
else:
|
||||
self.password.append(password[i])
|
||||
else:
|
||||
self.password = [password]
|
||||
if self.jumphost != "" and config != '':
|
||||
@@ -121,6 +124,8 @@ class node:
|
||||
profile = re.search("^@(.*)", self.jumphost["password"][i])
|
||||
if profile:
|
||||
jumphost_password.append(config.profiles[profile.group(1)]["password"])
|
||||
else:
|
||||
jumphost_password.append(self.jumphost["password"][i])
|
||||
self.jumphost["password"] = jumphost_password
|
||||
else:
|
||||
self.jumphost["password"] = [self.jumphost["password"]]
|
||||
@@ -159,7 +164,9 @@ class node:
|
||||
decrypted = decryptor.decrypt(ast.literal_eval(passwd)).decode("utf-8")
|
||||
dpass.append(decrypted)
|
||||
except Exception:
|
||||
raise ValueError("Missing or corrupted key")
|
||||
printer.error("Decryption failed: Missing or corrupted key.")
|
||||
printer.info("Verify your RSA key and configuration settings.")
|
||||
sys.exit(1)
|
||||
return dpass
|
||||
|
||||
|
||||
@@ -242,7 +249,7 @@ class node:
|
||||
|
||||
|
||||
@MethodHook
|
||||
def interact(self, debug = False):
|
||||
def interact(self, debug = False, logger = None):
|
||||
'''
|
||||
Allow user to interact with the node directly, mostly used by connection manager.
|
||||
|
||||
@@ -250,12 +257,15 @@ class node:
|
||||
|
||||
- debug (bool): If True, display all the connecting information
|
||||
before interact. Default False.
|
||||
- logger (callable): Optional callback for status reporting.
|
||||
'''
|
||||
connect = self._connect(debug = debug)
|
||||
connect = self._connect(debug = debug, logger = logger)
|
||||
if connect == True:
|
||||
size = re.search('columns=([0-9]+).*lines=([0-9]+)',str(os.get_terminal_size()))
|
||||
self.child.setwinsize(int(size.group(2)),int(size.group(1)))
|
||||
printer.success("Connected to " + self.unique + " at " + self.host + (":" if self.port != '' else '') + self.port + " via: " + self.protocol)
|
||||
if logger:
|
||||
logger("success", "Connected to " + self.unique + " at " + self.host + (":" if self.port != '' else '') + self.port + " via: " + self.protocol)
|
||||
|
||||
if 'logfile' in dir(self):
|
||||
# Initialize self.mylog
|
||||
if not 'mylog' in dir(self):
|
||||
@@ -280,14 +290,19 @@ class node:
|
||||
f.write(self._logclean(self.mylog.getvalue().decode(), True))
|
||||
|
||||
else:
|
||||
printer.error(connect)
|
||||
exit(1)
|
||||
if logger:
|
||||
logger("error", str(connect))
|
||||
else:
|
||||
printer.error(f"Connection failed: {str(connect)}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@MethodHook
|
||||
def run(self, commands, vars = None,*, folder = '', prompt = r'>$|#$|\$$|>.$|#.$|\$.$', stdout = False, timeout = 10):
|
||||
def run(self, commands, vars = None,*, folder = '', prompt = r'>$|#$|\$$|>.$|#.$|\$.$', stdout = False, timeout = 10, logger = None):
|
||||
'''
|
||||
Run a command or list of commands on the node and return the output.
|
||||
|
||||
|
||||
### Parameters:
|
||||
|
||||
- commands (str/list): Commands to run on the node. Should be
|
||||
@@ -324,9 +339,12 @@ class node:
|
||||
str: Output of the commands you ran on the node.
|
||||
|
||||
'''
|
||||
connect = self._connect(timeout = timeout)
|
||||
connect = self._connect(timeout = timeout, logger = logger)
|
||||
now = datetime.datetime.now().strftime('%Y-%m-%d_%H%M%S')
|
||||
if connect == True:
|
||||
if logger:
|
||||
logger("success", "Connected to " + self.unique + " at " + self.host + (":" if self.port != '' else '') + self.port + " via: " + self.protocol)
|
||||
|
||||
# Attempt to set the terminal size
|
||||
try:
|
||||
self.child.setwinsize(65535, 65535)
|
||||
@@ -338,6 +356,7 @@ class node:
|
||||
if "prompt" in self.tags:
|
||||
prompt = self.tags["prompt"]
|
||||
expects = [prompt, pexpect.EOF, pexpect.TIMEOUT]
|
||||
|
||||
output = ''
|
||||
status = ''
|
||||
if not isinstance(commands, list):
|
||||
@@ -357,8 +376,8 @@ class node:
|
||||
result = self.child.expect(expects, timeout = timeout)
|
||||
self.child.close()
|
||||
output = self._logclean(self.mylog.getvalue().decode(), True)
|
||||
if stdout == True:
|
||||
print(output)
|
||||
if logger:
|
||||
logger("output", output)
|
||||
if folder != '':
|
||||
with open(folder + "/" + self.unique + "_" + now + ".txt", "w") as f:
|
||||
f.write(output)
|
||||
@@ -372,19 +391,21 @@ class node:
|
||||
else:
|
||||
self.output = connect
|
||||
self.status = 1
|
||||
if stdout == True:
|
||||
print(connect)
|
||||
if logger:
|
||||
logger("error", f"Connection failed: {connect}")
|
||||
if folder != '':
|
||||
with open(folder + "/" + self.unique + "_" + now + ".txt", "w") as f:
|
||||
f.write(connect)
|
||||
|
||||
f.close()
|
||||
return connect
|
||||
|
||||
@MethodHook
|
||||
def test(self, commands, expected, vars = None,*, prompt = r'>$|#$|\$$|>.$|#.$|\$.$', timeout = 10):
|
||||
def test(self, commands, expected, vars = None,*, prompt = r'>$|#$|\$$|>.$|#.$|\$.$', timeout = 10, logger = None):
|
||||
'''
|
||||
Run a command or list of commands on the node, then check if expected value appears on the output after the last command.
|
||||
|
||||
|
||||
### Parameters:
|
||||
|
||||
- commands (str/list): Commands to run on the node. Should be
|
||||
@@ -420,8 +441,11 @@ class node:
|
||||
false if prompt is found before.
|
||||
|
||||
'''
|
||||
connect = self._connect(timeout = timeout)
|
||||
connect = self._connect(timeout = timeout, logger = logger)
|
||||
if connect == True:
|
||||
if logger:
|
||||
logger("success", "Connected to " + self.unique + " at " + self.host + (":" if self.port != '' else '') + self.port + " via: " + self.protocol)
|
||||
|
||||
# Attempt to set the terminal size
|
||||
try:
|
||||
self.child.setwinsize(65535, 65535)
|
||||
@@ -536,12 +560,14 @@ class node:
|
||||
elif self.protocol == "docker":
|
||||
return self._generate_docker_cmd()
|
||||
else:
|
||||
raise ValueError(f"Invalid protocol: {self.protocol}")
|
||||
printer.error(f"Invalid protocol: {self.protocol}")
|
||||
sys.exit(1)
|
||||
|
||||
@MethodHook
|
||||
def _connect(self, debug=False, timeout=10, max_attempts=3):
|
||||
def _connect(self, debug=False, timeout=10, max_attempts=3, logger=None):
|
||||
|
||||
cmd = self._get_cmd()
|
||||
passwords = self._passtx(self.password) if self.password[0] else []
|
||||
passwords = self._passtx(self.password) if self.password and any(self.password) else []
|
||||
if self.logs != '':
|
||||
self.logfile = self._logfile()
|
||||
default_prompt = r'>$|#$|\$$|>.$|#.$|\$.$'
|
||||
@@ -586,10 +612,12 @@ class node:
|
||||
if isinstance(self.tags, dict) and self.tags.get("console"):
|
||||
child.sendline()
|
||||
if debug:
|
||||
printer.debug(f"Command:\n{cmd}")
|
||||
if logger:
|
||||
logger("debug", f"Command:\n{cmd}")
|
||||
self.mylog = io.BytesIO()
|
||||
child.logfile_read = self.mylog
|
||||
|
||||
|
||||
endloop = False
|
||||
for i in range(len(passwords) if passwords else 1):
|
||||
while True:
|
||||
@@ -710,10 +738,11 @@ class nodes:
|
||||
|
||||
|
||||
@MethodHook
|
||||
def run(self, commands, vars = None,*, folder = None, prompt = None, stdout = None, parallel = 10, timeout = None, on_complete = None):
|
||||
def run(self, commands, vars = None,*, folder = None, prompt = None, stdout = None, parallel = 10, timeout = None, on_complete = None, logger = None):
|
||||
'''
|
||||
Run a command or list of commands on all the nodes in nodelist.
|
||||
|
||||
|
||||
### Parameters:
|
||||
|
||||
- commands (str/list): Commands to run on the nodes. Should be str or
|
||||
@@ -792,11 +821,17 @@ class nodes:
|
||||
nodesargs[n.unique]["vars"].update(vars["__global__"])
|
||||
if n.unique in vars.keys():
|
||||
nodesargs[n.unique]["vars"].update(vars[n.unique])
|
||||
|
||||
# Pass the logger to the node
|
||||
nodesargs[n.unique]["logger"] = logger
|
||||
|
||||
if on_complete:
|
||||
tasks.append(threading.Thread(target=_run_node, args=(n, nodesargs[n.unique], on_complete)))
|
||||
else:
|
||||
tasks.append(threading.Thread(target=n.run, kwargs=nodesargs[n.unique]))
|
||||
|
||||
taskslist = list(self._splitlist(tasks, parallel))
|
||||
|
||||
for t in taskslist:
|
||||
for i in t:
|
||||
i.start()
|
||||
@@ -810,10 +845,11 @@ class nodes:
|
||||
return output
|
||||
|
||||
@MethodHook
|
||||
def test(self, commands, expected, vars = None,*, prompt = None, parallel = 10, timeout = None):
|
||||
def test(self, commands, expected, vars = None,*, prompt = None, parallel = 10, timeout = None, on_complete = None, logger = None):
|
||||
'''
|
||||
Run a command or list of commands on all the nodes in nodelist, then check if expected value appears on the output after the last command.
|
||||
|
||||
|
||||
### Parameters:
|
||||
|
||||
- commands (str/list): Commands to run on the node. Should be str or
|
||||
@@ -848,6 +884,11 @@ class nodes:
|
||||
- timeout (int): Time in seconds for expect to wait for prompt/EOF.
|
||||
default 10.
|
||||
|
||||
- on_complete (callable): Optional callback called when each node
|
||||
finishes. Receives (unique, output, status).
|
||||
Called from the node's thread so it must
|
||||
be thread-safe.
|
||||
|
||||
### Returns:
|
||||
|
||||
dict: Dictionary formed by nodes unique as keys, value is True if
|
||||
@@ -867,6 +908,13 @@ class nodes:
|
||||
result = {}
|
||||
status = {}
|
||||
tasks = []
|
||||
|
||||
def _test_node(node_obj, node_args, callback):
|
||||
"""Wrapper that runs a node test and fires the callback on completion."""
|
||||
node_obj.test(**node_args)
|
||||
if callback:
|
||||
callback(node_obj.unique, node_obj.output, node_obj.status, node_obj.result)
|
||||
|
||||
for n in self.nodelist:
|
||||
nodesargs[n.unique] = deepcopy(args)
|
||||
if vars != None:
|
||||
@@ -875,7 +923,13 @@ class nodes:
|
||||
nodesargs[n.unique]["vars"].update(vars["__global__"])
|
||||
if n.unique in vars.keys():
|
||||
nodesargs[n.unique]["vars"].update(vars[n.unique])
|
||||
tasks.append(threading.Thread(target=n.test, kwargs=nodesargs[n.unique]))
|
||||
nodesargs[n.unique]["logger"] = logger
|
||||
|
||||
if on_complete:
|
||||
tasks.append(threading.Thread(target=_test_node, args=(n, nodesargs[n.unique], on_complete)))
|
||||
else:
|
||||
tasks.append(threading.Thread(target=n.test, kwargs=nodesargs[n.unique]))
|
||||
|
||||
taskslist = list(self._splitlist(tasks, parallel))
|
||||
for t in taskslist:
|
||||
for i in t:
|
||||
|
||||
+360
-357
@@ -1,338 +1,5 @@
|
||||
import argparse
|
||||
import sys
|
||||
import subprocess
|
||||
import random
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
from pexpect import TIMEOUT
|
||||
from connpy import printer
|
||||
|
||||
class RemoteCapture:
|
||||
def __init__(self, connapp, node_name, interface, namespace=None, use_wireshark=False, tcpdump_filter=None, tcpdump_args=None):
|
||||
self.connapp = connapp
|
||||
self.node_name = node_name
|
||||
self.interface = interface
|
||||
self.namespace = namespace
|
||||
self.use_wireshark = use_wireshark
|
||||
self.tcpdump_filter = tcpdump_filter or []
|
||||
self.tcpdump_args = tcpdump_args if isinstance(tcpdump_args, list) else []
|
||||
|
||||
if node_name.startswith("@"): # fuzzy match
|
||||
matches = [k for k in connapp.nodes_list if node_name in k]
|
||||
else:
|
||||
matches = [k for k in connapp.nodes_list if k.startswith(node_name)]
|
||||
|
||||
if not matches:
|
||||
printer.error(f"Node '{node_name}' not found.")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = connapp._choose(matches, "node", "capture")
|
||||
|
||||
if matches[0] is None:
|
||||
sys.exit(7)
|
||||
|
||||
node_data = connapp.config.getitem(matches[0])
|
||||
self.node = connapp.node(matches[0], **node_data, config=connapp.config)
|
||||
|
||||
if self.node.protocol != "ssh":
|
||||
printer.error(f"Node '{self.node.unique}' must be an SSH connection.")
|
||||
sys.exit(2)
|
||||
|
||||
self.wireshark_path = connapp.config.config.get("wireshark_path")
|
||||
|
||||
def _start_local_listener(self, port, ws_proc=None):
|
||||
self.fake_connection = False
|
||||
self.listener_active = True
|
||||
self.listener_conn = None
|
||||
self.listener_connected = threading.Event()
|
||||
|
||||
def listen():
|
||||
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
|
||||
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
s.bind(("localhost", port))
|
||||
s.listen(1)
|
||||
printer.start(f"Listening on localhost:{port}")
|
||||
|
||||
conn, addr = s.accept()
|
||||
self.listener_conn = conn
|
||||
if not self.fake_connection:
|
||||
printer.start(f"Connection from {addr}")
|
||||
self.listener_connected.set()
|
||||
|
||||
try:
|
||||
while self.listener_active:
|
||||
data = conn.recv(4096)
|
||||
if not data:
|
||||
break
|
||||
|
||||
if self.use_wireshark and ws_proc:
|
||||
try:
|
||||
ws_proc.stdin.write(data)
|
||||
ws_proc.stdin.flush()
|
||||
except BrokenPipeError:
|
||||
printer.info("Wireshark closed the pipe.")
|
||||
break
|
||||
else:
|
||||
sys.stdout.buffer.write(data)
|
||||
sys.stdout.buffer.flush()
|
||||
except Exception as e:
|
||||
if isinstance(e, BrokenPipeError):
|
||||
printer.info("Listener closed due to broken pipe.")
|
||||
else:
|
||||
printer.error(f"Listener error: {e}")
|
||||
finally:
|
||||
conn.close()
|
||||
self.listener_conn = None
|
||||
|
||||
self.listener_thread = threading.Thread(target=listen)
|
||||
self.listener_thread.daemon = True
|
||||
self.listener_thread.start()
|
||||
|
||||
def _is_port_in_use(self, port):
|
||||
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
|
||||
return s.connect_ex(('localhost', port)) == 0
|
||||
|
||||
def _find_free_port(self, start=20000, end=30000):
|
||||
for _ in range(10):
|
||||
port = random.randint(start, end)
|
||||
if not self._is_port_in_use(port):
|
||||
return port
|
||||
raise RuntimeError("No free port found for SSH tunnel.")
|
||||
|
||||
def _monitor_wireshark(self, ws_proc):
|
||||
try:
|
||||
while True:
|
||||
try:
|
||||
ws_proc.wait(timeout=1)
|
||||
self.listener_active = False
|
||||
if self.listener_conn:
|
||||
printer.info("Wireshark exited, stopping listener.")
|
||||
try:
|
||||
self.listener_conn.shutdown(socket.SHUT_RDWR)
|
||||
self.listener_conn.close()
|
||||
except Exception:
|
||||
pass
|
||||
break
|
||||
except subprocess.TimeoutExpired:
|
||||
if not self.listener_active:
|
||||
break
|
||||
time.sleep(0.2)
|
||||
except Exception as e:
|
||||
printer.warning(f"Error in monitor_wireshark: {e}")
|
||||
|
||||
def _detect_sudo_requirement(self):
|
||||
base_cmd = f"tcpdump -i {self.interface} -w - -U -c 1"
|
||||
if self.namespace:
|
||||
base_cmd = f"ip netns exec {self.namespace} {base_cmd}"
|
||||
|
||||
cmds = [base_cmd, f"sudo {base_cmd}"]
|
||||
|
||||
printer.info(f"Verifying sudo requirement")
|
||||
for cmd in cmds:
|
||||
try:
|
||||
self.node.child.sendline(cmd)
|
||||
start_time = time.time()
|
||||
while time.time() - start_time < 3:
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'listening on',
|
||||
r'permission denied',
|
||||
r'cannot',
|
||||
r'No such file or directory',
|
||||
], timeout=1)
|
||||
|
||||
if index == 0:
|
||||
self.node.child.send("\x03")
|
||||
return "sudo" in cmd
|
||||
else:
|
||||
break
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
self.node.child.send("\x03")
|
||||
time.sleep(0.5)
|
||||
try:
|
||||
self.node.child.read_nonblocking(size=1024, timeout=0.5)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
printer.warning(f"Error during sudo detection: {e}")
|
||||
continue
|
||||
|
||||
printer.error(f"Failed to run tcpdump on remote node '{self.node.unique}'")
|
||||
sys.exit(4)
|
||||
|
||||
def _monitor_capture_output(self):
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'Broken pipe',
|
||||
r'packet[s]? captured'
|
||||
], timeout=None)
|
||||
if index == 0:
|
||||
printer.error("Tcpdump failed: Broken pipe.")
|
||||
else:
|
||||
printer.success("Tcpdump finished capturing packets.")
|
||||
|
||||
self.listener_active = False
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _sendline_until_connected(self, cmd, retries=5, interval=2):
|
||||
for attempt in range(1, retries + 1):
|
||||
printer.info(f"Attempt {attempt}/{retries} to connect listener...")
|
||||
self.node.child.sendline(cmd)
|
||||
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'listening on',
|
||||
TIMEOUT,
|
||||
r'permission',
|
||||
r'not permitted',
|
||||
r'invalid',
|
||||
r'unrecognized',
|
||||
r'Unable',
|
||||
r'No such',
|
||||
r'illegal',
|
||||
r'not found',
|
||||
r'non-ether',
|
||||
r'syntax error'
|
||||
], timeout=5)
|
||||
|
||||
if index == 0:
|
||||
|
||||
self.monitor_end = threading.Thread(target=self._monitor_capture_output)
|
||||
self.monitor_end.daemon = True
|
||||
self.monitor_end.start()
|
||||
|
||||
if self.listener_connected.wait(timeout=interval):
|
||||
printer.success("Listener successfully received a connection.")
|
||||
return True
|
||||
else:
|
||||
printer.warning("No connection yet. Retrying...")
|
||||
|
||||
elif index == 1:
|
||||
error = f"tcpdump did not respond within the expected time.\n" \
|
||||
f"Command used:\n{cmd}\n" \
|
||||
f"→ Please verify the command syntax."
|
||||
return f"{error}"
|
||||
else:
|
||||
before_last_line = self.node.child.before.decode().splitlines()[-1]
|
||||
error = f"Tcpdump error detected: " \
|
||||
f"{before_last_line}{self.node.child.after.decode()}{self.node.child.readline().decode()}".rstrip()
|
||||
return f"{error}"
|
||||
|
||||
except Exception as e:
|
||||
printer.warning(f"Unexpected error during tcpdump startup: {e}")
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _build_tcpdump_command(self):
|
||||
base = f"tcpdump -i {self.interface}"
|
||||
if self.use_wireshark:
|
||||
base += " -w - -U"
|
||||
else:
|
||||
base += " -l"
|
||||
|
||||
if self.namespace:
|
||||
base = f"ip netns exec {self.namespace} {base}"
|
||||
|
||||
if self.requires_sudo:
|
||||
base = f"sudo {base}"
|
||||
|
||||
if self.tcpdump_args:
|
||||
base += " " + " ".join(self.tcpdump_args)
|
||||
|
||||
if self.tcpdump_filter:
|
||||
base += " " + " ".join(self.tcpdump_filter)
|
||||
|
||||
base += f" | nc localhost {self.local_port}"
|
||||
return base
|
||||
|
||||
def run(self):
|
||||
if self.use_wireshark:
|
||||
if not self.wireshark_path:
|
||||
printer.error("Wireshark path not set in config.\nUse '--set-wireshark-path /full/path/to/wireshark' to configure it.")
|
||||
sys.exit(1)
|
||||
|
||||
self.local_port = self._find_free_port()
|
||||
self.node.options += f" -o ExitOnForwardFailure=yes -R {self.local_port}:localhost:{self.local_port}"
|
||||
|
||||
connection = self.node._connect()
|
||||
if connection is not True:
|
||||
printer.error(f"Could not connect to {self.node.unique}\n{connection}")
|
||||
sys.exit(1)
|
||||
|
||||
self.requires_sudo = self._detect_sudo_requirement()
|
||||
tcpdump_cmd = self._build_tcpdump_command()
|
||||
|
||||
ws_proc = None
|
||||
monitor_thread = None
|
||||
|
||||
if self.use_wireshark:
|
||||
|
||||
printer.info(f"Live capture from {self.node.unique}:{self.interface}, launching Wireshark...")
|
||||
try:
|
||||
ws_proc = subprocess.Popen(
|
||||
[self.wireshark_path, "-k", "-i", "-"],
|
||||
stdin=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE
|
||||
)
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to launch Wireshark: {e}\nMake sure the path is correct and Wireshark is installed.")
|
||||
exit(1)
|
||||
|
||||
monitor_thread = threading.Thread(target=self._monitor_wireshark, args=(ws_proc,))
|
||||
monitor_thread.daemon = True
|
||||
monitor_thread.start()
|
||||
else:
|
||||
printer.info(f"Live text capture from {self.node.unique}:{self.interface}")
|
||||
printer.info("Press Ctrl+C to stop.\n")
|
||||
|
||||
try:
|
||||
self._start_local_listener(self.local_port, ws_proc=ws_proc)
|
||||
time.sleep(1) # small delay before retry attempts
|
||||
|
||||
result = self._sendline_until_connected(tcpdump_cmd, retries=5, interval=2)
|
||||
if result is not True:
|
||||
if isinstance(result, str):
|
||||
printer.error(f"{result}")
|
||||
else:
|
||||
printer.error("Listener connection failed after all retries.")
|
||||
printer.debug(f"Command used:\n{tcpdump_cmd}")
|
||||
if not self.listener_conn:
|
||||
try:
|
||||
self.fake_connection = True
|
||||
socket.create_connection(("localhost", self.local_port), timeout=1).close()
|
||||
except OSError:
|
||||
pass
|
||||
self.listener_active = False
|
||||
return
|
||||
|
||||
while self.listener_active:
|
||||
time.sleep(0.5)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("")
|
||||
printer.warning("Capture interrupted by user.")
|
||||
self.listener_active = False
|
||||
finally:
|
||||
if self.listener_conn:
|
||||
try:
|
||||
self.listener_conn.shutdown(socket.SHUT_RDWR)
|
||||
self.listener_conn.close()
|
||||
except OSError:
|
||||
pass
|
||||
if hasattr(self.node, "child"):
|
||||
self.node.child.close(force=True)
|
||||
if self.listener_thread.is_alive():
|
||||
self.listener_thread.join()
|
||||
if monitor_thread and monitor_thread.is_alive():
|
||||
monitor_thread.join()
|
||||
|
||||
|
||||
class Parser:
|
||||
def __init__(self):
|
||||
@@ -359,41 +26,377 @@ class Parser:
|
||||
)
|
||||
|
||||
class Entrypoint:
|
||||
@staticmethod
|
||||
def get_remote_capture_class():
|
||||
import subprocess
|
||||
import random
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
from pexpect import TIMEOUT
|
||||
from connpy import printer
|
||||
|
||||
class RemoteCapture:
|
||||
def __init__(self, connapp, node_name, interface, namespace=None, use_wireshark=False, tcpdump_filter=None, tcpdump_args=None):
|
||||
self.connapp = connapp
|
||||
self.node_name = node_name
|
||||
self.interface = interface
|
||||
self.namespace = namespace
|
||||
self.use_wireshark = use_wireshark
|
||||
self.tcpdump_filter = tcpdump_filter or []
|
||||
self.tcpdump_args = tcpdump_args if isinstance(tcpdump_args, list) else []
|
||||
|
||||
if node_name.startswith("@"): # fuzzy match
|
||||
matches = self.connapp.services.nodes.list_nodes(node_name)
|
||||
else:
|
||||
matches = self.connapp.services.nodes.list_nodes(f"^{node_name}")
|
||||
|
||||
if not matches:
|
||||
printer.error(f"Node '{node_name}' not found.")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
from ..cli.helpers import choose
|
||||
matches[0] = choose(self.connapp, matches, "node", "capture")
|
||||
|
||||
if matches[0] is None:
|
||||
sys.exit(7)
|
||||
|
||||
node_data = self.connapp.services.nodes.get_node_details(matches[0])
|
||||
self.node = self.connapp.node(matches[0], **node_data, config=self.connapp.config)
|
||||
|
||||
if self.node.protocol != "ssh":
|
||||
printer.error(f"Node '{self.node.unique}' must be an SSH connection.")
|
||||
sys.exit(2)
|
||||
|
||||
settings = self.connapp.services.config_svc.get_settings()
|
||||
self.wireshark_path = settings.get("wireshark_path")
|
||||
|
||||
def _start_local_listener(self, port, ws_proc=None):
|
||||
self.fake_connection = False
|
||||
self.listener_active = True
|
||||
self.listener_conn = None
|
||||
self.listener_connected = threading.Event()
|
||||
|
||||
def listen():
|
||||
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
|
||||
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
s.bind(("localhost", port))
|
||||
s.listen(1)
|
||||
printer.start(f"Listening on localhost:{port}")
|
||||
|
||||
conn, addr = s.accept()
|
||||
self.listener_conn = conn
|
||||
if not self.fake_connection:
|
||||
printer.start(f"Connection from {addr}")
|
||||
self.listener_connected.set()
|
||||
|
||||
try:
|
||||
while self.listener_active:
|
||||
data = conn.recv(4096)
|
||||
if not data:
|
||||
break
|
||||
|
||||
if self.use_wireshark and ws_proc:
|
||||
try:
|
||||
ws_proc.stdin.write(data)
|
||||
ws_proc.stdin.flush()
|
||||
except BrokenPipeError:
|
||||
printer.info("Wireshark closed the pipe.")
|
||||
break
|
||||
else:
|
||||
sys.stdout.buffer.write(data)
|
||||
sys.stdout.buffer.flush()
|
||||
except Exception as e:
|
||||
if isinstance(e, BrokenPipeError):
|
||||
printer.info("Listener closed due to broken pipe.")
|
||||
else:
|
||||
printer.error(f"Listener error: {e}")
|
||||
finally:
|
||||
conn.close()
|
||||
self.listener_conn = None
|
||||
|
||||
self.listener_thread = threading.Thread(target=listen)
|
||||
self.listener_thread.daemon = True
|
||||
self.listener_thread.start()
|
||||
|
||||
def _is_port_in_use(self, port):
|
||||
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
|
||||
return s.connect_ex(('localhost', port)) == 0
|
||||
|
||||
def _find_free_port(self, start=20000, end=30000):
|
||||
for _ in range(10):
|
||||
port = random.randint(start, end)
|
||||
if not self._is_port_in_use(port):
|
||||
return port
|
||||
printer.error("No free port found for SSH tunnel.")
|
||||
sys.exit(1)
|
||||
|
||||
def _monitor_wireshark(self, ws_proc):
|
||||
try:
|
||||
while True:
|
||||
try:
|
||||
ws_proc.wait(timeout=1)
|
||||
self.listener_active = False
|
||||
if self.listener_conn:
|
||||
printer.info("Wireshark exited, stopping listener.")
|
||||
try:
|
||||
self.listener_conn.shutdown(socket.SHUT_RDWR)
|
||||
self.listener_conn.close()
|
||||
except Exception:
|
||||
pass
|
||||
break
|
||||
except subprocess.TimeoutExpired:
|
||||
if not self.listener_active:
|
||||
break
|
||||
time.sleep(0.2)
|
||||
except Exception as e:
|
||||
printer.warning(f"Error in monitor_wireshark: {e}")
|
||||
|
||||
def _detect_sudo_requirement(self):
|
||||
base_cmd = f"tcpdump -i {self.interface} -w - -U -c 1"
|
||||
if self.namespace:
|
||||
base_cmd = f"ip netns exec {self.namespace} {base_cmd}"
|
||||
|
||||
cmds = [base_cmd, f"sudo {base_cmd}"]
|
||||
|
||||
printer.info(f"Verifying sudo requirement")
|
||||
for cmd in cmds:
|
||||
try:
|
||||
self.node.child.sendline(cmd)
|
||||
start_time = time.time()
|
||||
while time.time() - start_time < 3:
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'listening on',
|
||||
r'permission denied',
|
||||
r'cannot',
|
||||
r'No such file or directory',
|
||||
], timeout=1)
|
||||
|
||||
if index == 0:
|
||||
self.node.child.send("\x03")
|
||||
return "sudo" in cmd
|
||||
else:
|
||||
break
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
self.node.child.send("\x03")
|
||||
time.sleep(0.5)
|
||||
try:
|
||||
self.node.child.read_nonblocking(size=1024, timeout=0.5)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
printer.warning(f"Error during sudo detection: {e}")
|
||||
continue
|
||||
|
||||
printer.error(f"Failed to run tcpdump on remote node '{self.node.unique}'")
|
||||
sys.exit(4)
|
||||
|
||||
def _monitor_capture_output(self):
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'Broken pipe',
|
||||
r'packet[s]? captured'
|
||||
], timeout=None)
|
||||
if index == 0:
|
||||
printer.error("Tcpdump failed: Broken pipe.")
|
||||
else:
|
||||
printer.success("Tcpdump finished capturing packets.")
|
||||
|
||||
self.listener_active = False
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _sendline_until_connected(self, cmd, retries=5, interval=2):
|
||||
for attempt in range(1, retries + 1):
|
||||
printer.info(f"Attempt {attempt}/{retries} to connect listener...")
|
||||
self.node.child.sendline(cmd)
|
||||
|
||||
try:
|
||||
index = self.node.child.expect([
|
||||
r'listening on',
|
||||
TIMEOUT,
|
||||
r'permission',
|
||||
r'not permitted',
|
||||
r'invalid',
|
||||
r'unrecognized',
|
||||
r'Unable',
|
||||
r'No such',
|
||||
r'illegal',
|
||||
r'not found',
|
||||
r'non-ether',
|
||||
r'syntax error'
|
||||
], timeout=5)
|
||||
|
||||
if index == 0:
|
||||
self.monitor_end = threading.Thread(target=self._monitor_capture_output)
|
||||
self.monitor_end.daemon = True
|
||||
self.monitor_end.start()
|
||||
|
||||
if self.listener_connected.wait(timeout=interval):
|
||||
printer.success("Listener successfully received a connection.")
|
||||
return True
|
||||
else:
|
||||
printer.warning("No connection yet. Retrying...")
|
||||
|
||||
elif index == 1:
|
||||
error = f"tcpdump did not respond within the expected time.\nCommand used:\n{cmd}\n\u2192 Please verify the command syntax."
|
||||
return f"{error}"
|
||||
else:
|
||||
before_last_line = self.node.child.before.decode().splitlines()[-1]
|
||||
error = f"Tcpdump error detected: {before_last_line}{self.node.child.after.decode()}{self.node.child.readline().decode()}".rstrip()
|
||||
return f"{error}"
|
||||
|
||||
except Exception as e:
|
||||
printer.warning(f"Unexpected error during tcpdump startup: {e}")
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def _build_tcpdump_command(self):
|
||||
base = f"tcpdump -i {self.interface}"
|
||||
if self.use_wireshark:
|
||||
base += " -w - -U"
|
||||
else:
|
||||
base += " -l"
|
||||
|
||||
if self.namespace:
|
||||
base = f"ip netns exec {self.namespace} {base}"
|
||||
|
||||
if self.requires_sudo:
|
||||
base = f"sudo {base}"
|
||||
|
||||
if self.tcpdump_args:
|
||||
base += " " + " ".join(self.tcpdump_args)
|
||||
|
||||
if self.tcpdump_filter:
|
||||
base += " " + " ".join(self.tcpdump_filter)
|
||||
|
||||
base += f" | nc localhost {self.local_port}"
|
||||
return base
|
||||
|
||||
def run(self):
|
||||
if self.use_wireshark:
|
||||
if not self.wireshark_path:
|
||||
printer.error("Wireshark path not set in config.\nUse '--set-wireshark-path /full/path/to/wireshark' to configure it.")
|
||||
sys.exit(1)
|
||||
|
||||
self.local_port = self._find_free_port()
|
||||
self.node.options += f" -o ExitOnForwardFailure=yes -R {self.local_port}:localhost:{self.local_port}"
|
||||
|
||||
connection = self.node._connect()
|
||||
if connection is not True:
|
||||
printer.error(f"Could not connect to {self.node.unique}\n{connection}")
|
||||
sys.exit(1)
|
||||
|
||||
self.requires_sudo = self._detect_sudo_requirement()
|
||||
tcpdump_cmd = self._build_tcpdump_command()
|
||||
|
||||
ws_proc = None
|
||||
monitor_thread = None
|
||||
|
||||
if self.use_wireshark:
|
||||
printer.info(f"Live capture from {self.node.unique}:{self.interface}, launching Wireshark...")
|
||||
try:
|
||||
ws_proc = subprocess.Popen([self.wireshark_path, "-k", "-i", "-"], stdin=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to launch Wireshark: {e}\nMake sure the path is correct and Wireshark is installed.")
|
||||
exit(1)
|
||||
|
||||
monitor_thread = threading.Thread(target=self._monitor_wireshark, args=(ws_proc,))
|
||||
monitor_thread.daemon = True
|
||||
monitor_thread.start()
|
||||
else:
|
||||
printer.info(f"Live text capture from {self.node.unique}:{self.interface}")
|
||||
printer.info("Press Ctrl+C to stop.\n")
|
||||
|
||||
try:
|
||||
self._start_local_listener(self.local_port, ws_proc=ws_proc)
|
||||
time.sleep(1)
|
||||
|
||||
result = self._sendline_until_connected(tcpdump_cmd, retries=5, interval=2)
|
||||
if result is not True:
|
||||
if isinstance(result, str):
|
||||
printer.error(f"{result}")
|
||||
else:
|
||||
printer.error("Listener connection failed after all retries.")
|
||||
self.listener_active = False
|
||||
return
|
||||
|
||||
while self.listener_active:
|
||||
time.sleep(0.5)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("")
|
||||
printer.warning("Capture interrupted by user.")
|
||||
self.listener_active = False
|
||||
finally:
|
||||
if self.listener_conn:
|
||||
try:
|
||||
self.listener_conn.shutdown(socket.SHUT_RDWR)
|
||||
self.listener_conn.close()
|
||||
except OSError: pass
|
||||
if hasattr(self.node, "child"):
|
||||
self.node.child.close(force=True)
|
||||
|
||||
return RemoteCapture
|
||||
|
||||
def __init__(self, args, parser, connapp):
|
||||
from connpy import printer
|
||||
if "--" in args.unknown_args:
|
||||
args.unknown_args.remove("--")
|
||||
if args.set_wireshark_path:
|
||||
connapp._change_settings("wireshark_path", args.set_wireshark_path)
|
||||
connapp.services.config_svc.update_setting("wireshark_path", args.set_wireshark_path)
|
||||
printer.success(f"Wireshark path updated to: {args.set_wireshark_path}")
|
||||
return
|
||||
|
||||
if not args.node or not args.interface:
|
||||
parser.error("node and interface are required unless --set-wireshark-path is used")
|
||||
|
||||
RemoteCapture = self.get_remote_capture_class()
|
||||
capture = RemoteCapture(
|
||||
connapp=connapp,
|
||||
node_name=args.node,
|
||||
interface=args.interface,
|
||||
namespace=args.namespace,
|
||||
use_wireshark=args.wireshark,
|
||||
tcpdump_filter=args.tcpdump_filter,
|
||||
tcpdump_args=args.unknown_args
|
||||
connapp=connapp, node_name=args.node, interface=args.interface,
|
||||
namespace=args.namespace, use_wireshark=args.wireshark,
|
||||
tcpdump_filter=args.tcpdump_filter, tcpdump_args=args.unknown_args
|
||||
)
|
||||
capture.run()
|
||||
|
||||
def _connpy_completion(wordsnumber, words, info = None):
|
||||
if wordsnumber == 3:
|
||||
result = ["--help", "--set-wireshark-path"]
|
||||
result.extend(info["nodes"])
|
||||
elif wordsnumber == 5 and words[1] in info["nodes"]:
|
||||
result = ['--wireshark', '--namespace', '--filter', '--help']
|
||||
elif wordsnumber == 6 and words[3] in ["-w", "--wireshark"]:
|
||||
result = ['--namespace', '--filter', '--help']
|
||||
elif wordsnumber == 7 and words[3] in ["-n", "--namespace"]:
|
||||
result = ['--wireshark', '--filter', '--help']
|
||||
elif wordsnumber == 8:
|
||||
if any(w in words for w in ["-w", "--wireshark"]) and any(w in words for w in ["-n", "--namespace"]):
|
||||
result = ['--filter', '--help']
|
||||
else:
|
||||
result = []
|
||||
def _connpy_tree(info=None):
|
||||
"""Declarative completion tree for the capture plugin following completion.py patterns."""
|
||||
nodes = info.get("nodes", []) if info else []
|
||||
|
||||
|
||||
return result
|
||||
|
||||
# State 2: Main capture loop (No setup flag here)
|
||||
capture_main = {"__exclude_used__": True}
|
||||
|
||||
# Inline logic to suggest nodes only if no positional has been provided yet
|
||||
get_nodes = lambda w: nodes if not [x for x in w[:-1] if not x.startswith("-") and x != "capture"] else []
|
||||
capture_main["__extra__"] = get_nodes
|
||||
capture_main["*"] = capture_main
|
||||
|
||||
for f in ["--wireshark", "-w", "--help", "-h"]:
|
||||
capture_main[f] = capture_main
|
||||
for f in ["--namespace", "--filter", "-f"]:
|
||||
capture_main[f] = {"*": capture_main}
|
||||
|
||||
# State 1: Start (Highly discoverable configuration)
|
||||
capture_start = {
|
||||
"__exclude_used__": True,
|
||||
"__extra__": get_nodes,
|
||||
"--set-wireshark-path": {"__extra__": lambda w: get_cwd(w, "--set-wireshark-path")}
|
||||
}
|
||||
|
||||
# Transitions from start to main
|
||||
for f in ["--wireshark", "-w", "--help", "-h"]:
|
||||
capture_start[f] = capture_main
|
||||
for f in ["--namespace", "--filter", "-f"]:
|
||||
capture_start[f] = {"*": capture_main}
|
||||
|
||||
capture_start["*"] = capture_main
|
||||
|
||||
return capture_start
|
||||
|
||||
@@ -1,199 +0,0 @@
|
||||
import argparse
|
||||
import yaml
|
||||
import re
|
||||
from connpy import printer
|
||||
|
||||
|
||||
class context_manager:
|
||||
|
||||
def __init__(self, connapp):
|
||||
self.connapp = connapp
|
||||
self.config = connapp.config
|
||||
|
||||
@property
|
||||
def contexts(self):
|
||||
return self.config.config.get("contexts", {})
|
||||
|
||||
@property
|
||||
def current_context(self):
|
||||
return self.config.config.get("current_context", "all")
|
||||
|
||||
@property
|
||||
def regex(self):
|
||||
try:
|
||||
return [re.compile(regex) for regex in self.contexts[self.current_context]]
|
||||
except KeyError:
|
||||
return [re.compile(".*")]
|
||||
|
||||
def add_context(self, context, regex):
|
||||
if not context.isalnum():
|
||||
printer.error("Context name has to be alphanumeric.")
|
||||
exit(1)
|
||||
elif context in self.contexts:
|
||||
printer.error(f"Context {context} already exists.")
|
||||
exit(2)
|
||||
else:
|
||||
contexts = self.contexts
|
||||
contexts[context] = regex
|
||||
self.connapp._change_settings("contexts", contexts)
|
||||
|
||||
def modify_context(self, context, regex):
|
||||
if context == "all":
|
||||
printer.error("Can't modify default context: all")
|
||||
exit(3)
|
||||
elif context not in self.contexts:
|
||||
printer.error(f"Context {context} doesn't exist.")
|
||||
exit(4)
|
||||
else:
|
||||
contexts = self.contexts
|
||||
contexts[context] = regex
|
||||
self.connapp._change_settings("contexts", contexts)
|
||||
|
||||
def delete_context(self, context):
|
||||
if context == "all":
|
||||
printer.error("Can't delete default context: all")
|
||||
exit(3)
|
||||
elif context not in self.contexts:
|
||||
printer.error(f"Context {context} doesn't exist.")
|
||||
exit(4)
|
||||
if context == self.current_context:
|
||||
printer.error(f"Can't delete current context: {self.current_context}")
|
||||
exit(5)
|
||||
else:
|
||||
contexts = self.contexts
|
||||
contexts.pop(context)
|
||||
self.connapp._change_settings("contexts", contexts)
|
||||
|
||||
def list_contexts(self):
|
||||
for key in self.contexts.keys():
|
||||
if key == self.current_context:
|
||||
printer.success(f"{key} (active)")
|
||||
else:
|
||||
printer.custom(" ",key)
|
||||
|
||||
def set_context(self, context):
|
||||
if context not in self.contexts:
|
||||
printer.error(f"Context {context} doesn't exist.")
|
||||
exit(4)
|
||||
elif context == self.current_context:
|
||||
printer.info(f"Context {context} already set")
|
||||
exit(0)
|
||||
else:
|
||||
self.connapp._change_settings("current_context", context)
|
||||
|
||||
def show_context(self, context):
|
||||
if context not in self.contexts:
|
||||
printer.error(f"Context {context} doesn't exist.")
|
||||
exit(4)
|
||||
else:
|
||||
yaml_output = yaml.dump(self.contexts[context], sort_keys=False, default_flow_style=False)
|
||||
printer.custom(context,"")
|
||||
print(yaml_output)
|
||||
|
||||
|
||||
@staticmethod
|
||||
def add_default_context(config):
|
||||
config_modified = False
|
||||
if "contexts" not in config.config:
|
||||
config.config["contexts"] = {}
|
||||
config.config["contexts"]["all"] = [".*"]
|
||||
config_modified = True
|
||||
if "current_context" not in config.config:
|
||||
config.config["current_context"] = "all"
|
||||
config_modified = True
|
||||
if config_modified:
|
||||
config._saveconfig(config.file)
|
||||
|
||||
def match_any_regex(self, node, regex_list):
|
||||
return any(regex.match(node) for regex in regex_list)
|
||||
|
||||
def modify_node_list(self, *args, **kwargs):
|
||||
filtered_nodes = [node for node in kwargs["result"] if self.match_any_regex(node, self.regex)]
|
||||
return filtered_nodes
|
||||
|
||||
def modify_node_dict(self, *args, **kwargs):
|
||||
filtered_nodes = {key: value for key, value in kwargs["result"].items() if self.match_any_regex(key, self.regex)}
|
||||
return filtered_nodes
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
cm = context_manager(connapp)
|
||||
# Register hooks first so that any save triggers a filtered cache generation
|
||||
connapp.config._getallnodes.register_post_hook(cm.modify_node_list)
|
||||
connapp.config._getallfolders.register_post_hook(cm.modify_node_list)
|
||||
connapp.config._getallnodesfull.register_post_hook(cm.modify_node_dict)
|
||||
|
||||
# Define contexts if doesn't exist (triggers save/cache generation)
|
||||
connapp.config.modify(context_manager.add_default_context)
|
||||
|
||||
# Filter in-memory nodes using current context
|
||||
connapp.nodes_list = [node for node in connapp.nodes_list if cm.match_any_regex(node, cm.regex)]
|
||||
connapp.folders = [node for node in connapp.folders if cm.match_any_regex(node, cm.regex)]
|
||||
|
||||
class Parser:
|
||||
def __init__(self):
|
||||
self.parser = argparse.ArgumentParser(description="Manage contexts with regex matching", formatter_class=argparse.RawTextHelpFormatter)
|
||||
|
||||
# Define the context name as a positional argument
|
||||
self.parser.add_argument("context_name", help="Name of the context", nargs='?')
|
||||
|
||||
group = self.parser.add_mutually_exclusive_group(required=True)
|
||||
group.add_argument("-a", "--add", nargs='+', help='Add a new context with regex values.\nUsage: context -a name "regex1" "regex2"')
|
||||
group.add_argument("-r", "--rm", "--del", action='store_true', help="Delete a context.\nUsage: context -d name")
|
||||
group.add_argument("--ls", action='store_true', help="List all contexts.\nUsage: context --ls")
|
||||
group.add_argument("--set", action='store_true', help="Set the used context.\nUsage: context --set name")
|
||||
group.add_argument("-s", "--show", action='store_true', help="Show the defined regex of a context.\nUsage: context --show name")
|
||||
group.add_argument("-e", "--edit", "--mod", nargs='+', help='Modify an existing context.\nUsage: context --mod name "regex1" "regex2"')
|
||||
|
||||
class Entrypoint:
|
||||
def __init__(self, args, parser, connapp):
|
||||
if args.add and len(args.add) < 2:
|
||||
parser.error("--add requires at least 2 arguments: name and at least one regex")
|
||||
if args.edit and len(args.edit) < 2:
|
||||
parser.error("--edit requires at least 2 arguments: name and at least one regex")
|
||||
if args.ls and args.context_name is not None:
|
||||
parser.error("--ls does not require a context name")
|
||||
if args.rm and not args.context_name:
|
||||
parser.error("--rm require a context name")
|
||||
if args.set and not args.context_name:
|
||||
parser.error("--set require a context name")
|
||||
if args.show and not args.context_name:
|
||||
parser.error("--show require a context name")
|
||||
|
||||
cm = context_manager(connapp)
|
||||
|
||||
if args.add:
|
||||
cm.add_context(args.add[0], args.add[1:])
|
||||
elif args.rm:
|
||||
cm.delete_context(args.context_name)
|
||||
elif args.ls:
|
||||
cm.list_contexts()
|
||||
elif args.edit:
|
||||
cm.modify_context(args.edit[0], args.edit[1:])
|
||||
elif args.set:
|
||||
cm.set_context(args.context_name)
|
||||
elif args.show:
|
||||
cm.show_context(args.context_name)
|
||||
|
||||
def _connpy_completion(wordsnumber, words, info=None):
|
||||
if wordsnumber == 3:
|
||||
result = ["--help", "--add", "--del", "--rm", "--ls", "--set", "--show", "--edit", "--mod"]
|
||||
elif wordsnumber == 4 and words[1] in ["--del", "-r", "--rm", "--set", "--edit", "--mod", "-e", "--show", "-s"]:
|
||||
contexts = info["config"]["config"]["contexts"].keys()
|
||||
current_context = info["config"]["config"]["current_context"]
|
||||
default_context = "all"
|
||||
|
||||
if words[1] in ["--del", "-r", "--rm"]:
|
||||
# Filter out default context and current context
|
||||
result = [context for context in contexts if context not in [default_context, current_context]]
|
||||
elif words[1] == "--set":
|
||||
# Filter out current context
|
||||
result = [context for context in contexts if context != current_context]
|
||||
elif words[1] in ["--edit", "--mod", "-e"]:
|
||||
# Filter out default context
|
||||
result = [context for context in contexts if context != default_context]
|
||||
elif words[1] in ["--show", "-s"]:
|
||||
# No filter for show
|
||||
result = list(contexts)
|
||||
|
||||
return result
|
||||
@@ -1,405 +0,0 @@
|
||||
#!/usr/bin/python3
|
||||
import argparse
|
||||
import os
|
||||
import time
|
||||
import zipfile
|
||||
import tempfile
|
||||
import io
|
||||
import yaml
|
||||
import threading
|
||||
from connpy import printer
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google.auth.transport.requests import Request
|
||||
from googleapiclient.discovery import build
|
||||
from google.auth.exceptions import RefreshError
|
||||
from google_auth_oauthlib.flow import InstalledAppFlow
|
||||
from googleapiclient.http import MediaFileUpload,MediaIoBaseDownload
|
||||
from googleapiclient.errors import HttpError
|
||||
from datetime import datetime
|
||||
|
||||
class sync:
|
||||
|
||||
def __init__(self, connapp):
|
||||
self.scopes = ['https://www.googleapis.com/auth/drive.appdata']
|
||||
self.token_file = f"{connapp.config.defaultdir}/gtoken.json"
|
||||
self.file = connapp.config.file
|
||||
self.key = connapp.config.key
|
||||
# Embedded OAuth config to bypass GitHub Secret Scanning for desktop apps
|
||||
self.client_config = {
|
||||
"installed": {
|
||||
"client_id": "559598250648-cr189kfrga2il1a6d6nkaspq0a9pn5vv.apps.googleusercontent.com",
|
||||
"project_id": "celtic-surface-420323",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://oauth2.googleapis.com/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_secret": "GOCSPX-" + "VVfOSrJLPU90Pl0g7aAXM9GK2xPE",
|
||||
"redirect_uris": ["http://localhost"]
|
||||
}
|
||||
}
|
||||
self.connapp = connapp
|
||||
try:
|
||||
self.sync = self.connapp.config.config["sync"]
|
||||
except KeyError:
|
||||
self.sync = False
|
||||
|
||||
def login(self):
|
||||
creds = None
|
||||
# The file token.json stores the user's access and refresh tokens.
|
||||
if os.path.exists(self.token_file):
|
||||
creds = Credentials.from_authorized_user_file(self.token_file, self.scopes)
|
||||
|
||||
try:
|
||||
# If there are no valid credentials available, let the user log in.
|
||||
if not creds or not creds.valid:
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
else:
|
||||
flow = InstalledAppFlow.from_client_config(
|
||||
self.client_config, self.scopes)
|
||||
creds = flow.run_local_server(port=0, access_type='offline')
|
||||
|
||||
# Save the credentials for the next run
|
||||
with open(self.token_file, 'w') as token:
|
||||
token.write(creds.to_json())
|
||||
|
||||
printer.success("Logged in successfully.")
|
||||
|
||||
except RefreshError as e:
|
||||
# If refresh fails, delete the invalid token file and start a new login flow
|
||||
if os.path.exists(self.token_file):
|
||||
os.remove(self.token_file)
|
||||
printer.warning("Existing token was invalid and has been removed. Please log in again.")
|
||||
flow = InstalledAppFlow.from_client_config(
|
||||
self.client_config, self.scopes)
|
||||
creds = flow.run_local_server(port=0, access_type='offline')
|
||||
with open(self.token_file, 'w') as token:
|
||||
token.write(creds.to_json())
|
||||
printer.success("Logged in successfully after re-authentication.")
|
||||
|
||||
def logout(self):
|
||||
if os.path.exists(self.token_file):
|
||||
os.remove(self.token_file)
|
||||
printer.success("Logged out successfully.")
|
||||
else:
|
||||
printer.info("No credentials file found. Already logged out.")
|
||||
|
||||
def get_credentials(self):
|
||||
# Load credentials from token.json
|
||||
if os.path.exists(self.token_file):
|
||||
creds = Credentials.from_authorized_user_file(self.token_file, self.scopes)
|
||||
else:
|
||||
printer.error("Credentials file not found.")
|
||||
return 0
|
||||
|
||||
# If there are no valid credentials available, ask the user to log in again
|
||||
if not creds or not creds.valid:
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
try:
|
||||
creds.refresh(Request())
|
||||
except RefreshError:
|
||||
printer.warning("Could not refresh access token. Please log in again.")
|
||||
return 0
|
||||
else:
|
||||
printer.warning("Credentials are missing or invalid. Please log in.")
|
||||
return 0
|
||||
return creds
|
||||
|
||||
def check_login_status(self):
|
||||
# Check if the credentials file exists
|
||||
if os.path.exists(self.token_file):
|
||||
# Load credentials from token.json
|
||||
creds = Credentials.from_authorized_user_file(self.token_file)
|
||||
|
||||
# If credentials are expired, refresh them
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
try:
|
||||
creds.refresh(Request())
|
||||
except RefreshError:
|
||||
pass
|
||||
|
||||
# Check if the credentials are valid after refresh
|
||||
if creds.valid:
|
||||
return True
|
||||
else:
|
||||
return "Invalid"
|
||||
else:
|
||||
return False
|
||||
|
||||
def status(self):
|
||||
printer.info(f"Login: {self.check_login_status()}")
|
||||
printer.info(f"Sync: {self.sync}")
|
||||
|
||||
|
||||
def get_appdata_files(self):
|
||||
|
||||
creds = self.get_credentials()
|
||||
if not creds:
|
||||
return 0
|
||||
|
||||
try:
|
||||
# Create the Google Drive service
|
||||
service = build("drive", "v3", credentials=creds)
|
||||
|
||||
# List files in the appDataFolder
|
||||
response = (
|
||||
service.files()
|
||||
.list(
|
||||
spaces="appDataFolder",
|
||||
fields="files(id, name, appProperties)",
|
||||
pageSize=10,
|
||||
)
|
||||
.execute()
|
||||
)
|
||||
|
||||
files_info = []
|
||||
for file in response.get("files", []):
|
||||
# Extract file information
|
||||
file_id = file.get("id")
|
||||
file_name = file.get("name")
|
||||
timestamp = file.get("appProperties", {}).get("timestamp")
|
||||
human_readable_date = file.get("appProperties", {}).get("date")
|
||||
files_info.append({"name": file_name, "id": file_id, "date": human_readable_date, "timestamp": timestamp})
|
||||
|
||||
return files_info
|
||||
|
||||
except HttpError as error:
|
||||
printer.error(f"An error occurred: {error}")
|
||||
return 0
|
||||
|
||||
|
||||
def dump_appdata_files_yaml(self):
|
||||
files_info = self.get_appdata_files()
|
||||
if not files_info:
|
||||
printer.error("Failed to retrieve files or no files found.")
|
||||
return
|
||||
# Pretty print as YAML
|
||||
yaml_output = yaml.dump(files_info, sort_keys=False, default_flow_style=False)
|
||||
printer.custom("backups","")
|
||||
print(yaml_output)
|
||||
|
||||
|
||||
def backup_file_to_drive(self, file_path, timestamp):
|
||||
|
||||
creds = self.get_credentials()
|
||||
if not creds:
|
||||
return 1
|
||||
|
||||
# Create the Google Drive service
|
||||
service = build('drive', 'v3', credentials=creds)
|
||||
|
||||
# Convert timestamp to a human-readable date
|
||||
human_readable_date = datetime.fromtimestamp(timestamp/1000).strftime('%Y-%m-%d %H:%M:%S')
|
||||
|
||||
# Upload the file to Google Drive with timestamp metadata
|
||||
file_metadata = {
|
||||
'name': os.path.basename(file_path),
|
||||
'parents': ["appDataFolder"],
|
||||
'appProperties': {
|
||||
'timestamp': str(timestamp),
|
||||
'date': human_readable_date # Add human-readable date attribute
|
||||
}
|
||||
}
|
||||
media = MediaFileUpload(file_path)
|
||||
|
||||
try:
|
||||
file = service.files().create(body=file_metadata, media_body=media, fields='id').execute()
|
||||
return 0
|
||||
except Exception as e:
|
||||
return f"An error occurred: {e}"
|
||||
|
||||
def delete_file_by_id(self, file_id):
|
||||
creds = self.get_credentials()
|
||||
if not creds:
|
||||
return 1
|
||||
|
||||
try:
|
||||
# Create the Google Drive service
|
||||
service = build("drive", "v3", credentials=creds)
|
||||
|
||||
# Delete the file
|
||||
service.files().delete(fileId=file_id).execute()
|
||||
return 0
|
||||
except Exception as e:
|
||||
return f"An error occurred: {e}"
|
||||
|
||||
def compress_specific_files(self, zip_path):
|
||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
zipf.write(self.file, os.path.basename(self.file))
|
||||
zipf.write(self.key, ".osk")
|
||||
|
||||
def compress_and_upload(self):
|
||||
# Read the file content to get the folder path
|
||||
timestamp = int(time.time() * 1000)
|
||||
# Create a temporary directory for storing the zip file
|
||||
with tempfile.TemporaryDirectory() as tmp_dir:
|
||||
# Compress specific files from the folder path to a zip file in the temporary directory
|
||||
zip_path = os.path.join(tmp_dir, f"connpy-backup-{timestamp}.zip")
|
||||
self.compress_specific_files(zip_path)
|
||||
|
||||
# Get the files in the app data folder
|
||||
app_data_files = self.get_appdata_files()
|
||||
if app_data_files == 0:
|
||||
return 1
|
||||
|
||||
# If there are 10 or more files, remove the oldest one based on timestamp
|
||||
if len(app_data_files) >= 10:
|
||||
oldest_file = min(app_data_files, key=lambda x: x['timestamp'])
|
||||
delete_old = self.delete_file_by_id(oldest_file['id'])
|
||||
if delete_old:
|
||||
printer.error(delete_old)
|
||||
return 1
|
||||
|
||||
# Upload the new file
|
||||
upload_new = self.backup_file_to_drive(zip_path, timestamp)
|
||||
if upload_new:
|
||||
printer.error(upload_new)
|
||||
return 1
|
||||
|
||||
printer.success("Backup to google uploaded successfully.")
|
||||
return 0
|
||||
|
||||
def decompress_zip(self, zip_path):
|
||||
try:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zipf:
|
||||
# Extract the specific file to the specified destination
|
||||
names = zipf.namelist()
|
||||
if "config.yaml" in names:
|
||||
zipf.extract("config.yaml", os.path.dirname(self.file))
|
||||
elif "config.json" in names:
|
||||
zipf.extract("config.json", os.path.dirname(self.file))
|
||||
|
||||
if ".osk" in names:
|
||||
zipf.extract(".osk", os.path.dirname(self.key))
|
||||
|
||||
# Delete caches to force auto-regeneration on next run
|
||||
try:
|
||||
if os.path.exists(self.connapp.config.cachefile):
|
||||
os.remove(self.connapp.config.cachefile)
|
||||
if os.path.exists(self.connapp.config.fzf_cachefile):
|
||||
os.remove(self.connapp.config.fzf_cachefile)
|
||||
except Exception:
|
||||
pass
|
||||
return 0
|
||||
except Exception as e:
|
||||
printer.error(f"An error occurred: {e}")
|
||||
return 1
|
||||
|
||||
def download_file_by_id(self, file_id, destination_path):
|
||||
|
||||
creds = self.get_credentials()
|
||||
if not creds:
|
||||
return 1
|
||||
|
||||
try:
|
||||
# Create the Google Drive service
|
||||
service = build('drive', 'v3', credentials=creds)
|
||||
|
||||
# Download the file
|
||||
request = service.files().get_media(fileId=file_id)
|
||||
fh = io.FileIO(destination_path, mode='wb')
|
||||
downloader = MediaIoBaseDownload(fh, request)
|
||||
done = False
|
||||
while done is False:
|
||||
status, done = downloader.next_chunk()
|
||||
|
||||
return 0
|
||||
except Exception as e:
|
||||
return f"An error occurred: {e}"
|
||||
|
||||
def restore_last_config(self, file_id=None):
|
||||
# Get the files in the app data folder
|
||||
app_data_files = self.get_appdata_files()
|
||||
if not app_data_files:
|
||||
printer.error("No files found in app data folder.")
|
||||
return 1
|
||||
|
||||
# Check if a specific file_id was provided and if it exists in the list
|
||||
if file_id:
|
||||
selected_file = next((f for f in app_data_files if f['id'] == file_id), None)
|
||||
if not selected_file:
|
||||
printer.error(f"No file found with ID: {file_id}")
|
||||
return 1
|
||||
else:
|
||||
# Find the latest file based on timestamp
|
||||
selected_file = max(app_data_files, key=lambda x: x['timestamp'])
|
||||
|
||||
# Download the selected file to a temporary location
|
||||
temp_download_path = os.path.join(tempfile.gettempdir(), 'connpy-backup.zip')
|
||||
if self.download_file_by_id(selected_file['id'], temp_download_path):
|
||||
return 1
|
||||
|
||||
# Unzip the downloaded file to the destination folder
|
||||
if self.decompress_zip(temp_download_path):
|
||||
printer.error("Failed to decompress the file.")
|
||||
return 1
|
||||
|
||||
printer.success(f"Backup from Google Drive restored successfully: {selected_file['name']}")
|
||||
return 0
|
||||
|
||||
def config_listener_post(self, args, kwargs):
|
||||
if self.sync:
|
||||
if self.check_login_status() == True:
|
||||
if not kwargs["result"]:
|
||||
self.compress_and_upload()
|
||||
else:
|
||||
printer.warning("Sync cannot be performed. Please check your login status.")
|
||||
return kwargs["result"]
|
||||
|
||||
def config_listener_pre(self, *args, **kwargs):
|
||||
try:
|
||||
self.sync = self.connapp.config.config["sync"]
|
||||
except KeyError:
|
||||
self.sync = False
|
||||
return args, kwargs
|
||||
|
||||
def start_post_thread(self, *args, **kwargs):
|
||||
post_thread = threading.Thread(target=self.config_listener_post, args=(args,kwargs))
|
||||
post_thread.start()
|
||||
|
||||
class Preload:
|
||||
def __init__(self, connapp):
|
||||
syncapp = sync(connapp)
|
||||
connapp.config._saveconfig.register_post_hook(syncapp.start_post_thread)
|
||||
connapp.config._saveconfig.register_pre_hook(syncapp.config_listener_pre)
|
||||
|
||||
class Parser:
|
||||
def __init__(self):
|
||||
self.parser = argparse.ArgumentParser(description="Sync config with Google")
|
||||
subparsers = self.parser.add_subparsers(title="Commands", dest='command',metavar="")
|
||||
login_parser = subparsers.add_parser("login", help="Login to Google to enable synchronization")
|
||||
logout_parser = subparsers.add_parser("logout", help="Logout from Google")
|
||||
start_parser = subparsers.add_parser("start", help="Start synchronizing with Google")
|
||||
stop_parser = subparsers.add_parser("stop", help="Stop any ongoing synchronization")
|
||||
restore_parser = subparsers.add_parser("restore", help="Restore data from Google")
|
||||
backup_parser = subparsers.add_parser("once", help="Backup current configuration to Google once")
|
||||
restore_parser.add_argument("--id", type=str, help="Optional file ID to restore a specific backup", required=False)
|
||||
status_parser = subparsers.add_parser("status", help="Check the current status of synchronization")
|
||||
list_parser = subparsers.add_parser("list", help="List all backups stored on Google")
|
||||
|
||||
class Entrypoint:
|
||||
def __init__(self, args, parser, connapp):
|
||||
syncapp = sync(connapp)
|
||||
if args.command == 'login':
|
||||
syncapp.login()
|
||||
elif args.command == "status":
|
||||
syncapp.status()
|
||||
elif args.command == "start":
|
||||
connapp._change_settings("sync", True)
|
||||
elif args.command == "stop":
|
||||
connapp._change_settings("sync", False)
|
||||
elif args.command == "list":
|
||||
syncapp.dump_appdata_files_yaml()
|
||||
elif args.command == "once":
|
||||
syncapp.compress_and_upload()
|
||||
elif args.command == "restore":
|
||||
syncapp.restore_last_config(args.id)
|
||||
elif args.command == "logout":
|
||||
syncapp.logout()
|
||||
|
||||
def _connpy_completion(wordsnumber, words, info = None):
|
||||
if wordsnumber == 3:
|
||||
result = ["--help", "login", "status", "start", "stop", "list", "once", "restore", "logout"]
|
||||
#NETMASK_completion
|
||||
if wordsnumber == 4 and words[1] == "restore":
|
||||
result = ["--help", "--id"]
|
||||
return result
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,25 @@
|
||||
syntax = "proto3";
|
||||
package connpy_remote;
|
||||
|
||||
message IdRequest {
|
||||
string id = 1;
|
||||
}
|
||||
|
||||
message StringResponse {
|
||||
string value = 1;
|
||||
}
|
||||
|
||||
message PluginInvokeRequest {
|
||||
string name = 1;
|
||||
string args_json = 2;
|
||||
}
|
||||
|
||||
message OutputChunk {
|
||||
string text = 1;
|
||||
bool is_error = 2;
|
||||
}
|
||||
|
||||
service RemotePluginService {
|
||||
rpc get_plugin_source(IdRequest) returns (StringResponse);
|
||||
rpc invoke_plugin(PluginInvokeRequest) returns (stream OutputChunk);
|
||||
}
|
||||
@@ -0,0 +1,44 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# NO CHECKED-IN PROTOBUF GENCODE
|
||||
# source: remote_plugin.proto
|
||||
# Protobuf Python Version: 6.31.1
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import runtime_version as _runtime_version
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
_runtime_version.ValidateProtobufRuntimeVersion(
|
||||
_runtime_version.Domain.PUBLIC,
|
||||
6,
|
||||
31,
|
||||
1,
|
||||
'',
|
||||
'remote_plugin.proto'
|
||||
)
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x13remote_plugin.proto\x12\rconnpy_remote\"\x17\n\tIdRequest\x12\n\n\x02id\x18\x01 \x01(\t\"\x1f\n\x0eStringResponse\x12\r\n\x05value\x18\x01 \x01(\t\"6\n\x13PluginInvokeRequest\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x11\n\targs_json\x18\x02 \x01(\t\"-\n\x0bOutputChunk\x12\x0c\n\x04text\x18\x01 \x01(\t\x12\x10\n\x08is_error\x18\x02 \x01(\x08\x32\xb6\x01\n\x13RemotePluginService\x12L\n\x11get_plugin_source\x12\x18.connpy_remote.IdRequest\x1a\x1d.connpy_remote.StringResponse\x12Q\n\rinvoke_plugin\x12\".connpy_remote.PluginInvokeRequest\x1a\x1a.connpy_remote.OutputChunk0\x01\x62\x06proto3')
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'remote_plugin_pb2', _globals)
|
||||
if not _descriptor._USE_C_DESCRIPTORS:
|
||||
DESCRIPTOR._loaded_options = None
|
||||
_globals['_IDREQUEST']._serialized_start=38
|
||||
_globals['_IDREQUEST']._serialized_end=61
|
||||
_globals['_STRINGRESPONSE']._serialized_start=63
|
||||
_globals['_STRINGRESPONSE']._serialized_end=94
|
||||
_globals['_PLUGININVOKEREQUEST']._serialized_start=96
|
||||
_globals['_PLUGININVOKEREQUEST']._serialized_end=150
|
||||
_globals['_OUTPUTCHUNK']._serialized_start=152
|
||||
_globals['_OUTPUTCHUNK']._serialized_end=197
|
||||
_globals['_REMOTEPLUGINSERVICE']._serialized_start=200
|
||||
_globals['_REMOTEPLUGINSERVICE']._serialized_end=382
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
@@ -0,0 +1,140 @@
|
||||
# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
|
||||
"""Client and server classes corresponding to protobuf-defined services."""
|
||||
import grpc
|
||||
import warnings
|
||||
|
||||
from . import remote_plugin_pb2 as remote__plugin__pb2
|
||||
|
||||
GRPC_GENERATED_VERSION = '1.80.0'
|
||||
GRPC_VERSION = grpc.__version__
|
||||
_version_not_supported = False
|
||||
|
||||
try:
|
||||
from grpc._utilities import first_version_is_lower
|
||||
_version_not_supported = first_version_is_lower(GRPC_VERSION, GRPC_GENERATED_VERSION)
|
||||
except ImportError:
|
||||
_version_not_supported = True
|
||||
|
||||
if _version_not_supported:
|
||||
raise RuntimeError(
|
||||
f'The grpc package installed is at version {GRPC_VERSION},'
|
||||
+ ' but the generated code in remote_plugin_pb2_grpc.py depends on'
|
||||
+ f' grpcio>={GRPC_GENERATED_VERSION}.'
|
||||
+ f' Please upgrade your grpc module to grpcio>={GRPC_GENERATED_VERSION}'
|
||||
+ f' or downgrade your generated code using grpcio-tools<={GRPC_VERSION}.'
|
||||
)
|
||||
|
||||
|
||||
class RemotePluginServiceStub(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
def __init__(self, channel):
|
||||
"""Constructor.
|
||||
|
||||
Args:
|
||||
channel: A grpc.Channel.
|
||||
"""
|
||||
self.get_plugin_source = channel.unary_unary(
|
||||
'/connpy_remote.RemotePluginService/get_plugin_source',
|
||||
request_serializer=remote__plugin__pb2.IdRequest.SerializeToString,
|
||||
response_deserializer=remote__plugin__pb2.StringResponse.FromString,
|
||||
_registered_method=True)
|
||||
self.invoke_plugin = channel.unary_stream(
|
||||
'/connpy_remote.RemotePluginService/invoke_plugin',
|
||||
request_serializer=remote__plugin__pb2.PluginInvokeRequest.SerializeToString,
|
||||
response_deserializer=remote__plugin__pb2.OutputChunk.FromString,
|
||||
_registered_method=True)
|
||||
|
||||
|
||||
class RemotePluginServiceServicer(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
def get_plugin_source(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def invoke_plugin(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
|
||||
def add_RemotePluginServiceServicer_to_server(servicer, server):
|
||||
rpc_method_handlers = {
|
||||
'get_plugin_source': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.get_plugin_source,
|
||||
request_deserializer=remote__plugin__pb2.IdRequest.FromString,
|
||||
response_serializer=remote__plugin__pb2.StringResponse.SerializeToString,
|
||||
),
|
||||
'invoke_plugin': grpc.unary_stream_rpc_method_handler(
|
||||
servicer.invoke_plugin,
|
||||
request_deserializer=remote__plugin__pb2.PluginInvokeRequest.FromString,
|
||||
response_serializer=remote__plugin__pb2.OutputChunk.SerializeToString,
|
||||
),
|
||||
}
|
||||
generic_handler = grpc.method_handlers_generic_handler(
|
||||
'connpy_remote.RemotePluginService', rpc_method_handlers)
|
||||
server.add_generic_rpc_handlers((generic_handler,))
|
||||
server.add_registered_method_handlers('connpy_remote.RemotePluginService', rpc_method_handlers)
|
||||
|
||||
|
||||
# This class is part of an EXPERIMENTAL API.
|
||||
class RemotePluginService(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
@staticmethod
|
||||
def get_plugin_source(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/get_plugin_source',
|
||||
remote__plugin__pb2.IdRequest.SerializeToString,
|
||||
remote__plugin__pb2.StringResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def invoke_plugin(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_stream(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/invoke_plugin',
|
||||
remote__plugin__pb2.PluginInvokeRequest.SerializeToString,
|
||||
remote__plugin__pb2.OutputChunk.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
@@ -0,0 +1,703 @@
|
||||
import grpc
|
||||
from concurrent import futures
|
||||
from google.protobuf.empty_pb2 import Empty
|
||||
import os
|
||||
import ctypes
|
||||
import threading
|
||||
|
||||
# Suppress harmless but noisy gRPC fork() warnings from pexpect child processes
|
||||
os.environ["GRPC_VERBOSITY"] = "NONE"
|
||||
os.environ["GRPC_ENABLE_FORK_SUPPORT"] = "0"
|
||||
from . import connpy_pb2, connpy_pb2_grpc, remote_plugin_pb2, remote_plugin_pb2_grpc
|
||||
import json
|
||||
from .utils import to_value, from_value, to_struct, from_struct
|
||||
from ..services.exceptions import ConnpyError
|
||||
|
||||
# Import local services
|
||||
from ..services.node_service import NodeService
|
||||
from ..services.profile_service import ProfileService
|
||||
from ..services.config_service import ConfigService
|
||||
from ..services.plugin_service import PluginService
|
||||
from ..services.ai_service import AIService
|
||||
from ..services.system_service import SystemService
|
||||
from ..services.execution_service import ExecutionService
|
||||
from ..services.import_export_service import ImportExportService
|
||||
|
||||
def handle_errors(func):
|
||||
def wrapper(*args, **kwargs):
|
||||
try:
|
||||
return func(*args, **kwargs)
|
||||
except ConnpyError as e:
|
||||
context = kwargs.get("context") or args[-1]
|
||||
context.abort(grpc.StatusCode.INTERNAL, str(e))
|
||||
except Exception as e:
|
||||
context = kwargs.get("context") or args[-1]
|
||||
context.abort(grpc.StatusCode.UNKNOWN, str(e))
|
||||
return wrapper
|
||||
|
||||
class NodeServicer(connpy_pb2_grpc.NodeServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = NodeService(config)
|
||||
|
||||
@handle_errors
|
||||
def interact_node(self, request_iterator, context):
|
||||
import sys
|
||||
import select
|
||||
import os
|
||||
from connpy.core import node
|
||||
from ..services.profile_service import ProfileService
|
||||
|
||||
# Fetch first setup packet
|
||||
try:
|
||||
first_req = next(request_iterator)
|
||||
except StopIteration:
|
||||
context.abort(grpc.StatusCode.INVALID_ARGUMENT, "No setup request received")
|
||||
|
||||
unique_id = first_req.id
|
||||
sftp = first_req.sftp
|
||||
debug = first_req.debug
|
||||
|
||||
node_data = self.service.config.getitem(unique_id, extract=False)
|
||||
profile_service = ProfileService(self.service.config)
|
||||
resolved_data = profile_service.resolve_node_data(node_data)
|
||||
|
||||
n = node(unique_id, **resolved_data, config=self.service.config)
|
||||
if sftp:
|
||||
n.protocol = "sftp"
|
||||
|
||||
connect = n._connect(debug=debug)
|
||||
if connect != True:
|
||||
context.abort(grpc.StatusCode.INTERNAL, "Failed to connect to node")
|
||||
|
||||
import threading
|
||||
import queue
|
||||
|
||||
stdin_queue = queue.Queue()
|
||||
running = True
|
||||
|
||||
def read_requests():
|
||||
try:
|
||||
for req in request_iterator:
|
||||
if not running:
|
||||
break
|
||||
if req.cols > 0 and req.rows > 0:
|
||||
try:
|
||||
n.child.setwinsize(req.rows, req.cols)
|
||||
except Exception:
|
||||
pass
|
||||
if req.stdin_data:
|
||||
stdin_queue.put(req.stdin_data)
|
||||
except grpc.RpcError:
|
||||
pass
|
||||
|
||||
t = threading.Thread(target=read_requests, daemon=True)
|
||||
t.start()
|
||||
|
||||
# Set initial window size if provided
|
||||
if first_req.cols > 0 and first_req.rows > 0:
|
||||
try:
|
||||
n.child.setwinsize(first_req.rows, first_req.cols)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
while n.child.isalive() and running:
|
||||
r, _, _ = select.select([n.child.child_fd], [], [], 0.05)
|
||||
if r:
|
||||
try:
|
||||
data = os.read(n.child.child_fd, 4096)
|
||||
if not data:
|
||||
break
|
||||
yield connpy_pb2.InteractResponse(stdout_data=data)
|
||||
except OSError:
|
||||
break
|
||||
|
||||
while not stdin_queue.empty():
|
||||
data = stdin_queue.get_nowait()
|
||||
try:
|
||||
os.write(n.child.child_fd, data)
|
||||
except OSError:
|
||||
running = False
|
||||
break
|
||||
finally:
|
||||
running = False
|
||||
try:
|
||||
n.child.terminate(force=True)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
@handle_errors
|
||||
def list_nodes(self, request, context):
|
||||
f = request.filter_str if request.filter_str else None
|
||||
fmt = request.format_str if request.format_str else None
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_nodes(f, fmt)))
|
||||
|
||||
@handle_errors
|
||||
def list_folders(self, request, context):
|
||||
f = request.filter_str if request.filter_str else None
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_folders(f)))
|
||||
|
||||
@handle_errors
|
||||
def get_node_details(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.get_node_details(request.id)))
|
||||
|
||||
@handle_errors
|
||||
def explode_unique(self, request, context):
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.explode_unique(request.id)))
|
||||
|
||||
@handle_errors
|
||||
def generate_cache(self, request, context):
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def add_node(self, request, context):
|
||||
self.service.add_node(request.id, from_struct(request.data), request.is_folder)
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def update_node(self, request, context):
|
||||
self.service.update_node(request.id, from_struct(request.data))
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def delete_node(self, request, context):
|
||||
self.service.delete_node(request.id, request.is_folder)
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def move_node(self, request, context):
|
||||
self.service.move_node(request.src_id, request.dst_id, request.copy)
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def bulk_add(self, request, context):
|
||||
self.service.bulk_add(list(request.ids), list(request.hosts), from_struct(request.common_data))
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def set_reserved_names(self, request, context):
|
||||
self.service.set_reserved_names(list(request.items))
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def full_replace(self, request, context):
|
||||
connections = from_struct(request.connections)
|
||||
profiles = from_struct(request.profiles)
|
||||
self.service.full_replace(connections, profiles)
|
||||
self.service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def get_inventory(self, request, context):
|
||||
data = self.service.get_inventory()
|
||||
return connpy_pb2.FullReplaceRequest(
|
||||
connections=to_struct(data["connections"]),
|
||||
profiles=to_struct(data["profiles"])
|
||||
)
|
||||
|
||||
class ProfileServicer(connpy_pb2_grpc.ProfileServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = ProfileService(config)
|
||||
self.node_service = NodeService(config)
|
||||
|
||||
@handle_errors
|
||||
def list_profiles(self, request, context):
|
||||
f = request.filter_str if request.filter_str else None
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_profiles(f)))
|
||||
|
||||
@handle_errors
|
||||
def get_profile(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.get_profile(request.name, request.resolve)))
|
||||
|
||||
@handle_errors
|
||||
def add_profile(self, request, context):
|
||||
self.service.add_profile(request.id, from_struct(request.data))
|
||||
self.node_service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def resolve_node_data(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.resolve_node_data(from_struct(request.data))))
|
||||
|
||||
@handle_errors
|
||||
def delete_profile(self, request, context):
|
||||
self.service.delete_profile(request.id)
|
||||
self.node_service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def update_profile(self, request, context):
|
||||
self.service.update_profile(request.id, from_struct(request.data))
|
||||
self.node_service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
class ConfigServicer(connpy_pb2_grpc.ConfigServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = ConfigService(config)
|
||||
|
||||
@handle_errors
|
||||
def get_settings(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.get_settings()))
|
||||
|
||||
@handle_errors
|
||||
def get_default_dir(self, request, context):
|
||||
return connpy_pb2.StringResponse(value=self.service.get_default_dir())
|
||||
|
||||
@handle_errors
|
||||
def set_config_folder(self, request, context):
|
||||
self.service.set_config_folder(request.value)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def update_setting(self, request, context):
|
||||
self.service.update_setting(request.key, from_value(request.value))
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def encrypt_password(self, request, context):
|
||||
return connpy_pb2.StringResponse(value=self.service.encrypt_password(request.value))
|
||||
|
||||
@handle_errors
|
||||
def apply_theme_from_file(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.apply_theme_from_file(request.value)))
|
||||
|
||||
class PluginServicer(connpy_pb2_grpc.PluginServiceServicer, remote_plugin_pb2_grpc.RemotePluginServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = PluginService(config)
|
||||
|
||||
@handle_errors
|
||||
def list_plugins(self, request, context):
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_plugins()))
|
||||
|
||||
@handle_errors
|
||||
def add_plugin(self, request, context):
|
||||
if request.source_file.startswith("---CONTENT---\n"):
|
||||
content = request.source_file[len("---CONTENT---\n"):].encode()
|
||||
self.service.add_plugin_from_bytes(request.name, content, request.update)
|
||||
else:
|
||||
self.service.add_plugin(request.name, request.source_file, request.update)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def delete_plugin(self, request, context):
|
||||
self.service.delete_plugin(request.id)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def enable_plugin(self, request, context):
|
||||
self.service.enable_plugin(request.id)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def disable_plugin(self, request, context):
|
||||
self.service.disable_plugin(request.id)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def get_plugin_source(self, request, context):
|
||||
source = self.service.get_plugin_source(request.id)
|
||||
return remote_plugin_pb2.StringResponse(value=source)
|
||||
|
||||
@handle_errors
|
||||
def invoke_plugin(self, request, context):
|
||||
args_dict = json.loads(request.args_json)
|
||||
for chunk in self.service.invoke_plugin(request.name, args_dict):
|
||||
yield remote_plugin_pb2.OutputChunk(text=chunk)
|
||||
|
||||
class ExecutionServicer(connpy_pb2_grpc.ExecutionServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = ExecutionService(config)
|
||||
|
||||
@handle_errors
|
||||
def run_commands(self, request, context):
|
||||
import queue
|
||||
import threading
|
||||
|
||||
nodes_filter = request.nodes[0] if len(request.nodes) == 1 else list(request.nodes)
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
def _on_complete(unique, output, status):
|
||||
q.put({"unique_id": unique, "output": output, "status": status})
|
||||
|
||||
def _worker():
|
||||
try:
|
||||
self.service.run_commands(
|
||||
nodes_filter=nodes_filter,
|
||||
commands=list(request.commands),
|
||||
folder=request.folder if request.folder else None,
|
||||
prompt=request.prompt if request.prompt else None,
|
||||
parallel=request.parallel,
|
||||
variables=from_struct(request.vars) if request.HasField("vars") else None,
|
||||
on_node_complete=_on_complete
|
||||
)
|
||||
except Exception as e:
|
||||
# Optionally pass error to stream, but handle_errors decorator covers top-level.
|
||||
# However, thread exceptions won't reach context.abort directly.
|
||||
q.put(e)
|
||||
finally:
|
||||
q.put(None)
|
||||
|
||||
threading.Thread(target=_worker, daemon=True).start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
if isinstance(item, Exception):
|
||||
raise item
|
||||
|
||||
yield connpy_pb2.NodeRunResult(
|
||||
unique_id=item["unique_id"],
|
||||
output=item["output"],
|
||||
status=item["status"]
|
||||
)
|
||||
|
||||
@handle_errors
|
||||
def test_commands(self, request, context):
|
||||
import queue
|
||||
import threading
|
||||
|
||||
nodes_filter = request.nodes[0] if len(request.nodes) == 1 else list(request.nodes)
|
||||
|
||||
q = queue.Queue()
|
||||
|
||||
def _on_complete(unique, output, status, result):
|
||||
q.put({"unique_id": unique, "output": output, "status": status, "result": result})
|
||||
|
||||
def _worker():
|
||||
try:
|
||||
self.service.test_commands(
|
||||
nodes_filter=nodes_filter,
|
||||
commands=list(request.commands),
|
||||
expected=request.expected,
|
||||
folder=request.folder if request.folder else None,
|
||||
prompt=request.prompt if request.prompt else None,
|
||||
parallel=request.parallel,
|
||||
variables=from_struct(request.vars) if request.HasField("vars") else None,
|
||||
on_node_complete=_on_complete
|
||||
)
|
||||
except Exception as e:
|
||||
q.put(e)
|
||||
finally:
|
||||
q.put(None)
|
||||
|
||||
threading.Thread(target=_worker, daemon=True).start()
|
||||
|
||||
while True:
|
||||
item = q.get()
|
||||
if item is None:
|
||||
break
|
||||
if isinstance(item, Exception):
|
||||
raise item
|
||||
|
||||
res = connpy_pb2.NodeRunResult(
|
||||
unique_id=item["unique_id"],
|
||||
output=item["output"],
|
||||
status=item["status"]
|
||||
)
|
||||
if item["result"] is not None:
|
||||
res.test_result.CopyFrom(to_struct(item["result"]))
|
||||
yield res
|
||||
|
||||
@handle_errors
|
||||
def run_cli_script(self, request, context):
|
||||
res = self.service.run_cli_script(request.param1, request.param2, request.parallel)
|
||||
return connpy_pb2.StructResponse(data=to_struct(res))
|
||||
|
||||
@handle_errors
|
||||
def run_yaml_playbook(self, request, context):
|
||||
res = self.service.run_yaml_playbook(request.param1, request.parallel)
|
||||
return connpy_pb2.StructResponse(data=to_struct(res))
|
||||
|
||||
class ImportExportServicer(connpy_pb2_grpc.ImportExportServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = ImportExportService(config)
|
||||
self.node_service = NodeService(config)
|
||||
|
||||
@handle_errors
|
||||
def export_to_file(self, request, context):
|
||||
self.service.export_to_file(request.file_path, list(request.folders) if request.folders else None)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def import_from_file(self, request, context):
|
||||
if request.value.startswith("---YAML---\n"):
|
||||
import yaml
|
||||
content = request.value[len("---YAML---\n"):]
|
||||
data = yaml.load(content, Loader=yaml.FullLoader)
|
||||
self.service.import_from_dict(data)
|
||||
else:
|
||||
self.service.import_from_file(request.value)
|
||||
self.node_service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def set_reserved_names(self, request, context):
|
||||
self.service.set_reserved_names(list(request.items))
|
||||
self.node_service.generate_cache()
|
||||
return Empty()
|
||||
|
||||
class StatusBridge:
|
||||
def __init__(self, q, request_queue=None):
|
||||
self.q = q
|
||||
self.request_queue = request_queue
|
||||
self.on_interrupt = self._force_interrupt
|
||||
self.thread = None
|
||||
|
||||
def _force_interrupt(self):
|
||||
"""Forcefully raise KeyboardInterrupt in the target thread."""
|
||||
if self.thread and self.thread.ident:
|
||||
# Standard Python trick to raise an exception in a specific thread
|
||||
ctypes.pythonapi.PyThreadState_SetAsyncExc(
|
||||
ctypes.c_long(self.thread.ident),
|
||||
ctypes.py_object(KeyboardInterrupt)
|
||||
)
|
||||
|
||||
def update(self, msg):
|
||||
self.q.put(("status", msg))
|
||||
|
||||
def stop(self):
|
||||
pass
|
||||
|
||||
def print(self, *args, **kwargs):
|
||||
# Capture Rich output and send as debug message
|
||||
self._print_to_queue("debug", *args, **kwargs)
|
||||
|
||||
def print_important(self, *args, **kwargs):
|
||||
# Capture Rich output and send as important message (always show)
|
||||
self._print_to_queue("important", *args, **kwargs)
|
||||
|
||||
def _print_to_queue(self, msg_type, *args, **kwargs):
|
||||
from rich.console import Console
|
||||
from io import StringIO
|
||||
from ..printer import connpy_theme
|
||||
buf = StringIO()
|
||||
# Use a high-quality console for rendering with the app's theme
|
||||
c = Console(file=buf, force_terminal=True, width=100, theme=connpy_theme)
|
||||
c.print(*args, **kwargs)
|
||||
self.q.put((msg_type, buf.getvalue()))
|
||||
|
||||
def confirm(self, prompt, default="n"):
|
||||
"""Bridge confirmation to the gRPC client."""
|
||||
if not self.request_queue:
|
||||
return default
|
||||
|
||||
# Render markup to ANSI for the client
|
||||
from rich.console import Console
|
||||
from io import StringIO
|
||||
from ..printer import connpy_theme
|
||||
buf = StringIO()
|
||||
c = Console(file=buf, force_terminal=True, theme=connpy_theme)
|
||||
c.print(prompt, end="")
|
||||
ansi_prompt = buf.getvalue()
|
||||
|
||||
# Send confirmation request to client
|
||||
self.q.put(("confirm", ansi_prompt))
|
||||
|
||||
# Wait for the client to send back the answer via the request stream
|
||||
try:
|
||||
# Block until we get the next request from the client
|
||||
req = self.request_queue.get()
|
||||
if req and req.confirmation_answer:
|
||||
return req.confirmation_answer
|
||||
except Exception:
|
||||
pass
|
||||
return default
|
||||
|
||||
class AIServicer(connpy_pb2_grpc.AIServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = AIService(config)
|
||||
|
||||
@handle_errors
|
||||
def ask(self, request_iterator, context):
|
||||
import queue
|
||||
import threading
|
||||
|
||||
# In bidirectional mode, the first request contains the query
|
||||
try:
|
||||
first_request = next(request_iterator)
|
||||
except StopIteration:
|
||||
return
|
||||
|
||||
history = from_value(first_request.chat_history)
|
||||
|
||||
overrides = {}
|
||||
if first_request.engineer_model: overrides["engineer_model"] = first_request.engineer_model
|
||||
if first_request.engineer_api_key: overrides["engineer_api_key"] = first_request.engineer_api_key
|
||||
if first_request.architect_model: overrides["architect_model"] = first_request.architect_model
|
||||
if first_request.architect_api_key: overrides["architect_api_key"] = first_request.architect_api_key
|
||||
|
||||
chunk_queue = queue.Queue()
|
||||
request_queue = queue.Queue()
|
||||
bridge = StatusBridge(chunk_queue, request_queue=request_queue)
|
||||
|
||||
# Start a thread to pull subsequent requests from the client (confirmations)
|
||||
def pull_requests():
|
||||
try:
|
||||
for req in request_iterator:
|
||||
if req.interrupt and bridge.on_interrupt:
|
||||
bridge.on_interrupt()
|
||||
request_queue.put(req)
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
request_queue.put(None)
|
||||
|
||||
threading.Thread(target=pull_requests, daemon=True).start()
|
||||
|
||||
def callback(chunk):
|
||||
chunk_queue.put(("text", chunk))
|
||||
|
||||
result_container = {}
|
||||
|
||||
def run_ai():
|
||||
try:
|
||||
res = self.service.ask(
|
||||
first_request.input_text,
|
||||
dryrun=first_request.dryrun,
|
||||
chat_history=history if history else None,
|
||||
session_id=first_request.session_id if first_request.session_id else None,
|
||||
debug=first_request.debug,
|
||||
status=bridge,
|
||||
console=bridge,
|
||||
confirm_handler=bridge.confirm,
|
||||
chunk_callback=callback,
|
||||
trust=first_request.trust,
|
||||
**overrides
|
||||
)
|
||||
result_container["res"] = res
|
||||
except Exception as e:
|
||||
chunk_queue.put(("status", f"[bold fail]Error: {str(e)}[/bold fail]"))
|
||||
result_container["error"] = e
|
||||
finally:
|
||||
chunk_queue.put(None) # Sentinel
|
||||
|
||||
t = threading.Thread(target=run_ai, daemon=True)
|
||||
bridge.thread = t
|
||||
t.start()
|
||||
|
||||
while True:
|
||||
item = chunk_queue.get()
|
||||
if item is None:
|
||||
break
|
||||
|
||||
msg_type, val = item
|
||||
if msg_type == "text":
|
||||
yield connpy_pb2.AIResponse(text_chunk=val, is_final=False)
|
||||
elif msg_type == "status":
|
||||
yield connpy_pb2.AIResponse(status_update=val, is_final=False)
|
||||
elif msg_type == "debug":
|
||||
yield connpy_pb2.AIResponse(debug_message=val, is_final=False)
|
||||
elif msg_type == "important":
|
||||
yield connpy_pb2.AIResponse(important_message=val, is_final=False)
|
||||
elif msg_type == "confirm":
|
||||
yield connpy_pb2.AIResponse(status_update=val, requires_confirmation=True, is_final=False)
|
||||
|
||||
if "error" in result_container:
|
||||
raise result_container["error"]
|
||||
|
||||
yield connpy_pb2.AIResponse(
|
||||
is_final=True,
|
||||
full_result=to_struct(result_container.get("res", {}))
|
||||
)
|
||||
|
||||
@handle_errors
|
||||
def confirm(self, request, context):
|
||||
res = self.service.confirm(request.value)
|
||||
return connpy_pb2.BoolResponse(value=res)
|
||||
|
||||
@handle_errors
|
||||
def list_sessions(self, request, context):
|
||||
return connpy_pb2.ValueResponse(data=to_value(self.service.list_sessions()))
|
||||
|
||||
@handle_errors
|
||||
def delete_session(self, request, context):
|
||||
self.service.delete_session(request.value)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def configure_provider(self, request, context):
|
||||
self.service.configure_provider(request.provider, request.model, request.api_key)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def load_session_data(self, request, context):
|
||||
return connpy_pb2.StructResponse(data=to_struct(self.service.load_session_data(request.value)))
|
||||
|
||||
class SystemServicer(connpy_pb2_grpc.SystemServiceServicer):
|
||||
def __init__(self, config):
|
||||
self.service = SystemService(config)
|
||||
|
||||
@handle_errors
|
||||
def start_api(self, request, context):
|
||||
self.service.start_api(request.value)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def debug_api(self, request, context):
|
||||
self.service.debug_api(request.value)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def stop_api(self, request, context):
|
||||
self.service.stop_api()
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def restart_api(self, request, context):
|
||||
self.service.restart_api(request.value)
|
||||
return Empty()
|
||||
|
||||
@handle_errors
|
||||
def get_api_status(self, request, context):
|
||||
return connpy_pb2.BoolResponse(value=self.service.get_api_status())
|
||||
|
||||
class LoggingInterceptor(grpc.ServerInterceptor):
|
||||
def __init__(self):
|
||||
from rich.console import Console
|
||||
from ..printer import connpy_theme
|
||||
self.console = Console(theme=connpy_theme)
|
||||
|
||||
def intercept_service(self, continuation, handler_call_details):
|
||||
import time
|
||||
method = handler_call_details.method
|
||||
self.console.print(f"[debug][DEBUG][/debug] gRPC Incoming Request: [bold cyan]{method}[/bold cyan]")
|
||||
|
||||
start_time = time.time()
|
||||
try:
|
||||
result = continuation(handler_call_details)
|
||||
except Exception as e:
|
||||
self.console.print(f"[debug][DEBUG][/debug] [bold red]ERROR[/bold red] in {method}: {e}")
|
||||
raise e
|
||||
finally:
|
||||
duration = (time.time() - start_time) * 1000
|
||||
self.console.print(f"[debug][DEBUG][/debug] Completed [bold cyan]{method}[/bold cyan] in {duration:.2f}ms")
|
||||
|
||||
return result
|
||||
|
||||
def serve(config, port=8048, debug=False):
|
||||
interceptors = [LoggingInterceptor()] if debug else []
|
||||
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10), interceptors=interceptors)
|
||||
|
||||
connpy_pb2_grpc.add_NodeServiceServicer_to_server(NodeServicer(config), server)
|
||||
connpy_pb2_grpc.add_ProfileServiceServicer_to_server(ProfileServicer(config), server)
|
||||
connpy_pb2_grpc.add_ConfigServiceServicer_to_server(ConfigServicer(config), server)
|
||||
plugin_servicer = PluginServicer(config)
|
||||
connpy_pb2_grpc.add_PluginServiceServicer_to_server(plugin_servicer, server)
|
||||
remote_plugin_pb2_grpc.add_RemotePluginServiceServicer_to_server(plugin_servicer, server)
|
||||
connpy_pb2_grpc.add_ExecutionServiceServicer_to_server(ExecutionServicer(config), server)
|
||||
connpy_pb2_grpc.add_ImportExportServiceServicer_to_server(ImportExportServicer(config), server)
|
||||
connpy_pb2_grpc.add_AIServiceServicer_to_server(AIServicer(config), server)
|
||||
connpy_pb2_grpc.add_SystemServiceServicer_to_server(SystemServicer(config), server)
|
||||
|
||||
server.add_insecure_port(f'[::]:{port}')
|
||||
server.start()
|
||||
return server
|
||||
@@ -0,0 +1,568 @@
|
||||
import grpc
|
||||
import queue
|
||||
import threading
|
||||
from functools import wraps
|
||||
from google.protobuf.empty_pb2 import Empty
|
||||
|
||||
from . import connpy_pb2, connpy_pb2_grpc, remote_plugin_pb2, remote_plugin_pb2_grpc
|
||||
from .utils import to_value, from_value, to_struct, from_struct
|
||||
from ..services.exceptions import ConnpyError
|
||||
from ..hooks import MethodHook
|
||||
from .. import printer
|
||||
|
||||
def handle_errors(func):
|
||||
@wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
try:
|
||||
return func(*args, **kwargs)
|
||||
except grpc.RpcError as e:
|
||||
# Re-raise gRPC errors as native ConnpyError to keep CLI handlers agnostic
|
||||
details = e.details()
|
||||
|
||||
# Identify the host if available on the instance
|
||||
instance = args[0] if args else None
|
||||
host = getattr(instance, "remote_host", "remote host")
|
||||
|
||||
# Make common gRPC errors more readable
|
||||
if "failed to connect to all addresses" in details:
|
||||
simplified = f"Failed to connect to remote host at {host} (Connection refused)"
|
||||
elif "Method not found" in details:
|
||||
simplified = f"Remote server at {host} is using an incompatible version"
|
||||
elif "Deadline Exceeded" in details:
|
||||
simplified = f"Request to {host} timed out"
|
||||
else:
|
||||
simplified = details
|
||||
|
||||
raise ConnpyError(simplified)
|
||||
return wrapper
|
||||
class NodeStub:
|
||||
def __init__(self, channel, remote_host, config=None):
|
||||
self.stub = connpy_pb2_grpc.NodeServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
self.config = config
|
||||
|
||||
@handle_errors
|
||||
def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
import sys
|
||||
import select
|
||||
import tty
|
||||
import termios
|
||||
import os
|
||||
import threading
|
||||
|
||||
def request_generator():
|
||||
cols, rows = 80, 24
|
||||
try:
|
||||
size = os.get_terminal_size()
|
||||
cols, rows = size.columns, size.lines
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
yield connpy_pb2.InteractRequest(
|
||||
id=unique_id, sftp=sftp, debug=debug, cols=cols, rows=rows
|
||||
)
|
||||
|
||||
while True:
|
||||
r, _, _ = select.select([sys.stdin.fileno()], [], [])
|
||||
if r:
|
||||
try:
|
||||
data = os.read(sys.stdin.fileno(), 1024)
|
||||
if not data:
|
||||
break
|
||||
yield connpy_pb2.InteractRequest(stdin_data=data)
|
||||
except OSError:
|
||||
break
|
||||
|
||||
old_tty = termios.tcgetattr(sys.stdin)
|
||||
try:
|
||||
tty.setraw(sys.stdin.fileno())
|
||||
response_iterator = self.stub.interact_node(request_generator())
|
||||
|
||||
for res in response_iterator:
|
||||
if res.stdout_data:
|
||||
os.write(sys.stdout.fileno(), res.stdout_data)
|
||||
finally:
|
||||
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, old_tty)
|
||||
|
||||
@MethodHook
|
||||
@handle_errors
|
||||
def list_nodes(self, filter_str=None, format_str=None):
|
||||
req = connpy_pb2.FilterRequest(filter_str=filter_str or "", format_str=format_str or "")
|
||||
return from_value(self.stub.list_nodes(req).data) or []
|
||||
|
||||
@MethodHook
|
||||
@handle_errors
|
||||
def list_folders(self, filter_str=None):
|
||||
req = connpy_pb2.FilterRequest(filter_str=filter_str or "")
|
||||
return from_value(self.stub.list_folders(req).data) or []
|
||||
|
||||
@handle_errors
|
||||
def get_node_details(self, unique_id):
|
||||
return from_struct(self.stub.get_node_details(connpy_pb2.IdRequest(id=unique_id)).data)
|
||||
|
||||
@handle_errors
|
||||
def explode_unique(self, unique_id):
|
||||
return from_value(self.stub.explode_unique(connpy_pb2.IdRequest(id=unique_id)).data)
|
||||
|
||||
@handle_errors
|
||||
def generate_cache(self, nodes=None, folders=None, profiles=None):
|
||||
# 1. Update remote cache on server
|
||||
self.stub.generate_cache(Empty())
|
||||
|
||||
# 2. Update local fzf/text cache files
|
||||
# If no data provided, we fetch it all from remote to sync local files
|
||||
if nodes is None and folders is None and profiles is None:
|
||||
nodes = self.list_nodes()
|
||||
folders = self.list_folders()
|
||||
# We don't have direct access to ProfileStub here, but usually
|
||||
# node cache is what matters for fzf. We'll fetch profiles if we can.
|
||||
# For now, let's sync what we have.
|
||||
|
||||
if nodes is not None or folders is not None or profiles is not None:
|
||||
self.config._generate_nodes_cache(nodes=nodes, folders=folders, profiles=profiles)
|
||||
|
||||
def _trigger_local_cache_sync(self):
|
||||
"""Helper to fetch remote data and update local fzf cache files after a change."""
|
||||
try:
|
||||
nodes = self.list_nodes()
|
||||
folders = self.list_folders()
|
||||
self.generate_cache(nodes=nodes, folders=folders)
|
||||
except Exception:
|
||||
# Failure to sync cache shouldn't break the main operation's success feedback
|
||||
pass
|
||||
|
||||
@handle_errors
|
||||
def add_node(self, unique_id, data, is_folder=False):
|
||||
req = connpy_pb2.NodeRequest(id=unique_id, data=to_struct(data), is_folder=is_folder)
|
||||
self.stub.add_node(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def update_node(self, unique_id, data):
|
||||
req = connpy_pb2.NodeRequest(id=unique_id, data=to_struct(data), is_folder=False)
|
||||
self.stub.update_node(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def delete_node(self, unique_id, is_folder=False):
|
||||
req = connpy_pb2.DeleteRequest(id=unique_id, is_folder=is_folder)
|
||||
self.stub.delete_node(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def move_node(self, src_id, dst_id, copy=False):
|
||||
req = connpy_pb2.MoveRequest(src_id=src_id, dst_id=dst_id, copy=copy)
|
||||
self.stub.move_node(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def bulk_add(self, ids, hosts, common_data):
|
||||
req = connpy_pb2.BulkRequest(ids=ids, hosts=hosts, common_data=to_struct(common_data))
|
||||
self.stub.bulk_add(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def set_reserved_names(self, names):
|
||||
self.stub.set_reserved_names(connpy_pb2.ListRequest(items=names))
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def full_replace(self, connections, profiles):
|
||||
req = connpy_pb2.FullReplaceRequest(
|
||||
connections=to_struct(connections),
|
||||
profiles=to_struct(profiles)
|
||||
)
|
||||
self.stub.full_replace(req)
|
||||
self._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def get_inventory(self):
|
||||
resp = self.stub.get_inventory(Empty())
|
||||
return {
|
||||
"connections": from_struct(resp.connections),
|
||||
"profiles": from_struct(resp.profiles)
|
||||
}
|
||||
|
||||
|
||||
class ProfileStub:
|
||||
def __init__(self, channel, remote_host, node_stub=None):
|
||||
self.stub = connpy_pb2_grpc.ProfileServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
self.node_stub = node_stub
|
||||
|
||||
@handle_errors
|
||||
def list_profiles(self, filter_str=None):
|
||||
req = connpy_pb2.FilterRequest(filter_str=filter_str or "")
|
||||
return from_value(self.stub.list_profiles(req).data) or []
|
||||
|
||||
@handle_errors
|
||||
def get_profile(self, name, resolve=True):
|
||||
req = connpy_pb2.ProfileRequest(name=name, resolve=resolve)
|
||||
return from_struct(self.stub.get_profile(req).data)
|
||||
|
||||
@handle_errors
|
||||
def add_profile(self, name, data):
|
||||
req = connpy_pb2.NodeRequest(id=name, data=to_struct(data))
|
||||
self.stub.add_profile(req)
|
||||
if self.node_stub:
|
||||
self.node_stub._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def resolve_node_data(self, node_data):
|
||||
req = connpy_pb2.StructRequest(data=to_struct(node_data))
|
||||
return from_struct(self.stub.resolve_node_data(req).data)
|
||||
|
||||
@handle_errors
|
||||
def delete_profile(self, name):
|
||||
req = connpy_pb2.IdRequest(id=name)
|
||||
self.stub.delete_profile(req)
|
||||
if self.node_stub:
|
||||
self.node_stub._trigger_local_cache_sync()
|
||||
|
||||
@handle_errors
|
||||
def update_profile(self, name, data):
|
||||
req = connpy_pb2.NodeRequest(id=name, data=to_struct(data))
|
||||
self.stub.update_profile(req)
|
||||
if self.node_stub:
|
||||
self.node_stub._trigger_local_cache_sync()
|
||||
|
||||
|
||||
class PluginStub:
|
||||
def __init__(self, channel, remote_host):
|
||||
self.stub = connpy_pb2_grpc.PluginServiceStub(channel)
|
||||
self.remote_stub = remote_plugin_pb2_grpc.RemotePluginServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
|
||||
@handle_errors
|
||||
def list_plugins(self):
|
||||
return from_value(self.stub.list_plugins(Empty()).data)
|
||||
|
||||
@handle_errors
|
||||
def add_plugin(self, name, source_file, update=False):
|
||||
# Read the local file content to send it to the server
|
||||
with open(source_file, "r") as f:
|
||||
content = f.read()
|
||||
|
||||
# Use source_file as a marker for "content-inside"
|
||||
marker_content = f"---CONTENT---\n{content}"
|
||||
req = connpy_pb2.PluginRequest(name=name, source_file=marker_content, update=update)
|
||||
self.stub.add_plugin(req)
|
||||
|
||||
@handle_errors
|
||||
def delete_plugin(self, name):
|
||||
self.stub.delete_plugin(connpy_pb2.IdRequest(id=name))
|
||||
|
||||
@handle_errors
|
||||
def enable_plugin(self, name):
|
||||
self.stub.enable_plugin(connpy_pb2.IdRequest(id=name))
|
||||
|
||||
@handle_errors
|
||||
def disable_plugin(self, name):
|
||||
self.stub.disable_plugin(connpy_pb2.IdRequest(id=name))
|
||||
|
||||
@handle_errors
|
||||
def get_plugin_source(self, name):
|
||||
resp = self.remote_stub.get_plugin_source(remote_plugin_pb2.IdRequest(id=name))
|
||||
return resp.value
|
||||
|
||||
@handle_errors
|
||||
def invoke_plugin(self, name, args_namespace):
|
||||
import json
|
||||
args_dict = {k: v for k, v in vars(args_namespace).items()
|
||||
if isinstance(v, (str, int, float, bool, list, type(None)))}
|
||||
if hasattr(args_namespace, "func") and hasattr(args_namespace.func, "__name__"):
|
||||
args_dict["__func_name__"] = args_namespace.func.__name__
|
||||
|
||||
req = remote_plugin_pb2.PluginInvokeRequest(name=name, args_json=json.dumps(args_dict))
|
||||
for chunk in self.remote_stub.invoke_plugin(req):
|
||||
yield chunk.text
|
||||
|
||||
class ExecutionStub:
|
||||
def __init__(self, channel, remote_host):
|
||||
self.stub = connpy_pb2_grpc.ExecutionServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
|
||||
@handle_errors
|
||||
def run_commands(self, nodes_filter, commands, variables=None, parallel=10, timeout=10, folder=None, prompt=None, **kwargs):
|
||||
nodes_list = [nodes_filter] if isinstance(nodes_filter, str) else list(nodes_filter)
|
||||
req = connpy_pb2.RunRequest(
|
||||
nodes=nodes_list,
|
||||
commands=commands,
|
||||
folder=folder or "",
|
||||
prompt=prompt or "",
|
||||
parallel=parallel,
|
||||
)
|
||||
# Note: 'timeout', 'on_node_complete', and 'logger' are currently not
|
||||
# sent over gRPC in the current proto definition.
|
||||
if variables is not None:
|
||||
req.vars.CopyFrom(to_struct(variables))
|
||||
|
||||
final_results = {}
|
||||
on_complete = kwargs.get("on_node_complete")
|
||||
|
||||
for response in self.stub.run_commands(req):
|
||||
if on_complete:
|
||||
on_complete(response.unique_id, response.output, response.status)
|
||||
final_results[response.unique_id] = response.output
|
||||
|
||||
return final_results
|
||||
|
||||
@handle_errors
|
||||
def test_commands(self, nodes_filter, commands, expected, variables=None, parallel=10, timeout=10, prompt=None, **kwargs):
|
||||
nodes_list = [nodes_filter] if isinstance(nodes_filter, str) else list(nodes_filter)
|
||||
req = connpy_pb2.TestRequest(
|
||||
nodes=nodes_list,
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
folder=kwargs.get("folder", ""),
|
||||
prompt=prompt or "",
|
||||
parallel=parallel,
|
||||
)
|
||||
if variables is not None:
|
||||
req.vars.CopyFrom(to_struct(variables))
|
||||
|
||||
final_results = {}
|
||||
on_complete = kwargs.get("on_node_complete")
|
||||
|
||||
for response in self.stub.test_commands(req):
|
||||
result_dict = from_struct(response.test_result) if response.HasField("test_result") else {}
|
||||
if on_complete:
|
||||
on_complete(response.unique_id, response.output, response.status, result_dict)
|
||||
final_results[response.unique_id] = result_dict
|
||||
|
||||
return final_results
|
||||
|
||||
@handle_errors
|
||||
def run_cli_script(self, nodes_filter, script_path, parallel=10):
|
||||
req = connpy_pb2.ScriptRequest(param1=nodes_filter, param2=script_path, parallel=parallel)
|
||||
return from_struct(self.stub.run_cli_script(req).data)
|
||||
|
||||
@handle_errors
|
||||
def run_yaml_playbook(self, playbook_path, parallel=10):
|
||||
req = connpy_pb2.ScriptRequest(param1=playbook_path, parallel=parallel)
|
||||
return from_struct(self.stub.run_yaml_playbook(req).data)
|
||||
|
||||
class ImportExportStub:
|
||||
def __init__(self, channel, remote_host):
|
||||
self.stub = connpy_pb2_grpc.ImportExportServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
|
||||
@handle_errors
|
||||
def export_to_file(self, file_path, folders=None):
|
||||
req = connpy_pb2.ExportRequest(file_path=file_path, folders=folders or [])
|
||||
self.stub.export_to_file(req)
|
||||
|
||||
@handle_errors
|
||||
def import_from_file(self, file_path):
|
||||
with open(file_path, "r") as f:
|
||||
content = f.read()
|
||||
# Marker to tell the server this is content, not a path
|
||||
marker_content = f"---YAML---\n{content}"
|
||||
self.stub.import_from_file(connpy_pb2.StringRequest(value=marker_content))
|
||||
|
||||
@handle_errors
|
||||
def set_reserved_names(self, names):
|
||||
self.stub.set_reserved_names(connpy_pb2.ListRequest(items=names))
|
||||
|
||||
class AIStub:
|
||||
def __init__(self, channel, remote_host):
|
||||
self.stub = connpy_pb2_grpc.AIServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
|
||||
@handle_errors
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, session_id=None, debug=False, status=None, **overrides):
|
||||
import queue
|
||||
from rich.prompt import Prompt
|
||||
from rich.text import Text
|
||||
from rich.live import Live
|
||||
from rich.panel import Panel
|
||||
from rich.markdown import Markdown
|
||||
|
||||
req_queue = queue.Queue()
|
||||
|
||||
initial_req = connpy_pb2.AskRequest(
|
||||
input_text=input_text,
|
||||
dryrun=dryrun,
|
||||
session_id=session_id or "",
|
||||
debug=debug,
|
||||
engineer_model=overrides.get("engineer_model", ""),
|
||||
engineer_api_key=overrides.get("engineer_api_key", ""),
|
||||
architect_model=overrides.get("architect_model", ""),
|
||||
architect_api_key=overrides.get("architect_api_key", ""),
|
||||
trust=overrides.get("trust", False)
|
||||
)
|
||||
if chat_history is not None:
|
||||
initial_req.chat_history.CopyFrom(to_value(chat_history))
|
||||
|
||||
req_queue.put(initial_req)
|
||||
|
||||
def request_generator():
|
||||
while True:
|
||||
req = req_queue.get()
|
||||
if req is None: break
|
||||
yield req
|
||||
|
||||
responses = self.stub.ask(request_generator())
|
||||
|
||||
full_content = ""
|
||||
live_display = None
|
||||
final_result = {"response": "", "chat_history": []}
|
||||
|
||||
# Background thread to pull responses from gRPC into a local queue
|
||||
# This prevents KeyboardInterrupt from corrupting the gRPC iterator state
|
||||
response_queue = queue.Queue()
|
||||
|
||||
def pull_responses():
|
||||
try:
|
||||
for response in responses:
|
||||
response_queue.put(("data", response))
|
||||
except Exception as e:
|
||||
response_queue.put(("error", e))
|
||||
finally:
|
||||
response_queue.put((None, None))
|
||||
|
||||
threading.Thread(target=pull_responses, daemon=True).start()
|
||||
|
||||
try:
|
||||
while True:
|
||||
try:
|
||||
# BLOCKING GET from local queue (interruptible by signal)
|
||||
msg_type, response = response_queue.get()
|
||||
except KeyboardInterrupt:
|
||||
# Signal interruption to the server
|
||||
if status:
|
||||
status.update("[error]Interrupted! Closing pending tasks...")
|
||||
|
||||
# Send the interrupt signal to the server
|
||||
req_queue.put(connpy_pb2.AskRequest(interrupt=True))
|
||||
|
||||
# CONTINUE the loop to receive remaining data and summary from the queue
|
||||
continue
|
||||
|
||||
if msg_type is None: # Sentinel
|
||||
break
|
||||
|
||||
if msg_type == "error":
|
||||
# Re-raise or handle gRPC error from background thread
|
||||
if isinstance(response, grpc.RpcError):
|
||||
raise response
|
||||
printer.warning(f"Stream interrupted: {response}")
|
||||
break
|
||||
|
||||
if response.status_update:
|
||||
if response.requires_confirmation:
|
||||
if status: status.stop()
|
||||
if live_display: live_display.stop()
|
||||
|
||||
# Show prompt and wait for answer
|
||||
prompt_text = Text.from_ansi(response.status_update)
|
||||
ans = Prompt.ask(prompt_text)
|
||||
|
||||
if status:
|
||||
status.update("[ai_status]Agent: Resuming...")
|
||||
status.start()
|
||||
if live_display: live_display.start()
|
||||
|
||||
req_queue.put(connpy_pb2.AskRequest(confirmation_answer=ans))
|
||||
continue
|
||||
|
||||
if status:
|
||||
status.update(response.status_update)
|
||||
continue
|
||||
|
||||
if response.debug_message:
|
||||
if debug:
|
||||
printer.console.print(Text.from_ansi(response.debug_message))
|
||||
continue
|
||||
|
||||
if response.important_message:
|
||||
printer.console.print(Text.from_ansi(response.important_message))
|
||||
continue
|
||||
|
||||
if not response.is_final:
|
||||
full_content += response.text_chunk
|
||||
|
||||
if not live_display and not debug:
|
||||
if status: status.stop()
|
||||
live_display = Live(
|
||||
Panel(Markdown(full_content), title="AI Assistant", expand=False),
|
||||
console=printer.console,
|
||||
refresh_per_second=8,
|
||||
transient=False
|
||||
)
|
||||
live_display.start()
|
||||
elif live_display:
|
||||
live_display.update(Panel(Markdown(full_content), title="AI Assistant", expand=False))
|
||||
continue
|
||||
|
||||
if response.is_final:
|
||||
final_result = from_struct(response.full_result)
|
||||
responder = final_result.get("responder", "engineer")
|
||||
alias = "architect" if responder == "architect" else "engineer"
|
||||
role_label = "Network Architect" if responder == "architect" else "Network Engineer"
|
||||
title = f"[bold {alias}]{role_label}[/bold {alias}]"
|
||||
|
||||
if live_display:
|
||||
live_display.update(Panel(Markdown(full_content), title=title, border_style=alias, expand=False))
|
||||
live_display.stop()
|
||||
elif full_content:
|
||||
printer.console.print(Panel(Markdown(full_content), title=title, border_style=alias, expand=False))
|
||||
break
|
||||
except Exception as e:
|
||||
# Check if it was a gRPC error that we should let handle_errors catch
|
||||
if isinstance(e, grpc.RpcError):
|
||||
raise
|
||||
printer.warning(f"Stream interrupted: {e}")
|
||||
finally:
|
||||
req_queue.put(None)
|
||||
|
||||
if full_content:
|
||||
final_result["streamed"] = True
|
||||
|
||||
return final_result
|
||||
|
||||
@handle_errors
|
||||
def confirm(self, input_text, console=None):
|
||||
return self.stub.confirm(connpy_pb2.StringRequest(value=input_text)).value
|
||||
|
||||
@handle_errors
|
||||
def list_sessions(self):
|
||||
return from_value(self.stub.list_sessions(Empty()).data)
|
||||
|
||||
@handle_errors
|
||||
def delete_session(self, session_id):
|
||||
self.stub.delete_session(connpy_pb2.StringRequest(value=session_id))
|
||||
|
||||
@handle_errors
|
||||
def configure_provider(self, provider, model=None, api_key=None):
|
||||
req = connpy_pb2.ProviderRequest(provider=provider, model=model or "", api_key=api_key or "")
|
||||
self.stub.configure_provider(req)
|
||||
|
||||
@handle_errors
|
||||
def load_session_data(self, session_id):
|
||||
return from_struct(self.stub.load_session_data(connpy_pb2.StringRequest(value=session_id)).data)
|
||||
|
||||
class SystemStub:
|
||||
def __init__(self, channel, remote_host):
|
||||
self.stub = connpy_pb2_grpc.SystemServiceStub(channel)
|
||||
self.remote_host = remote_host
|
||||
|
||||
@handle_errors
|
||||
def start_api(self, port=None):
|
||||
self.stub.start_api(connpy_pb2.IntRequest(value=port or 8048))
|
||||
|
||||
@handle_errors
|
||||
def debug_api(self, port=None):
|
||||
self.stub.debug_api(connpy_pb2.IntRequest(value=port or 8048))
|
||||
|
||||
@handle_errors
|
||||
def stop_api(self):
|
||||
self.stub.stop_api(Empty())
|
||||
|
||||
@handle_errors
|
||||
def restart_api(self, port=None):
|
||||
self.stub.restart_api(connpy_pb2.IntRequest(value=port or 8048))
|
||||
|
||||
@handle_errors
|
||||
def get_api_status(self):
|
||||
return self.stub.get_api_status(Empty()).value
|
||||
@@ -0,0 +1,30 @@
|
||||
import json
|
||||
from google.protobuf import json_format
|
||||
from google.protobuf.struct_pb2 import Struct, Value
|
||||
|
||||
def to_value(obj):
|
||||
if obj is None:
|
||||
v = Value()
|
||||
v.null_value = 0
|
||||
return v
|
||||
json_str = json.dumps(obj)
|
||||
v = Value()
|
||||
json_format.Parse(json_str, v)
|
||||
return v
|
||||
|
||||
def from_value(val):
|
||||
if not val.HasField("kind"):
|
||||
return None
|
||||
return json.loads(json_format.MessageToJson(val))
|
||||
|
||||
def to_struct(obj):
|
||||
if not obj:
|
||||
return Struct()
|
||||
s = Struct()
|
||||
json_format.ParseDict(obj, s)
|
||||
return s
|
||||
|
||||
def from_struct(struct):
|
||||
if not struct:
|
||||
return {}
|
||||
return json_format.MessageToDict(struct, preserving_proto_field_name=True)
|
||||
+10
-9
@@ -22,16 +22,17 @@ class MethodHook:
|
||||
except Exception as e:
|
||||
printer.error(f"{self.func.__name__} Pre-hook {hook.__name__} raised an exception: {e}")
|
||||
|
||||
try:
|
||||
result = self.func(*args, **kwargs)
|
||||
result = self.func(*args, **kwargs)
|
||||
|
||||
finally:
|
||||
# Execute post-hooks after the original function
|
||||
for hook in self.post_hooks:
|
||||
try:
|
||||
result = hook(*args, **kwargs, result=result) # Pass result to hooks
|
||||
except Exception as e:
|
||||
printer.error(f"{self.func.__name__} Post-hook {hook.__name__} raised an exception: {e}")
|
||||
# Execute post-hooks after the original function
|
||||
if self.post_hooks:
|
||||
#printer.info(f"Executing {len(self.post_hooks)} post-hooks for {self.func.__name__}...")
|
||||
pass
|
||||
for hook in self.post_hooks:
|
||||
try:
|
||||
result = hook(*args, **kwargs, result=result) # Pass result to hooks
|
||||
except Exception as e:
|
||||
printer.error(f"{self.func.__name__} Post-hook {hook.__name__} raised an exception: {e}")
|
||||
|
||||
return result
|
||||
|
||||
|
||||
+119
-2
@@ -11,6 +11,27 @@ class Plugins:
|
||||
self.plugins = {}
|
||||
self.plugin_parsers = {}
|
||||
self.preloads = {}
|
||||
self.remote_plugins = {}
|
||||
self.preferences = {}
|
||||
|
||||
def _load_preferences(self, config_dir):
|
||||
import json
|
||||
path = os.path.join(config_dir, "plugin_preferences.json")
|
||||
try:
|
||||
with open(path) as f:
|
||||
self.preferences = json.load(f)
|
||||
except (FileNotFoundError, json.JSONDecodeError):
|
||||
self.preferences = {}
|
||||
|
||||
def _save_preferences(self, config_dir):
|
||||
import json
|
||||
path = os.path.join(config_dir, "plugin_preferences.json")
|
||||
try:
|
||||
with open(path, "w") as f:
|
||||
json.dump(self.preferences, f, indent=4)
|
||||
except OSError as e:
|
||||
printer.error(f"Failed to save plugin preferences: {e}")
|
||||
|
||||
|
||||
def verify_script(self, file_path):
|
||||
"""
|
||||
@@ -114,7 +135,7 @@ class Plugins:
|
||||
spec.loader.exec_module(module)
|
||||
return module
|
||||
|
||||
def _import_plugins_to_argparse(self, directory, subparsers):
|
||||
def _import_plugins_to_argparse(self, directory, subparsers, remote_enabled=False):
|
||||
if not os.path.exists(directory):
|
||||
return
|
||||
for filename in os.listdir(directory):
|
||||
@@ -123,6 +144,11 @@ class Plugins:
|
||||
root_filename = os.path.splitext(filename)[0]
|
||||
if root_filename in commands:
|
||||
continue
|
||||
|
||||
# Check preferences: if remote is preferred AND remote is enabled, skip local loading
|
||||
if remote_enabled and self.preferences.get(root_filename) == "remote":
|
||||
continue
|
||||
|
||||
# Construct the full path
|
||||
filepath = os.path.join(directory, filename)
|
||||
check_file = self.verify_script(filepath)
|
||||
@@ -134,7 +160,98 @@ class Plugins:
|
||||
if hasattr(self.plugins[root_filename], "Parser"):
|
||||
self.plugin_parsers[root_filename] = self.plugins[root_filename].Parser()
|
||||
plugin = self.plugin_parsers[root_filename]
|
||||
subparsers.add_parser(root_filename, parents=[self.plugin_parsers[root_filename].parser], add_help=False, usage=plugin.parser.usage, description=plugin.parser.description, epilog=plugin.parser.epilog, formatter_class=plugin.parser.formatter_class)
|
||||
# Default to RichHelpFormatter if plugin doesn't set one
|
||||
try:
|
||||
from rich_argparse import RichHelpFormatter as _RHF
|
||||
fmt = plugin.parser.formatter_class
|
||||
if fmt is argparse.HelpFormatter or fmt is argparse.RawTextHelpFormatter or fmt is argparse.RawDescriptionHelpFormatter:
|
||||
fmt = _RHF
|
||||
except ImportError:
|
||||
fmt = plugin.parser.formatter_class
|
||||
subparsers.add_parser(root_filename, parents=[self.plugin_parsers[root_filename].parser], add_help=False, help=plugin.parser.description, usage=plugin.parser.usage, description=plugin.parser.description, epilog=plugin.parser.epilog, formatter_class=fmt)
|
||||
if hasattr(self.plugins[root_filename], "Preload"):
|
||||
self.preloads[root_filename] = self.plugins[root_filename]
|
||||
|
||||
def _import_remote_plugins_to_argparse(self, plugin_stub, subparsers, cache_dir, force_sync=False):
|
||||
import hashlib
|
||||
os.makedirs(cache_dir, exist_ok=True)
|
||||
|
||||
try:
|
||||
remote_plugins_info = plugin_stub.list_plugins()
|
||||
except Exception:
|
||||
return
|
||||
|
||||
# Pruning: Remove local cached files that are no longer on the server
|
||||
for local_file in os.listdir(cache_dir):
|
||||
if local_file.endswith(".py"):
|
||||
name = local_file[:-3]
|
||||
if name not in remote_plugins_info:
|
||||
try:
|
||||
os.remove(os.path.join(cache_dir, local_file))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for name, info in remote_plugins_info.items():
|
||||
if not info.get("enabled", True):
|
||||
continue
|
||||
|
||||
pref = self.preferences.get(name, "local")
|
||||
if pref != "remote" and name in self.plugins:
|
||||
continue
|
||||
if not force_sync and name in subparsers.choices:
|
||||
continue
|
||||
|
||||
cache_path = os.path.join(cache_dir, f"{name}.py")
|
||||
|
||||
# Hash comparison
|
||||
remote_hash = info.get("hash", "")
|
||||
local_hash = ""
|
||||
if os.path.exists(cache_path):
|
||||
try:
|
||||
with open(cache_path, "rb") as f:
|
||||
local_hash = hashlib.md5(f.read()).hexdigest()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Update only if hash differs or force_sync is True
|
||||
if force_sync or remote_hash != local_hash or not os.path.exists(cache_path):
|
||||
try:
|
||||
source = plugin_stub.get_plugin_source(name)
|
||||
with open(cache_path, "w") as f:
|
||||
f.write(source)
|
||||
except Exception as e:
|
||||
printer.warning(f"Failed to sync remote plugin {name}: {e}")
|
||||
continue
|
||||
|
||||
# Verify and load
|
||||
check_file = self.verify_script(cache_path)
|
||||
if check_file:
|
||||
printer.warning(f"Remote plugin {name} failed verification: {check_file}")
|
||||
continue
|
||||
|
||||
module = self._import_from_path(cache_path)
|
||||
if hasattr(module, "Parser"):
|
||||
self.plugin_parsers[name] = module.Parser()
|
||||
self.remote_plugins[name] = True
|
||||
plugin = self.plugin_parsers[name]
|
||||
try:
|
||||
from rich_argparse import RichHelpFormatter as _RHF
|
||||
fmt = plugin.parser.formatter_class
|
||||
if fmt is argparse.HelpFormatter or fmt is argparse.RawTextHelpFormatter or fmt is argparse.RawDescriptionHelpFormatter:
|
||||
fmt = _RHF
|
||||
except ImportError:
|
||||
fmt = plugin.parser.formatter_class
|
||||
|
||||
# If force_sync, we might be re-registering, but argparse subparsers.add_parser
|
||||
# might fail if it exists. We check if it's already there.
|
||||
if name not in subparsers.choices:
|
||||
subparsers.add_parser(
|
||||
name,
|
||||
parents=[plugin.parser],
|
||||
add_help=False,
|
||||
help=f"[remote] {plugin.parser.description}",
|
||||
usage=plugin.parser.usage,
|
||||
description=plugin.parser.description,
|
||||
epilog=plugin.parser.epilog,
|
||||
formatter_class=fmt
|
||||
)
|
||||
|
||||
+243
-20
@@ -1,51 +1,274 @@
|
||||
import sys
|
||||
from rich.console import Console
|
||||
from rich.table import Table
|
||||
from rich.live import Live
|
||||
# Lazy-loaded printer module to speed up CLI startup
|
||||
_console = None
|
||||
_err_console = None
|
||||
_theme = None
|
||||
|
||||
console = Console()
|
||||
err_console = Console(stderr=True)
|
||||
# Centralized design system
|
||||
STYLES = {
|
||||
"info": "cyan",
|
||||
"warning": "yellow",
|
||||
"error": "red",
|
||||
"success": "green",
|
||||
"debug": "dim",
|
||||
"header": "bold cyan",
|
||||
"key": "bold cyan",
|
||||
"border": "cyan",
|
||||
"pass": "bold green",
|
||||
"fail": "bold red",
|
||||
"engineer": "blue",
|
||||
"architect": "medium_purple",
|
||||
"ai_status": "bold green",
|
||||
"user_prompt": "bold cyan",
|
||||
"unavailable": "orange3",
|
||||
}
|
||||
|
||||
def _get_console():
|
||||
global _console, _theme
|
||||
if _console is None:
|
||||
from rich.console import Console
|
||||
from rich.theme import Theme
|
||||
if _theme is None:
|
||||
_theme = Theme(STYLES)
|
||||
_console = Console(theme=_theme)
|
||||
return _console
|
||||
|
||||
def _get_err_console():
|
||||
global _err_console, _theme
|
||||
if _err_console is None:
|
||||
from rich.console import Console
|
||||
from rich.theme import Theme
|
||||
if _theme is None:
|
||||
_theme = Theme(STYLES)
|
||||
_err_console = Console(stderr=True, theme=_theme)
|
||||
return _err_console
|
||||
|
||||
@property
|
||||
def console():
|
||||
return _get_console()
|
||||
|
||||
@property
|
||||
def err_console():
|
||||
return _get_err_console()
|
||||
|
||||
@property
|
||||
def connpy_theme():
|
||||
global _theme
|
||||
if _theme is None:
|
||||
from rich.theme import Theme
|
||||
_theme = Theme(STYLES)
|
||||
return _theme
|
||||
|
||||
def apply_theme(user_styles=None):
|
||||
"""
|
||||
Updates the global console themes with user-defined styles.
|
||||
If a style is missing in user_styles, it falls back to the default in STYLES.
|
||||
"""
|
||||
global _theme, _console, _err_console
|
||||
from rich.theme import Theme
|
||||
|
||||
# Start with a copy of defaults
|
||||
active_styles = STYLES.copy()
|
||||
if user_styles:
|
||||
# Merge user styles (only if they are valid keys)
|
||||
for key, value in user_styles.items():
|
||||
if key in active_styles:
|
||||
active_styles[key] = value
|
||||
|
||||
_theme = Theme(active_styles)
|
||||
if _console:
|
||||
_console.push_theme(_theme)
|
||||
if _err_console:
|
||||
_err_console.push_theme(_theme)
|
||||
return active_styles
|
||||
|
||||
|
||||
def _format_multiline(tag, message):
|
||||
def _format_multiline(tag, message, style=None):
|
||||
message = str(message)
|
||||
lines = message.splitlines()
|
||||
if not lines:
|
||||
return f"\\[{tag}]"
|
||||
formatted = [f"\\[{tag}] {lines[0]}"]
|
||||
return f"[{style}]\\[{tag}][/{style}]" if style else f"\\[{tag}]"
|
||||
|
||||
# Apply style to the tag if provided
|
||||
styled_tag = f"[{style}]\\[{tag}][/{style}]" if style else f"\\[{tag}]"
|
||||
formatted = [f"{styled_tag} {lines[0]}"]
|
||||
|
||||
# Indent subsequent lines
|
||||
indent = " " * (len(tag) + 3)
|
||||
for line in lines[1:]:
|
||||
formatted.append(f"{indent}{line}")
|
||||
return "\n".join(formatted)
|
||||
|
||||
def info(message):
|
||||
console.print(_format_multiline("i", message))
|
||||
_get_console().print(_format_multiline("i", message, style="info"))
|
||||
|
||||
def success(message):
|
||||
console.print(_format_multiline("✓", message))
|
||||
_get_console().print(_format_multiline("✓", message, style="success"))
|
||||
|
||||
def start(message):
|
||||
console.print(_format_multiline("+", message))
|
||||
_get_console().print(_format_multiline("+", message, style="success"))
|
||||
|
||||
def warning(message):
|
||||
console.print(_format_multiline("!", message))
|
||||
_get_console().print(_format_multiline("!", message, style="warning"))
|
||||
|
||||
def error(message):
|
||||
# For error, we can create a temporary stderr console or just use the current one
|
||||
# err_console handles styles better than standard print and outputs to stderr.
|
||||
err_console.print(_format_multiline("✗", message), style="red")
|
||||
_get_err_console().print(_format_multiline("✗", message, style="error"))
|
||||
|
||||
def debug(message):
|
||||
console.print(_format_multiline("d", message))
|
||||
_get_console().print(_format_multiline("d", message, style="debug"))
|
||||
|
||||
def custom(tag, message):
|
||||
console.print(_format_multiline(tag, message))
|
||||
_get_console().print(_format_multiline(tag, message, style="header"))
|
||||
|
||||
def table(title, columns, rows, header_style="bold cyan", box=None):
|
||||
def table(title, columns, rows, header_style="header", box=None):
|
||||
from rich.table import Table
|
||||
t = Table(title=title, header_style=header_style, box=box)
|
||||
for col in columns:
|
||||
t.add_column(col)
|
||||
for row in rows:
|
||||
t.add_row(*[str(item) for item in row])
|
||||
console.print(t)
|
||||
_get_console().print(t)
|
||||
|
||||
def data(title, content, language="yaml"):
|
||||
"""Display structured data with syntax highlighting inside a panel."""
|
||||
from rich.syntax import Syntax
|
||||
from rich.panel import Panel
|
||||
syntax = Syntax(content, language, theme="ansi_dark", word_wrap=True, background_color="default")
|
||||
panel = Panel(syntax, title=f"[header]{title}[/header]", border_style="border", expand=False)
|
||||
_get_console().print(panel)
|
||||
|
||||
def node_panel(unique, output, status, title_prefix=""):
|
||||
"""Display node execution result in a styled panel."""
|
||||
from rich.panel import Panel
|
||||
from rich.text import Text
|
||||
from rich.console import Group
|
||||
import os
|
||||
|
||||
try:
|
||||
cols, _ = os.get_terminal_size()
|
||||
except OSError:
|
||||
cols = 80
|
||||
|
||||
if status == 0:
|
||||
status_str = "[pass]✓ PASS[/pass]"
|
||||
border = "pass"
|
||||
else:
|
||||
status_str = f"[fail]✗ FAIL({status})[/fail]"
|
||||
border = "fail"
|
||||
|
||||
title_line = f"{title_prefix}[bold]{unique}[/bold] — {status_str}"
|
||||
stripped = output.strip() if output else ""
|
||||
code_block = Text(stripped + "\n") if stripped else Text()
|
||||
|
||||
_get_console().print(Panel(Group(Text(), code_block), title=title_line, width=cols, border_style=border))
|
||||
|
||||
def test_panel(unique, output, status, result):
|
||||
"""Display test execution result in a styled panel."""
|
||||
from rich.panel import Panel
|
||||
from rich.text import Text
|
||||
from rich.console import Group
|
||||
import os
|
||||
|
||||
try:
|
||||
cols, _ = os.get_terminal_size()
|
||||
except OSError:
|
||||
cols = 80
|
||||
|
||||
is_pass = (status == 0 and result and all(result.values()))
|
||||
|
||||
if is_pass:
|
||||
status_str = "[pass]✓ PASS[/pass]"
|
||||
border = "pass"
|
||||
else:
|
||||
status_str = f"[fail]✗ FAIL[/fail]"
|
||||
border = "fail"
|
||||
|
||||
title_line = f"[bold]{unique}[/bold] — {status_str}"
|
||||
|
||||
stripped = output.strip() if output else ""
|
||||
code_block = Text(stripped + "\n") if stripped else Text()
|
||||
|
||||
test_results = Text()
|
||||
test_results.append("\nTEST RESULTS:\n", style="header")
|
||||
if result:
|
||||
max_key_len = max(len(k) for k in result.keys())
|
||||
for k, v in result.items():
|
||||
mark = "✓" if v else "✗"
|
||||
style = "success" if v else "error"
|
||||
test_results.append(f" {k.ljust(max_key_len)} {mark}\n", style=style)
|
||||
else:
|
||||
test_results.append(" No results (execution failed)\n", style="error")
|
||||
|
||||
_get_console().print(Panel(Group(Text(), code_block, test_results), title=title_line, width=cols, border_style=border))
|
||||
|
||||
def test_summary(results):
|
||||
"""Print an aggregate summary of multiple test results."""
|
||||
from rich.panel import Panel
|
||||
from rich.text import Text
|
||||
from rich.console import Group
|
||||
import os
|
||||
|
||||
try:
|
||||
cols, _ = os.get_terminal_size()
|
||||
except OSError:
|
||||
cols = 80
|
||||
|
||||
for node, test_result in results.items():
|
||||
status_code = 0 if test_result and all(test_result.values()) else 1
|
||||
if status_code == 0:
|
||||
status_str = "[pass]✓ PASS[/pass]"
|
||||
border = "pass"
|
||||
else:
|
||||
status_str = f"[fail]✗ FAIL[/fail]"
|
||||
border = "fail"
|
||||
|
||||
title_line = f"[bold]{node}[/bold] — {status_str}"
|
||||
|
||||
test_output = Text()
|
||||
test_output.append("TEST RESULTS:\n", style="header")
|
||||
max_key_len = max(len(k) for k in test_result.keys()) if test_result else 0
|
||||
for k, v in (test_result.items() if test_result else []):
|
||||
mark = "✓" if v else "✗"
|
||||
style = "success" if v else "error"
|
||||
test_output.append(f" {k.ljust(max_key_len)} {mark}\n", style=style)
|
||||
|
||||
_get_console().print(Panel(Group(Text(), test_output), title=title_line, width=cols, border_style=border))
|
||||
|
||||
def header(text):
|
||||
"""Print a section header."""
|
||||
from rich.rule import Rule
|
||||
_get_console().print(Rule(text, style="header"))
|
||||
|
||||
def kv(key, value):
|
||||
"""Print an inline key-value pair."""
|
||||
_get_console().print(f"[key]{key}[/key]: {value}")
|
||||
|
||||
def confirm_action(item, action):
|
||||
"""Print a confirmation pre-action message."""
|
||||
_get_console().print(f"\\[i] [bold]{action}[/bold]: {item}", style="info")
|
||||
|
||||
# Compatibility proxies
|
||||
class _ConsoleProxy:
|
||||
def __getattr__(self, name):
|
||||
return getattr(_get_console(), name)
|
||||
def __call__(self, *args, **kwargs):
|
||||
return _get_console()(*args, **kwargs)
|
||||
|
||||
class _ErrConsoleProxy:
|
||||
def __getattr__(self, name):
|
||||
return getattr(_get_err_console(), name)
|
||||
def __call__(self, *args, **kwargs):
|
||||
return _get_err_console()(*args, **kwargs)
|
||||
|
||||
console = _ConsoleProxy()
|
||||
err_console = _ErrConsoleProxy()
|
||||
|
||||
# theme also needs to be lazy
|
||||
class _ThemeProxy:
|
||||
def __getattr__(self, name):
|
||||
global _theme
|
||||
if _theme is None:
|
||||
from rich.theme import Theme
|
||||
_theme = Theme(STYLES)
|
||||
return getattr(_theme, name)
|
||||
|
||||
connpy_theme = _ThemeProxy()
|
||||
|
||||
@@ -0,0 +1,251 @@
|
||||
syntax = "proto3";
|
||||
|
||||
package connpy;
|
||||
|
||||
import "google/protobuf/struct.proto";
|
||||
import "google/protobuf/empty.proto";
|
||||
|
||||
service NodeService {
|
||||
rpc list_nodes (FilterRequest) returns (ValueResponse) {}
|
||||
rpc list_folders (FilterRequest) returns (ValueResponse) {}
|
||||
rpc get_node_details (IdRequest) returns (StructResponse) {}
|
||||
rpc explode_unique (IdRequest) returns (ValueResponse) {}
|
||||
rpc generate_cache (google.protobuf.Empty) returns (google.protobuf.Empty) {}
|
||||
rpc add_node (NodeRequest) returns (google.protobuf.Empty) {}
|
||||
rpc update_node (NodeRequest) returns (google.protobuf.Empty) {}
|
||||
rpc delete_node (DeleteRequest) returns (google.protobuf.Empty) {}
|
||||
rpc move_node (MoveRequest) returns (google.protobuf.Empty) {}
|
||||
rpc bulk_add (BulkRequest) returns (google.protobuf.Empty) {}
|
||||
rpc set_reserved_names (ListRequest) returns (google.protobuf.Empty) {}
|
||||
rpc interact_node (stream InteractRequest) returns (stream InteractResponse) {}
|
||||
rpc full_replace (FullReplaceRequest) returns (google.protobuf.Empty) {}
|
||||
rpc get_inventory (google.protobuf.Empty) returns (FullReplaceRequest) {}
|
||||
}
|
||||
|
||||
service ProfileService {
|
||||
rpc list_profiles (FilterRequest) returns (ValueResponse) {}
|
||||
rpc get_profile (ProfileRequest) returns (StructResponse) {}
|
||||
rpc add_profile (NodeRequest) returns (google.protobuf.Empty) {}
|
||||
rpc resolve_node_data (StructRequest) returns (StructResponse) {}
|
||||
rpc delete_profile (IdRequest) returns (google.protobuf.Empty) {}
|
||||
rpc update_profile (NodeRequest) returns (google.protobuf.Empty) {}
|
||||
}
|
||||
|
||||
service ConfigService {
|
||||
rpc get_settings (google.protobuf.Empty) returns (StructResponse) {}
|
||||
rpc get_default_dir (google.protobuf.Empty) returns (StringResponse) {}
|
||||
rpc set_config_folder (StringRequest) returns (google.protobuf.Empty) {}
|
||||
rpc update_setting (UpdateRequest) returns (google.protobuf.Empty) {}
|
||||
rpc encrypt_password (StringRequest) returns (StringResponse) {}
|
||||
rpc apply_theme_from_file (StringRequest) returns (StructResponse) {}
|
||||
}
|
||||
|
||||
service PluginService {
|
||||
rpc list_plugins (google.protobuf.Empty) returns (ValueResponse) {}
|
||||
rpc add_plugin (PluginRequest) returns (google.protobuf.Empty) {}
|
||||
rpc delete_plugin (IdRequest) returns (google.protobuf.Empty) {}
|
||||
rpc enable_plugin (IdRequest) returns (google.protobuf.Empty) {}
|
||||
rpc disable_plugin (IdRequest) returns (google.protobuf.Empty) {}
|
||||
}
|
||||
|
||||
service ExecutionService {
|
||||
rpc run_commands (RunRequest) returns (stream NodeRunResult) {}
|
||||
rpc test_commands (TestRequest) returns (stream NodeRunResult) {}
|
||||
rpc run_cli_script (ScriptRequest) returns (StructResponse) {}
|
||||
rpc run_yaml_playbook (ScriptRequest) returns (StructResponse) {}
|
||||
}
|
||||
|
||||
service ImportExportService {
|
||||
rpc export_to_file (ExportRequest) returns (google.protobuf.Empty) {}
|
||||
rpc import_from_file (StringRequest) returns (google.protobuf.Empty) {}
|
||||
rpc set_reserved_names (ListRequest) returns (google.protobuf.Empty) {}
|
||||
}
|
||||
|
||||
service AIService {
|
||||
rpc ask (stream AskRequest) returns (stream AIResponse) {}
|
||||
rpc confirm (StringRequest) returns (BoolResponse) {}
|
||||
rpc list_sessions (google.protobuf.Empty) returns (ValueResponse) {}
|
||||
rpc delete_session (StringRequest) returns (google.protobuf.Empty) {}
|
||||
rpc configure_provider (ProviderRequest) returns (google.protobuf.Empty) {}
|
||||
rpc load_session_data (StringRequest) returns (StructResponse) {}
|
||||
}
|
||||
|
||||
service SystemService {
|
||||
rpc start_api (IntRequest) returns (google.protobuf.Empty) {}
|
||||
rpc debug_api (IntRequest) returns (google.protobuf.Empty) {}
|
||||
rpc stop_api (google.protobuf.Empty) returns (google.protobuf.Empty) {}
|
||||
rpc restart_api (IntRequest) returns (google.protobuf.Empty) {}
|
||||
rpc get_api_status (google.protobuf.Empty) returns (BoolResponse) {}
|
||||
}
|
||||
|
||||
// Request and Response Messages
|
||||
|
||||
message InteractRequest {
|
||||
string id = 1;
|
||||
bool sftp = 2;
|
||||
bool debug = 3;
|
||||
bytes stdin_data = 4;
|
||||
int32 cols = 5;
|
||||
int32 rows = 6;
|
||||
}
|
||||
|
||||
message InteractResponse {
|
||||
bytes stdout_data = 1;
|
||||
}
|
||||
|
||||
message FilterRequest {
|
||||
string filter_str = 1;
|
||||
string format_str = 2;
|
||||
}
|
||||
|
||||
message ValueResponse {
|
||||
google.protobuf.Value data = 1;
|
||||
}
|
||||
|
||||
message IdRequest {
|
||||
string id = 1;
|
||||
}
|
||||
|
||||
message NodeRequest {
|
||||
string id = 1;
|
||||
google.protobuf.Struct data = 2;
|
||||
bool is_folder = 3;
|
||||
}
|
||||
|
||||
message DeleteRequest {
|
||||
string id = 1;
|
||||
bool is_folder = 2;
|
||||
}
|
||||
|
||||
message MessageValue {
|
||||
string value = 1;
|
||||
}
|
||||
|
||||
message MoveRequest {
|
||||
string src_id = 1;
|
||||
string dst_id = 2;
|
||||
bool copy = 3;
|
||||
}
|
||||
|
||||
message BulkRequest {
|
||||
repeated string ids = 1;
|
||||
repeated string hosts = 2;
|
||||
google.protobuf.Struct common_data = 3;
|
||||
}
|
||||
|
||||
message StructResponse {
|
||||
google.protobuf.Struct data = 1;
|
||||
}
|
||||
|
||||
message ProfileRequest {
|
||||
string name = 1;
|
||||
bool resolve = 2;
|
||||
}
|
||||
|
||||
message StructRequest {
|
||||
google.protobuf.Struct data = 1;
|
||||
}
|
||||
|
||||
message StringRequest {
|
||||
string value = 1;
|
||||
}
|
||||
|
||||
message StringResponse {
|
||||
string value = 1;
|
||||
}
|
||||
|
||||
message UpdateRequest {
|
||||
string key = 1;
|
||||
google.protobuf.Value value = 2;
|
||||
}
|
||||
|
||||
message PluginRequest {
|
||||
string name = 1;
|
||||
string source_file = 2;
|
||||
bool update = 3;
|
||||
}
|
||||
|
||||
message RunRequest {
|
||||
repeated string nodes = 1;
|
||||
repeated string commands = 2;
|
||||
string folder = 3;
|
||||
string prompt = 4;
|
||||
int32 parallel = 5;
|
||||
google.protobuf.Struct vars = 6;
|
||||
}
|
||||
|
||||
message TestRequest {
|
||||
repeated string nodes = 1;
|
||||
repeated string commands = 2;
|
||||
string expected = 3;
|
||||
string folder = 4;
|
||||
string prompt = 5;
|
||||
int32 parallel = 6;
|
||||
google.protobuf.Struct vars = 7;
|
||||
}
|
||||
|
||||
message ScriptRequest {
|
||||
string param1 = 1; // nodes_filter or playbook_path
|
||||
string param2 = 2; // script_path or ""
|
||||
int32 parallel = 3;
|
||||
}
|
||||
|
||||
message ExportRequest {
|
||||
string file_path = 1;
|
||||
repeated string folders = 2;
|
||||
}
|
||||
|
||||
message ListRequest {
|
||||
repeated string items = 1;
|
||||
}
|
||||
|
||||
message AskRequest {
|
||||
string input_text = 1;
|
||||
bool dryrun = 2;
|
||||
google.protobuf.Value chat_history = 3;
|
||||
string session_id = 4;
|
||||
bool debug = 5;
|
||||
string engineer_model = 6;
|
||||
string engineer_api_key = 7;
|
||||
string architect_model = 8;
|
||||
string architect_api_key = 9;
|
||||
bool trust = 10;
|
||||
string confirmation_answer = 11;
|
||||
bool interrupt = 12;
|
||||
}
|
||||
|
||||
message AIResponse {
|
||||
string text_chunk = 1;
|
||||
bool is_final = 2;
|
||||
google.protobuf.Struct full_result = 3;
|
||||
string status_update = 4;
|
||||
string debug_message = 5;
|
||||
bool requires_confirmation = 6;
|
||||
string important_message = 7;
|
||||
}
|
||||
|
||||
message BoolResponse {
|
||||
bool value = 1;
|
||||
}
|
||||
|
||||
message ProviderRequest {
|
||||
string provider = 1;
|
||||
string model = 2;
|
||||
string api_key = 3;
|
||||
}
|
||||
|
||||
message IntRequest {
|
||||
int32 value = 1;
|
||||
}
|
||||
|
||||
message NodeRunResult {
|
||||
string unique_id = 1;
|
||||
string output = 2;
|
||||
int32 status = 3;
|
||||
google.protobuf.Struct test_result = 4;
|
||||
}
|
||||
|
||||
message FullReplaceRequest {
|
||||
google.protobuf.Struct connections = 1;
|
||||
google.protobuf.Struct profiles = 2;
|
||||
}
|
||||
@@ -0,0 +1,28 @@
|
||||
from .exceptions import *
|
||||
from .node_service import NodeService
|
||||
from .profile_service import ProfileService
|
||||
from .execution_service import ExecutionService
|
||||
from .import_export_service import ImportExportService
|
||||
from .ai_service import AIService
|
||||
from .plugin_service import PluginService
|
||||
from .config_service import ConfigService
|
||||
from .system_service import SystemService
|
||||
|
||||
__all__ = [
|
||||
'NodeService',
|
||||
'ProfileService',
|
||||
'ExecutionService',
|
||||
'ImportExportService',
|
||||
'AIService',
|
||||
'PluginService',
|
||||
'ConfigService',
|
||||
'SystemService',
|
||||
'ConnpyError',
|
||||
'NodeNotFoundError',
|
||||
'NodeAlreadyExistsError',
|
||||
'ProfileNotFoundError',
|
||||
'ProfileAlreadyExistsError',
|
||||
'ExecutionError',
|
||||
'InvalidConfigurationError'
|
||||
]
|
||||
|
||||
@@ -0,0 +1,53 @@
|
||||
from .base import BaseService
|
||||
from .exceptions import InvalidConfigurationError
|
||||
|
||||
class AIService(BaseService):
|
||||
"""Business logic for interacting with AI agents and LLM configurations."""
|
||||
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console, confirm_handler=confirm_handler, trust=trust, **overrides)
|
||||
return agent.ask(input_text, dryrun, chat_history, status=status, debug=debug, session_id=session_id, chunk_callback=chunk_callback)
|
||||
|
||||
|
||||
def confirm(self, input_text, console=None):
|
||||
"""Ask for a safe confirmation of an action."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console)
|
||||
return agent.confirm(input_text)
|
||||
|
||||
|
||||
def list_sessions(self):
|
||||
"""Return a list of all saved AI sessions."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent._get_sessions()
|
||||
|
||||
def delete_session(self, session_id):
|
||||
"""Delete an AI session by ID."""
|
||||
import os
|
||||
sessions_dir = os.path.join(self.config.defaultdir, "ai_sessions")
|
||||
path = os.path.join(sessions_dir, f"{session_id}.json")
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
raise InvalidConfigurationError(f"Session '{session_id}' not found.")
|
||||
|
||||
def configure_provider(self, provider, model=None, api_key=None):
|
||||
"""Update AI provider settings in the configuration."""
|
||||
settings = self.config.config.get("ai", {})
|
||||
if model:
|
||||
settings[f"{provider}_model"] = model
|
||||
if api_key:
|
||||
settings[f"{provider}_api_key"] = api_key
|
||||
|
||||
self.config.config["ai"] = settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def load_session_data(self, session_id):
|
||||
"""Load a session's raw data by ID."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent.load_session_data(session_id)
|
||||
|
||||
@@ -0,0 +1,33 @@
|
||||
from connpy.hooks import MethodHook
|
||||
|
||||
class BaseService:
|
||||
"""Base class for all connpy services, providing common configuration access."""
|
||||
|
||||
def __init__(self, config=None):
|
||||
"""
|
||||
Initialize the service.
|
||||
|
||||
Args:
|
||||
config: An instance of configfile (or None to instantiate a new one/use global context).
|
||||
"""
|
||||
from connpy import configfile
|
||||
self.config = config or configfile()
|
||||
self.hooks = MethodHook
|
||||
self.reserved_names = []
|
||||
|
||||
def set_reserved_names(self, names):
|
||||
"""Inject a list of reserved names (e.g. from the CLI)."""
|
||||
self.reserved_names = names
|
||||
|
||||
def _validate_node_name(self, unique_id):
|
||||
"""Check if the node name in unique_id is reserved."""
|
||||
from .exceptions import ReservedNameError
|
||||
if not self.reserved_names:
|
||||
return
|
||||
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if uniques and "id" in uniques:
|
||||
# We only validate the 'id' (the actual node name), folders are prefixed with @
|
||||
node_name = uniques["id"]
|
||||
if node_name in self.reserved_names:
|
||||
raise ReservedNameError(f"Node name '{node_name}' is a reserved command.")
|
||||
@@ -0,0 +1,82 @@
|
||||
import os
|
||||
import shutil
|
||||
import base64
|
||||
from typing import Any, Dict
|
||||
from Crypto.PublicKey import RSA
|
||||
from Crypto.Cipher import PKCS1_OAEP
|
||||
from .base import BaseService
|
||||
from .exceptions import ConnpyError, InvalidConfigurationError, NodeNotFoundError
|
||||
|
||||
|
||||
class ConfigService(BaseService):
|
||||
"""Business logic for general application settings and state configuration."""
|
||||
|
||||
def get_settings(self) -> Dict[str, Any]:
|
||||
"""Get the global configuration settings block."""
|
||||
settings = self.config.config.copy()
|
||||
settings["configfolder"] = self.config.defaultdir
|
||||
return settings
|
||||
|
||||
def get_default_dir(self) -> str:
|
||||
"""Get the default configuration directory."""
|
||||
return self.config.defaultdir
|
||||
|
||||
def set_config_folder(self, folder_path: str):
|
||||
"""Set the default location for config file by writing to ~/.config/conn/.folder"""
|
||||
if not os.path.isdir(folder_path):
|
||||
raise ConnpyError(f"readable_dir:{folder_path} is not a valid path")
|
||||
|
||||
pathfile = os.path.join(self.config.anchor_path, ".folder")
|
||||
folder = os.path.abspath(folder_path).rstrip('/')
|
||||
|
||||
try:
|
||||
with open(pathfile, "w") as f:
|
||||
f.write(str(folder))
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to save config folder: {e}")
|
||||
|
||||
def update_setting(self, key, value):
|
||||
"""Update a setting in the configuration file."""
|
||||
self.config.config[key] = value
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def encrypt_password(self, password):
|
||||
"""Encrypt a password using the application's configuration encryption key."""
|
||||
return self.config.encrypt(password)
|
||||
|
||||
def apply_theme_from_file(self, theme_input):
|
||||
"""Apply 'dark', 'light' theme or load a YAML theme file and save it to the configuration."""
|
||||
import yaml
|
||||
from ..printer import STYLES, LIGHT_THEME
|
||||
|
||||
if theme_input == "dark":
|
||||
valid_styles = {}
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
elif theme_input == "light":
|
||||
valid_styles = LIGHT_THEME.copy()
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
|
||||
if not os.path.exists(theme_input):
|
||||
raise InvalidConfigurationError(f"Theme file '{theme_input}' not found.")
|
||||
|
||||
try:
|
||||
with open(theme_input, 'r') as f:
|
||||
user_styles = yaml.safe_load(f)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to parse theme file: {e}")
|
||||
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
if not valid_styles:
|
||||
raise InvalidConfigurationError("No valid style keys found in theme file.")
|
||||
|
||||
# Persist and return merged styles
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
|
||||
@@ -0,0 +1,87 @@
|
||||
import re
|
||||
from typing import List, Dict, Any
|
||||
from .base import BaseService
|
||||
from ..hooks import MethodHook
|
||||
from .. import printer
|
||||
|
||||
class ContextService(BaseService):
|
||||
"""Business logic for managing and applying regex-based contexts locally."""
|
||||
|
||||
@property
|
||||
def contexts(self) -> Dict[str, List[str]]:
|
||||
return self.config.config.get("contexts", {"all": [".*"]})
|
||||
|
||||
@property
|
||||
def current_context(self) -> str:
|
||||
return self.config.config.get("current_context", "all")
|
||||
|
||||
def list_contexts(self) -> List[Dict[str, Any]]:
|
||||
result = []
|
||||
for name in self.contexts.keys():
|
||||
result.append({
|
||||
"name": name,
|
||||
"active": (name == self.current_context),
|
||||
"regexes": self.contexts[name]
|
||||
})
|
||||
return result
|
||||
|
||||
def add_context(self, name: str, regexes: List[str]):
|
||||
if not name.isalnum():
|
||||
raise ValueError("Context name must be alphanumeric")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name in ctxs:
|
||||
raise ValueError(f"Context '{name}' already exists")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def update_context(self, name: str, regexes: List[str]):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot modify default context 'all'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def delete_context(self, name: str):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot delete default context 'all'")
|
||||
if name == self.current_context:
|
||||
raise ValueError(f"Cannot delete active context '{name}'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
del ctxs[name]
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def set_active_context(self, name: str):
|
||||
if name not in self.contexts:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
self.config.config["current_context"] = name
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def get_active_regexes(self) -> List[re.Pattern]:
|
||||
patterns = self.contexts.get(self.current_context, [".*"])
|
||||
return [re.compile(p) for p in patterns]
|
||||
|
||||
def _match_any(self, node_name: str, patterns: List[re.Pattern]) -> bool:
|
||||
return any(p.match(node_name) for p in patterns)
|
||||
|
||||
# Hook handlers for filtering
|
||||
def filter_node_list(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return [node for node in kwargs["result"] if self._match_any(node, patterns)]
|
||||
|
||||
def filter_node_dict(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return {k: v for k, v in kwargs["result"].items() if self._match_any(k, patterns)}
|
||||
@@ -0,0 +1,31 @@
|
||||
class ConnpyError(Exception):
|
||||
"""Base exception for all connpy services."""
|
||||
pass
|
||||
|
||||
class NodeNotFoundError(ConnpyError):
|
||||
"""Raised when a connection or folder is not found."""
|
||||
pass
|
||||
|
||||
class NodeAlreadyExistsError(ConnpyError):
|
||||
"""Raised when a node or folder already exists."""
|
||||
pass
|
||||
|
||||
class ProfileNotFoundError(ConnpyError):
|
||||
"""Raised when a profile is not found."""
|
||||
pass
|
||||
|
||||
class ProfileAlreadyExistsError(ConnpyError):
|
||||
"""Raised when a profile with the same name already exists."""
|
||||
pass
|
||||
|
||||
class ExecutionError(ConnpyError):
|
||||
"""Raised when an execution fails or returns error."""
|
||||
pass
|
||||
|
||||
class InvalidConfigurationError(ConnpyError):
|
||||
"""Raised when data or configuration input is invalid."""
|
||||
pass
|
||||
|
||||
class ReservedNameError(ConnpyError):
|
||||
"""Raised when a node name conflicts with a reserved command."""
|
||||
pass
|
||||
@@ -0,0 +1,132 @@
|
||||
from typing import List, Dict, Any, Callable, Optional
|
||||
import os
|
||||
import yaml
|
||||
from .base import BaseService
|
||||
from connpy.core import nodes as Nodes
|
||||
from .exceptions import ConnpyError
|
||||
|
||||
class ExecutionService(BaseService):
|
||||
"""Business logic for executing commands on nodes and running automation scripts."""
|
||||
|
||||
def run_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, str]:
|
||||
|
||||
"""Execute commands on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.run(
|
||||
commands=commands,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Execution failed: {e}")
|
||||
|
||||
def test_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, Dict[str, bool]]:
|
||||
|
||||
"""Run commands and verify expected output on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.test(
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Testing failed: {e}")
|
||||
|
||||
def run_cli_script(self, nodes_filter: str, script_path: str, parallel: int = 10) -> Dict[str, str]:
|
||||
"""Run a plain-text script containing one command per line."""
|
||||
if not os.path.exists(script_path):
|
||||
raise ConnpyError(f"Script file not found: {script_path}")
|
||||
|
||||
try:
|
||||
with open(script_path, "r") as f:
|
||||
commands = [line.strip() for line in f if line.strip()]
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to read script {script_path}: {e}")
|
||||
|
||||
return self.run_commands(nodes_filter, commands, parallel=parallel)
|
||||
|
||||
def run_yaml_playbook(self, playbook_path: str, parallel: int = 10) -> Dict[str, Any]:
|
||||
"""Run a structured Connpy YAML automation playbook."""
|
||||
if not os.path.exists(playbook_path):
|
||||
raise ConnpyError(f"Playbook file not found: {playbook_path}")
|
||||
|
||||
try:
|
||||
with open(playbook_path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to load playbook {playbook_path}: {e}")
|
||||
|
||||
# Basic validation
|
||||
if not isinstance(playbook, dict) or "nodes" not in playbook or "commands" not in playbook:
|
||||
raise ConnpyError("Invalid playbook format: missing 'nodes' or 'commands' keys.")
|
||||
|
||||
action = playbook.get("action", "run")
|
||||
if action == "run":
|
||||
return self.run_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
elif action == "test":
|
||||
return self.test_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
expected=playbook.get("expected", []),
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
else:
|
||||
raise ConnpyError(f"Unsupported playbook action: {action}")
|
||||
|
||||
@@ -0,0 +1,73 @@
|
||||
from .base import BaseService
|
||||
import yaml
|
||||
import os
|
||||
from .exceptions import InvalidConfigurationError, NodeNotFoundError, ReservedNameError
|
||||
from ..configfile import NoAliasDumper
|
||||
|
||||
|
||||
class ImportExportService(BaseService):
|
||||
"""Business logic for YAML/JSON inventory import and export."""
|
||||
|
||||
def export_to_file(self, file_path, folders=None):
|
||||
"""Export nodes/folders to a YAML file."""
|
||||
if os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' already exists.")
|
||||
|
||||
data = self.export_to_dict(folders)
|
||||
try:
|
||||
with open(file_path, "w") as f:
|
||||
yaml.dump(data, f, Dumper=NoAliasDumper, default_flow_style=False)
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to export to '{file_path}': {e}")
|
||||
|
||||
def export_to_dict(self, folders=None):
|
||||
"""Export nodes/folders to a dictionary."""
|
||||
if not folders:
|
||||
return self.config._getallnodesfull(extract=False)
|
||||
else:
|
||||
# Validate folders exist
|
||||
for f in folders:
|
||||
if f != "@" and f not in self.config._getallfolders():
|
||||
raise NodeNotFoundError(f"Folder '{f}' not found.")
|
||||
return self.config._getallnodesfull(folders, extract=False)
|
||||
|
||||
def import_from_file(self, file_path):
|
||||
"""Import nodes/folders from a YAML file."""
|
||||
if not os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' does not exist.")
|
||||
|
||||
try:
|
||||
with open(file_path, "r") as f:
|
||||
data = yaml.load(f, Loader=yaml.FullLoader)
|
||||
self.import_from_dict(data)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to read/parse import file: {e}")
|
||||
|
||||
def import_from_dict(self, data):
|
||||
"""Import nodes/folders from a dictionary."""
|
||||
if not isinstance(data, dict):
|
||||
raise InvalidConfigurationError("Invalid import data format: expected a dictionary of nodes.")
|
||||
|
||||
# Process imports
|
||||
for k, v in data.items():
|
||||
uniques = self.config._explode_unique(k)
|
||||
|
||||
# Ensure folders exist
|
||||
if "folder" in uniques:
|
||||
folder_name = f"@{uniques['folder']}"
|
||||
if folder_name not in self.config._getallfolders():
|
||||
folder_uniques = self.config._explode_unique(folder_name)
|
||||
self.config._folder_add(**folder_uniques)
|
||||
|
||||
if "subfolder" in uniques:
|
||||
sub_name = f"@{uniques['subfolder']}@{uniques['folder']}"
|
||||
if sub_name not in self.config._getallfolders():
|
||||
sub_uniques = self.config._explode_unique(sub_name)
|
||||
self.config._folder_add(**sub_uniques)
|
||||
|
||||
# Add node/connection
|
||||
v.update(uniques)
|
||||
self._validate_node_name(k)
|
||||
self.config._connections_add(**v)
|
||||
|
||||
self.config._saveconfig(self.config.file)
|
||||
@@ -0,0 +1,255 @@
|
||||
import re
|
||||
from .base import BaseService
|
||||
from .exceptions import (
|
||||
NodeNotFoundError, NodeAlreadyExistsError,
|
||||
InvalidConfigurationError, ReservedNameError
|
||||
)
|
||||
|
||||
class NodeService(BaseService):
|
||||
def __init__(self, config=None):
|
||||
super().__init__(config)
|
||||
|
||||
|
||||
def list_nodes(self, filter_str=None, format_str=None):
|
||||
"""Return a listed filtered by regex match and formatted if needed."""
|
||||
nodes = self.config._getallnodes()
|
||||
case_sensitive = self.config.config.get("case", False)
|
||||
|
||||
if filter_str:
|
||||
flags = re.IGNORECASE if not case_sensitive else 0
|
||||
nodes = [n for n in nodes if re.search(filter_str, n, flags)]
|
||||
|
||||
if not format_str:
|
||||
return nodes
|
||||
|
||||
from .profile_service import ProfileService
|
||||
profile_service = ProfileService(self.config)
|
||||
|
||||
formatted_nodes = []
|
||||
for n_id in nodes:
|
||||
# Use ProfileService to resolve profiles for dynamic formatting
|
||||
details = self.config.getitem(n_id, extract=False)
|
||||
if details:
|
||||
details = profile_service.resolve_node_data(details)
|
||||
|
||||
name = n_id.split("@")[0]
|
||||
location = n_id.partition("@")[2] or "root"
|
||||
|
||||
# Prepare context for .format() with all details
|
||||
context = details.copy()
|
||||
context.update({
|
||||
"name": name,
|
||||
"NAME": name.upper(),
|
||||
"location": location,
|
||||
"LOCATION": location.upper(),
|
||||
})
|
||||
|
||||
# Add exploded uniques (id, folder, subfolder)
|
||||
uniques = self.config._explode_unique(n_id)
|
||||
if uniques:
|
||||
context.update(uniques)
|
||||
|
||||
# Add uppercase versions of all keys for convenience
|
||||
for k, v in list(context.items()):
|
||||
if isinstance(v, str):
|
||||
context[k.upper()] = v.upper()
|
||||
|
||||
try:
|
||||
formatted_nodes.append(format_str.format(**context))
|
||||
except (KeyError, IndexError, ValueError):
|
||||
# Fallback to original string if format fails
|
||||
formatted_nodes.append(n_id)
|
||||
return formatted_nodes
|
||||
|
||||
def list_folders(self, filter_str=None):
|
||||
"""Return all unique folders, optionally filtered by regex."""
|
||||
folders = self.config._getallfolders()
|
||||
case_sensitive = self.config.config.get("case", False)
|
||||
|
||||
if filter_str:
|
||||
flags = re.IGNORECASE if not case_sensitive else 0
|
||||
folders = [f for f in folders if re.search(filter_str, f, flags)]
|
||||
return folders
|
||||
|
||||
def get_node_details(self, unique_id):
|
||||
"""Return full configuration dictionary for a specific node."""
|
||||
details = self.config.getitem(unique_id)
|
||||
if not details:
|
||||
raise NodeNotFoundError(f"Node '{unique_id}' not found.")
|
||||
return details
|
||||
|
||||
def explode_unique(self, unique_id):
|
||||
"""Explode a unique ID into a dictionary of its parts."""
|
||||
return self.config._explode_unique(unique_id)
|
||||
|
||||
def generate_cache(self, nodes=None, folders=None, profiles=None):
|
||||
"""Generate and update the internal nodes cache."""
|
||||
self.config._generate_nodes_cache(nodes=nodes, folders=folders, profiles=profiles)
|
||||
|
||||
|
||||
def add_node(self, unique_id, data, is_folder=False):
|
||||
"""Logic for adding a new node or folder to configuration."""
|
||||
if not is_folder:
|
||||
self._validate_node_name(unique_id)
|
||||
|
||||
all_nodes = self.config._getallnodes()
|
||||
all_folders = self.config._getallfolders()
|
||||
|
||||
if is_folder:
|
||||
if unique_id in all_folders:
|
||||
raise NodeAlreadyExistsError(f"Folder '{unique_id}' already exists.")
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if not uniques:
|
||||
raise InvalidConfigurationError(f"Invalid folder name '{unique_id}'.")
|
||||
|
||||
# Check if parent folder exists when creating a subfolder
|
||||
if "subfolder" in uniques:
|
||||
parent_folder = f"@{uniques['folder']}"
|
||||
if parent_folder not in all_folders:
|
||||
raise NodeNotFoundError(f"Folder '{parent_folder}' not found.")
|
||||
|
||||
self.config._folder_add(**uniques)
|
||||
self.config._saveconfig(self.config.file)
|
||||
else:
|
||||
if unique_id in all_nodes:
|
||||
raise NodeAlreadyExistsError(f"Node '{unique_id}' already exists.")
|
||||
|
||||
# Check if parent folder exists when creating a node in a folder
|
||||
node_folder = unique_id.partition("@")[2]
|
||||
if node_folder:
|
||||
parent_folder = f"@{node_folder}"
|
||||
if parent_folder not in all_folders:
|
||||
raise NodeNotFoundError(f"Folder '{parent_folder}' not found.")
|
||||
|
||||
# Ensure 'id' is in data for config._connections_add
|
||||
if "id" not in data:
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if uniques and "id" in uniques:
|
||||
data["id"] = uniques["id"]
|
||||
|
||||
self.config._connections_add(**data)
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def update_node(self, unique_id, data):
|
||||
"""Explicitly update an existing node."""
|
||||
all_nodes = self.config._getallnodes()
|
||||
if unique_id not in all_nodes:
|
||||
raise NodeNotFoundError(f"Node '{unique_id}' not found.")
|
||||
|
||||
# Ensure 'id' is in data for config._connections_add
|
||||
if "id" not in data:
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if uniques:
|
||||
data["id"] = uniques["id"]
|
||||
|
||||
# config._connections_add actually handles updates if ID exists correctly
|
||||
self.config._connections_add(**data)
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def delete_node(self, unique_id, is_folder=False):
|
||||
"""Logic for deleting a node or folder."""
|
||||
if is_folder:
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if not uniques:
|
||||
raise NodeNotFoundError(f"Folder '{unique_id}' not found or invalid.")
|
||||
self.config._folder_del(**uniques)
|
||||
else:
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if not uniques:
|
||||
raise NodeNotFoundError(f"Node '{unique_id}' not found or invalid.")
|
||||
self.config._connections_del(**uniques)
|
||||
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def connect_node(self, unique_id, sftp=False, debug=False, logger=None):
|
||||
"""Interact with a node directly."""
|
||||
from connpy.core import node
|
||||
from .profile_service import ProfileService
|
||||
|
||||
node_data = self.config.getitem(unique_id, extract=False)
|
||||
if not node_data:
|
||||
raise NodeNotFoundError(f"Node '{unique_id}' not found.")
|
||||
|
||||
# Resolve profiles
|
||||
profile_service = ProfileService(self.config)
|
||||
resolved_data = profile_service.resolve_node_data(node_data)
|
||||
|
||||
n = node(unique_id, **resolved_data, config=self.config)
|
||||
if sftp:
|
||||
n.protocol = "sftp"
|
||||
|
||||
n.interact(debug=debug, logger=logger)
|
||||
|
||||
def move_node(self, src_id, dst_id, copy=False):
|
||||
"""Move or copy a node."""
|
||||
self._validate_node_name(dst_id)
|
||||
|
||||
node_data = self.config.getitem(src_id)
|
||||
if not node_data:
|
||||
raise NodeNotFoundError(f"Source node '{src_id}' not found.")
|
||||
|
||||
if dst_id in self.config._getallnodes():
|
||||
raise NodeAlreadyExistsError(f"Destination node '{dst_id}' already exists.")
|
||||
|
||||
new_uniques = self.config._explode_unique(dst_id)
|
||||
if not new_uniques:
|
||||
raise InvalidConfigurationError(f"Invalid destination format '{dst_id}'.")
|
||||
|
||||
new_node_data = node_data.copy()
|
||||
new_node_data.update(new_uniques)
|
||||
|
||||
self.config._connections_add(**new_node_data)
|
||||
|
||||
if not copy:
|
||||
src_uniques = self.config._explode_unique(src_id)
|
||||
self.config._connections_del(**src_uniques)
|
||||
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def bulk_add(self, ids, hosts, common_data):
|
||||
"""Add multiple nodes with shared common configuration."""
|
||||
count = 0
|
||||
all_nodes = self.config._getallnodes()
|
||||
|
||||
for i, uid in enumerate(ids):
|
||||
if uid in all_nodes:
|
||||
continue
|
||||
|
||||
try:
|
||||
self._validate_node_name(uid)
|
||||
except ReservedNameError:
|
||||
# For bulk, we might want to just skip or log.
|
||||
# CLI caller will handle if it wants to be strict.
|
||||
continue
|
||||
|
||||
host = hosts[i] if i < len(hosts) else hosts[0]
|
||||
uniques = self.config._explode_unique(uid)
|
||||
if not uniques:
|
||||
continue
|
||||
|
||||
node_data = common_data.copy()
|
||||
node_data.pop("ids", None)
|
||||
node_data.pop("location", None)
|
||||
node_data.update(uniques)
|
||||
node_data["host"] = host
|
||||
node_data["type"] = "connection"
|
||||
|
||||
self.config._connections_add(**node_data)
|
||||
count += 1
|
||||
|
||||
if count > 0:
|
||||
self.config._saveconfig(self.config.file)
|
||||
return count
|
||||
|
||||
def full_replace(self, connections, profiles):
|
||||
"""Replace all connections and profiles with new data."""
|
||||
self.config.connections = connections
|
||||
self.config.profiles = profiles
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def get_inventory(self):
|
||||
"""Return a full snapshot of connections and profiles."""
|
||||
return {
|
||||
"connections": self.config.connections,
|
||||
"profiles": self.config.profiles
|
||||
}
|
||||
@@ -0,0 +1,250 @@
|
||||
from .base import BaseService
|
||||
import yaml
|
||||
import os
|
||||
from .exceptions import InvalidConfigurationError, NodeNotFoundError
|
||||
|
||||
|
||||
class PluginService(BaseService):
|
||||
"""Business logic for enabling, disabling, and listing plugins."""
|
||||
|
||||
def list_plugins(self):
|
||||
"""List all core and user-defined plugins with their status and hash."""
|
||||
import os
|
||||
import hashlib
|
||||
|
||||
# Check for user plugins directory
|
||||
plugin_dir = os.path.join(self.config.defaultdir, "plugins")
|
||||
# Check for core plugins directory
|
||||
core_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "..", "core_plugins")
|
||||
|
||||
all_plugin_info = {}
|
||||
|
||||
def get_hash(path):
|
||||
try:
|
||||
with open(path, "rb") as f:
|
||||
return hashlib.md5(f.read()).hexdigest()
|
||||
except Exception:
|
||||
return ""
|
||||
|
||||
# User plugins
|
||||
if os.path.exists(plugin_dir):
|
||||
for f in os.listdir(plugin_dir):
|
||||
if f.endswith(".py"):
|
||||
name = f[:-3]
|
||||
path = os.path.join(plugin_dir, f)
|
||||
all_plugin_info[name] = {"enabled": True, "hash": get_hash(path)}
|
||||
elif f.endswith(".py.bkp"):
|
||||
name = f[:-7]
|
||||
all_plugin_info[name] = {"enabled": False}
|
||||
|
||||
return all_plugin_info
|
||||
|
||||
def add_plugin(self, name, source_file, update=False):
|
||||
"""Add or update a plugin from a local file."""
|
||||
import os
|
||||
import shutil
|
||||
from connpy.plugins import Plugins
|
||||
|
||||
if not name.isalpha() or not name.islower() or len(name) > 15:
|
||||
raise InvalidConfigurationError("Plugin name should be lowercase letters up to 15 characters.")
|
||||
|
||||
p_manager = Plugins()
|
||||
# Check for bad script
|
||||
error = p_manager.verify_script(source_file)
|
||||
if error:
|
||||
raise InvalidConfigurationError(f"Invalid plugin script: {error}")
|
||||
|
||||
self._save_plugin_file(name, source_file, update, is_path=True)
|
||||
|
||||
def add_plugin_from_bytes(self, name, content, update=False):
|
||||
"""Add or update a plugin from bytes (gRPC)."""
|
||||
import tempfile
|
||||
import os
|
||||
|
||||
if not name.isalpha() or not name.islower() or len(name) > 15:
|
||||
raise InvalidConfigurationError("Plugin name should be lowercase letters up to 15 characters.")
|
||||
|
||||
# Write to temp file to verify script
|
||||
with tempfile.NamedTemporaryFile(suffix=".py", delete=False) as tmp:
|
||||
tmp.write(content)
|
||||
tmp_path = tmp.name
|
||||
|
||||
try:
|
||||
from connpy.plugins import Plugins
|
||||
p_manager = Plugins()
|
||||
error = p_manager.verify_script(tmp_path)
|
||||
if error:
|
||||
raise InvalidConfigurationError(f"Invalid plugin script: {error}")
|
||||
|
||||
self._save_plugin_file(name, tmp_path, update, is_path=True)
|
||||
finally:
|
||||
if os.path.exists(tmp_path):
|
||||
os.remove(tmp_path)
|
||||
|
||||
def _save_plugin_file(self, name, source, update=False, is_path=True):
|
||||
import os
|
||||
import shutil
|
||||
|
||||
plugin_dir = os.path.join(self.config.defaultdir, "plugins")
|
||||
os.makedirs(plugin_dir, exist_ok=True)
|
||||
|
||||
target_file = os.path.join(plugin_dir, f"{name}.py")
|
||||
backup_file = f"{target_file}.bkp"
|
||||
|
||||
if not update and (os.path.exists(target_file) or os.path.exists(backup_file)):
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' already exists.")
|
||||
|
||||
try:
|
||||
if is_path:
|
||||
shutil.copy2(source, target_file)
|
||||
else:
|
||||
with open(target_file, "wb") as f:
|
||||
f.write(source)
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to save plugin file: {e}")
|
||||
|
||||
def delete_plugin(self, name):
|
||||
"""Remove a plugin file permanently."""
|
||||
import os
|
||||
plugin_file = os.path.join(self.config.defaultdir, "plugins", f"{name}.py")
|
||||
disabled_file = f"{plugin_file}.bkp"
|
||||
|
||||
deleted = False
|
||||
for f in [plugin_file, disabled_file]:
|
||||
if os.path.exists(f):
|
||||
try:
|
||||
os.remove(f)
|
||||
deleted = True
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to delete plugin file '{f}': {e}")
|
||||
|
||||
if not deleted:
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' not found.")
|
||||
|
||||
def enable_plugin(self, name):
|
||||
"""Activate a plugin by renaming its backup file."""
|
||||
import os
|
||||
plugin_file = os.path.join(self.config.defaultdir, "plugins", f"{name}.py")
|
||||
disabled_file = f"{plugin_file}.bkp"
|
||||
|
||||
if os.path.exists(plugin_file):
|
||||
return False # Already enabled
|
||||
|
||||
if not os.path.exists(disabled_file):
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' not found.")
|
||||
|
||||
try:
|
||||
os.rename(disabled_file, plugin_file)
|
||||
return True
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to enable plugin '{name}': {e}")
|
||||
|
||||
def disable_plugin(self, name):
|
||||
"""Deactivate a plugin by renaming it to a backup file."""
|
||||
import os
|
||||
plugin_file = os.path.join(self.config.defaultdir, "plugins", f"{name}.py")
|
||||
disabled_file = f"{plugin_file}.bkp"
|
||||
|
||||
if os.path.exists(disabled_file):
|
||||
return False # Already disabled
|
||||
|
||||
if not os.path.exists(plugin_file):
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' not found or is a core plugin.")
|
||||
|
||||
try:
|
||||
os.rename(plugin_file, disabled_file)
|
||||
return True
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to disable plugin '{name}': {e}")
|
||||
|
||||
def get_plugin_source(self, name):
|
||||
import os
|
||||
from ..services.exceptions import InvalidConfigurationError
|
||||
|
||||
plugin_file = os.path.join(self.config.defaultdir, "plugins", f"{name}.py")
|
||||
core_path = os.path.dirname(os.path.realpath(__file__)) + f"/../core_plugins/{name}.py"
|
||||
|
||||
if os.path.exists(plugin_file):
|
||||
target = plugin_file
|
||||
elif os.path.exists(core_path):
|
||||
target = core_path
|
||||
else:
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' not found")
|
||||
|
||||
with open(target, "r") as f:
|
||||
return f.read()
|
||||
|
||||
def invoke_plugin(self, name, args_dict):
|
||||
import sys, io
|
||||
from argparse import Namespace
|
||||
from ..services.exceptions import InvalidConfigurationError
|
||||
from connpy.plugins import Plugins
|
||||
class MockApp:
|
||||
def __init__(self, config):
|
||||
from ..core import node, nodes
|
||||
from ..ai import ai
|
||||
from ..services.provider import ServiceProvider
|
||||
|
||||
self.config = config
|
||||
self.node = node
|
||||
self.nodes = nodes
|
||||
self.ai = ai
|
||||
|
||||
self.services = ServiceProvider(config, mode="local")
|
||||
try:
|
||||
self.nodes_list = self.services.nodes.list_nodes()
|
||||
self.folders = self.services.nodes.list_folders()
|
||||
self.profiles = self.services.profiles.list_profiles()
|
||||
except Exception:
|
||||
self.nodes_list = {}
|
||||
self.folders = {}
|
||||
self.profiles = {}
|
||||
|
||||
args = Namespace(**args_dict)
|
||||
|
||||
p_manager = Plugins()
|
||||
import os
|
||||
plugin_file = os.path.join(self.config.defaultdir, "plugins", f"{name}.py")
|
||||
core_path = os.path.dirname(os.path.realpath(__file__)) + f"/../core_plugins/{name}.py"
|
||||
|
||||
if os.path.exists(plugin_file):
|
||||
target = plugin_file
|
||||
elif os.path.exists(core_path):
|
||||
target = core_path
|
||||
else:
|
||||
raise InvalidConfigurationError(f"Plugin '{name}' not found")
|
||||
|
||||
module = p_manager._import_from_path(target)
|
||||
parser = module.Parser().parser if hasattr(module, "Parser") else None
|
||||
|
||||
if "__func_name__" in args_dict and hasattr(module, args_dict["__func_name__"]):
|
||||
args.func = getattr(module, args_dict["__func_name__"])
|
||||
|
||||
app = MockApp(self.config)
|
||||
|
||||
from .. import printer
|
||||
from rich.console import Console
|
||||
|
||||
buf = io.StringIO()
|
||||
old_console = printer.console
|
||||
old_err_console = printer.err_console
|
||||
|
||||
printer.console = Console(file=buf, theme=printer.connpy_theme, force_terminal=True)
|
||||
printer.err_console = Console(file=buf, theme=printer.connpy_theme, force_terminal=True)
|
||||
|
||||
old_stdout = sys.stdout
|
||||
sys.stdout = buf
|
||||
|
||||
try:
|
||||
if hasattr(module, "Entrypoint"):
|
||||
module.Entrypoint(args, parser, app)
|
||||
except Exception as e:
|
||||
import traceback
|
||||
printer.err_console.print(traceback.format_exc())
|
||||
finally:
|
||||
sys.stdout = old_stdout
|
||||
printer.console = old_console
|
||||
printer.err_console = old_err_console
|
||||
|
||||
for line in buf.getvalue().splitlines(keepends=True):
|
||||
yield line
|
||||
@@ -0,0 +1,134 @@
|
||||
from .base import BaseService
|
||||
from .exceptions import ProfileNotFoundError, ProfileAlreadyExistsError, InvalidConfigurationError
|
||||
|
||||
class ProfileService(BaseService):
|
||||
"""Business logic for node profiles management."""
|
||||
|
||||
def list_profiles(self, filter_str=None):
|
||||
"""List all profile names, optionally filtered."""
|
||||
profiles = list(self.config.profiles.keys())
|
||||
case_sensitive = self.config.config.get("case", False)
|
||||
|
||||
if filter_str:
|
||||
if not case_sensitive:
|
||||
f_str = filter_str.lower()
|
||||
return [p for p in profiles if f_str in p.lower()]
|
||||
else:
|
||||
return [p for p in profiles if filter_str in p]
|
||||
return profiles
|
||||
|
||||
def get_profile(self, name, resolve=True):
|
||||
"""Get the profile dictionary, optionally resolved."""
|
||||
profile = self.config.profiles.get(name)
|
||||
if not profile:
|
||||
raise ProfileNotFoundError(f"Profile '{name}' not found.")
|
||||
|
||||
if resolve:
|
||||
return self.resolve_node_data(profile)
|
||||
return profile
|
||||
|
||||
def add_profile(self, name, data):
|
||||
"""Add a new profile."""
|
||||
if name in self.config.profiles:
|
||||
raise ProfileAlreadyExistsError(f"Profile '{name}' already exists.")
|
||||
|
||||
# Filter data to match _profiles_add signature and ensure id is passed
|
||||
allowed_keys = {"host", "options", "logs", "password", "port", "protocol", "user", "tags", "jumphost"}
|
||||
filtered_data = {k: v for k, v in data.items() if k in allowed_keys}
|
||||
|
||||
self.config._profiles_add(id=name, **filtered_data)
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def resolve_node_data(self, node_data):
|
||||
"""Resolve profile references (@profile) in node data and handle inheritance."""
|
||||
resolved = node_data.copy()
|
||||
|
||||
# 1. Identify all referenced profiles to support inheritance
|
||||
referenced_profiles = []
|
||||
for value in resolved.values():
|
||||
if isinstance(value, str) and value.startswith("@"):
|
||||
referenced_profiles.append(value[1:])
|
||||
elif isinstance(value, list):
|
||||
for item in value:
|
||||
if isinstance(item, str) and item.startswith("@"):
|
||||
referenced_profiles.append(item[1:])
|
||||
|
||||
# 2. Resolve explicit references
|
||||
for key, value in resolved.items():
|
||||
if isinstance(value, str) and value.startswith("@"):
|
||||
profile_name = value[1:]
|
||||
try:
|
||||
profile = self.get_profile(profile_name, resolve=True)
|
||||
resolved[key] = profile.get(key, "")
|
||||
except ProfileNotFoundError:
|
||||
resolved[key] = ""
|
||||
elif isinstance(value, list):
|
||||
resolved_list = []
|
||||
for item in value:
|
||||
if isinstance(item, str) and item.startswith("@"):
|
||||
profile_name = item[1:]
|
||||
try:
|
||||
profile = self.get_profile(profile_name, resolve=True)
|
||||
if "password" in profile:
|
||||
resolved_list.append(profile["password"])
|
||||
except ProfileNotFoundError:
|
||||
pass
|
||||
else:
|
||||
resolved_list.append(item)
|
||||
resolved[key] = resolved_list
|
||||
|
||||
# 3. Inheritance: Fill empty keys from the first referenced profile
|
||||
if referenced_profiles:
|
||||
base_profile_name = referenced_profiles[0]
|
||||
try:
|
||||
base_profile = self.get_profile(base_profile_name, resolve=True)
|
||||
for key, value in base_profile.items():
|
||||
# Fill if key is missing or empty
|
||||
if key not in resolved or resolved[key] == "" or resolved[key] == [] or resolved[key] is None:
|
||||
resolved[key] = value
|
||||
except ProfileNotFoundError:
|
||||
pass
|
||||
|
||||
# 4. Handle default protocol
|
||||
if resolved.get("protocol") == "" or resolved.get("protocol") is None:
|
||||
try:
|
||||
default_profile = self.get_profile("default", resolve=True)
|
||||
resolved["protocol"] = default_profile.get("protocol", "ssh")
|
||||
except ProfileNotFoundError:
|
||||
resolved["protocol"] = "ssh"
|
||||
|
||||
return resolved
|
||||
|
||||
def delete_profile(self, name):
|
||||
"""Delete an existing profile, with safety checks."""
|
||||
if name not in self.config.profiles:
|
||||
raise ProfileNotFoundError(f"Profile '{name}' not found.")
|
||||
|
||||
if name == "default":
|
||||
raise InvalidConfigurationError("Cannot delete the 'default' profile.")
|
||||
|
||||
used_by = self.config._profileused(name)
|
||||
if used_by:
|
||||
# We return the list of nodes using it so the UI can inform the user
|
||||
raise InvalidConfigurationError(f"Profile '{name}' is used by nodes: {', '.join(used_by)}")
|
||||
|
||||
self.config._profiles_del(id=name)
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def update_profile(self, name, data):
|
||||
"""Update an existing profile."""
|
||||
if name not in self.config.profiles:
|
||||
raise ProfileNotFoundError(f"Profile '{name}' not found.")
|
||||
|
||||
# Merge with existing data
|
||||
existing = self.get_profile(name, resolve=False)
|
||||
updated_data = existing.copy()
|
||||
updated_data.update(data)
|
||||
|
||||
# Filter data to match _profiles_add signature
|
||||
allowed_keys = {"host", "options", "logs", "password", "port", "protocol", "user", "tags", "jumphost"}
|
||||
filtered_data = {k: v for k, v in updated_data.items() if k in allowed_keys}
|
||||
|
||||
self.config._profiles_add(id=name, **filtered_data)
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
@@ -0,0 +1,71 @@
|
||||
from .exceptions import InvalidConfigurationError
|
||||
|
||||
class RemoteStub:
|
||||
def __getattr__(self, name):
|
||||
raise NotImplementedError(
|
||||
"Remote mode (gRPC) is not yet available. "
|
||||
"Use local mode or wait for the gRPC implementation."
|
||||
)
|
||||
|
||||
class ServiceProvider:
|
||||
"""Dynamic service backend. Transparently provides local or remote services."""
|
||||
|
||||
def __init__(self, config, mode="local", remote_host=None):
|
||||
self.mode = mode
|
||||
self.config = config
|
||||
self.remote_host = remote_host
|
||||
|
||||
if mode == "local":
|
||||
self._init_local()
|
||||
elif mode == "remote":
|
||||
self._init_remote()
|
||||
else:
|
||||
raise ValueError(f"Unknown service mode: {mode}")
|
||||
|
||||
def _init_local(self):
|
||||
from .node_service import NodeService
|
||||
from .profile_service import ProfileService
|
||||
from .config_service import ConfigService
|
||||
from .plugin_service import PluginService
|
||||
from .ai_service import AIService
|
||||
from .system_service import SystemService
|
||||
from .execution_service import ExecutionService
|
||||
from .import_export_service import ImportExportService
|
||||
from .context_service import ContextService
|
||||
from .sync_service import SyncService
|
||||
|
||||
self.nodes = NodeService(self.config)
|
||||
self.profiles = ProfileService(self.config)
|
||||
self.config_svc = ConfigService(self.config)
|
||||
self.plugins = PluginService(self.config)
|
||||
self.ai = AIService(self.config)
|
||||
self.system = SystemService(self.config)
|
||||
self.execution = ExecutionService(self.config)
|
||||
self.import_export = ImportExportService(self.config)
|
||||
self.context = ContextService(self.config)
|
||||
self.sync = SyncService(self.config)
|
||||
|
||||
def _init_remote(self):
|
||||
# Allow ConfigService to work locally so the user can revert the mode
|
||||
from .config_service import ConfigService
|
||||
from .context_service import ContextService
|
||||
from .sync_service import SyncService
|
||||
self.config_svc = ConfigService(self.config)
|
||||
self.context = ContextService(self.config)
|
||||
self.sync = SyncService(self.config)
|
||||
|
||||
if not self.remote_host:
|
||||
raise InvalidConfigurationError("Remote host must be specified in remote mode")
|
||||
|
||||
import grpc
|
||||
from ..grpc.stubs import NodeStub, ProfileStub, PluginStub, AIStub, ExecutionStub, ImportExportStub, SystemStub
|
||||
|
||||
channel = grpc.insecure_channel(self.remote_host)
|
||||
|
||||
self.nodes = NodeStub(channel, remote_host=self.remote_host, config=self.config)
|
||||
self.profiles = ProfileStub(channel, remote_host=self.remote_host, node_stub=self.nodes)
|
||||
self.plugins = PluginStub(channel, remote_host=self.remote_host)
|
||||
self.ai = AIStub(channel, remote_host=self.remote_host)
|
||||
self.system = SystemStub(channel, remote_host=self.remote_host)
|
||||
self.execution = ExecutionStub(channel, remote_host=self.remote_host)
|
||||
self.import_export = ImportExportStub(channel, remote_host=self.remote_host)
|
||||
@@ -0,0 +1,389 @@
|
||||
import os
|
||||
import time
|
||||
import zipfile
|
||||
import tempfile
|
||||
import io
|
||||
import yaml
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from google.oauth2.credentials import Credentials
|
||||
from google.auth.transport.requests import Request
|
||||
from googleapiclient.discovery import build
|
||||
from google.auth.exceptions import RefreshError
|
||||
from google_auth_oauthlib.flow import InstalledAppFlow
|
||||
from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
|
||||
from googleapiclient.errors import HttpError
|
||||
|
||||
from .base import BaseService
|
||||
from .. import printer
|
||||
|
||||
class SyncService(BaseService):
|
||||
"""Business logic for Google Drive synchronization."""
|
||||
|
||||
def __init__(self, config):
|
||||
super().__init__(config)
|
||||
self.scopes = ['https://www.googleapis.com/auth/drive.appdata']
|
||||
self.token_file = os.path.join(self.config.defaultdir, "gtoken.json")
|
||||
|
||||
# Embedded OAuth config
|
||||
self.client_config = {
|
||||
"installed": {
|
||||
"client_id": "559598250648-cr189kfrga2il1a6d6nkaspq0a9pn5vv." + "apps.googleusercontent.com",
|
||||
"project_id": "celtic-surface-420323",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://oauth2.googleapis.com/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_secret": "GOCSPX-" + "VVfOSrJLPU90Pl0g7aAXM9GK2xPE",
|
||||
"redirect_uris": ["http://localhost"]
|
||||
}
|
||||
}
|
||||
|
||||
# Sync status from config
|
||||
self.sync_enabled = self.config.config.get("sync", False)
|
||||
self.sync_remote = self.config.config.get("sync_remote", False)
|
||||
|
||||
def login(self):
|
||||
"""Authenticate with Google Drive."""
|
||||
creds = None
|
||||
if os.path.exists(self.token_file):
|
||||
creds = Credentials.from_authorized_user_file(self.token_file, self.scopes)
|
||||
|
||||
try:
|
||||
if not creds or not creds.valid:
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
creds.refresh(Request())
|
||||
else:
|
||||
flow = InstalledAppFlow.from_client_config(self.client_config, self.scopes)
|
||||
creds = flow.run_local_server(port=0, access_type='offline')
|
||||
|
||||
with open(self.token_file, 'w') as token:
|
||||
token.write(creds.to_json())
|
||||
|
||||
printer.success("Logged in successfully.")
|
||||
return True
|
||||
|
||||
except RefreshError:
|
||||
if os.path.exists(self.token_file):
|
||||
os.remove(self.token_file)
|
||||
printer.warning("Existing token was invalid and has been removed. Please log in again.")
|
||||
return False
|
||||
except Exception as e:
|
||||
printer.error(f"Login failed: {e}")
|
||||
return False
|
||||
|
||||
def logout(self):
|
||||
"""Remove Google Drive credentials."""
|
||||
if os.path.exists(self.token_file):
|
||||
os.remove(self.token_file)
|
||||
printer.success("Logged out successfully.")
|
||||
else:
|
||||
printer.info("No credentials file found. Already logged out.")
|
||||
|
||||
def get_credentials(self):
|
||||
"""Get valid credentials, refreshing if necessary."""
|
||||
if os.path.exists(self.token_file):
|
||||
creds = Credentials.from_authorized_user_file(self.token_file, self.scopes)
|
||||
else:
|
||||
return None
|
||||
|
||||
if not creds or not creds.valid:
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
try:
|
||||
creds.refresh(Request())
|
||||
except RefreshError:
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
return creds
|
||||
|
||||
def check_login_status(self):
|
||||
"""Check if logged in to Google Drive."""
|
||||
if os.path.exists(self.token_file):
|
||||
creds = Credentials.from_authorized_user_file(self.token_file)
|
||||
if creds and creds.expired and creds.refresh_token:
|
||||
try:
|
||||
creds.refresh(Request())
|
||||
except RefreshError:
|
||||
pass
|
||||
return True if creds.valid else "Invalid"
|
||||
return False
|
||||
|
||||
def list_backups(self):
|
||||
"""List files in Google Drive appDataFolder."""
|
||||
creds = self.get_credentials()
|
||||
if not creds:
|
||||
printer.error("Not logged in to Google Drive.")
|
||||
return []
|
||||
|
||||
try:
|
||||
service = build("drive", "v3", credentials=creds)
|
||||
response = service.files().list(
|
||||
spaces="appDataFolder",
|
||||
fields="files(id, name, appProperties)",
|
||||
pageSize=10,
|
||||
).execute()
|
||||
|
||||
files_info = []
|
||||
for file in response.get("files", []):
|
||||
files_info.append({
|
||||
"name": file.get("name"),
|
||||
"id": file.get("id"),
|
||||
"date": file.get("appProperties", {}).get("date"),
|
||||
"timestamp": file.get("appProperties", {}).get("timestamp")
|
||||
})
|
||||
return files_info
|
||||
except HttpError as error:
|
||||
printer.error(f"Google Drive API error: {error}")
|
||||
return []
|
||||
|
||||
def compress_and_upload(self, remote_data=None):
|
||||
"""Compress config and upload to Drive."""
|
||||
timestamp = int(time.time() * 1000)
|
||||
with tempfile.TemporaryDirectory() as tmp_dir:
|
||||
zip_path = os.path.join(tmp_dir, f"connpy-backup-{timestamp}.zip")
|
||||
|
||||
with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
# If we have remote data, we create a virtual config file
|
||||
if remote_data:
|
||||
config_tmp = os.path.join(tmp_dir, "config.yaml")
|
||||
with open(config_tmp, 'w') as f:
|
||||
yaml.dump(remote_data, f, default_flow_style=False)
|
||||
zipf.write(config_tmp, "config.yaml")
|
||||
else:
|
||||
# Legacy behavior: use local file
|
||||
zipf.write(self.config.file, os.path.basename(self.config.file))
|
||||
|
||||
# Always include the key if it exists
|
||||
if os.path.exists(self.config.key):
|
||||
zipf.write(self.config.key, ".osk")
|
||||
|
||||
# Manage retention (max 10 backups)
|
||||
backups = self.list_backups()
|
||||
if len(backups) >= 10:
|
||||
oldest = min(backups, key=lambda x: x['timestamp'] or '0')
|
||||
self.delete_backup(oldest['id'])
|
||||
|
||||
# Upload
|
||||
return self.upload_file(zip_path, timestamp)
|
||||
|
||||
def upload_file(self, file_path, timestamp):
|
||||
"""Internal method to upload to Drive."""
|
||||
creds = self.get_credentials()
|
||||
if not creds: return False
|
||||
|
||||
service = build('drive', 'v3', credentials=creds)
|
||||
date_str = datetime.fromtimestamp(timestamp/1000).strftime('%Y-%m-%d %H:%M:%S')
|
||||
|
||||
file_metadata = {
|
||||
'name': os.path.basename(file_path),
|
||||
'parents': ["appDataFolder"],
|
||||
'appProperties': {
|
||||
'timestamp': str(timestamp),
|
||||
'date': date_str
|
||||
}
|
||||
}
|
||||
media = MediaFileUpload(file_path)
|
||||
try:
|
||||
service.files().create(body=file_metadata, media_body=media, fields='id').execute()
|
||||
printer.success("Backup uploaded to Google Drive.")
|
||||
return True
|
||||
except Exception as e:
|
||||
printer.error(f"Upload failed: {e}")
|
||||
return False
|
||||
|
||||
def delete_backup(self, file_id):
|
||||
"""Delete a backup from Drive."""
|
||||
creds = self.get_credentials()
|
||||
if not creds: return False
|
||||
try:
|
||||
service = build("drive", "v3", credentials=creds)
|
||||
service.files().delete(fileId=file_id).execute()
|
||||
return True
|
||||
except Exception as e:
|
||||
printer.error(f"Delete failed: {e}")
|
||||
return False
|
||||
|
||||
def restore_backup(self, file_id=None, restore_config=True, restore_nodes=True, app_instance=None):
|
||||
"""Download and analyze a backup for restoration."""
|
||||
backups = self.list_backups()
|
||||
if not backups:
|
||||
printer.error("No backups found.")
|
||||
return None
|
||||
|
||||
if file_id:
|
||||
selected = next((f for f in backups if f['id'] == file_id), None)
|
||||
if not selected:
|
||||
printer.error(f"Backup {file_id} not found.")
|
||||
return None
|
||||
else:
|
||||
selected = max(backups, key=lambda x: x['timestamp'] or '0')
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp_dir:
|
||||
zip_path = os.path.join(tmp_dir, 'restore.zip')
|
||||
if self.download_file(selected['id'], zip_path):
|
||||
return self.perform_restore(zip_path, restore_config, restore_nodes, app_instance)
|
||||
return False
|
||||
|
||||
def download_file(self, file_id, dest):
|
||||
"""Internal method to download from Drive."""
|
||||
creds = self.get_credentials()
|
||||
if not creds: return False
|
||||
try:
|
||||
service = build('drive', 'v3', credentials=creds)
|
||||
request = service.files().get_media(fileId=file_id)
|
||||
with io.FileIO(dest, mode='wb') as fh:
|
||||
downloader = MediaIoBaseDownload(fh, request)
|
||||
done = False
|
||||
while not done:
|
||||
_, done = downloader.next_chunk()
|
||||
return True
|
||||
except Exception as e:
|
||||
printer.error(f"Download failed: {e}")
|
||||
return False
|
||||
|
||||
def perform_restore(self, zip_path, restore_config=True, restore_nodes=True, app_instance=None):
|
||||
"""Execute the actual restoration of files or remote nodes."""
|
||||
try:
|
||||
with zipfile.ZipFile(zip_path, 'r') as zipf:
|
||||
names = zipf.namelist()
|
||||
dest_dir = os.path.dirname(self.config.file)
|
||||
|
||||
# We need to read the config content from zip to decide what to do
|
||||
backup_data = {}
|
||||
config_filename = "config.yaml" if "config.yaml" in names else ("config.json" if "config.json" in names else None)
|
||||
|
||||
if config_filename:
|
||||
with zipf.open(config_filename) as f:
|
||||
backup_data = yaml.safe_load(f)
|
||||
|
||||
# 1. Restore Key (.osk) - Part of config identity
|
||||
if restore_config and ".osk" in names:
|
||||
zipf.extract(".osk", os.path.dirname(self.config.key))
|
||||
|
||||
# 2. Restore Config (Local Settings)
|
||||
if restore_config and backup_data:
|
||||
local_config = self.config.config.copy()
|
||||
|
||||
# Capture current connectivity settings to preserve them
|
||||
current_mode = local_config.get("service_mode", "local")
|
||||
current_remote = local_config.get("remote_host")
|
||||
|
||||
if "config" in backup_data:
|
||||
local_config.update(backup_data["config"])
|
||||
|
||||
# Restore connectivity settings - we don't want a restore to
|
||||
# accidentally switch us between local and remote and break connectivity
|
||||
local_config["service_mode"] = current_mode
|
||||
if current_remote:
|
||||
local_config["remote_host"] = current_remote
|
||||
|
||||
self.config.config = local_config
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
# 3. Restore Nodes and Profiles
|
||||
if restore_nodes and backup_data:
|
||||
connections = backup_data.get("connections", {})
|
||||
profiles = backup_data.get("profiles", {})
|
||||
|
||||
if app_instance and app_instance.services.mode == "remote":
|
||||
# Push to Remote via gRPC
|
||||
app_instance.services.nodes.full_replace(connections, profiles)
|
||||
else:
|
||||
# Restore to Local config file
|
||||
self.config.connections = connections
|
||||
self.config.profiles = profiles
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
# Clear caches
|
||||
for f in [self.config.cachefile, self.config.fzf_cachefile]:
|
||||
if os.path.exists(f): os.remove(f)
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
printer.error(f"Restoration failed: {e}")
|
||||
return False
|
||||
|
||||
def analyze_backup_content(self, file_id=None):
|
||||
"""Analyze a backup without restoring to provide info for confirmation."""
|
||||
backups = self.list_backups()
|
||||
if not backups: return None
|
||||
selected = next((f for f in backups if f['id'] == file_id), None) if file_id else max(backups, key=lambda x: x['timestamp'] or '0')
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp_dir:
|
||||
zip_path = os.path.join(tmp_dir, 'analyze.zip')
|
||||
if self.download_file(selected['id'], zip_path):
|
||||
with zipfile.ZipFile(zip_path, 'r') as zipf:
|
||||
names = zipf.namelist()
|
||||
config_filename = "config.yaml" if "config.yaml" in names else ("config.json" if "config.json" in names else None)
|
||||
if config_filename:
|
||||
with zipf.open(config_filename) as f:
|
||||
data = yaml.safe_load(f)
|
||||
connections = data.get("connections", {})
|
||||
|
||||
# Accurate recursive count
|
||||
nodes_count = 0
|
||||
folders_count = 0
|
||||
|
||||
# Layer 1
|
||||
for k, v in connections.items():
|
||||
if isinstance(v, dict):
|
||||
if v.get("type") == "connection":
|
||||
nodes_count += 1
|
||||
elif v.get("type") == "folder":
|
||||
folders_count += 1
|
||||
# Layer 2
|
||||
for k2, v2 in v.items():
|
||||
if isinstance(v2, dict):
|
||||
if v2.get("type") == "connection":
|
||||
nodes_count += 1
|
||||
elif v2.get("type") == "subfolder":
|
||||
folders_count += 1
|
||||
# Layer 3
|
||||
for k3, v3 in v2.items():
|
||||
if isinstance(v3, dict) and v3.get("type") == "connection":
|
||||
nodes_count += 1
|
||||
|
||||
return {
|
||||
"nodes": nodes_count,
|
||||
"folders": folders_count,
|
||||
"profiles": len(data.get("profiles", {})),
|
||||
"has_config": "config" in data,
|
||||
"has_key": ".osk" in names
|
||||
}
|
||||
return None
|
||||
|
||||
def perform_sync(self, app_instance):
|
||||
"""Background sync logic."""
|
||||
# Always check current config state
|
||||
sync_enabled = self.config.config.get("sync", False)
|
||||
sync_remote = self.config.config.get("sync_remote", False)
|
||||
|
||||
if not sync_enabled: return
|
||||
|
||||
printer.info("Triggering auto-sync...")
|
||||
if self.check_login_status() != True:
|
||||
printer.warning("Auto-sync: Not logged in to Google Drive.")
|
||||
return
|
||||
|
||||
remote_data = None
|
||||
if sync_remote and app_instance.services.mode == "remote":
|
||||
try:
|
||||
inventory = app_instance.services.nodes.get_inventory()
|
||||
# Merge with local settings
|
||||
local_settings = app_instance.services.config_svc.get_settings()
|
||||
local_settings.pop("configfolder", None)
|
||||
|
||||
# Maintain proper config structure: {config: {}, connections: {}, profiles: {}}
|
||||
remote_data = {
|
||||
"config": local_settings,
|
||||
"connections": inventory.get("connections", {}),
|
||||
"profiles": inventory.get("profiles", {})
|
||||
}
|
||||
except Exception as e:
|
||||
printer.warning(f"Could not fetch remote inventory for sync: {e}")
|
||||
|
||||
# Run in thread to not block CLI
|
||||
threading.Thread(
|
||||
target=self.compress_and_upload,
|
||||
args=(remote_data,)
|
||||
).start()
|
||||
@@ -0,0 +1,88 @@
|
||||
from .base import BaseService
|
||||
from .exceptions import ConnpyError
|
||||
|
||||
class SystemService(BaseService):
|
||||
"""Business logic for application lifecycle (API, processes)."""
|
||||
|
||||
def start_api(self, port=None):
|
||||
"""Start the Connpy REST API."""
|
||||
print(f"DEBUG SystemService: port type={type(port)} value={port}")
|
||||
from connpy.api import start_api
|
||||
try:
|
||||
start_api(port, config=self.config)
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to start API: {e}")
|
||||
|
||||
def debug_api(self, port=None):
|
||||
"""Start the Connpy REST API in debug mode."""
|
||||
from connpy.api import debug_api
|
||||
try:
|
||||
debug_api(port, config=self.config)
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to start API in debug mode: {e}")
|
||||
|
||||
|
||||
def stop_api(self):
|
||||
"""Stop the Connpy REST API."""
|
||||
try:
|
||||
import os
|
||||
import signal
|
||||
|
||||
pids = ["/run/connpy.pid", "/tmp/connpy.pid"]
|
||||
stopped = False
|
||||
for pid_file in pids:
|
||||
if os.path.exists(pid_file):
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
# Read only the first line (PID)
|
||||
line = f.readline().strip()
|
||||
if not line:
|
||||
continue
|
||||
pid = int(line)
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
# Remove the PID file after successful kill
|
||||
os.remove(pid_file)
|
||||
stopped = True
|
||||
except (ValueError, OSError, ProcessLookupError):
|
||||
# If process is already dead, just remove the stale PID file
|
||||
try:
|
||||
os.remove(pid_file)
|
||||
except OSError:
|
||||
pass
|
||||
continue
|
||||
return stopped
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to stop API: {e}")
|
||||
|
||||
def restart_api(self, port=None):
|
||||
"""Restart the Connpy REST API, maintaining the current port if none provided."""
|
||||
if port is None:
|
||||
status = self.get_api_status()
|
||||
if status["running"] and status.get("port"):
|
||||
port = status["port"]
|
||||
|
||||
self.stop_api()
|
||||
import time
|
||||
time.sleep(1)
|
||||
self.start_api(port)
|
||||
|
||||
def get_api_status(self):
|
||||
"""Check if the API is currently running."""
|
||||
import os
|
||||
pids = ["/run/connpy.pid", "/tmp/connpy.pid"]
|
||||
for pid_file in pids:
|
||||
if os.path.exists(pid_file):
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid_line = f.readline().strip()
|
||||
port_line = f.readline().strip()
|
||||
if not pid_line:
|
||||
continue
|
||||
pid = int(pid_line)
|
||||
port = int(port_line) if port_line else None
|
||||
# Signal 0 checks for process existence without killing it
|
||||
os.kill(pid, 0)
|
||||
return {"running": True, "pid": pid, "port": port, "pid_file": pid_file}
|
||||
except (ValueError, OSError, ProcessLookupError):
|
||||
continue
|
||||
return {"running": False}
|
||||
@@ -17,11 +17,13 @@ class TestAIInit:
|
||||
assert myai.engineer_model == "test/test-model"
|
||||
assert myai.architect_model == "test/test-architect"
|
||||
|
||||
def test_init_missing_engineer_key(self, config):
|
||||
"""Raises ValueError if engineer key is missing."""
|
||||
def test_ask_missing_engineer_key(self, config):
|
||||
"""Raises ValueError if engineer key is missing when asking."""
|
||||
from connpy.ai import ai
|
||||
with pytest.raises(ValueError, match="Engineer API key"):
|
||||
ai(config)
|
||||
myai = ai(config)
|
||||
with pytest.raises(ValueError) as exc:
|
||||
myai.ask("hello")
|
||||
assert "Engineer API key not configured" in str(exc.value)
|
||||
|
||||
def test_init_missing_architect_key_warns(self, ai_config, capsys, mock_litellm):
|
||||
"""Warns if architect key is missing but doesn't crash."""
|
||||
|
||||
@@ -1,268 +0,0 @@
|
||||
"""Tests for connpy.api module — Flask routes."""
|
||||
import json
|
||||
import pytest
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def api_client(populated_config):
|
||||
"""Create a Flask test client with a populated config."""
|
||||
from connpy.api import app
|
||||
app.custom_config = populated_config
|
||||
app.config["TESTING"] = True
|
||||
with app.test_client() as client:
|
||||
yield client
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# Root endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestRootEndpoint:
|
||||
def test_root_returns_welcome(self, api_client):
|
||||
response = api_client.get("/")
|
||||
data = response.get_json()
|
||||
assert response.status_code == 200
|
||||
assert "Welcome" in data["message"]
|
||||
assert "version" in data
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# /list_nodes endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestListNodes:
|
||||
def test_list_nodes_no_filter(self, api_client):
|
||||
response = api_client.post("/list_nodes", json={})
|
||||
data = response.get_json()
|
||||
assert response.status_code == 200
|
||||
assert isinstance(data, list)
|
||||
assert "router1" in data
|
||||
|
||||
def test_list_nodes_with_filter(self, api_client):
|
||||
response = api_client.post("/list_nodes", json={"filter": "router.*"})
|
||||
data = response.get_json()
|
||||
assert "router1" in data
|
||||
assert all("router" in n or "Router" in n for n in data)
|
||||
|
||||
def test_list_nodes_case_insensitive(self, api_client):
|
||||
"""Filter is lowercased when case=false."""
|
||||
response = api_client.post("/list_nodes", json={"filter": "ROUTER.*"})
|
||||
data = response.get_json()
|
||||
# Should still match since the filter gets lowercased
|
||||
assert isinstance(data, list)
|
||||
|
||||
def test_list_nodes_no_body(self, api_client):
|
||||
"""No body returns all nodes."""
|
||||
response = api_client.post("/list_nodes",
|
||||
data="",
|
||||
content_type="application/json")
|
||||
data = response.get_json()
|
||||
assert isinstance(data, list)
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# /get_nodes endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestGetNodes:
|
||||
def test_get_nodes_no_filter(self, api_client):
|
||||
response = api_client.post("/get_nodes", json={})
|
||||
data = response.get_json()
|
||||
assert response.status_code == 200
|
||||
assert isinstance(data, dict)
|
||||
assert "router1" in data
|
||||
|
||||
def test_get_nodes_with_filter(self, api_client):
|
||||
response = api_client.post("/get_nodes", json={"filter": "router.*"})
|
||||
data = response.get_json()
|
||||
assert "router1" in data
|
||||
assert "host" in data["router1"]
|
||||
|
||||
def test_get_nodes_has_attributes(self, api_client):
|
||||
response = api_client.post("/get_nodes", json={"filter": "router1"})
|
||||
data = response.get_json()
|
||||
if "router1" in data:
|
||||
assert "host" in data["router1"]
|
||||
assert "protocol" in data["router1"]
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# /run_commands endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestRunCommands:
|
||||
def test_missing_action(self, api_client):
|
||||
response = api_client.post("/run_commands", json={
|
||||
"nodes": "router1",
|
||||
"commands": ["show version"]
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "DataError" in data
|
||||
assert "action" in data["DataError"]
|
||||
|
||||
def test_missing_nodes(self, api_client):
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"commands": ["show version"]
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "DataError" in data
|
||||
assert "nodes" in data["DataError"]
|
||||
|
||||
def test_missing_commands(self, api_client):
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"nodes": "router1"
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "DataError" in data
|
||||
assert "commands" in data["DataError"]
|
||||
|
||||
def test_wrong_action(self, api_client):
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "invalid",
|
||||
"nodes": "router1",
|
||||
"commands": ["show version"]
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "DataError" in data
|
||||
assert "Wrong action" in data["DataError"]
|
||||
|
||||
@patch("connpy.api.nodes")
|
||||
def test_run_action(self, mock_nodes_cls, api_client):
|
||||
"""action=run executes and returns output."""
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.run.return_value = {"router1": "Router v1.0"}
|
||||
mock_nodes_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"nodes": "router1",
|
||||
"commands": ["show version"]
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "router1" in data
|
||||
|
||||
@patch("connpy.api.nodes")
|
||||
def test_test_action(self, mock_nodes_cls, api_client):
|
||||
"""action=test returns result + output."""
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.test.return_value = {"router1": {"expected": True}}
|
||||
mock_instance.output = {"router1": "output text"}
|
||||
mock_nodes_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "test",
|
||||
"nodes": "router1",
|
||||
"commands": ["show version"],
|
||||
"expected": "Router"
|
||||
})
|
||||
data = response.get_json()
|
||||
assert "result" in data
|
||||
assert "output" in data
|
||||
|
||||
@patch("connpy.api.nodes")
|
||||
def test_run_with_options(self, mock_nodes_cls, api_client):
|
||||
"""Options get passed through."""
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.run.return_value = {"router1": "ok"}
|
||||
mock_nodes_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"nodes": "router1",
|
||||
"commands": ["show version"],
|
||||
"options": {"timeout": 30, "parallel": 5}
|
||||
})
|
||||
assert response.status_code == 200
|
||||
|
||||
@patch("connpy.api.nodes")
|
||||
def test_run_folder_nodes(self, mock_nodes_cls, api_client):
|
||||
"""Nodes with @ prefix are resolved as folders."""
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.run.return_value = {"server1@office": "ok"}
|
||||
mock_nodes_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"nodes": "@office",
|
||||
"commands": ["ls -la"]
|
||||
})
|
||||
assert response.status_code == 200
|
||||
|
||||
@patch("connpy.api.nodes")
|
||||
def test_run_list_nodes(self, mock_nodes_cls, api_client):
|
||||
"""List of nodes is resolved correctly."""
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.run.return_value = {"router1": "ok", "server1@office": "ok"}
|
||||
mock_nodes_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/run_commands", json={
|
||||
"action": "run",
|
||||
"nodes": ["router1", "server1@office"],
|
||||
"commands": ["show version"]
|
||||
})
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# /ask_ai endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestAskAI:
|
||||
@patch("connpy.api.myai")
|
||||
def test_ask_ai(self, mock_ai_cls, api_client):
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.ask.return_value = {"response": "AI says hello"}
|
||||
mock_ai_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/ask_ai", json={
|
||||
"input": "list my routers"
|
||||
})
|
||||
data = response.get_json()
|
||||
assert data is not None
|
||||
|
||||
@patch("connpy.api.myai")
|
||||
def test_ask_ai_with_dryrun(self, mock_ai_cls, api_client):
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.ask.return_value = {"response": "dry run"}
|
||||
mock_ai_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/ask_ai", json={
|
||||
"input": "test",
|
||||
"dryrun": True
|
||||
})
|
||||
assert response.status_code == 200
|
||||
|
||||
@patch("connpy.api.myai")
|
||||
def test_ask_ai_with_history(self, mock_ai_cls, api_client):
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.ask.return_value = {"response": "with history"}
|
||||
mock_ai_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/ask_ai", json={
|
||||
"input": "follow up",
|
||||
"chat_history": [
|
||||
{"role": "user", "content": "previous"},
|
||||
{"role": "assistant", "content": "answer"}
|
||||
]
|
||||
})
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# /confirm endpoint
|
||||
# =========================================================================
|
||||
|
||||
class TestConfirm:
|
||||
@patch("connpy.api.myai")
|
||||
def test_confirm(self, mock_ai_cls, api_client):
|
||||
mock_instance = MagicMock()
|
||||
mock_instance.confirm.return_value = True
|
||||
mock_ai_cls.return_value = mock_instance
|
||||
|
||||
response = api_client.post("/confirm", json={
|
||||
"input": "yes"
|
||||
})
|
||||
assert response.status_code == 200
|
||||
@@ -1,51 +1,56 @@
|
||||
"""Tests for connpy.core_plugins.capture"""
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch
|
||||
from connpy.core_plugins.capture import RemoteCapture
|
||||
from connpy.core_plugins.capture import Entrypoint
|
||||
|
||||
@pytest.fixture
|
||||
def RemoteCapture():
|
||||
return Entrypoint.get_remote_capture_class()
|
||||
|
||||
@pytest.fixture
|
||||
def mock_connapp():
|
||||
app = MagicMock()
|
||||
app.nodes_list = ["test_node"]
|
||||
app.config.getitem.return_value = {"host": "127.0.0.1", "protocol": "ssh"}
|
||||
app.services.nodes.list_nodes.return_value = ["test_node"]
|
||||
app.services.nodes.get_node_details.return_value = {"host": "127.0.0.1", "protocol": "ssh"}
|
||||
app.services.config_svc.get_settings().get.return_value = "/fake/ws"
|
||||
|
||||
mock_node = MagicMock()
|
||||
mock_node.protocol = "ssh"
|
||||
mock_node.unique = "test_node"
|
||||
app.node.return_value = mock_node
|
||||
app.config.config = {"wireshark_path": "/fake/ws"}
|
||||
return app
|
||||
|
||||
class TestRemoteCapture:
|
||||
def test_init_node_not_found(self, mock_connapp):
|
||||
# Attempt to capture a node not in nodes_list
|
||||
mock_connapp.nodes_list = ["other_node"]
|
||||
def test_init_node_not_found(self, mock_connapp, RemoteCapture):
|
||||
# Attempt to capture a node not in inventory
|
||||
mock_connapp.services.nodes.list_nodes.return_value = []
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
RemoteCapture(mock_connapp, "test_node", "eth0")
|
||||
assert exc.value.code == 2
|
||||
|
||||
def test_init_success(self, mock_connapp):
|
||||
def test_init_success(self, mock_connapp, RemoteCapture):
|
||||
rc = RemoteCapture(mock_connapp, "test_node", "eth0")
|
||||
assert rc.node_name == "test_node"
|
||||
assert rc.interface == "eth0"
|
||||
assert rc.wireshark_path == "/fake/ws"
|
||||
|
||||
@patch("connpy.core_plugins.capture.socket")
|
||||
def test_is_port_in_use(self, mock_socket, mock_connapp):
|
||||
def test_is_port_in_use(self, mock_connapp, RemoteCapture):
|
||||
rc = RemoteCapture(mock_connapp, "test_node", "eth0")
|
||||
mock_sock_instance = MagicMock()
|
||||
mock_socket.socket.return_value.__enter__.return_value = mock_sock_instance
|
||||
|
||||
mock_sock_instance.connect_ex.return_value = 0
|
||||
assert rc._is_port_in_use(8080) is True
|
||||
|
||||
mock_sock_instance.connect_ex.return_value = 1
|
||||
assert rc._is_port_in_use(8080) is False
|
||||
with patch("socket.socket") as mock_socket:
|
||||
mock_sock_instance = MagicMock()
|
||||
mock_socket.return_value.__enter__.return_value = mock_sock_instance
|
||||
|
||||
mock_sock_instance.connect_ex.return_value = 0
|
||||
assert rc._is_port_in_use(8080) is True
|
||||
|
||||
mock_sock_instance.connect_ex.return_value = 1
|
||||
assert rc._is_port_in_use(8080) is False
|
||||
|
||||
@patch.object(RemoteCapture, "_is_port_in_use")
|
||||
def test_find_free_port(self, mock_is_in_use, mock_connapp):
|
||||
def test_find_free_port(self, mock_connapp, RemoteCapture):
|
||||
rc = RemoteCapture(mock_connapp, "test_node", "eth0")
|
||||
# First 2 ports in use, 3rd is free
|
||||
mock_is_in_use.side_effect = [True, True, False]
|
||||
port = rc._find_free_port(20000, 30000)
|
||||
assert 20000 <= port <= 30000
|
||||
assert mock_is_in_use.call_count == 3
|
||||
with patch.object(RemoteCapture, "_is_port_in_use") as mock_is_in_use:
|
||||
# First 2 ports in use, 3rd is free
|
||||
mock_is_in_use.side_effect = [True, True, False]
|
||||
port = rc._find_free_port(20000, 30000)
|
||||
assert 20000 <= port <= 30000
|
||||
assert mock_is_in_use.call_count == 3
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
import os
|
||||
import json
|
||||
import pytest
|
||||
from connpy.completion import load_txt_cache, _getcwd, _get_plugins
|
||||
from connpy.completion import load_txt_cache, get_cwd
|
||||
|
||||
|
||||
# =========================================================================
|
||||
@@ -25,7 +25,7 @@ class TestLoadTxtCache:
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# _getcwd tests
|
||||
# get_cwd tests
|
||||
# =========================================================================
|
||||
|
||||
class TestGetCwd:
|
||||
@@ -37,7 +37,7 @@ class TestGetCwd:
|
||||
subdir = tmp_path / "subdir"
|
||||
subdir.mkdir()
|
||||
|
||||
result = _getcwd(["run", "run"], "run")
|
||||
result = get_cwd(["run", "run"])
|
||||
# Should list files
|
||||
assert any("file1.txt" in r for r in result)
|
||||
assert any("subdir/" in r for r in result)
|
||||
@@ -48,7 +48,7 @@ class TestGetCwd:
|
||||
(tmp_path / "script.yaml").touch()
|
||||
(tmp_path / "script2.yaml").touch()
|
||||
|
||||
result = _getcwd(["run", "script"], "run")
|
||||
result = get_cwd(["run", "script"])
|
||||
assert any("script" in r for r in result)
|
||||
|
||||
def test_folder_only(self, tmp_path, monkeypatch):
|
||||
@@ -58,65 +58,11 @@ class TestGetCwd:
|
||||
subdir = tmp_path / "mydir"
|
||||
subdir.mkdir()
|
||||
|
||||
result = _getcwd(["export", "export"], "export", folderonly=True)
|
||||
result = get_cwd(["export", "export"], folderonly=True)
|
||||
files_in_result = [r for r in result if "file.txt" in r]
|
||||
assert len(files_in_result) == 0
|
||||
dirs_in_result = [r for r in result if "mydir" in r]
|
||||
assert len(dirs_in_result) > 0
|
||||
|
||||
|
||||
# =========================================================================
|
||||
# _get_plugins tests
|
||||
# =========================================================================
|
||||
|
||||
class TestGetPlugins:
|
||||
def test_get_plugins_disable(self, tmp_path):
|
||||
"""--disable returns enabled plugins."""
|
||||
plugin_dir = tmp_path / "plugins"
|
||||
plugin_dir.mkdir()
|
||||
(plugin_dir / "active.py").touch()
|
||||
(plugin_dir / "disabled.py.bkp").touch()
|
||||
|
||||
result = _get_plugins("--disable", str(tmp_path))
|
||||
assert "active" in result
|
||||
assert "disabled" not in result
|
||||
|
||||
def test_get_plugins_enable(self, tmp_path):
|
||||
"""--enable returns disabled plugins."""
|
||||
plugin_dir = tmp_path / "plugins"
|
||||
plugin_dir.mkdir()
|
||||
(plugin_dir / "active.py").touch()
|
||||
(plugin_dir / "disabled.py.bkp").touch()
|
||||
|
||||
result = _get_plugins("--enable", str(tmp_path))
|
||||
assert "disabled" in result
|
||||
assert "active" not in result
|
||||
|
||||
def test_get_plugins_del(self, tmp_path):
|
||||
"""--del returns all plugins."""
|
||||
plugin_dir = tmp_path / "plugins"
|
||||
plugin_dir.mkdir()
|
||||
(plugin_dir / "active.py").touch()
|
||||
(plugin_dir / "disabled.py.bkp").touch()
|
||||
|
||||
result = _get_plugins("--del", str(tmp_path))
|
||||
assert "active" in result
|
||||
assert "disabled" in result
|
||||
|
||||
def test_get_plugins_all(self, tmp_path):
|
||||
"""'all' returns dict with paths."""
|
||||
plugin_dir = tmp_path / "plugins"
|
||||
plugin_dir.mkdir()
|
||||
(plugin_dir / "myplugin.py").touch()
|
||||
|
||||
result = _get_plugins("all", str(tmp_path))
|
||||
assert isinstance(result, dict)
|
||||
assert "myplugin" in result
|
||||
|
||||
def test_get_plugins_empty_dir(self, tmp_path):
|
||||
"""Empty plugins directory returns empty list."""
|
||||
plugin_dir = tmp_path / "plugins"
|
||||
plugin_dir.mkdir()
|
||||
|
||||
result = _get_plugins("--disable", str(tmp_path))
|
||||
assert result == []
|
||||
|
||||
@@ -307,8 +307,9 @@ class TestGetAll:
|
||||
assert "server1@office" not in nodes
|
||||
|
||||
def test_getallnodes_filter_invalid_type(self, populated_config):
|
||||
with pytest.raises(ValueError):
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
populated_config._getallnodes(123)
|
||||
assert exc.value.code == 1
|
||||
|
||||
def test_getallfolders(self, populated_config):
|
||||
folders = populated_config._getallfolders()
|
||||
|
||||
@@ -0,0 +1,264 @@
|
||||
import pytest
|
||||
from unittest.mock import patch, MagicMock
|
||||
from connpy.connapp import connapp
|
||||
import sys
|
||||
import yaml
|
||||
import os
|
||||
|
||||
@pytest.fixture
|
||||
def app(populated_config):
|
||||
"""Returns an instance of connapp initialized with the mock config."""
|
||||
return connapp(populated_config)
|
||||
|
||||
def test_connapp_init(app, populated_config):
|
||||
"""Test that connapp initializes correctly with config."""
|
||||
assert app.config == populated_config
|
||||
assert app.case == populated_config.config.get("case", False)
|
||||
|
||||
@patch("connpy.cli.node_handler.NodeHandler.dispatch")
|
||||
def test_node_default(mock_func_node, app):
|
||||
"""Test that default 'node' command correctly parses and calls _func_node."""
|
||||
app.start(["node", "router1"])
|
||||
mock_func_node.assert_called_once()
|
||||
args = mock_func_node.call_args[0][0]
|
||||
assert args.data == "router1"
|
||||
assert args.action == "connect"
|
||||
|
||||
@patch("connpy.cli.node_handler.NodeHandler.dispatch")
|
||||
def test_node_add(mock_func_node, app):
|
||||
"""Test that 'node -a' command correctly parses."""
|
||||
app.start(["node", "-a", "new_router"])
|
||||
mock_func_node.assert_called_once()
|
||||
args = mock_func_node.call_args[0][0]
|
||||
assert args.data == "new_router"
|
||||
assert args.action == "add"
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.list_nodes")
|
||||
@patch("connpy.services.node_service.NodeService.delete_node")
|
||||
@patch("inquirer.prompt")
|
||||
def test_node_del(mock_prompt, mock_delete_node, mock_list_nodes, app):
|
||||
mock_list_nodes.return_value = ["router1"]
|
||||
mock_prompt.return_value = {"delete": True}
|
||||
app.start(["node", "-r", "router1"])
|
||||
mock_delete_node.assert_called_once_with("router1", is_folder=False)
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.list_nodes")
|
||||
@patch("connpy.services.node_service.NodeService.get_node_details")
|
||||
@patch("connpy.services.node_service.NodeService.update_node")
|
||||
@patch("connpy.cli.forms.Forms.questions_edit")
|
||||
@patch("connpy.cli.forms.Forms.questions_nodes")
|
||||
def test_node_mod(mock_q_nodes, mock_q_edit, mock_update_node, mock_get_details, mock_list_nodes, app):
|
||||
mock_list_nodes.return_value = ["router1"]
|
||||
mock_get_details.return_value = {"host": "1.1.1.1", "port": 22}
|
||||
mock_q_edit.return_value = {"host": True}
|
||||
mock_q_nodes.return_value = {"host": "2.2.2.2", "port": 22}
|
||||
|
||||
app.start(["node", "-e", "router1"])
|
||||
mock_update_node.assert_called_once()
|
||||
|
||||
@patch("connpy.printer.data")
|
||||
def test_node_show(mock_data, app):
|
||||
app.nodes_list = ["router1"]
|
||||
app.config.getitem = MagicMock(return_value={"host": "1.1.1.1"})
|
||||
app.start(["node", "-s", "router1"])
|
||||
mock_data.assert_called()
|
||||
|
||||
@patch("connpy.services.profile_service.ProfileService.list_profiles")
|
||||
@patch("connpy.connapp.printer.console.print")
|
||||
def test_profile_list(mock_print, mock_list_profiles, app):
|
||||
"""Test 'profile list' invokes profile service correctly."""
|
||||
mock_list_profiles.return_value = ["default", "office-user"]
|
||||
app.start(["list", "profiles"])
|
||||
assert mock_list_profiles.call_count >= 2
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.list_nodes")
|
||||
def test_node_list(mock_list_nodes, app):
|
||||
"""Test 'list nodes' invokes node service."""
|
||||
mock_list_nodes.return_value = ["router1", "server1"]
|
||||
app.start(["list", "nodes"])
|
||||
# Should be called during init and during the list command
|
||||
assert mock_list_nodes.call_count >= 2
|
||||
|
||||
@patch("connpy.services.system_service.SystemService.get_api_status")
|
||||
def test_api_stop(mock_status, app):
|
||||
mock_status.return_value = {"running": True, "pid": "1234"}
|
||||
app.services.system.stop_api = MagicMock(return_value=True)
|
||||
app.start(["api", "-x"])
|
||||
app.services.system.stop_api.assert_called_once()
|
||||
|
||||
@patch("connpy.services.profile_service.ProfileService.list_profiles")
|
||||
@patch("connpy.services.profile_service.ProfileService.add_profile")
|
||||
@patch("connpy.cli.forms.Forms.questions_profiles")
|
||||
def test_profile_add(mock_q_profiles, mock_add_profile, mock_list_profiles, app):
|
||||
mock_list_profiles.return_value = ["default"]
|
||||
mock_q_profiles.return_value = {"host": "test"}
|
||||
app.start(["profile", "-a", "new_profile"])
|
||||
mock_add_profile.assert_called_once_with("new_profile", {"host": "test"})
|
||||
|
||||
@patch("connpy.services.profile_service.ProfileService.get_profile")
|
||||
@patch("connpy.services.profile_service.ProfileService.delete_profile")
|
||||
@patch("inquirer.prompt")
|
||||
def test_profile_del(mock_prompt, mock_delete_profile, mock_get_profile, app):
|
||||
mock_get_profile.return_value = {"host": "test"}
|
||||
mock_prompt.return_value = {"delete": True}
|
||||
app.start(["profile", "-r", "test_profile"])
|
||||
mock_delete_profile.assert_called_once_with("test_profile")
|
||||
|
||||
@patch("connpy.services.profile_service.ProfileService.get_profile")
|
||||
@patch("connpy.services.profile_service.ProfileService.update_profile")
|
||||
@patch("connpy.cli.forms.Forms.questions_edit")
|
||||
@patch("connpy.cli.forms.Forms.questions_profiles")
|
||||
def test_profile_mod(mock_q_profiles, mock_q_edit, mock_update_profile, mock_get_profile, app):
|
||||
mock_get_profile.return_value = {"host": "test", "port": 22}
|
||||
mock_q_edit.return_value = {"host": True}
|
||||
mock_q_profiles.return_value = {"id": "test_profile", "host": "new_host", "port": 22}
|
||||
app.start(["profile", "-e", "test_profile"])
|
||||
mock_update_profile.assert_called_once_with("test_profile", {"id": "test_profile", "host": "new_host", "port": 22})
|
||||
|
||||
@patch("connpy.services.profile_service.ProfileService.get_profile")
|
||||
@patch("connpy.printer.data")
|
||||
def test_profile_show(mock_data, mock_get_profile, app):
|
||||
mock_get_profile.return_value = {"host": "test"}
|
||||
app.start(["profile", "-s", "test_profile"])
|
||||
mock_data.assert_called()
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.move_node")
|
||||
def test_move(mock_move_node, app):
|
||||
app.start(["move", "src_node", "dst_node"])
|
||||
mock_move_node.assert_called_once_with("src_node", "dst_node", copy=False)
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.move_node")
|
||||
def test_copy(mock_move_node, app):
|
||||
app.start(["copy", "src_node", "dst_node"])
|
||||
mock_move_node.assert_called_once_with("src_node", "dst_node", copy=True)
|
||||
|
||||
@patch("connpy.cli.forms.Forms.questions_bulk")
|
||||
@patch("connpy.services.node_service.NodeService.bulk_add")
|
||||
def test_bulk(mock_bulk_add, mock_q_bulk, app):
|
||||
mock_q_bulk.return_value = {"ids": "node1", "host": "host1", "location": ""}
|
||||
mock_bulk_add.return_value = 1
|
||||
app.start(["bulk"])
|
||||
mock_bulk_add.assert_called_once()
|
||||
|
||||
@patch("connpy.services.import_export_service.ImportExportService.export_to_file")
|
||||
def test_export(mock_export, app):
|
||||
with pytest.raises(SystemExit):
|
||||
app.start(["export", "file.yml", "@folder1"])
|
||||
mock_export.assert_called_once_with("file.yml", folders=["@folder1"])
|
||||
|
||||
@patch("os.path.exists")
|
||||
@patch("inquirer.prompt")
|
||||
@patch("connpy.services.import_export_service.ImportExportService.import_from_file")
|
||||
def test_import(mock_import, mock_prompt, mock_exists, app):
|
||||
mock_exists.return_value = True
|
||||
mock_prompt.return_value = {"import": True}
|
||||
app.start(["import", "file.yml"])
|
||||
mock_import.assert_called_once_with("file.yml")
|
||||
|
||||
@patch("connpy.services.ai_service.AIService.ask")
|
||||
@patch("connpy.connapp.console.status")
|
||||
def test_ai(mock_status, mock_ask, app):
|
||||
mock_ask.return_value = {"response": "AI output", "usage": {"total": 10, "input": 5, "output": 5}}
|
||||
|
||||
app.start(["ai", "--engineer-api-key", "testkey", "how are you"])
|
||||
mock_ask.assert_called_once()
|
||||
|
||||
@patch("connpy.services.execution_service.ExecutionService.run_commands")
|
||||
def test_run(mock_run_commands, app):
|
||||
app.start(["run", "node1", "command1", "command2"])
|
||||
mock_run_commands.assert_called_once()
|
||||
assert mock_run_commands.call_args[1]["nodes_filter"] == "node1"
|
||||
assert mock_run_commands.call_args[1]["commands"] == ["command1 command2"]
|
||||
|
||||
@patch("os.path.exists")
|
||||
@patch("shutil.copy2")
|
||||
@patch("connpy.plugins.Plugins.verify_script")
|
||||
def test_plugin_add(mock_verify, mock_copy, mock_exists, app):
|
||||
def mock_exists_side_effect(path):
|
||||
if "testplug.py" in path: return False
|
||||
if "testplug.py.bkp" in path: return False
|
||||
if "file.py" in path: return True
|
||||
return True
|
||||
mock_exists.side_effect = mock_exists_side_effect
|
||||
mock_verify.return_value = None
|
||||
app.commands = []
|
||||
app.start(["plugin", "--add", "testplug", "file.py"])
|
||||
mock_copy.assert_called()
|
||||
|
||||
@patch("connpy.services.config_service.ConfigService.update_setting")
|
||||
def test_config(mock_update_setting, app):
|
||||
app.start(["config", "--allow-uppercase", "true"])
|
||||
mock_update_setting.assert_called_with("case", True)
|
||||
|
||||
@patch("connpy.services.system_service.SystemService.get_api_status")
|
||||
def test_api_start(mock_status, app):
|
||||
mock_status.return_value = {"running": False}
|
||||
app.services.system.start_api = MagicMock()
|
||||
app.start(["api", "-s", "8080"])
|
||||
app.services.system.start_api.assert_called_once_with(port=8080)
|
||||
|
||||
@patch("connpy.services.system_service.SystemService.get_api_status")
|
||||
def test_api_debug(mock_status, app):
|
||||
mock_status.return_value = {"running": False}
|
||||
app.services.system.debug_api = MagicMock()
|
||||
app.start(["api", "-d", "8080"])
|
||||
app.services.system.debug_api.assert_called_once_with(port=8080)
|
||||
|
||||
@patch("connpy.services.node_service.NodeService.list_folders")
|
||||
def test_list_folders(mock_list_folders, app):
|
||||
mock_list_folders.return_value = ["folder1"]
|
||||
app.start(["list", "folders"])
|
||||
# Called during init and during the list command
|
||||
assert mock_list_folders.call_count >= 2
|
||||
|
||||
@patch("connpy.services.config_service.ConfigService.update_setting")
|
||||
def test_config_various(mock_update_setting, app):
|
||||
app.start(["config", "--fzf", "true"])
|
||||
mock_update_setting.assert_called_with("fzf", True)
|
||||
app.start(["config", "--keepalive", "60"])
|
||||
mock_update_setting.assert_called_with("idletime", 60)
|
||||
|
||||
@patch("connpy.services.config_service.ConfigService.set_config_folder")
|
||||
def test_config_folder(mock_set_config_folder, app):
|
||||
app.start(["config", "--configfolder", "/new/path"])
|
||||
mock_set_config_folder.assert_called_once_with("/new/path")
|
||||
|
||||
@patch("connpy.services.plugin_service.PluginService.list_plugins")
|
||||
def test_plugin_list(mock_list_plugins, app):
|
||||
mock_list_plugins.return_value = {"testplug": {"enabled": True}}
|
||||
app.start(["plugin", "--list"])
|
||||
mock_list_plugins.assert_called_once()
|
||||
|
||||
@patch("connpy.services.plugin_service.PluginService.delete_plugin")
|
||||
def test_plugin_delete(mock_delete, app):
|
||||
app.start(["plugin", "--del", "testplug"])
|
||||
mock_delete.assert_called_once_with("testplug")
|
||||
|
||||
@patch("connpy.services.plugin_service.PluginService.enable_plugin")
|
||||
def test_plugin_enable(mock_enable, app):
|
||||
app.start(["plugin", "--enable", "testplug"])
|
||||
mock_enable.assert_called_once_with("testplug")
|
||||
|
||||
@patch("connpy.services.plugin_service.PluginService.disable_plugin")
|
||||
def test_plugin_disable(mock_disable, app):
|
||||
app.start(["plugin", "--disable", "testplug"])
|
||||
mock_disable.assert_called_once_with("testplug")
|
||||
|
||||
@patch("connpy.services.ai_service.AIService.list_sessions")
|
||||
def test_ai_list(mock_list_sessions, app):
|
||||
mock_list_sessions.return_value = [{"id": "1", "title": "t", "created_at": "now", "model": "m"}]
|
||||
app.start(["ai", "--list"])
|
||||
mock_list_sessions.assert_called_once()
|
||||
|
||||
def test_type_node_reserved_word(app):
|
||||
app.commands = ["bulk", "ai", "run"]
|
||||
with patch("sys.argv", ["connpy", "node", "-a", "bulk"]):
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
app._type_node("bulk")
|
||||
assert exc.value.code == 2
|
||||
|
||||
# In move/copy it also raises because destination cannot be reserved
|
||||
with patch("sys.argv", ["connpy", "mv", "test1", "bulk"]):
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
app._type_node("bulk")
|
||||
assert exc.value.code == 2
|
||||
@@ -1,109 +0,0 @@
|
||||
"""Tests for connpy.core_plugins.context"""
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch
|
||||
from connpy.core_plugins.context import context_manager, Preload, Entrypoint
|
||||
|
||||
@pytest.fixture
|
||||
def mock_connapp():
|
||||
connapp = MagicMock()
|
||||
connapp.config.config = {
|
||||
"contexts": {"all": [".*"]},
|
||||
"current_context": "all"
|
||||
}
|
||||
return connapp
|
||||
|
||||
class TestContextManager:
|
||||
def test_init(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
assert cm.contexts == {"all": [".*"]}
|
||||
assert cm.current_context == "all"
|
||||
assert len(cm.regex) == 1
|
||||
|
||||
def test_add_context_success(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
cm.add_context("prod", ["^prod_.*"])
|
||||
assert "prod" in cm.contexts
|
||||
mock_connapp._change_settings.assert_called_with("contexts", cm.contexts)
|
||||
|
||||
def test_add_context_invalid_name(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.add_context("prod-env", ["Regex"])
|
||||
assert exc.value.code == 1
|
||||
|
||||
def test_add_context_already_exists(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.add_context("all", ["Regex"])
|
||||
assert exc.value.code == 2
|
||||
|
||||
def test_modify_context_success(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
cm.add_context("prod", ["old"])
|
||||
cm.modify_context("prod", ["new"])
|
||||
assert cm.contexts["prod"] == ["new"]
|
||||
|
||||
def test_modify_context_all(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.modify_context("all", ["new"])
|
||||
assert exc.value.code == 3
|
||||
|
||||
def test_modify_context_not_exists(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.modify_context("fake", ["new"])
|
||||
assert exc.value.code == 4
|
||||
|
||||
def test_delete_context_success(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
cm.add_context("prod", ["old"])
|
||||
cm.delete_context("prod")
|
||||
assert "prod" not in cm.contexts
|
||||
|
||||
def test_delete_context_all(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.delete_context("all")
|
||||
assert exc.value.code == 3
|
||||
|
||||
def test_delete_context_current(self, mock_connapp):
|
||||
mock_connapp.config.config["current_context"] = "prod"
|
||||
mock_connapp.config.config["contexts"]["prod"] = [".*"]
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.delete_context("prod")
|
||||
assert exc.value.code == 5
|
||||
|
||||
def test_set_context_success(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
cm.contexts["prod"] = [".*"]
|
||||
cm.set_context("prod")
|
||||
mock_connapp._change_settings.assert_called_with("current_context", "prod")
|
||||
|
||||
def test_set_context_already_set(self, mock_connapp):
|
||||
cm = context_manager(mock_connapp)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
cm.set_context("all")
|
||||
assert exc.value.code == 0
|
||||
|
||||
def test_match_regexp(self, mock_connapp):
|
||||
mock_connapp.config.config["contexts"]["all"] = ["^prod", "^test"]
|
||||
cm = context_manager(mock_connapp)
|
||||
assert cm.match_any_regex("prod_node", cm.regex) is True
|
||||
assert cm.match_any_regex("test_node", cm.regex) is True
|
||||
assert cm.match_any_regex("dev_node", cm.regex) is False
|
||||
|
||||
def test_modify_node_list(self, mock_connapp):
|
||||
mock_connapp.config.config["contexts"]["all"] = ["^prod"]
|
||||
cm = context_manager(mock_connapp)
|
||||
nodes = ["prod_1", "dev_1", "prod_2"]
|
||||
result = cm.modify_node_list(result=nodes)
|
||||
assert result == ["prod_1", "prod_2"]
|
||||
|
||||
def test_modify_node_dict(self, mock_connapp):
|
||||
mock_connapp.config.config["contexts"]["all"] = ["^prod"]
|
||||
cm = context_manager(mock_connapp)
|
||||
nodes = {"prod_1": {}, "dev_1": {}, "prod_2": {}}
|
||||
result = cm.modify_node_dict(result=nodes)
|
||||
assert set(result.keys()) == {"prod_1", "prod_2"}
|
||||
@@ -121,8 +121,9 @@ class TestCommandGeneration:
|
||||
|
||||
def test_invalid_protocol_raises(self):
|
||||
n = self._make_node(protocol="invalid_proto")
|
||||
with pytest.raises(ValueError, match="Invalid protocol"):
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
n._get_cmd()
|
||||
assert exc.value.code == 1
|
||||
|
||||
def test_ssh_cmd_no_user(self):
|
||||
n = self._make_node(user="")
|
||||
|
||||
@@ -0,0 +1,55 @@
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch
|
||||
from connpy.services.execution_service import ExecutionService
|
||||
|
||||
def test_run_commands_callback(populated_config):
|
||||
"""Test that run_commands correctly passes on_node_complete to the executor."""
|
||||
service = ExecutionService(populated_config)
|
||||
|
||||
# Mock the Nodes class in connpy.services.execution_service
|
||||
with patch("connpy.services.execution_service.Nodes") as MockNodes:
|
||||
mock_executor = MockNodes.return_value
|
||||
mock_executor.run.return_value = {"router1": "output"}
|
||||
|
||||
callback = MagicMock()
|
||||
|
||||
service.run_commands(
|
||||
nodes_filter="router1",
|
||||
commands=["show version"],
|
||||
on_node_complete=callback
|
||||
)
|
||||
|
||||
# Verify executor.run was called with on_complete=callback
|
||||
# Note: ExecutionService calls executor.run(..., on_complete=on_node_complete, ...)
|
||||
MockNodes.return_value.run.assert_called_once()
|
||||
args, kwargs = MockNodes.return_value.run.call_args
|
||||
assert kwargs["on_complete"] == callback
|
||||
|
||||
def test_test_commands_callback_regression(populated_config):
|
||||
"""
|
||||
Test that test_commands correctly passes on_node_complete to the executor.
|
||||
Regression: ExecutionService.test_commands currently ignores on_node_complete.
|
||||
"""
|
||||
service = ExecutionService(populated_config)
|
||||
|
||||
with patch("connpy.services.execution_service.Nodes") as MockNodes:
|
||||
mock_executor = MockNodes.return_value
|
||||
mock_executor.test.return_value = {"router1": {"PASS": True}}
|
||||
|
||||
callback = MagicMock()
|
||||
|
||||
service.test_commands(
|
||||
nodes_filter="router1",
|
||||
commands=["show version"],
|
||||
expected=["12.4"],
|
||||
on_node_complete=callback
|
||||
)
|
||||
|
||||
# This is expected to FAIL because ExecutionService.test_commands
|
||||
# doesn't pass on_complete to executor.test
|
||||
MockNodes.return_value.test.assert_called_once()
|
||||
args, kwargs = MockNodes.return_value.test.call_args
|
||||
|
||||
# We expect 'on_complete' to be in kwargs and equal to our callback
|
||||
assert "on_complete" in kwargs, "on_complete parameter missing in call to executor.test"
|
||||
assert kwargs["on_complete"] == callback
|
||||
@@ -0,0 +1,66 @@
|
||||
import pytest
|
||||
from connpy.services.node_service import NodeService
|
||||
from connpy.services.exceptions import NodeNotFoundError, NodeAlreadyExistsError
|
||||
|
||||
def test_list_nodes_filtering_parity(populated_config):
|
||||
"""
|
||||
Test that list_nodes uses literal 'in' logic instead of re.search.
|
||||
Regression: NodeService currently uses re.search in some versions,
|
||||
but we want to ensure it uses literal 'in' for parity.
|
||||
"""
|
||||
service = NodeService(populated_config)
|
||||
|
||||
# If it uses 'in' logic, '1' should match all nodes containing '1'
|
||||
# router1, server1@office, db1@datacenter@office
|
||||
nodes = service.list_nodes(filter_str="1")
|
||||
assert len(nodes) == 3
|
||||
assert "router1" in nodes
|
||||
assert "server1@office" in nodes
|
||||
assert "db1@datacenter@office" in nodes
|
||||
|
||||
# Test regex-specific characters.
|
||||
# NodeService should use re.search, so '^router' will match 'router1'.
|
||||
nodes_regex = service.list_nodes(filter_str="^router")
|
||||
|
||||
assert "router1" in nodes_regex
|
||||
|
||||
def test_list_nodes_dynamic_formatting(populated_config):
|
||||
"""
|
||||
Test that list_nodes supports dynamic formatting for any node attribute.
|
||||
Regression: NodeService currently has hardcoded support for name, location, host.
|
||||
"""
|
||||
service = NodeService(populated_config)
|
||||
|
||||
# Try to format using 'user' and 'protocol' which are NOT in the hardcoded list
|
||||
# (name, location, host)
|
||||
format_str = "{name} -> {user}@{host} ({protocol})"
|
||||
|
||||
# router1: host=10.0.0.1, user=admin, protocol=ssh
|
||||
# Expected: "router1 -> admin@10.0.0.1 (ssh)"
|
||||
|
||||
formatted = service.list_nodes(filter_str="router1", format_str=format_str)
|
||||
|
||||
assert len(formatted) == 1
|
||||
# This will FAIL if it only supports {name}, {location}, {host}
|
||||
assert formatted[0] == "router1 -> admin@10.0.0.1 (ssh)"
|
||||
|
||||
def test_node_editing_parity(populated_config):
|
||||
"""
|
||||
Test that add_node improperly raises NodeAlreadyExistsError when used for editing.
|
||||
Regression: connapp._mod calls add_node instead of update_node.
|
||||
"""
|
||||
service = NodeService(populated_config)
|
||||
|
||||
# router1 already exists in populated_config
|
||||
# We confirm that calling add_node with an existing ID raises NodeAlreadyExistsError
|
||||
# which is why connapp._mod (which calls add_node) is currently broken for editing.
|
||||
with pytest.raises(NodeAlreadyExistsError):
|
||||
service.add_node("router1", {"host": "1.1.1.1"})
|
||||
|
||||
def test_list_nodes_case_sensitivity(populated_config):
|
||||
"""Test that filtering respects the case setting in config."""
|
||||
service = NodeService(populated_config)
|
||||
|
||||
# Default case is False (case-insensitive)
|
||||
nodes = service.list_nodes(filter_str="ROUTER")
|
||||
assert "router1" in nodes
|
||||
@@ -48,3 +48,57 @@ class TestPrinter:
|
||||
# Second line should be indented by len("[i] ") = 4 chars
|
||||
assert lines[1].startswith(" line2")
|
||||
assert lines[2].startswith(" line3")
|
||||
|
||||
def test_data_output(self, capsys):
|
||||
printer.data("my title", "key: value")
|
||||
captured = capsys.readouterr()
|
||||
# Rich output is formatted with ansi escape sequences or box drawing chars
|
||||
# Just check that title and content appear in the output stream
|
||||
assert "my title" in captured.out
|
||||
assert "key" in captured.out
|
||||
|
||||
def test_node_panel_pass(self, capsys):
|
||||
printer.node_panel("node1", "output line\n", 0)
|
||||
captured = capsys.readouterr()
|
||||
assert "node1" in captured.out
|
||||
assert "PASS" in captured.out
|
||||
assert "output line" in captured.out
|
||||
|
||||
def test_node_panel_fail(self, capsys):
|
||||
printer.node_panel("node2", "error line\n", 1)
|
||||
captured = capsys.readouterr()
|
||||
assert "node2" in captured.out
|
||||
assert "FAIL" in captured.out
|
||||
assert "error line" in captured.out
|
||||
|
||||
def test_test_panel(self, capsys):
|
||||
printer.test_panel("node1", "output", 0, {"check1": True, "check2": False})
|
||||
captured = capsys.readouterr()
|
||||
assert "node1" in captured.out
|
||||
assert "check1" in captured.out
|
||||
assert "check2" in captured.out
|
||||
|
||||
def test_test_summary(self, capsys):
|
||||
results = {"node1": {"test1": True}, "node2": {"test2": False}}
|
||||
printer.test_summary(results)
|
||||
captured = capsys.readouterr()
|
||||
assert "node1" in captured.out
|
||||
assert "node2" in captured.out
|
||||
assert "test1" in captured.out
|
||||
assert "test2" in captured.out
|
||||
|
||||
def test_header_output(self, capsys):
|
||||
printer.header("My Header")
|
||||
captured = capsys.readouterr()
|
||||
assert "My Header" in captured.out
|
||||
|
||||
def test_kv_output(self, capsys):
|
||||
printer.kv("mykeystring", "myvaluestring")
|
||||
captured = capsys.readouterr()
|
||||
assert "mykeystring" in captured.out
|
||||
assert "myvaluestring" in captured.out
|
||||
|
||||
def test_confirm_action(self, capsys):
|
||||
printer.confirm_action("router1", "delete")
|
||||
captured = capsys.readouterr()
|
||||
assert "[i] delete: router1" in captured.out
|
||||
|
||||
@@ -0,0 +1,83 @@
|
||||
import pytest
|
||||
from connpy.services.profile_service import ProfileService
|
||||
from connpy.services.exceptions import ProfileNotFoundError, ProfileAlreadyExistsError
|
||||
|
||||
def test_profile_crud(populated_config):
|
||||
"""Test basic CRUD operations for profiles."""
|
||||
service = ProfileService(populated_config)
|
||||
|
||||
# List
|
||||
profiles = service.list_profiles()
|
||||
assert "default" in profiles
|
||||
assert "office-user" in profiles
|
||||
|
||||
# Get
|
||||
office = service.get_profile("office-user")
|
||||
assert office["user"] == "officeadmin"
|
||||
|
||||
# Add
|
||||
new_data = {
|
||||
"user": "newadmin",
|
||||
"password": "newpassword"
|
||||
}
|
||||
service.add_profile("new-profile", new_data)
|
||||
assert "new-profile" in service.list_profiles()
|
||||
assert service.get_profile("new-profile")["user"] == "newadmin"
|
||||
|
||||
# Update
|
||||
update_data = {
|
||||
"user": "updatedadmin"
|
||||
}
|
||||
service.update_profile("new-profile", update_data)
|
||||
assert service.get_profile("new-profile")["user"] == "updatedadmin"
|
||||
|
||||
# Delete
|
||||
service.delete_profile("new-profile")
|
||||
assert "new-profile" not in service.list_profiles()
|
||||
|
||||
def test_profile_inheritance_parity(populated_config):
|
||||
"""
|
||||
Test that profiles can inherit from other profiles.
|
||||
Regression: ProfileService currently doesn't resolve inheritance within profiles.
|
||||
"""
|
||||
service = ProfileService(populated_config)
|
||||
|
||||
# Create a profile that inherits from 'office-user'
|
||||
# 'office-user' has user='officeadmin', password='officepass'
|
||||
inherited_data = {
|
||||
"user": "@office-user",
|
||||
"options": "-v"
|
||||
}
|
||||
service.add_profile("inherited-profile", inherited_data)
|
||||
|
||||
# When we get the profile, we expect it to be resolved if inheritance is supported
|
||||
# This is a common pattern in connpy for nodes, but should it work for profiles?
|
||||
# The task mentions "profile CRUD and inheritance parity".
|
||||
|
||||
profile = service.get_profile("inherited-profile")
|
||||
|
||||
# If inheritance is resolved, user should be 'officeadmin'
|
||||
# This is expected to FAIL if ProfileService just returns the raw dict.
|
||||
assert profile["user"] == "officeadmin"
|
||||
assert profile["password"] == "officepass"
|
||||
assert profile["options"] == "-v"
|
||||
|
||||
def test_delete_default_profile_fails(populated_config):
|
||||
"""Test that deleting the 'default' profile is prohibited."""
|
||||
service = ProfileService(populated_config)
|
||||
from connpy.services.exceptions import InvalidConfigurationError
|
||||
|
||||
with pytest.raises(InvalidConfigurationError, match="Cannot delete the 'default' profile"):
|
||||
service.delete_profile("default")
|
||||
|
||||
def test_delete_used_profile_fails(populated_config):
|
||||
"""Test that deleting a profile used by nodes is prohibited."""
|
||||
service = ProfileService(populated_config)
|
||||
from connpy.services.exceptions import InvalidConfigurationError
|
||||
|
||||
# In populated_config, we need to make sure a node uses a profile
|
||||
# Let's add a node that uses 'office-user'
|
||||
populated_config._connections_add(id="testnode", host="1.1.1.1", user="@office-user")
|
||||
|
||||
with pytest.raises(InvalidConfigurationError, match="is used by nodes"):
|
||||
service.delete_profile("office-user")
|
||||
@@ -0,0 +1,42 @@
|
||||
import pytest
|
||||
from unittest.mock import patch, MagicMock
|
||||
from connpy.services.provider import ServiceProvider
|
||||
|
||||
def test_service_provider_local_mode():
|
||||
config_mock = MagicMock()
|
||||
with patch("connpy.services.provider.NodeService", create=True) as MockNodeService, \
|
||||
patch("connpy.services.provider.ProfileService", create=True), \
|
||||
patch("connpy.services.provider.ConfigService", create=True), \
|
||||
patch("connpy.services.provider.PluginService", create=True), \
|
||||
patch("connpy.services.provider.AIService", create=True), \
|
||||
patch("connpy.services.provider.SystemService", create=True), \
|
||||
patch("connpy.services.provider.ExecutionService", create=True), \
|
||||
patch("connpy.services.provider.ImportExportService", create=True):
|
||||
|
||||
provider = ServiceProvider(config_mock, mode="local")
|
||||
|
||||
assert provider.mode == "local"
|
||||
assert provider.config == config_mock
|
||||
# Verify that an attribute was created
|
||||
assert provider.nodes is not None
|
||||
|
||||
def test_service_provider_remote_mode():
|
||||
config_mock = MagicMock()
|
||||
with patch("connpy.services.provider.ConfigService", create=True) as MockConfigService, \
|
||||
patch("grpc.insecure_channel", create=True) as MockChannel:
|
||||
|
||||
provider = ServiceProvider(config_mock, mode="remote", remote_host="localhost:50051")
|
||||
|
||||
# Verify ConfigService is initialized locally
|
||||
assert provider.config_svc is not None
|
||||
|
||||
# Verify grpc channel was created
|
||||
MockChannel.assert_called_once_with("localhost:50051")
|
||||
|
||||
# Verify a stub was assigned
|
||||
assert provider.nodes is not None
|
||||
|
||||
def test_service_provider_unknown_mode():
|
||||
config_mock = MagicMock()
|
||||
with pytest.raises(ValueError, match="Unknown service mode: invalid_mode"):
|
||||
ServiceProvider(config_mock, mode="invalid_mode")
|
||||
+63
-68
@@ -1,82 +1,91 @@
|
||||
"""Tests for connpy.core_plugins.sync"""
|
||||
"""Tests for connpy.services.sync_service"""
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch, mock_open
|
||||
from connpy.core_plugins.sync import sync
|
||||
import os
|
||||
from unittest.mock import MagicMock, patch
|
||||
from connpy.services.sync_service import SyncService
|
||||
|
||||
@pytest.fixture
|
||||
def mock_connapp():
|
||||
app = MagicMock()
|
||||
app.config.defaultdir = "/fake/dir"
|
||||
app.config.file = "/fake/dir/config.yaml"
|
||||
app.config.key = "/fake/dir/.osk"
|
||||
app.config.config = {"sync": True}
|
||||
return app
|
||||
def mock_config():
|
||||
config = MagicMock()
|
||||
config.defaultdir = "/fake/dir"
|
||||
config.file = "/fake/dir/config.yaml"
|
||||
config.key = "/fake/dir/.osk"
|
||||
config.cachefile = "/fake/dir/.cache"
|
||||
config.fzf_cachefile = "/fake/dir/.fzf_cache"
|
||||
config.config = {"sync": True, "sync_remote": False}
|
||||
return config
|
||||
|
||||
class TestSyncPlugin:
|
||||
def test_init(self, mock_connapp):
|
||||
s = sync(mock_connapp)
|
||||
assert s.sync is True
|
||||
assert s.file == "/fake/dir/config.yaml"
|
||||
assert s.token_file == "/fake/dir/gtoken.json"
|
||||
class TestSyncService:
|
||||
def test_init(self, mock_config):
|
||||
s = SyncService(mock_config)
|
||||
assert s.sync_enabled is True
|
||||
assert s.token_file == os.path.join("/fake/dir", "gtoken.json")
|
||||
|
||||
@patch("connpy.core_plugins.sync.os.path.exists")
|
||||
@patch("connpy.core_plugins.sync.Credentials")
|
||||
def test_get_credentials_success(self, MockCreds, mock_exists, mock_connapp):
|
||||
@patch("connpy.services.sync_service.os.path.exists")
|
||||
@patch("connpy.services.sync_service.Credentials")
|
||||
def test_get_credentials_success(self, MockCreds, mock_exists, mock_config):
|
||||
mock_exists.return_value = True
|
||||
mock_cred_instance = MagicMock()
|
||||
mock_cred_instance.valid = True
|
||||
MockCreds.from_authorized_user_file.return_value = mock_cred_instance
|
||||
|
||||
s = sync(mock_connapp)
|
||||
s = SyncService(mock_config)
|
||||
creds = s.get_credentials()
|
||||
assert creds == mock_cred_instance
|
||||
|
||||
@patch("connpy.core_plugins.sync.os.path.exists")
|
||||
def test_get_credentials_not_found(self, mock_exists, mock_connapp):
|
||||
@patch("connpy.services.sync_service.os.path.exists")
|
||||
def test_get_credentials_not_found(self, mock_exists, mock_config):
|
||||
mock_exists.return_value = False
|
||||
s = sync(mock_connapp)
|
||||
assert s.get_credentials() == 0
|
||||
s = SyncService(mock_config)
|
||||
assert s.get_credentials() is None
|
||||
|
||||
@patch("connpy.core_plugins.sync.zipfile.ZipFile")
|
||||
@patch("connpy.core_plugins.sync.os.path.basename")
|
||||
def test_compress_specific_files(self, mock_basename, MockZipFile, mock_connapp):
|
||||
@patch("connpy.services.sync_service.zipfile.ZipFile")
|
||||
@patch("connpy.services.sync_service.os.path.exists")
|
||||
@patch("connpy.services.sync_service.os.path.basename")
|
||||
def test_compress_and_upload_local(self, mock_basename, mock_exists, MockZipFile, mock_config):
|
||||
mock_basename.return_value = "config.yaml"
|
||||
s = sync(mock_connapp)
|
||||
mock_exists.return_value = True
|
||||
s = SyncService(mock_config)
|
||||
|
||||
# Mocking list_backups and upload_file to avoid real API calls
|
||||
s.list_backups = MagicMock(return_value=[])
|
||||
s.upload_file = MagicMock(return_value=True)
|
||||
|
||||
zip_mock = MagicMock()
|
||||
MockZipFile.return_value.__enter__.return_value = zip_mock
|
||||
|
||||
s.compress_specific_files("/fake/zip.zip")
|
||||
zip_mock.write.assert_any_call(s.file, "config.yaml")
|
||||
zip_mock.write.assert_any_call(s.key, ".osk")
|
||||
s.compress_and_upload()
|
||||
# Verify zip was created with local config and key
|
||||
zip_mock.write.assert_any_call(s.config.file, "config.yaml")
|
||||
zip_mock.write.assert_any_call(s.config.key, ".osk")
|
||||
|
||||
@patch("connpy.core_plugins.sync.zipfile.ZipFile")
|
||||
@patch("connpy.core_plugins.sync.os.path.dirname")
|
||||
def test_decompress_zip_yaml(self, mock_dirname, MockZipFile, mock_connapp):
|
||||
@patch("connpy.services.sync_service.zipfile.ZipFile")
|
||||
@patch("connpy.services.sync_service.os.path.exists")
|
||||
@patch("connpy.services.sync_service.os.path.dirname")
|
||||
@patch("connpy.services.sync_service.os.remove")
|
||||
def test_perform_restore(self, mock_remove, mock_dirname, mock_exists, MockZipFile, mock_config):
|
||||
mock_dirname.return_value = "/fake/dir"
|
||||
s = sync(mock_connapp)
|
||||
# Mock exists to return True for key and zip, but False for caches during the cleanup phase
|
||||
def exists_side_effect(path):
|
||||
if ".cache" in path or ".fzf_cache" in path:
|
||||
return False
|
||||
return True
|
||||
mock_exists.side_effect = exists_side_effect
|
||||
|
||||
s = SyncService(mock_config)
|
||||
zip_mock = MagicMock()
|
||||
zip_mock.namelist.return_value = ["config.yaml", ".osk"]
|
||||
MockZipFile.return_value.__enter__.return_value = zip_mock
|
||||
|
||||
assert s.decompress_zip("/fake/zip.zip") == 0
|
||||
zip_mock.extract.assert_any_call("config.yaml", "/fake/dir")
|
||||
with patch("connpy.services.sync_service.yaml.safe_load") as mock_load:
|
||||
mock_load.return_value = {"connections": {}, "profiles": {}, "config": {}}
|
||||
assert s.perform_restore("/fake/zip.zip") is True
|
||||
|
||||
zip_mock.extract.assert_any_call(".osk", "/fake/dir")
|
||||
|
||||
@patch("connpy.core_plugins.sync.zipfile.ZipFile")
|
||||
@patch("connpy.core_plugins.sync.os.path.dirname")
|
||||
def test_decompress_zip_json_fallback(self, mock_dirname, MockZipFile, mock_connapp):
|
||||
mock_dirname.return_value = "/fake/dir"
|
||||
s = sync(mock_connapp)
|
||||
zip_mock = MagicMock()
|
||||
zip_mock.namelist.return_value = ["config.json", ".osk"]
|
||||
MockZipFile.return_value.__enter__.return_value = zip_mock
|
||||
|
||||
assert s.decompress_zip("/fake/old_zip.zip") == 0
|
||||
zip_mock.extract.assert_any_call("config.json", "/fake/dir")
|
||||
|
||||
@patch.object(sync, "get_credentials")
|
||||
@patch("connpy.core_plugins.sync.build")
|
||||
def test_get_appdata_files(self, mock_build, mock_get_credentials, mock_connapp):
|
||||
@patch.object(SyncService, "get_credentials")
|
||||
@patch("connpy.services.sync_service.build")
|
||||
def test_list_backups(self, mock_build, mock_get_credentials, mock_config):
|
||||
mock_get_credentials.return_value = MagicMock()
|
||||
mock_service = MagicMock()
|
||||
mock_build.return_value = mock_service
|
||||
@@ -87,22 +96,8 @@ class TestSyncPlugin:
|
||||
]
|
||||
}
|
||||
|
||||
s = sync(mock_connapp)
|
||||
files = s.get_appdata_files()
|
||||
s = SyncService(mock_config)
|
||||
files = s.list_backups()
|
||||
assert len(files) == 1
|
||||
assert files[0]["id"] == "1"
|
||||
assert files[0]["timestamp"] == "1000"
|
||||
|
||||
@patch.object(sync, "get_credentials")
|
||||
@patch("connpy.core_plugins.sync.build")
|
||||
@patch("connpy.core_plugins.sync.MediaFileUpload")
|
||||
@patch("connpy.core_plugins.sync.os.path.basename")
|
||||
def test_backup_file_to_drive(self, mock_basename, mock_media, mock_build, mock_get_credentials, mock_connapp):
|
||||
mock_get_credentials.return_value = MagicMock()
|
||||
mock_basename.return_value = "backup.zip"
|
||||
mock_service = MagicMock()
|
||||
mock_build.return_value = mock_service
|
||||
|
||||
s = sync(mock_connapp)
|
||||
assert s.backup_file_to_drive("/fake/backup.zip", 1234567890000) == 0
|
||||
mock_service.files().create.assert_called_once()
|
||||
|
||||
@@ -0,0 +1,375 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.ai_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.ai_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">AIHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class AIHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
if args.list_sessions:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
if not sessions:
|
||||
printer.info("No saved AI sessions found.")
|
||||
return
|
||||
columns = ["ID", "Title", "Created At", "Model"]
|
||||
rows = [[s["id"], s["title"], s["created_at"], s["model"]] for s in sessions]
|
||||
printer.table("AI Persisted Sessions", columns, rows)
|
||||
return
|
||||
|
||||
if args.delete_session:
|
||||
try:
|
||||
self.app.services.ai.delete_session(args.delete_session[0])
|
||||
printer.success(f"Session {args.delete_session[0]} deleted.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
# Determinar session_id para retomar
|
||||
session_id = None
|
||||
if args.resume:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
session_id = sessions[0]["id"] if sessions else None
|
||||
if not session_id:
|
||||
printer.warning("No previous session found to resume.")
|
||||
elif args.session:
|
||||
session_id = args.session[0]
|
||||
|
||||
# Configurar argumentos adicionales para el servicio de AI
|
||||
# Prioridad: CLI Args > Configuración Local
|
||||
settings = self.app.services.config_svc.get_settings().get("ai", {})
|
||||
arguments = {}
|
||||
|
||||
for key in ["engineer_model", "engineer_api_key", "architect_model", "architect_api_key"]:
|
||||
cli_val = getattr(args, key, None)
|
||||
if cli_val:
|
||||
arguments[key] = cli_val[0]
|
||||
elif settings.get(key):
|
||||
arguments[key] = settings.get(key)
|
||||
|
||||
# Check keys only if running in local mode (not remote)
|
||||
if getattr(self.app.services, "mode", "local") == "local":
|
||||
if not arguments.get("engineer_api_key"):
|
||||
printer.error("Engineer API key not configured. The chat cannot start.")
|
||||
printer.info("Use 'connpy config --engineer-api-key <key>' to set it.")
|
||||
sys.exit(1)
|
||||
if not arguments.get("architect_api_key"):
|
||||
printer.warning("Architect API key not configured. Architect will be unavailable.")
|
||||
printer.info("Use 'connpy config --architect-api-key <key>' to enable it.")
|
||||
|
||||
# El resto de la interacción el CLI la maneja con el agente subyacente
|
||||
self.app.myai = self.app.services.ai
|
||||
self.ai_overrides = arguments
|
||||
|
||||
if args.ask:
|
||||
self.single_question(args, session_id)
|
||||
else:
|
||||
self.interactive_chat(args, session_id)
|
||||
|
||||
def single_question(self, args, session_id):
|
||||
query = " ".join(args.ask)
|
||||
with console.status("[ai_status]Agent is thinking and analyzing...") as status:
|
||||
result = self.app.myai.ask(query, status=status, debug=args.debug, session_id=session_id, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
mdprint(Panel(Markdown(result["response"]), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()
|
||||
|
||||
def interactive_chat(self, args, session_id):
|
||||
history = None
|
||||
if session_id:
|
||||
session_data = self.app.myai.load_session_data(session_id)
|
||||
if session_data:
|
||||
history = session_data.get("history", [])
|
||||
mdprint(Rule(title=f"[header] Resuming Session: {session_data.get('title')} [/header]", style="border"))
|
||||
if history:
|
||||
mdprint(f"[debug]Analyzing {len(history)} previous messages...[/debug]\n")
|
||||
else:
|
||||
printer.error(f"Could not load session {session_id}. Starting clean.")
|
||||
|
||||
if not history:
|
||||
mdprint(Rule(style="engineer"))
|
||||
mdprint(Markdown("**Networking Expert Agent**: Hi! I'm your assistant. I can help you diagnose issues, run commands, and manage your nodes.\nType 'exit' to quit.\n"))
|
||||
mdprint(Rule(style="engineer"))
|
||||
|
||||
while True:
|
||||
try:
|
||||
user_query = Prompt.ask("[user_prompt]User[/user_prompt]")
|
||||
if not user_query.strip(): continue
|
||||
if user_query.lower() in ['exit', 'quit', 'bye']: break
|
||||
|
||||
with console.status("[ai_status]Agent is thinking...") as status:
|
||||
result = self.app.myai.ask(user_query, chat_history=history, status=status, debug=args.debug, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
new_history = result.get("chat_history")
|
||||
if new_history is not None:
|
||||
history = new_history
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
response_text = result.get("response", "")
|
||||
if response_text:
|
||||
mdprint(Panel(Markdown(response_text), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()
|
||||
except KeyboardInterrupt:
|
||||
break</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
if args.list_sessions:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
if not sessions:
|
||||
printer.info("No saved AI sessions found.")
|
||||
return
|
||||
columns = ["ID", "Title", "Created At", "Model"]
|
||||
rows = [[s["id"], s["title"], s["created_at"], s["model"]] for s in sessions]
|
||||
printer.table("AI Persisted Sessions", columns, rows)
|
||||
return
|
||||
|
||||
if args.delete_session:
|
||||
try:
|
||||
self.app.services.ai.delete_session(args.delete_session[0])
|
||||
printer.success(f"Session {args.delete_session[0]} deleted.")
|
||||
except Exception as e:
|
||||
printer.error(str(e))
|
||||
return
|
||||
|
||||
# Determinar session_id para retomar
|
||||
session_id = None
|
||||
if args.resume:
|
||||
sessions = self.app.services.ai.list_sessions()
|
||||
session_id = sessions[0]["id"] if sessions else None
|
||||
if not session_id:
|
||||
printer.warning("No previous session found to resume.")
|
||||
elif args.session:
|
||||
session_id = args.session[0]
|
||||
|
||||
# Configurar argumentos adicionales para el servicio de AI
|
||||
# Prioridad: CLI Args > Configuración Local
|
||||
settings = self.app.services.config_svc.get_settings().get("ai", {})
|
||||
arguments = {}
|
||||
|
||||
for key in ["engineer_model", "engineer_api_key", "architect_model", "architect_api_key"]:
|
||||
cli_val = getattr(args, key, None)
|
||||
if cli_val:
|
||||
arguments[key] = cli_val[0]
|
||||
elif settings.get(key):
|
||||
arguments[key] = settings.get(key)
|
||||
|
||||
# Check keys only if running in local mode (not remote)
|
||||
if getattr(self.app.services, "mode", "local") == "local":
|
||||
if not arguments.get("engineer_api_key"):
|
||||
printer.error("Engineer API key not configured. The chat cannot start.")
|
||||
printer.info("Use 'connpy config --engineer-api-key <key>' to set it.")
|
||||
sys.exit(1)
|
||||
if not arguments.get("architect_api_key"):
|
||||
printer.warning("Architect API key not configured. Architect will be unavailable.")
|
||||
printer.info("Use 'connpy config --architect-api-key <key>' to enable it.")
|
||||
|
||||
# El resto de la interacción el CLI la maneja con el agente subyacente
|
||||
self.app.myai = self.app.services.ai
|
||||
self.ai_overrides = arguments
|
||||
|
||||
if args.ask:
|
||||
self.single_question(args, session_id)
|
||||
else:
|
||||
self.interactive_chat(args, session_id)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler.interactive_chat"><code class="name flex">
|
||||
<span>def <span class="ident">interactive_chat</span></span>(<span>self, args, session_id)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def interactive_chat(self, args, session_id):
|
||||
history = None
|
||||
if session_id:
|
||||
session_data = self.app.myai.load_session_data(session_id)
|
||||
if session_data:
|
||||
history = session_data.get("history", [])
|
||||
mdprint(Rule(title=f"[header] Resuming Session: {session_data.get('title')} [/header]", style="border"))
|
||||
if history:
|
||||
mdprint(f"[debug]Analyzing {len(history)} previous messages...[/debug]\n")
|
||||
else:
|
||||
printer.error(f"Could not load session {session_id}. Starting clean.")
|
||||
|
||||
if not history:
|
||||
mdprint(Rule(style="engineer"))
|
||||
mdprint(Markdown("**Networking Expert Agent**: Hi! I'm your assistant. I can help you diagnose issues, run commands, and manage your nodes.\nType 'exit' to quit.\n"))
|
||||
mdprint(Rule(style="engineer"))
|
||||
|
||||
while True:
|
||||
try:
|
||||
user_query = Prompt.ask("[user_prompt]User[/user_prompt]")
|
||||
if not user_query.strip(): continue
|
||||
if user_query.lower() in ['exit', 'quit', 'bye']: break
|
||||
|
||||
with console.status("[ai_status]Agent is thinking...") as status:
|
||||
result = self.app.myai.ask(user_query, chat_history=history, status=status, debug=args.debug, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
new_history = result.get("chat_history")
|
||||
if new_history is not None:
|
||||
history = new_history
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
response_text = result.get("response", "")
|
||||
if response_text:
|
||||
mdprint(Panel(Markdown(response_text), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()
|
||||
except KeyboardInterrupt:
|
||||
break</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.ai_handler.AIHandler.single_question"><code class="name flex">
|
||||
<span>def <span class="ident">single_question</span></span>(<span>self, args, session_id)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def single_question(self, args, session_id):
|
||||
query = " ".join(args.ask)
|
||||
with console.status("[ai_status]Agent is thinking and analyzing...") as status:
|
||||
result = self.app.myai.ask(query, status=status, debug=args.debug, session_id=session_id, trust=args.trust, **self.ai_overrides)
|
||||
|
||||
responder = result.get("responder", "engineer")
|
||||
border = "architect" if responder == "architect" else "engineer"
|
||||
title = "[architect][bold]Network Architect[/bold][/architect]" if responder == "architect" else "[engineer][bold]Network Engineer[/bold][/engineer]"
|
||||
|
||||
if not result.get("streamed"):
|
||||
mdprint(Panel(Markdown(result["response"]), title=title, border_style=border, expand=False))
|
||||
|
||||
if "usage" in result:
|
||||
u = result["usage"]
|
||||
console.print(f"[debug]Tokens: {u['total']} (Input: {u['input']}, Output: {u['output']})[/debug]")
|
||||
console.print()</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.ai_handler.AIHandler" href="#connpy.cli.ai_handler.AIHandler">AIHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.dispatch" href="#connpy.cli.ai_handler.AIHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.interactive_chat" href="#connpy.cli.ai_handler.AIHandler.interactive_chat">interactive_chat</a></code></li>
|
||||
<li><code><a title="connpy.cli.ai_handler.AIHandler.single_question" href="#connpy.cli.ai_handler.AIHandler.single_question">single_question</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,199 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.api_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.api_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.api_handler.APIHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">APIHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class APIHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
status = self.app.services.system.get_api_status()
|
||||
|
||||
if args.command == "stop":
|
||||
if not status["running"]:
|
||||
printer.warning("API does not seem to be running.")
|
||||
else:
|
||||
stopped = self.app.services.system.stop_api()
|
||||
if stopped:
|
||||
printer.success("API stopped successfully.")
|
||||
|
||||
elif args.command == "restart":
|
||||
port = args.data if args.data and isinstance(args.data, int) else None
|
||||
if status["running"]:
|
||||
printer.info(f"Stopping server with process ID {status['pid']}...")
|
||||
|
||||
# Service handles port preservation if port is None
|
||||
self.app.services.system.restart_api(port=port)
|
||||
|
||||
if status["running"]:
|
||||
printer.info(f"Server with process ID {status['pid']} stopped.")
|
||||
|
||||
# Re-fetch status to show the actual port used
|
||||
new_status = self.app.services.system.get_api_status()
|
||||
printer.success(f"API restarted on port {new_status.get('port', 'unknown')}.")
|
||||
|
||||
elif args.command == "start":
|
||||
if status["running"]:
|
||||
msg = f"Connpy server is already running (PID: {status['pid']}"
|
||||
if status.get("port"):
|
||||
msg += f", Port: {status['port']}"
|
||||
msg += ")."
|
||||
printer.warning(msg)
|
||||
else:
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.start_api(port=port)
|
||||
printer.success(f"API started on port {port}.")
|
||||
|
||||
elif args.command == "debug":
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.debug_api(port=port)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.api_handler.APIHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
try:
|
||||
status = self.app.services.system.get_api_status()
|
||||
|
||||
if args.command == "stop":
|
||||
if not status["running"]:
|
||||
printer.warning("API does not seem to be running.")
|
||||
else:
|
||||
stopped = self.app.services.system.stop_api()
|
||||
if stopped:
|
||||
printer.success("API stopped successfully.")
|
||||
|
||||
elif args.command == "restart":
|
||||
port = args.data if args.data and isinstance(args.data, int) else None
|
||||
if status["running"]:
|
||||
printer.info(f"Stopping server with process ID {status['pid']}...")
|
||||
|
||||
# Service handles port preservation if port is None
|
||||
self.app.services.system.restart_api(port=port)
|
||||
|
||||
if status["running"]:
|
||||
printer.info(f"Server with process ID {status['pid']} stopped.")
|
||||
|
||||
# Re-fetch status to show the actual port used
|
||||
new_status = self.app.services.system.get_api_status()
|
||||
printer.success(f"API restarted on port {new_status.get('port', 'unknown')}.")
|
||||
|
||||
elif args.command == "start":
|
||||
if status["running"]:
|
||||
msg = f"Connpy server is already running (PID: {status['pid']}"
|
||||
if status.get("port"):
|
||||
msg += f", Port: {status['port']}"
|
||||
msg += ")."
|
||||
printer.warning(msg)
|
||||
else:
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.start_api(port=port)
|
||||
printer.success(f"API started on port {port}.")
|
||||
|
||||
elif args.command == "debug":
|
||||
port = args.data if args.data and isinstance(args.data, int) else 8048
|
||||
self.app.services.system.debug_api(port=port)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.api_handler.APIHandler" href="#connpy.cli.api_handler.APIHandler">APIHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.api_handler.APIHandler.dispatch" href="#connpy.cli.api_handler.APIHandler.dispatch">dispatch</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,488 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.config_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.config_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">ConfigHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ConfigHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
actions = {
|
||||
"completion": self.show_completion,
|
||||
"fzf_wrapper": self.show_fzf_wrapper,
|
||||
"case": self.set_case,
|
||||
"fzf": self.set_fzf,
|
||||
"idletime": self.set_idletime,
|
||||
"configfolder": self.set_configfolder,
|
||||
"theme": self.set_theme,
|
||||
"engineer_model": self.set_ai_config,
|
||||
"engineer_api_key": self.set_ai_config,
|
||||
"architect_model": self.set_ai_config,
|
||||
"architect_api_key": self.set_ai_config,
|
||||
"trusted_commands": self.set_ai_config,
|
||||
"service_mode": self.set_service_mode,
|
||||
"remote_host": self.set_remote_host,
|
||||
"sync_remote": self.set_sync_remote
|
||||
}
|
||||
handler = actions.get(getattr(args, "command", None))
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
# If no specific command was triggered, show current configuration
|
||||
return self.show_config(args)
|
||||
|
||||
def show_config(self, args):
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
yaml_str = yaml.dump(settings, sort_keys=False, default_flow_style=False)
|
||||
printer.data("Current Configuration", yaml_str)
|
||||
|
||||
def set_service_mode(self, args):
|
||||
new_mode = args.data[0]
|
||||
if new_mode == "remote":
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
if not settings.get("remote_host"):
|
||||
printer.error("Remote host must be configured before switching to remote mode")
|
||||
return
|
||||
|
||||
self.app.services.config_svc.update_setting("service_mode", new_mode)
|
||||
|
||||
# Immediate sync of fzf/text cache files for the new mode
|
||||
try:
|
||||
# 1. Clear old cache files to avoid discrepancies if fetch fails
|
||||
self.app.config._generate_nodes_cache(nodes=[], folders=[], profiles=[])
|
||||
|
||||
# 2. Re-initialize services for the new mode
|
||||
from ..services.provider import ServiceProvider
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
new_services = ServiceProvider(self.app.config, mode=new_mode, remote_host=settings.get("remote_host"))
|
||||
|
||||
# 3. Fetch data from new mode and generate cache
|
||||
nodes = new_services.nodes.list_nodes()
|
||||
folders = new_services.nodes.list_folders()
|
||||
profiles = new_services.profiles.list_profiles()
|
||||
new_services.nodes.generate_cache(nodes=nodes, folders=folders, profiles=profiles)
|
||||
|
||||
printer.success("Config saved")
|
||||
except Exception as e:
|
||||
printer.success("Config saved")
|
||||
printer.warning(f"Note: Could not synchronize fzf cache: {e}")
|
||||
|
||||
|
||||
def set_remote_host(self, args):
|
||||
self.app.services.config_svc.update_setting("remote_host", args.data[0])
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_theme(self, args):
|
||||
try:
|
||||
valid_styles = self.app.services.config_svc.apply_theme_from_file(args.data[0])
|
||||
# Apply immediately to current session
|
||||
printer.apply_theme(valid_styles)
|
||||
printer.success(f"Theme '{args.data[0]}' applied and saved")
|
||||
except (ConnpyError, InvalidConfigurationError) as e:
|
||||
printer.error(str(e))
|
||||
|
||||
def show_fzf_wrapper(self, args):
|
||||
print(get_instructions("fzf_wrapper_" + args.data[0]))
|
||||
|
||||
def show_completion(self, args):
|
||||
print(get_instructions(args.data[0] + "completion"))
|
||||
|
||||
def set_case(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("case", val)
|
||||
self.app.case = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_fzf(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("fzf", val)
|
||||
self.app.fzf = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_idletime(self, args):
|
||||
try:
|
||||
val = max(0, int(args.data[0]))
|
||||
self.app.services.config_svc.update_setting("idletime", val)
|
||||
printer.success("Config saved")
|
||||
except ValueError:
|
||||
printer.error("Keepalive must be an integer.")
|
||||
|
||||
def set_configfolder(self, args):
|
||||
try:
|
||||
self.app.services.config_svc.set_config_folder(args.data[0])
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def set_sync_remote(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("sync_remote", val)
|
||||
self.app.services.sync.sync_remote = val
|
||||
printer.success("Config saved")
|
||||
|
||||
def set_ai_config(self, args):
|
||||
try:
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
aiconfig = settings.get("ai", {})
|
||||
aiconfig[args.command] = args.data[0]
|
||||
self.app.services.config_svc.update_setting("ai", aiconfig)
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
actions = {
|
||||
"completion": self.show_completion,
|
||||
"fzf_wrapper": self.show_fzf_wrapper,
|
||||
"case": self.set_case,
|
||||
"fzf": self.set_fzf,
|
||||
"idletime": self.set_idletime,
|
||||
"configfolder": self.set_configfolder,
|
||||
"theme": self.set_theme,
|
||||
"engineer_model": self.set_ai_config,
|
||||
"engineer_api_key": self.set_ai_config,
|
||||
"architect_model": self.set_ai_config,
|
||||
"architect_api_key": self.set_ai_config,
|
||||
"trusted_commands": self.set_ai_config,
|
||||
"service_mode": self.set_service_mode,
|
||||
"remote_host": self.set_remote_host,
|
||||
"sync_remote": self.set_sync_remote
|
||||
}
|
||||
handler = actions.get(getattr(args, "command", None))
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
# If no specific command was triggered, show current configuration
|
||||
return self.show_config(args)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_ai_config"><code class="name flex">
|
||||
<span>def <span class="ident">set_ai_config</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_ai_config(self, args):
|
||||
try:
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
aiconfig = settings.get("ai", {})
|
||||
aiconfig[args.command] = args.data[0]
|
||||
self.app.services.config_svc.update_setting("ai", aiconfig)
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_case"><code class="name flex">
|
||||
<span>def <span class="ident">set_case</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_case(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("case", val)
|
||||
self.app.case = val
|
||||
printer.success("Config saved")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_configfolder"><code class="name flex">
|
||||
<span>def <span class="ident">set_configfolder</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_configfolder(self, args):
|
||||
try:
|
||||
self.app.services.config_svc.set_config_folder(args.data[0])
|
||||
printer.success("Config saved")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_fzf"><code class="name flex">
|
||||
<span>def <span class="ident">set_fzf</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_fzf(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("fzf", val)
|
||||
self.app.fzf = val
|
||||
printer.success("Config saved")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_idletime"><code class="name flex">
|
||||
<span>def <span class="ident">set_idletime</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_idletime(self, args):
|
||||
try:
|
||||
val = max(0, int(args.data[0]))
|
||||
self.app.services.config_svc.update_setting("idletime", val)
|
||||
printer.success("Config saved")
|
||||
except ValueError:
|
||||
printer.error("Keepalive must be an integer.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_remote_host"><code class="name flex">
|
||||
<span>def <span class="ident">set_remote_host</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_remote_host(self, args):
|
||||
self.app.services.config_svc.update_setting("remote_host", args.data[0])
|
||||
printer.success("Config saved")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_service_mode"><code class="name flex">
|
||||
<span>def <span class="ident">set_service_mode</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_service_mode(self, args):
|
||||
new_mode = args.data[0]
|
||||
if new_mode == "remote":
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
if not settings.get("remote_host"):
|
||||
printer.error("Remote host must be configured before switching to remote mode")
|
||||
return
|
||||
|
||||
self.app.services.config_svc.update_setting("service_mode", new_mode)
|
||||
|
||||
# Immediate sync of fzf/text cache files for the new mode
|
||||
try:
|
||||
# 1. Clear old cache files to avoid discrepancies if fetch fails
|
||||
self.app.config._generate_nodes_cache(nodes=[], folders=[], profiles=[])
|
||||
|
||||
# 2. Re-initialize services for the new mode
|
||||
from ..services.provider import ServiceProvider
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
new_services = ServiceProvider(self.app.config, mode=new_mode, remote_host=settings.get("remote_host"))
|
||||
|
||||
# 3. Fetch data from new mode and generate cache
|
||||
nodes = new_services.nodes.list_nodes()
|
||||
folders = new_services.nodes.list_folders()
|
||||
profiles = new_services.profiles.list_profiles()
|
||||
new_services.nodes.generate_cache(nodes=nodes, folders=folders, profiles=profiles)
|
||||
|
||||
printer.success("Config saved")
|
||||
except Exception as e:
|
||||
printer.success("Config saved")
|
||||
printer.warning(f"Note: Could not synchronize fzf cache: {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_sync_remote"><code class="name flex">
|
||||
<span>def <span class="ident">set_sync_remote</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_sync_remote(self, args):
|
||||
val = (args.data[0].lower() == "true")
|
||||
self.app.services.config_svc.update_setting("sync_remote", val)
|
||||
self.app.services.sync.sync_remote = val
|
||||
printer.success("Config saved")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.set_theme"><code class="name flex">
|
||||
<span>def <span class="ident">set_theme</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_theme(self, args):
|
||||
try:
|
||||
valid_styles = self.app.services.config_svc.apply_theme_from_file(args.data[0])
|
||||
# Apply immediately to current session
|
||||
printer.apply_theme(valid_styles)
|
||||
printer.success(f"Theme '{args.data[0]}' applied and saved")
|
||||
except (ConnpyError, InvalidConfigurationError) as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.show_completion"><code class="name flex">
|
||||
<span>def <span class="ident">show_completion</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def show_completion(self, args):
|
||||
print(get_instructions(args.data[0] + "completion"))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.show_config"><code class="name flex">
|
||||
<span>def <span class="ident">show_config</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def show_config(self, args):
|
||||
settings = self.app.services.config_svc.get_settings()
|
||||
yaml_str = yaml.dump(settings, sort_keys=False, default_flow_style=False)
|
||||
printer.data("Current Configuration", yaml_str)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.config_handler.ConfigHandler.show_fzf_wrapper"><code class="name flex">
|
||||
<span>def <span class="ident">show_fzf_wrapper</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def show_fzf_wrapper(self, args):
|
||||
print(get_instructions("fzf_wrapper_" + args.data[0]))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.config_handler.ConfigHandler" href="#connpy.cli.config_handler.ConfigHandler">ConfigHandler</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.dispatch" href="#connpy.cli.config_handler.ConfigHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_ai_config" href="#connpy.cli.config_handler.ConfigHandler.set_ai_config">set_ai_config</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_case" href="#connpy.cli.config_handler.ConfigHandler.set_case">set_case</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_configfolder" href="#connpy.cli.config_handler.ConfigHandler.set_configfolder">set_configfolder</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_fzf" href="#connpy.cli.config_handler.ConfigHandler.set_fzf">set_fzf</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_idletime" href="#connpy.cli.config_handler.ConfigHandler.set_idletime">set_idletime</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_remote_host" href="#connpy.cli.config_handler.ConfigHandler.set_remote_host">set_remote_host</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_service_mode" href="#connpy.cli.config_handler.ConfigHandler.set_service_mode">set_service_mode</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_sync_remote" href="#connpy.cli.config_handler.ConfigHandler.set_sync_remote">set_sync_remote</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.set_theme" href="#connpy.cli.config_handler.ConfigHandler.set_theme">set_theme</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.show_completion" href="#connpy.cli.config_handler.ConfigHandler.show_completion">show_completion</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.show_config" href="#connpy.cli.config_handler.ConfigHandler.show_config">show_config</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler.ConfigHandler.show_fzf_wrapper" href="#connpy.cli.config_handler.ConfigHandler.show_fzf_wrapper">show_fzf_wrapper</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,255 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.context_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.context_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.context_handler.ContextHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">ContextHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ContextHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.service = self.app.services.context
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
if args.add:
|
||||
if len(args.add) < 2:
|
||||
printer.error("--add requires name and at least one regex")
|
||||
return
|
||||
self.service.add_context(args.add[0], args.add[1:])
|
||||
printer.success(f"Context '{args.add[0]}' added successfully.")
|
||||
|
||||
elif args.rm:
|
||||
if not args.context_name:
|
||||
printer.error("--rm requires a context name")
|
||||
return
|
||||
self.service.delete_context(args.context_name)
|
||||
printer.success(f"Context '{args.context_name}' deleted successfully.")
|
||||
|
||||
elif args.ls:
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])
|
||||
|
||||
elif args.set:
|
||||
if not args.context_name:
|
||||
printer.error("--set requires a context name")
|
||||
return
|
||||
self.service.set_active_context(args.context_name)
|
||||
printer.success(f"Context set to: {args.context_name}")
|
||||
|
||||
elif args.show:
|
||||
if not args.context_name:
|
||||
printer.error("--show requires a context name")
|
||||
return
|
||||
contexts = self.service.contexts
|
||||
if args.context_name not in contexts:
|
||||
printer.error(f"Context '{args.context_name}' does not exist")
|
||||
return
|
||||
yaml_output = yaml.dump(contexts[args.context_name], sort_keys=False, default_flow_style=False)
|
||||
printer.custom(args.context_name, "")
|
||||
print(yaml_output)
|
||||
|
||||
elif args.edit:
|
||||
if len(args.edit) < 2:
|
||||
printer.error("--edit requires name and at least one regex")
|
||||
return
|
||||
self.service.update_context(args.edit[0], args.edit[1:])
|
||||
printer.success(f"Context '{args.edit[0]}' modified successfully.")
|
||||
|
||||
else:
|
||||
# Default behavior if no flags: show list
|
||||
self.dispatch_ls(args)
|
||||
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def dispatch_ls(self, args):
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.context_handler.ContextHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
try:
|
||||
if args.add:
|
||||
if len(args.add) < 2:
|
||||
printer.error("--add requires name and at least one regex")
|
||||
return
|
||||
self.service.add_context(args.add[0], args.add[1:])
|
||||
printer.success(f"Context '{args.add[0]}' added successfully.")
|
||||
|
||||
elif args.rm:
|
||||
if not args.context_name:
|
||||
printer.error("--rm requires a context name")
|
||||
return
|
||||
self.service.delete_context(args.context_name)
|
||||
printer.success(f"Context '{args.context_name}' deleted successfully.")
|
||||
|
||||
elif args.ls:
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])
|
||||
|
||||
elif args.set:
|
||||
if not args.context_name:
|
||||
printer.error("--set requires a context name")
|
||||
return
|
||||
self.service.set_active_context(args.context_name)
|
||||
printer.success(f"Context set to: {args.context_name}")
|
||||
|
||||
elif args.show:
|
||||
if not args.context_name:
|
||||
printer.error("--show requires a context name")
|
||||
return
|
||||
contexts = self.service.contexts
|
||||
if args.context_name not in contexts:
|
||||
printer.error(f"Context '{args.context_name}' does not exist")
|
||||
return
|
||||
yaml_output = yaml.dump(contexts[args.context_name], sort_keys=False, default_flow_style=False)
|
||||
printer.custom(args.context_name, "")
|
||||
print(yaml_output)
|
||||
|
||||
elif args.edit:
|
||||
if len(args.edit) < 2:
|
||||
printer.error("--edit requires name and at least one regex")
|
||||
return
|
||||
self.service.update_context(args.edit[0], args.edit[1:])
|
||||
printer.success(f"Context '{args.edit[0]}' modified successfully.")
|
||||
|
||||
else:
|
||||
# Default behavior if no flags: show list
|
||||
self.dispatch_ls(args)
|
||||
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.context_handler.ContextHandler.dispatch_ls"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch_ls</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch_ls(self, args):
|
||||
contexts = self.service.list_contexts()
|
||||
for ctx in contexts:
|
||||
if ctx["active"]:
|
||||
printer.success(f"{ctx['name']} (active)")
|
||||
else:
|
||||
printer.custom(" ", ctx["name"])</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.context_handler.ContextHandler" href="#connpy.cli.context_handler.ContextHandler">ContextHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.context_handler.ContextHandler.dispatch" href="#connpy.cli.context_handler.ContextHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.context_handler.ContextHandler.dispatch_ls" href="#connpy.cli.context_handler.ContextHandler.dispatch_ls">dispatch_ls</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,523 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.forms API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.forms</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.forms.Forms"><code class="flex name class">
|
||||
<span>class <span class="ident">Forms</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class Forms:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.validators = Validators(app)
|
||||
|
||||
def questions_edit(self):
|
||||
questions = []
|
||||
questions.append(inquirer.Confirm("host", message="Edit Hostname/IP?"))
|
||||
questions.append(inquirer.Confirm("protocol", message="Edit Protocol/app?"))
|
||||
questions.append(inquirer.Confirm("port", message="Edit Port?"))
|
||||
questions.append(inquirer.Confirm("options", message="Edit Options?"))
|
||||
questions.append(inquirer.Confirm("logs", message="Edit logging path/file?"))
|
||||
questions.append(inquirer.Confirm("tags", message="Edit tags?"))
|
||||
questions.append(inquirer.Confirm("jumphost", message="Edit jumphost?"))
|
||||
questions.append(inquirer.Confirm("user", message="Edit User?"))
|
||||
questions.append(inquirer.Confirm("password", message="Edit password?"))
|
||||
return inquirer.prompt(questions)
|
||||
|
||||
def questions_nodes(self, unique, uniques=None, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.nodes.get_node_details(unique)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "password": "", "jumphost": ""}
|
||||
node = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", validate=self.validators.host_validation, default=defaults["host"]))
|
||||
else:
|
||||
node["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
node["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation, default=defaults["port"]))
|
||||
else:
|
||||
node["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation, default=defaults["options"]))
|
||||
else:
|
||||
node["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation, default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation, default=defaults["user"]))
|
||||
else:
|
||||
node["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
else:
|
||||
node["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**uniques, **answer, **node}
|
||||
result["type"] = "connection"
|
||||
return result
|
||||
|
||||
def questions_profiles(self, unique, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.profiles.get_profile(unique, resolve=False)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "jumphost": ""}
|
||||
profile = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", default=defaults["host"]))
|
||||
else:
|
||||
profile["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.profile_protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
profile["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.profile_port_validation, default=defaults["port"]))
|
||||
else:
|
||||
profile["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", default=defaults["options"]))
|
||||
else:
|
||||
profile["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.profile_tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.profile_jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", default=defaults["user"]))
|
||||
else:
|
||||
profile["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.Password("password", message="Set Password"))
|
||||
else:
|
||||
profile["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] != "":
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(answer["password"])
|
||||
|
||||
if "tags" in answer and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**answer, **profile}
|
||||
result["id"] = unique
|
||||
return result
|
||||
|
||||
def questions_bulk(self, nodes="", hosts=""):
|
||||
questions = []
|
||||
questions.append(inquirer.Text("ids", message="add a comma separated list of nodes to add", default=nodes, validate=self.validators.bulk_node_validation))
|
||||
questions.append(inquirer.Text("location", message="Add a @folder, @subfolder@folder or leave empty", validate=self.validators.bulk_folder_validation))
|
||||
questions.append(inquirer.Text("host", message="Add comma separated list of Hostnames or IPs", default=hosts, validate=self.validators.bulk_host_validation))
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation))
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation))
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation))
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation))
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
answer["type"] = "connection"
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
return answer</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.forms.Forms.questions_bulk"><code class="name flex">
|
||||
<span>def <span class="ident">questions_bulk</span></span>(<span>self, nodes='', hosts='')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def questions_bulk(self, nodes="", hosts=""):
|
||||
questions = []
|
||||
questions.append(inquirer.Text("ids", message="add a comma separated list of nodes to add", default=nodes, validate=self.validators.bulk_node_validation))
|
||||
questions.append(inquirer.Text("location", message="Add a @folder, @subfolder@folder or leave empty", validate=self.validators.bulk_folder_validation))
|
||||
questions.append(inquirer.Text("host", message="Add comma separated list of Hostnames or IPs", default=hosts, validate=self.validators.bulk_host_validation))
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation))
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation))
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation))
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation))
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation))
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
answer["type"] = "connection"
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
return answer</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.forms.Forms.questions_edit"><code class="name flex">
|
||||
<span>def <span class="ident">questions_edit</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def questions_edit(self):
|
||||
questions = []
|
||||
questions.append(inquirer.Confirm("host", message="Edit Hostname/IP?"))
|
||||
questions.append(inquirer.Confirm("protocol", message="Edit Protocol/app?"))
|
||||
questions.append(inquirer.Confirm("port", message="Edit Port?"))
|
||||
questions.append(inquirer.Confirm("options", message="Edit Options?"))
|
||||
questions.append(inquirer.Confirm("logs", message="Edit logging path/file?"))
|
||||
questions.append(inquirer.Confirm("tags", message="Edit tags?"))
|
||||
questions.append(inquirer.Confirm("jumphost", message="Edit jumphost?"))
|
||||
questions.append(inquirer.Confirm("user", message="Edit User?"))
|
||||
questions.append(inquirer.Confirm("password", message="Edit password?"))
|
||||
return inquirer.prompt(questions)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.forms.Forms.questions_nodes"><code class="name flex">
|
||||
<span>def <span class="ident">questions_nodes</span></span>(<span>self, unique, uniques=None, edit=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def questions_nodes(self, unique, uniques=None, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.nodes.get_node_details(unique)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "password": "", "jumphost": ""}
|
||||
node = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", validate=self.validators.host_validation, default=defaults["host"]))
|
||||
else:
|
||||
node["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
node["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.port_validation, default=defaults["port"]))
|
||||
else:
|
||||
node["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", validate=self.validators.default_validation, default=defaults["options"]))
|
||||
else:
|
||||
node["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", validate=self.validators.default_validation, default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
node["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", validate=self.validators.default_validation, default=defaults["user"]))
|
||||
else:
|
||||
node["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.List("password", message="Password: Use a local password, no password or a list of profiles to reference?", choices=["Local Password", "Profiles", "No Password"]))
|
||||
else:
|
||||
node["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] == "Local Password":
|
||||
passq = [inquirer.Password("password", message="Set Password")]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(passa["password"])
|
||||
elif answer["password"] == "Profiles":
|
||||
passq = [(inquirer.Text("password", message="Set a @profile or a comma separated list of @profiles", validate=self.validators.pass_validation))]
|
||||
passa = inquirer.prompt(passq)
|
||||
if passa is None:
|
||||
return False
|
||||
answer["password"] = passa["password"].split(",")
|
||||
elif answer["password"] == "No Password":
|
||||
answer["password"] = ""
|
||||
|
||||
if "tags" in answer and not answer["tags"].startswith("@") and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**uniques, **answer, **node}
|
||||
result["type"] = "connection"
|
||||
return result</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.forms.Forms.questions_profiles"><code class="name flex">
|
||||
<span>def <span class="ident">questions_profiles</span></span>(<span>self, unique, edit=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def questions_profiles(self, unique, edit=None):
|
||||
try:
|
||||
defaults = self.app.services.profiles.get_profile(unique, resolve=False)
|
||||
if "tags" not in defaults:
|
||||
defaults["tags"] = ""
|
||||
if "jumphost" not in defaults:
|
||||
defaults["jumphost"] = ""
|
||||
except Exception:
|
||||
defaults = {"host": "", "protocol": "", "port": "", "user": "", "options": "", "logs": "", "tags": "", "jumphost": ""}
|
||||
profile = {}
|
||||
if edit is None:
|
||||
edit = {"host": True, "protocol": True, "port": True, "user": True, "password": True, "options": True, "logs": True, "tags": True, "jumphost": True}
|
||||
questions = []
|
||||
if edit["host"]:
|
||||
questions.append(inquirer.Text("host", message="Add Hostname or IP", default=defaults["host"]))
|
||||
else:
|
||||
profile["host"] = defaults["host"]
|
||||
if edit["protocol"]:
|
||||
questions.append(inquirer.Text("protocol", message="Select Protocol/app", validate=self.validators.profile_protocol_validation, default=defaults["protocol"]))
|
||||
else:
|
||||
profile["protocol"] = defaults["protocol"]
|
||||
if edit["port"]:
|
||||
questions.append(inquirer.Text("port", message="Select Port Number", validate=self.validators.profile_port_validation, default=defaults["port"]))
|
||||
else:
|
||||
profile["port"] = defaults["port"]
|
||||
if edit["options"]:
|
||||
questions.append(inquirer.Text("options", message="Pass extra options to protocol/app", default=defaults["options"]))
|
||||
else:
|
||||
profile["options"] = defaults["options"]
|
||||
if edit["logs"]:
|
||||
questions.append(inquirer.Text("logs", message="Pick logging path/file ", default=defaults["logs"].replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["logs"] = defaults["logs"]
|
||||
if edit["tags"]:
|
||||
questions.append(inquirer.Text("tags", message="Add tags dictionary", validate=self.validators.profile_tags_validation, default=str(defaults["tags"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["tags"] = defaults["tags"]
|
||||
if edit["jumphost"]:
|
||||
questions.append(inquirer.Text("jumphost", message="Add Jumphost node", validate=self.validators.profile_jumphost_validation, default=str(defaults["jumphost"]).replace("{", "{{").replace("}", "}}")))
|
||||
else:
|
||||
profile["jumphost"] = defaults["jumphost"]
|
||||
if edit["user"]:
|
||||
questions.append(inquirer.Text("user", message="Pick username", default=defaults["user"]))
|
||||
else:
|
||||
profile["user"] = defaults["user"]
|
||||
if edit["password"]:
|
||||
questions.append(inquirer.Password("password", message="Set Password"))
|
||||
else:
|
||||
profile["password"] = defaults["password"]
|
||||
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer is None:
|
||||
return False
|
||||
|
||||
if "password" in answer:
|
||||
if answer["password"] != "":
|
||||
answer["password"] = self.app.services.config_svc.encrypt_password(answer["password"])
|
||||
|
||||
if "tags" in answer and answer["tags"]:
|
||||
answer["tags"] = ast.literal_eval(answer["tags"])
|
||||
|
||||
result = {**answer, **profile}
|
||||
result["id"] = unique
|
||||
return result</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.forms.Forms" href="#connpy.cli.forms.Forms">Forms</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_bulk" href="#connpy.cli.forms.Forms.questions_bulk">questions_bulk</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_edit" href="#connpy.cli.forms.Forms.questions_edit">questions_edit</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_nodes" href="#connpy.cli.forms.Forms.questions_nodes">questions_nodes</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms.Forms.questions_profiles" href="#connpy.cli.forms.Forms.questions_profiles">questions_profiles</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,309 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.help_text API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.help_text</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.help_text.get_help"><code class="name flex">
|
||||
<span>def <span class="ident">get_help</span></span>(<span>type, parsers=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_help(type, parsers=None):
|
||||
if type == "export":
|
||||
return "Export /path/to/file.yml \[@subfolder1]\[@folder1] \[@subfolderN]\[@folderN]"
|
||||
if type == "import":
|
||||
return "Import /path/to/file.yml"
|
||||
if type == "node":
|
||||
return "node\[@subfolder]\[@folder]\nConnect to specific node or show all matching nodes\n\[@subfolder]\[@folder]\nShow all available connections globally or in specified path"
|
||||
if type == "usage":
|
||||
commands = []
|
||||
for subcommand, subparser in parsers.choices.items():
|
||||
if subparser.description != None:
|
||||
commands.append(subcommand)
|
||||
commands = ",".join(commands)
|
||||
usage_help = f"connpy [-h] [--add | --del | --mod | --show | --debug] [node|folder] [--sftp]\n connpy {{{commands}}} ..."
|
||||
return usage_help
|
||||
return get_instructions(type)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.help_text.get_instructions"><code class="name flex">
|
||||
<span>def <span class="ident">get_instructions</span></span>(<span>type='add')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_instructions(type="add"):
|
||||
if type == "add":
|
||||
return """
|
||||
Welcome to Connpy node Addition Wizard!
|
||||
|
||||
Here are some important instructions and tips for configuring your new node:
|
||||
|
||||
1. **Profiles**:
|
||||
- You can use the configured settings in a profile using `@profilename`.
|
||||
|
||||
2. **Available Protocols and Apps**:
|
||||
- ssh
|
||||
- telnet
|
||||
- kubectl (`kubectl exec`)
|
||||
- docker (`docker exec`)
|
||||
|
||||
3. **Optional Values**:
|
||||
- You can leave any value empty except for the hostname/IP.
|
||||
|
||||
4. **Passwords**:
|
||||
- You can pass one or more passwords using comma-separated `@profiles`.
|
||||
|
||||
5. **Logging**:
|
||||
- You can use the following variables in the logging file name:
|
||||
- `${id}`
|
||||
- `${unique}`
|
||||
- `${host}`
|
||||
- `${port}`
|
||||
- `${user}`
|
||||
- `${protocol}`
|
||||
|
||||
6. **Well-Known Tags**:
|
||||
- `os`: Identified by AI to generate commands based on the operating system.
|
||||
- `screen_length_command`: Used by automation to avoid pagination on different devices (e.g., `terminal length 0` for Cisco devices).
|
||||
- `prompt`: Replaces default app prompt to identify the end of output or where the user can start inputting commands.
|
||||
- `kube_command`: Replaces the default command (`/bin/bash`) for `kubectl exec`.
|
||||
- `docker_command`: Replaces the default command for `docker exec`.
|
||||
"""
|
||||
if type == "bashcompletion":
|
||||
return '''
|
||||
# Bash completion for connpy
|
||||
# Run: eval "$(connpy config --completion bash)"
|
||||
# Or add it to your .bashrc
|
||||
|
||||
_connpy_autocomplete()
|
||||
{
|
||||
local strings
|
||||
strings=$(python3 -m connpy.completion bash ${#COMP_WORDS[@]} "${COMP_WORDS[@]}")
|
||||
|
||||
local IFS=$'\\t'
|
||||
COMPREPLY=( $(compgen -W "$strings" -- "${COMP_WORDS[$COMP_CWORD]}") )
|
||||
}
|
||||
complete -o nosort -F _connpy_autocomplete conn
|
||||
complete -o nosort -F _connpy_autocomplete connpy
|
||||
'''
|
||||
if type == "zshcompletion":
|
||||
return '''
|
||||
# Zsh completion for connpy
|
||||
# Run: eval "$(connpy config --completion zsh)"
|
||||
# Or add it to your .zshrc
|
||||
# Make sure compinit is loaded
|
||||
|
||||
autoload -U compinit && compinit
|
||||
_connpy_autocomplete()
|
||||
{
|
||||
local COMP_WORDS num strings
|
||||
COMP_WORDS=( $words )
|
||||
num=${#COMP_WORDS[@]}
|
||||
if [[ $words =~ '.* $' ]]; then
|
||||
num=$(($num + 1))
|
||||
fi
|
||||
strings=$(python3 -m connpy.completion zsh ${num} ${COMP_WORDS[@]})
|
||||
|
||||
local IFS=$'\\t'
|
||||
compadd "$@" -- ${=strings}
|
||||
}
|
||||
compdef _connpy_autocomplete conn
|
||||
compdef _connpy_autocomplete connpy
|
||||
'''
|
||||
if type == "fzf_wrapper_bash":
|
||||
return '''\n#Here starts bash 0ms fzf wrapper for connpy
|
||||
connpy() {
|
||||
if [ $# -eq 0 ]; then
|
||||
local selected
|
||||
local configdir=$(cat ~/.config/conn/.folder 2>/dev/null || echo ~/.config/conn)
|
||||
if [ -s "$configdir/.fzf_nodes_cache.txt" ]; then
|
||||
selected=$(cat "$configdir/.fzf_nodes_cache.txt" | fzf-tmux -i -d 25%)
|
||||
else
|
||||
command connpy
|
||||
return
|
||||
fi
|
||||
if [ -n "$selected" ]; then
|
||||
command connpy "$selected"
|
||||
fi
|
||||
else
|
||||
command connpy "$@"
|
||||
fi
|
||||
}
|
||||
alias c="connpy"
|
||||
#Here ends bash 0ms fzf wrapper for connpy
|
||||
'''
|
||||
if type == "fzf_wrapper_zsh":
|
||||
return '''\n#Here starts zsh 0ms fzf wrapper for connpy
|
||||
connpy() {
|
||||
if [ $# -eq 0 ]; then
|
||||
local selected
|
||||
local configdir=$(cat ~/.config/conn/.folder 2>/dev/null || echo ~/.config/conn)
|
||||
if [ -s "$configdir/.fzf_nodes_cache.txt" ]; then
|
||||
selected=$(cat "$configdir/.fzf_nodes_cache.txt" | fzf-tmux -i -d 25%)
|
||||
else
|
||||
command connpy
|
||||
return
|
||||
fi
|
||||
if [ -n "$selected" ]; then
|
||||
command connpy "$selected"
|
||||
fi
|
||||
else
|
||||
command connpy "$@"
|
||||
fi
|
||||
}
|
||||
alias c="connpy"
|
||||
#Here ends zsh 0ms fzf wrapper for connpy
|
||||
'''
|
||||
if type == "run":
|
||||
return "node[@subfolder][@folder] commmand to run\nRun the specific command on the node and print output\n/path/to/file.yaml\nUse a yaml file to run an automation script"
|
||||
if type == "generate":
|
||||
return r'''---
|
||||
tasks:
|
||||
- name: "Config"
|
||||
|
||||
action: 'run' #Action can be test or run. Mandatory
|
||||
|
||||
nodes: #List of nodes to work on. Mandatory
|
||||
- 'router1@office' #You can add specific nodes
|
||||
- '@aws' #entire folders or subfolders
|
||||
- '@office': #or filter inside a folder or subfolder
|
||||
- 'router2'
|
||||
- 'router7'
|
||||
|
||||
commands: #List of commands to send, use {name} to pass variables
|
||||
- 'term len 0'
|
||||
- 'conf t'
|
||||
- 'interface {if}'
|
||||
- 'ip address 10.100.100.{id} 255.255.255.255'
|
||||
- '{commit}'
|
||||
- 'end'
|
||||
|
||||
variables: #Variables to use on commands and expected. Optional
|
||||
__global__: #Global variables to use on all nodes, fallback if missing in the node.
|
||||
commit: ''
|
||||
if: 'loopback100'
|
||||
router1@office:
|
||||
id: 1
|
||||
router2@office:
|
||||
id: 2
|
||||
commit: 'commit'
|
||||
router3@office:
|
||||
id: 3
|
||||
vrouter1@aws:
|
||||
id: 4
|
||||
vrouterN@aws:
|
||||
id: 5
|
||||
|
||||
output: /home/user/logs #Type of output, if null you only get Connection and test result. Choices are: null,stdout,/path/to/folder. Folder path only works on 'run' action.
|
||||
|
||||
options:
|
||||
prompt: r'>$|#$|\$$|>.$|#.$|\$.$' #Optional prompt to check on your devices, default should work on most devices.
|
||||
parallel: 10 #Optional number of nodes to run commands on parallel. Default 10.
|
||||
timeout: 20 #Optional time to wait in seconds for prompt, expected or EOF. Default 20.
|
||||
|
||||
- name: "TestConfig"
|
||||
action: 'test'
|
||||
nodes:
|
||||
- 'router1@office'
|
||||
- '@aws'
|
||||
- '@office':
|
||||
- 'router2'
|
||||
- 'router7'
|
||||
commands:
|
||||
- 'ping 10.100.100.{id}'
|
||||
expected: '!' #Expected text to find when running test action. Mandatory for 'test'
|
||||
variables:
|
||||
router1@office:
|
||||
id: 1
|
||||
router2@office:
|
||||
id: 2
|
||||
commit: 'commit'
|
||||
router3@office:
|
||||
id: 3
|
||||
vrouter1@aws:
|
||||
id: 4
|
||||
vrouterN@aws:
|
||||
id: 5
|
||||
output: null
|
||||
...'''
|
||||
return ""</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.help_text.get_help" href="#connpy.cli.help_text.get_help">get_help</a></code></li>
|
||||
<li><code><a title="connpy.cli.help_text.get_instructions" href="#connpy.cli.help_text.get_instructions">get_instructions</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,213 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.helpers API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.helpers</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.helpers.choose"><code class="name flex">
|
||||
<span>def <span class="ident">choose</span></span>(<span>app, list_, name, action)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def choose(app, list_, name, action):
|
||||
# Generates an inquirer list to pick
|
||||
# Safeguard: Never prompt if running in autocomplete shell
|
||||
if os.environ.get("_ARGCOMPLETE") or os.environ.get("COMP_LINE"):
|
||||
return None
|
||||
|
||||
if FzfPrompt and app.fzf and os.environ.get("_ARGCOMPLETE") is None and os.environ.get("COMP_LINE") is None:
|
||||
fzf_prompt = FzfPrompt(executable_path="fzf-tmux")
|
||||
if not app.case:
|
||||
fzf_prompt = FzfPrompt(executable_path="fzf-tmux -i")
|
||||
answer = fzf_prompt.prompt(list_, fzf_options="-d 25%")
|
||||
if len(answer) == 0:
|
||||
return None
|
||||
else:
|
||||
return answer[0]
|
||||
else:
|
||||
questions = [inquirer.List(name, message="Pick {} to {}:".format(name,action), choices=list_, carousel=True)]
|
||||
answer = inquirer.prompt(questions)
|
||||
if answer == None:
|
||||
return None
|
||||
else:
|
||||
return answer[name]</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.helpers.folders_completer"><code class="name flex">
|
||||
<span>def <span class="ident">folders_completer</span></span>(<span>prefix, parsed_args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def folders_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.folders_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.helpers.get_config_dir"><code class="name flex">
|
||||
<span>def <span class="ident">get_config_dir</span></span>(<span>)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_config_dir():
|
||||
home = os.path.expanduser("~")
|
||||
defaultdir = os.path.join(home, '.config/conn')
|
||||
pathfile = os.path.join(defaultdir, '.folder')
|
||||
try:
|
||||
with open(pathfile, "r") as f:
|
||||
return f.read().strip()
|
||||
except:
|
||||
return defaultdir</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.helpers.nodes_completer"><code class="name flex">
|
||||
<span>def <span class="ident">nodes_completer</span></span>(<span>prefix, parsed_args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def nodes_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.fzf_nodes_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.helpers.profiles_completer"><code class="name flex">
|
||||
<span>def <span class="ident">profiles_completer</span></span>(<span>prefix, parsed_args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def profiles_completer(prefix, parsed_args, **kwargs):
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.profiles_cache.txt')
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
return [line.strip() for line in f if line.startswith(prefix)]
|
||||
return []</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.helpers.toplevel_completer"><code class="name flex">
|
||||
<span>def <span class="ident">toplevel_completer</span></span>(<span>prefix, parsed_args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def toplevel_completer(prefix, parsed_args, **kwargs):
|
||||
commands = ["node", "profile", "move", "mv", "copy", "cp", "list", "ls", "bulk", "export", "import", "ai", "run", "api", "context", "plugin", "config", "sync"]
|
||||
|
||||
configdir = get_config_dir()
|
||||
cache_file = os.path.join(configdir, '.fzf_nodes_cache.txt')
|
||||
nodes = []
|
||||
if os.path.exists(cache_file):
|
||||
with open(cache_file, "r") as f:
|
||||
nodes = [line.strip() for line in f if line.startswith(prefix)]
|
||||
|
||||
cache_folders = os.path.join(configdir, '.folders_cache.txt')
|
||||
if os.path.exists(cache_folders):
|
||||
with open(cache_folders, "r") as f:
|
||||
nodes += [line.strip() for line in f if line.startswith(prefix)]
|
||||
|
||||
return [c for c in commands + nodes if c.startswith(prefix)]</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.cli.helpers.choose" href="#connpy.cli.helpers.choose">choose</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers.folders_completer" href="#connpy.cli.helpers.folders_completer">folders_completer</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers.get_config_dir" href="#connpy.cli.helpers.get_config_dir">get_config_dir</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers.nodes_completer" href="#connpy.cli.helpers.nodes_completer">nodes_completer</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers.profiles_completer" href="#connpy.cli.helpers.profiles_completer">profiles_completer</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers.toplevel_completer" href="#connpy.cli.helpers.toplevel_completer">toplevel_completer</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,278 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.import_export_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.import_export_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.import_export_handler.ImportExportHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">ImportExportHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ImportExportHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch_import(self, args):
|
||||
file_path = args.data[0]
|
||||
try:
|
||||
printer.warning("This could overwrite your current configuration!")
|
||||
question = [inquirer.Confirm("import", message=f"Are you sure you want to import {file_path}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["import"]:
|
||||
sys.exit(7)
|
||||
|
||||
self.app.services.import_export.import_from_file(file_path)
|
||||
printer.success(f"File {file_path} imported successfully.")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def dispatch_export(self, args):
|
||||
file_path = args.data[0]
|
||||
folders = args.data[1:] if len(args.data) > 1 else None
|
||||
try:
|
||||
self.app.services.import_export.export_to_file(file_path, folders=folders)
|
||||
printer.success(f"File {file_path} generated successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
sys.exit()
|
||||
|
||||
def bulk(self, args):
|
||||
if args.file and os.path.isfile(args.file[0]):
|
||||
with open(args.file[0], 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Expecting exactly 2 lines
|
||||
if len(lines) < 2:
|
||||
printer.error("The file must contain at least two lines: one for nodes, one for hosts.")
|
||||
sys.exit(11)
|
||||
|
||||
nodes = lines[0].strip()
|
||||
hosts = lines[1].strip()
|
||||
newnodes = self.forms.questions_bulk(nodes, hosts)
|
||||
else:
|
||||
newnodes = self.forms.questions_bulk()
|
||||
|
||||
if newnodes == False:
|
||||
sys.exit(7)
|
||||
|
||||
if not self.app.case:
|
||||
newnodes["location"] = newnodes["location"].lower()
|
||||
newnodes["ids"] = newnodes["ids"].lower()
|
||||
|
||||
# Handle the case where location might be a file reference (e.g. from a prompt)
|
||||
location = newnodes["location"]
|
||||
if location.startswith("@") and "/" in location:
|
||||
# Extract the actual @folder part (e.g. @testall from @testall/.folders_cache.txt)
|
||||
location = location.split("/")[0]
|
||||
newnodes["location"] = location
|
||||
|
||||
ids = newnodes["ids"].split(",")
|
||||
# Append location to each id for proper folder assignment
|
||||
location = newnodes["location"]
|
||||
if location:
|
||||
ids = [f"{i}{location}" for i in ids]
|
||||
|
||||
hosts = newnodes["host"].split(",")
|
||||
|
||||
try:
|
||||
count = self.app.services.nodes.bulk_add(ids, hosts, newnodes)
|
||||
if count > 0:
|
||||
printer.success(f"Successfully added {count} nodes.")
|
||||
else:
|
||||
printer.info("0 nodes added")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.import_export_handler.ImportExportHandler.bulk"><code class="name flex">
|
||||
<span>def <span class="ident">bulk</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def bulk(self, args):
|
||||
if args.file and os.path.isfile(args.file[0]):
|
||||
with open(args.file[0], 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Expecting exactly 2 lines
|
||||
if len(lines) < 2:
|
||||
printer.error("The file must contain at least two lines: one for nodes, one for hosts.")
|
||||
sys.exit(11)
|
||||
|
||||
nodes = lines[0].strip()
|
||||
hosts = lines[1].strip()
|
||||
newnodes = self.forms.questions_bulk(nodes, hosts)
|
||||
else:
|
||||
newnodes = self.forms.questions_bulk()
|
||||
|
||||
if newnodes == False:
|
||||
sys.exit(7)
|
||||
|
||||
if not self.app.case:
|
||||
newnodes["location"] = newnodes["location"].lower()
|
||||
newnodes["ids"] = newnodes["ids"].lower()
|
||||
|
||||
# Handle the case where location might be a file reference (e.g. from a prompt)
|
||||
location = newnodes["location"]
|
||||
if location.startswith("@") and "/" in location:
|
||||
# Extract the actual @folder part (e.g. @testall from @testall/.folders_cache.txt)
|
||||
location = location.split("/")[0]
|
||||
newnodes["location"] = location
|
||||
|
||||
ids = newnodes["ids"].split(",")
|
||||
# Append location to each id for proper folder assignment
|
||||
location = newnodes["location"]
|
||||
if location:
|
||||
ids = [f"{i}{location}" for i in ids]
|
||||
|
||||
hosts = newnodes["host"].split(",")
|
||||
|
||||
try:
|
||||
count = self.app.services.nodes.bulk_add(ids, hosts, newnodes)
|
||||
if count > 0:
|
||||
printer.success(f"Successfully added {count} nodes.")
|
||||
else:
|
||||
printer.info("0 nodes added")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.import_export_handler.ImportExportHandler.dispatch_export"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch_export</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch_export(self, args):
|
||||
file_path = args.data[0]
|
||||
folders = args.data[1:] if len(args.data) > 1 else None
|
||||
try:
|
||||
self.app.services.import_export.export_to_file(file_path, folders=folders)
|
||||
printer.success(f"File {file_path} generated successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
sys.exit()</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.import_export_handler.ImportExportHandler.dispatch_import"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch_import</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch_import(self, args):
|
||||
file_path = args.data[0]
|
||||
try:
|
||||
printer.warning("This could overwrite your current configuration!")
|
||||
question = [inquirer.Confirm("import", message=f"Are you sure you want to import {file_path}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["import"]:
|
||||
sys.exit(7)
|
||||
|
||||
self.app.services.import_export.import_from_file(file_path)
|
||||
printer.success(f"File {file_path} imported successfully.")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.import_export_handler.ImportExportHandler" href="#connpy.cli.import_export_handler.ImportExportHandler">ImportExportHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.import_export_handler.ImportExportHandler.bulk" href="#connpy.cli.import_export_handler.ImportExportHandler.bulk">bulk</a></code></li>
|
||||
<li><code><a title="connpy.cli.import_export_handler.ImportExportHandler.dispatch_export" href="#connpy.cli.import_export_handler.ImportExportHandler.dispatch_export">dispatch_export</a></code></li>
|
||||
<li><code><a title="connpy.cli.import_export_handler.ImportExportHandler.dispatch_import" href="#connpy.cli.import_export_handler.ImportExportHandler.dispatch_import">dispatch_import</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,143 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-submodules">Sub-modules</h2>
|
||||
<dl>
|
||||
<dt><code class="name"><a title="connpy.cli.ai_handler" href="ai_handler.html">connpy.cli.ai_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.api_handler" href="api_handler.html">connpy.cli.api_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.config_handler" href="config_handler.html">connpy.cli.config_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.context_handler" href="context_handler.html">connpy.cli.context_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.forms" href="forms.html">connpy.cli.forms</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.help_text" href="help_text.html">connpy.cli.help_text</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.helpers" href="helpers.html">connpy.cli.helpers</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.import_export_handler" href="import_export_handler.html">connpy.cli.import_export_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.node_handler" href="node_handler.html">connpy.cli.node_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.plugin_handler" href="plugin_handler.html">connpy.cli.plugin_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.profile_handler" href="profile_handler.html">connpy.cli.profile_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.run_handler" href="run_handler.html">connpy.cli.run_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.sync_handler" href="sync_handler.html">connpy.cli.sync_handler</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.cli.validators" href="validators.html">connpy.cli.validators</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy" href="../index.html">connpy</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-submodules">Sub-modules</a></h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli.ai_handler" href="ai_handler.html">connpy.cli.ai_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.api_handler" href="api_handler.html">connpy.cli.api_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.config_handler" href="config_handler.html">connpy.cli.config_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.context_handler" href="context_handler.html">connpy.cli.context_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.forms" href="forms.html">connpy.cli.forms</a></code></li>
|
||||
<li><code><a title="connpy.cli.help_text" href="help_text.html">connpy.cli.help_text</a></code></li>
|
||||
<li><code><a title="connpy.cli.helpers" href="helpers.html">connpy.cli.helpers</a></code></li>
|
||||
<li><code><a title="connpy.cli.import_export_handler" href="import_export_handler.html">connpy.cli.import_export_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler" href="node_handler.html">connpy.cli.node_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.plugin_handler" href="plugin_handler.html">connpy.cli.plugin_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.profile_handler" href="profile_handler.html">connpy.cli.profile_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler" href="run_handler.html">connpy.cli.run_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler" href="sync_handler.html">connpy.cli.sync_handler</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators" href="validators.html">connpy.cli.validators</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,604 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.node_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.node_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class NodeHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch(self, args):
|
||||
if not self.app.case and args.data != None:
|
||||
args.data = args.data.lower()
|
||||
actions = {"version": self.version, "connect": self.connect, "add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def version(self, args):
|
||||
from .._version import __version__
|
||||
printer.info(f"Connpy {__version__}")
|
||||
|
||||
def connect(self, args):
|
||||
if args.data == None:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes()
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to list nodes: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.warning("There are no nodes created")
|
||||
printer.info("try: connpy --help")
|
||||
sys.exit(9)
|
||||
else:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "connect")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.nodes.connect_node(
|
||||
matches[0],
|
||||
sftp=args.sftp,
|
||||
debug=args.debug,
|
||||
logger=self.app._service_logger
|
||||
)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def delete(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
matches = self.app.services.nodes.list_folders(args.data)
|
||||
else:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
|
||||
printer.info(f"Removing: {matches}")
|
||||
question = [inquirer.Confirm("delete", message="Are you sure you want to continue?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
for item in matches:
|
||||
self.app.services.nodes.delete_node(item, is_folder=is_folder)
|
||||
|
||||
if len(matches) == 1:
|
||||
printer.success(f"{matches[0]} deleted successfully")
|
||||
else:
|
||||
printer.success(f"{len(matches)} items deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def add(self, args):
|
||||
try:
|
||||
args.data = self.app._type_node(args.data)
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(3)
|
||||
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
if not uniques:
|
||||
raise InvalidConfigurationError(f"Invalid folder {args.data}")
|
||||
self.app.services.nodes.add_node(args.data, {}, is_folder=True)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
else:
|
||||
if args.data in self.app.nodes_list:
|
||||
printer.error(f"Node '{args.data}' already exists.")
|
||||
sys.exit(1)
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
printer.console.print(Markdown(get_instructions()))
|
||||
|
||||
new_node_data = self.forms.questions_nodes(args.data, uniques)
|
||||
if not new_node_data:
|
||||
sys.exit(7)
|
||||
self.app.services.nodes.add_node(args.data, new_node_data)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def show(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "show")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
node = self.app.services.nodes.get_node_details(matches[0])
|
||||
yaml_output = yaml.dump(node, sort_keys=False, default_flow_style=False)
|
||||
printer.data(matches[0], yaml_output)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def modify(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"No connection found with filter: {args.data}")
|
||||
sys.exit(2)
|
||||
|
||||
unique = matches[0] if len(matches) == 1 else None
|
||||
uniques = self.app.services.nodes.explode_unique(unique) if unique else {"id": None, "folder": None}
|
||||
|
||||
printer.info(f"Editing: {matches}")
|
||||
node_details = {}
|
||||
for i in matches:
|
||||
node_details[i] = self.app.services.nodes.get_node_details(i)
|
||||
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
# Use first match as base for defaults if multiple matches exist
|
||||
base_unique = matches[0]
|
||||
base_uniques = self.app.services.nodes.explode_unique(base_unique)
|
||||
updatenode = self.forms.questions_nodes(base_unique, base_uniques, edit=edits)
|
||||
if not updatenode:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
if len(matches) == 1:
|
||||
# Comparison for "Nothing to do"
|
||||
current = node_details[matches[0]].copy()
|
||||
current.update(uniques)
|
||||
current["type"] = "connection"
|
||||
if sorted(updatenode.items()) == sorted(current.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
self.app.services.nodes.update_node(matches[0], updatenode)
|
||||
printer.success(f"{args.data} edited successfully")
|
||||
else:
|
||||
editcount = 0
|
||||
for k in matches:
|
||||
updated_item = self.app.services.nodes.explode_unique(k)
|
||||
updated_item["type"] = "connection"
|
||||
updated_item.update(node_details[k])
|
||||
|
||||
this_item_changed = False
|
||||
for key, should_edit in edits.items():
|
||||
if should_edit:
|
||||
this_item_changed = True
|
||||
updated_item[key] = updatenode[key]
|
||||
|
||||
if this_item_changed:
|
||||
editcount += 1
|
||||
self.app.services.nodes.update_node(k, updated_item)
|
||||
|
||||
if editcount == 0:
|
||||
printer.info("Nothing to do here")
|
||||
else:
|
||||
printer.success(f"{matches} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.add"><code class="name flex">
|
||||
<span>def <span class="ident">add</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def add(self, args):
|
||||
try:
|
||||
args.data = self.app._type_node(args.data)
|
||||
except ValueError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(3)
|
||||
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
if not uniques:
|
||||
raise InvalidConfigurationError(f"Invalid folder {args.data}")
|
||||
self.app.services.nodes.add_node(args.data, {}, is_folder=True)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
else:
|
||||
if args.data in self.app.nodes_list:
|
||||
printer.error(f"Node '{args.data}' already exists.")
|
||||
sys.exit(1)
|
||||
uniques = self.app.services.nodes.explode_unique(args.data)
|
||||
printer.console.print(Markdown(get_instructions()))
|
||||
|
||||
new_node_data = self.forms.questions_nodes(args.data, uniques)
|
||||
if not new_node_data:
|
||||
sys.exit(7)
|
||||
self.app.services.nodes.add_node(args.data, new_node_data)
|
||||
printer.success(f"{args.data} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.connect"><code class="name flex">
|
||||
<span>def <span class="ident">connect</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def connect(self, args):
|
||||
if args.data == None:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes()
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to list nodes: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.warning("There are no nodes created")
|
||||
printer.info("try: connpy --help")
|
||||
sys.exit(9)
|
||||
else:
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "connect")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.nodes.connect_node(
|
||||
matches[0],
|
||||
sftp=args.sftp,
|
||||
debug=args.debug,
|
||||
logger=self.app._service_logger
|
||||
)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.delete"><code class="name flex">
|
||||
<span>def <span class="ident">delete</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def delete(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
is_folder = args.data.startswith("@")
|
||||
try:
|
||||
if is_folder:
|
||||
matches = self.app.services.nodes.list_folders(args.data)
|
||||
else:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
|
||||
printer.info(f"Removing: {matches}")
|
||||
question = [inquirer.Confirm("delete", message="Are you sure you want to continue?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
for item in matches:
|
||||
self.app.services.nodes.delete_node(item, is_folder=is_folder)
|
||||
|
||||
if len(matches) == 1:
|
||||
printer.success(f"{matches[0]} deleted successfully")
|
||||
else:
|
||||
printer.success(f"{len(matches)} items deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
if not self.app.case and args.data != None:
|
||||
args.data = args.data.lower()
|
||||
actions = {"version": self.version, "connect": self.connect, "add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.modify"><code class="name flex">
|
||||
<span>def <span class="ident">modify</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def modify(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"No connection found with filter: {args.data}")
|
||||
sys.exit(2)
|
||||
|
||||
unique = matches[0] if len(matches) == 1 else None
|
||||
uniques = self.app.services.nodes.explode_unique(unique) if unique else {"id": None, "folder": None}
|
||||
|
||||
printer.info(f"Editing: {matches}")
|
||||
node_details = {}
|
||||
for i in matches:
|
||||
node_details[i] = self.app.services.nodes.get_node_details(i)
|
||||
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
# Use first match as base for defaults if multiple matches exist
|
||||
base_unique = matches[0]
|
||||
base_uniques = self.app.services.nodes.explode_unique(base_unique)
|
||||
updatenode = self.forms.questions_nodes(base_unique, base_uniques, edit=edits)
|
||||
if not updatenode:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
if len(matches) == 1:
|
||||
# Comparison for "Nothing to do"
|
||||
current = node_details[matches[0]].copy()
|
||||
current.update(uniques)
|
||||
current["type"] = "connection"
|
||||
if sorted(updatenode.items()) == sorted(current.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
self.app.services.nodes.update_node(matches[0], updatenode)
|
||||
printer.success(f"{args.data} edited successfully")
|
||||
else:
|
||||
editcount = 0
|
||||
for k in matches:
|
||||
updated_item = self.app.services.nodes.explode_unique(k)
|
||||
updated_item["type"] = "connection"
|
||||
updated_item.update(node_details[k])
|
||||
|
||||
this_item_changed = False
|
||||
for key, should_edit in edits.items():
|
||||
if should_edit:
|
||||
this_item_changed = True
|
||||
updated_item[key] = updatenode[key]
|
||||
|
||||
if this_item_changed:
|
||||
editcount += 1
|
||||
self.app.services.nodes.update_node(k, updated_item)
|
||||
|
||||
if editcount == 0:
|
||||
printer.info("Nothing to do here")
|
||||
else:
|
||||
printer.success(f"{matches} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.show"><code class="name flex">
|
||||
<span>def <span class="ident">show</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def show(self, args):
|
||||
if args.data == None:
|
||||
printer.error("Missing argument node")
|
||||
sys.exit(3)
|
||||
|
||||
try:
|
||||
matches = self.app.services.nodes.list_nodes(args.data)
|
||||
except Exception:
|
||||
matches = []
|
||||
|
||||
if len(matches) == 0:
|
||||
printer.error(f"{args.data} not found")
|
||||
sys.exit(2)
|
||||
elif len(matches) > 1:
|
||||
matches[0] = choose(self.app, matches, "node", "show")
|
||||
|
||||
if matches[0] == None:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
node = self.app.services.nodes.get_node_details(matches[0])
|
||||
yaml_output = yaml.dump(node, sort_keys=False, default_flow_style=False)
|
||||
printer.data(matches[0], yaml_output)
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.node_handler.NodeHandler.version"><code class="name flex">
|
||||
<span>def <span class="ident">version</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def version(self, args):
|
||||
from .._version import __version__
|
||||
printer.info(f"Connpy {__version__}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.node_handler.NodeHandler" href="#connpy.cli.node_handler.NodeHandler">NodeHandler</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.add" href="#connpy.cli.node_handler.NodeHandler.add">add</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.connect" href="#connpy.cli.node_handler.NodeHandler.connect">connect</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.delete" href="#connpy.cli.node_handler.NodeHandler.delete">delete</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.dispatch" href="#connpy.cli.node_handler.NodeHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.modify" href="#connpy.cli.node_handler.NodeHandler.modify">modify</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.show" href="#connpy.cli.node_handler.NodeHandler.show">show</a></code></li>
|
||||
<li><code><a title="connpy.cli.node_handler.NodeHandler.version" href="#connpy.cli.node_handler.NodeHandler.version">version</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,391 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.plugin_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.plugin_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.plugin_handler.PluginHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">PluginHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class PluginHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
try:
|
||||
# We determine the target PluginService/PluginStub based on standard 'mode'
|
||||
# But wait, local plugins should go to app.services._init_local version
|
||||
# Or we can just use the provided app.services.plugins and pass the appropriate grpc calls if needed.
|
||||
|
||||
is_remote = getattr(args, "remote", False)
|
||||
if is_remote and self.app.services.mode != "remote":
|
||||
printer.error("Cannot use --remote flag when not running in remote mode.")
|
||||
return
|
||||
|
||||
if args.add:
|
||||
self.app.services.plugins.add_plugin(args.add[0], args.add[1])
|
||||
printer.success(f"Plugin {args.add[0]} added successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.update:
|
||||
self.app.services.plugins.add_plugin(args.update[0], args.update[1], update=True)
|
||||
printer.success(f"Plugin {args.update[0]} updated successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.delete:
|
||||
self.app.services.plugins.delete_plugin(args.delete[0])
|
||||
printer.success(f"Plugin {args.delete[0]} deleted successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.enable:
|
||||
name = args.enable[0]
|
||||
if is_remote:
|
||||
self.app.plugins.preferences[name] = "remote"
|
||||
else:
|
||||
if name in self.app.plugins.preferences:
|
||||
del self.app.plugins.preferences[name]
|
||||
|
||||
self.app.plugins._save_preferences(self.app.services.config_svc.get_default_dir())
|
||||
|
||||
# Always try to enable it locally (remove .bkp) if it exists
|
||||
# regardless of mode, to keep files consistent with "enabled" state
|
||||
try:
|
||||
# We use a local service instance to ensure we touch local files
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_svc.enable_plugin(name)
|
||||
except Exception:
|
||||
pass # Ignore if not found locally or already enabled
|
||||
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
self.app.services.plugins.enable_plugin(name)
|
||||
|
||||
printer.success(f"Plugin {name} enabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
elif args.disable:
|
||||
name = args.disable[0]
|
||||
success = False
|
||||
if is_remote:
|
||||
if self.app.services.mode == "remote":
|
||||
self.app.services.plugins.disable_plugin(name)
|
||||
success = True
|
||||
else:
|
||||
# Disable locally
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
try:
|
||||
if local_svc.disable_plugin(name):
|
||||
success = True
|
||||
except Exception as e:
|
||||
printer.warning(f"Could not disable local plugin: {e}")
|
||||
|
||||
if success:
|
||||
printer.success(f"Plugin {name} disabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
|
||||
# If any remote operation was performed, trigger a sync to update local cache immediately
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
try:
|
||||
import os
|
||||
cache_dir = os.path.join(self.app.services.config_svc.get_default_dir(), "remote_plugins")
|
||||
# We use a dummy subparser choice check bypass by passing force_sync=True
|
||||
# or just letting the hasher handle it.
|
||||
self.app.plugins._import_remote_plugins_to_argparse(
|
||||
self.app.services.plugins,
|
||||
self.app.subparsers, # We'll need to make sure this is available
|
||||
cache_dir,
|
||||
force_sync=True
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
elif getattr(args, "sync", False):
|
||||
# The actual sync logic is performed in connapp.py during init
|
||||
# if the --sync flag is detected in sys.argv
|
||||
printer.success("Remote plugins synchronized successfully.")
|
||||
elif args.list:
|
||||
# We need to fetch both local and remote if in remote mode
|
||||
local_plugins = {}
|
||||
remote_plugins = {}
|
||||
|
||||
# Fetch depending on mode
|
||||
if self.app.services.mode == "remote":
|
||||
# For local we need to instantiate a local plugin service bypassing stub
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_plugins = local_svc.list_plugins()
|
||||
remote_plugins = self.app.services.plugins.list_plugins()
|
||||
else:
|
||||
local_plugins = self.app.services.plugins.list_plugins()
|
||||
|
||||
from rich.table import Table
|
||||
|
||||
table = Table(title="Available Plugins", show_header=True, header_style="bold cyan")
|
||||
table.add_column("Plugin", style="cyan")
|
||||
table.add_column("State", style="bold")
|
||||
table.add_column("Origin", style="magenta")
|
||||
|
||||
# Populate local plugins
|
||||
for name, details in local_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if self.app.services.mode == "remote" and state == "Active":
|
||||
if self.app.plugins.preferences.get(name) == "remote":
|
||||
state = "Shadowed (Override by Remote)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Local")
|
||||
|
||||
# Populate remote plugins
|
||||
if self.app.services.mode == "remote":
|
||||
for name, details in remote_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if state == "Active":
|
||||
pref = self.app.plugins.preferences.get(name, "local")
|
||||
# If preference isn't remote and the plugin exists locally, local takes priority
|
||||
if pref != "remote" and name in local_plugins:
|
||||
state = "Shadowed (Override by Local)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Remote")
|
||||
|
||||
if not local_plugins and not remote_plugins:
|
||||
printer.console.print(" No plugins found.")
|
||||
else:
|
||||
printer.console.print(table)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.plugin_handler.PluginHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
try:
|
||||
# We determine the target PluginService/PluginStub based on standard 'mode'
|
||||
# But wait, local plugins should go to app.services._init_local version
|
||||
# Or we can just use the provided app.services.plugins and pass the appropriate grpc calls if needed.
|
||||
|
||||
is_remote = getattr(args, "remote", False)
|
||||
if is_remote and self.app.services.mode != "remote":
|
||||
printer.error("Cannot use --remote flag when not running in remote mode.")
|
||||
return
|
||||
|
||||
if args.add:
|
||||
self.app.services.plugins.add_plugin(args.add[0], args.add[1])
|
||||
printer.success(f"Plugin {args.add[0]} added successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.update:
|
||||
self.app.services.plugins.add_plugin(args.update[0], args.update[1], update=True)
|
||||
printer.success(f"Plugin {args.update[0]} updated successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.delete:
|
||||
self.app.services.plugins.delete_plugin(args.delete[0])
|
||||
printer.success(f"Plugin {args.delete[0]} deleted successfully{' remotely' if is_remote else ''}.")
|
||||
elif args.enable:
|
||||
name = args.enable[0]
|
||||
if is_remote:
|
||||
self.app.plugins.preferences[name] = "remote"
|
||||
else:
|
||||
if name in self.app.plugins.preferences:
|
||||
del self.app.plugins.preferences[name]
|
||||
|
||||
self.app.plugins._save_preferences(self.app.services.config_svc.get_default_dir())
|
||||
|
||||
# Always try to enable it locally (remove .bkp) if it exists
|
||||
# regardless of mode, to keep files consistent with "enabled" state
|
||||
try:
|
||||
# We use a local service instance to ensure we touch local files
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_svc.enable_plugin(name)
|
||||
except Exception:
|
||||
pass # Ignore if not found locally or already enabled
|
||||
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
self.app.services.plugins.enable_plugin(name)
|
||||
|
||||
printer.success(f"Plugin {name} enabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
elif args.disable:
|
||||
name = args.disable[0]
|
||||
success = False
|
||||
if is_remote:
|
||||
if self.app.services.mode == "remote":
|
||||
self.app.services.plugins.disable_plugin(name)
|
||||
success = True
|
||||
else:
|
||||
# Disable locally
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
try:
|
||||
if local_svc.disable_plugin(name):
|
||||
success = True
|
||||
except Exception as e:
|
||||
printer.warning(f"Could not disable local plugin: {e}")
|
||||
|
||||
if success:
|
||||
printer.success(f"Plugin {name} disabled successfully{' remotely' if is_remote else ' locally'}.")
|
||||
|
||||
# If any remote operation was performed, trigger a sync to update local cache immediately
|
||||
if is_remote and self.app.services.mode == "remote":
|
||||
try:
|
||||
import os
|
||||
cache_dir = os.path.join(self.app.services.config_svc.get_default_dir(), "remote_plugins")
|
||||
# We use a dummy subparser choice check bypass by passing force_sync=True
|
||||
# or just letting the hasher handle it.
|
||||
self.app.plugins._import_remote_plugins_to_argparse(
|
||||
self.app.services.plugins,
|
||||
self.app.subparsers, # We'll need to make sure this is available
|
||||
cache_dir,
|
||||
force_sync=True
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
elif getattr(args, "sync", False):
|
||||
# The actual sync logic is performed in connapp.py during init
|
||||
# if the --sync flag is detected in sys.argv
|
||||
printer.success("Remote plugins synchronized successfully.")
|
||||
elif args.list:
|
||||
# We need to fetch both local and remote if in remote mode
|
||||
local_plugins = {}
|
||||
remote_plugins = {}
|
||||
|
||||
# Fetch depending on mode
|
||||
if self.app.services.mode == "remote":
|
||||
# For local we need to instantiate a local plugin service bypassing stub
|
||||
from ..services.plugin_service import PluginService
|
||||
local_svc = PluginService(self.app.services.config)
|
||||
local_plugins = local_svc.list_plugins()
|
||||
remote_plugins = self.app.services.plugins.list_plugins()
|
||||
else:
|
||||
local_plugins = self.app.services.plugins.list_plugins()
|
||||
|
||||
from rich.table import Table
|
||||
|
||||
table = Table(title="Available Plugins", show_header=True, header_style="bold cyan")
|
||||
table.add_column("Plugin", style="cyan")
|
||||
table.add_column("State", style="bold")
|
||||
table.add_column("Origin", style="magenta")
|
||||
|
||||
# Populate local plugins
|
||||
for name, details in local_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if self.app.services.mode == "remote" and state == "Active":
|
||||
if self.app.plugins.preferences.get(name) == "remote":
|
||||
state = "Shadowed (Override by Remote)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Local")
|
||||
|
||||
# Populate remote plugins
|
||||
if self.app.services.mode == "remote":
|
||||
for name, details in remote_plugins.items():
|
||||
state = "Disabled" if not details.get("enabled", True) else "Active"
|
||||
color = "red" if state == "Disabled" else "green"
|
||||
|
||||
if state == "Active":
|
||||
pref = self.app.plugins.preferences.get(name, "local")
|
||||
# If preference isn't remote and the plugin exists locally, local takes priority
|
||||
if pref != "remote" and name in local_plugins:
|
||||
state = "Shadowed (Override by Local)"
|
||||
color = "yellow"
|
||||
|
||||
table.add_row(name, f"[{color}]{state}[/{color}]", "Remote")
|
||||
|
||||
if not local_plugins and not remote_plugins:
|
||||
printer.console.print(" No plugins found.")
|
||||
else:
|
||||
printer.console.print(table)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.plugin_handler.PluginHandler" href="#connpy.cli.plugin_handler.PluginHandler">PluginHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.plugin_handler.PluginHandler.dispatch" href="#connpy.cli.plugin_handler.PluginHandler.dispatch">dispatch</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,320 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.profile_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.profile_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">ProfileHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ProfileHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self.forms = Forms(app)
|
||||
|
||||
def dispatch(self, args):
|
||||
if not self.app.case:
|
||||
args.data[0] = args.data[0].lower()
|
||||
actions = {"add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def delete(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
self.app.services.profiles.get_profile(name)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{name} not found")
|
||||
sys.exit(2)
|
||||
|
||||
if name == "default":
|
||||
printer.error("Can't delete default profile")
|
||||
sys.exit(6)
|
||||
|
||||
question = [inquirer.Confirm("delete", message=f"Are you sure you want to delete {name}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.delete_profile(name)
|
||||
printer.success(f"{name} deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(8)
|
||||
|
||||
def show(self, args):
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(args.data[0])
|
||||
yaml_output = yaml.dump(profile, sort_keys=False, default_flow_style=False)
|
||||
printer.data(args.data[0], yaml_output)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{args.data[0]} not found")
|
||||
sys.exit(2)
|
||||
|
||||
def add(self, args):
|
||||
name = args.data[0]
|
||||
if name in self.app.services.profiles.list_profiles():
|
||||
printer.error(f"Profile '{name}' already exists.")
|
||||
sys.exit(4)
|
||||
|
||||
new_profile_data = self.forms.questions_profiles(name)
|
||||
if not new_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.add_profile(name, new_profile_data)
|
||||
printer.success(f"{name} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def modify(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(name, resolve=False)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"Profile '{name}' not found")
|
||||
sys.exit(2)
|
||||
|
||||
old_profile = {"id": name, **profile}
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
update_profile_data = self.forms.questions_profiles(name, edit=edits)
|
||||
if not update_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
if sorted(update_profile_data.items()) == sorted(old_profile.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
|
||||
try:
|
||||
self.app.services.profiles.update_profile(name, update_profile_data)
|
||||
printer.success(f"{name} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler.add"><code class="name flex">
|
||||
<span>def <span class="ident">add</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def add(self, args):
|
||||
name = args.data[0]
|
||||
if name in self.app.services.profiles.list_profiles():
|
||||
printer.error(f"Profile '{name}' already exists.")
|
||||
sys.exit(4)
|
||||
|
||||
new_profile_data = self.forms.questions_profiles(name)
|
||||
if not new_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.add_profile(name, new_profile_data)
|
||||
printer.success(f"{name} added successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler.delete"><code class="name flex">
|
||||
<span>def <span class="ident">delete</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def delete(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
self.app.services.profiles.get_profile(name)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{name} not found")
|
||||
sys.exit(2)
|
||||
|
||||
if name == "default":
|
||||
printer.error("Can't delete default profile")
|
||||
sys.exit(6)
|
||||
|
||||
question = [inquirer.Confirm("delete", message=f"Are you sure you want to delete {name}?")]
|
||||
confirm = inquirer.prompt(question)
|
||||
if confirm == None or not confirm["delete"]:
|
||||
sys.exit(7)
|
||||
|
||||
try:
|
||||
self.app.services.profiles.delete_profile(name)
|
||||
printer.success(f"{name} deleted successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(8)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
if not self.app.case:
|
||||
args.data[0] = args.data[0].lower()
|
||||
actions = {"add": self.add, "del": self.delete, "mod": self.modify, "show": self.show}
|
||||
return actions.get(args.action)(args)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler.modify"><code class="name flex">
|
||||
<span>def <span class="ident">modify</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def modify(self, args):
|
||||
name = args.data[0]
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(name, resolve=False)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"Profile '{name}' not found")
|
||||
sys.exit(2)
|
||||
|
||||
old_profile = {"id": name, **profile}
|
||||
edits = self.forms.questions_edit()
|
||||
if edits == None:
|
||||
sys.exit(7)
|
||||
|
||||
update_profile_data = self.forms.questions_profiles(name, edit=edits)
|
||||
if not update_profile_data:
|
||||
sys.exit(7)
|
||||
|
||||
if sorted(update_profile_data.items()) == sorted(old_profile.items()):
|
||||
printer.info("Nothing to do here")
|
||||
return
|
||||
|
||||
try:
|
||||
self.app.services.profiles.update_profile(name, update_profile_data)
|
||||
printer.success(f"{name} edited successfully")
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.profile_handler.ProfileHandler.show"><code class="name flex">
|
||||
<span>def <span class="ident">show</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def show(self, args):
|
||||
try:
|
||||
profile = self.app.services.profiles.get_profile(args.data[0])
|
||||
yaml_output = yaml.dump(profile, sort_keys=False, default_flow_style=False)
|
||||
printer.data(args.data[0], yaml_output)
|
||||
except ProfileNotFoundError:
|
||||
printer.error(f"{args.data[0]} not found")
|
||||
sys.exit(2)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.profile_handler.ProfileHandler" href="#connpy.cli.profile_handler.ProfileHandler">ProfileHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.profile_handler.ProfileHandler.add" href="#connpy.cli.profile_handler.ProfileHandler.add">add</a></code></li>
|
||||
<li><code><a title="connpy.cli.profile_handler.ProfileHandler.delete" href="#connpy.cli.profile_handler.ProfileHandler.delete">delete</a></code></li>
|
||||
<li><code><a title="connpy.cli.profile_handler.ProfileHandler.dispatch" href="#connpy.cli.profile_handler.ProfileHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.profile_handler.ProfileHandler.modify" href="#connpy.cli.profile_handler.ProfileHandler.modify">modify</a></code></li>
|
||||
<li><code><a title="connpy.cli.profile_handler.ProfileHandler.show" href="#connpy.cli.profile_handler.ProfileHandler.show">show</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,369 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.run_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.run_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.run_handler.RunHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">RunHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class RunHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
if len(args.data) > 1:
|
||||
args.action = "noderun"
|
||||
actions = {"noderun": self.node_run, "generate": self.yaml_generate, "run": self.yaml_run}
|
||||
return actions.get(args.action)(args)
|
||||
|
||||
def node_run(self, args):
|
||||
nodes_filter = args.data[0]
|
||||
commands = [" ".join(args.data[1:])]
|
||||
|
||||
try:
|
||||
header_printed = False
|
||||
# Inline execution with streaming results
|
||||
def _on_node_complete(unique, node_output, node_status):
|
||||
nonlocal header_printed
|
||||
if not header_printed:
|
||||
printer.console.print(Rule("OUTPUT", style="header"))
|
||||
header_printed = True
|
||||
printer.node_panel(unique, node_output, node_status)
|
||||
|
||||
self.app.services.execution.run_commands(
|
||||
nodes_filter=nodes_filter,
|
||||
commands=commands,
|
||||
on_node_complete=_on_node_complete
|
||||
)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
def yaml_generate(self, args):
|
||||
if os.path.exists(args.data[0]):
|
||||
printer.error(f"File '{args.data[0]}' already exists.")
|
||||
sys.exit(14)
|
||||
else:
|
||||
with open(args.data[0], "w") as file:
|
||||
file.write(get_instructions("generate"))
|
||||
printer.success(f"File {args.data[0]} generated successfully")
|
||||
sys.exit()
|
||||
|
||||
def yaml_run(self, args):
|
||||
path = args.data[0]
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
|
||||
for task in playbook.get("tasks", []):
|
||||
self.cli_run(task)
|
||||
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to run playbook {path}: {e}")
|
||||
sys.exit(10)
|
||||
|
||||
def cli_run(self, script):
|
||||
try:
|
||||
action = script["action"]
|
||||
nodelist = script["nodes"]
|
||||
commands = script["commands"]
|
||||
variables = script.get("variables")
|
||||
output_cfg = script["output"]
|
||||
name = script.get("name", "Task")
|
||||
options = script.get("options", {})
|
||||
except KeyError as e:
|
||||
printer.error(f"'{e.args[0]}' is mandatory in script")
|
||||
sys.exit(11)
|
||||
|
||||
stdout = (output_cfg == "stdout")
|
||||
folder = output_cfg if output_cfg not in [None, "stdout"] else None
|
||||
prompt = options.get("prompt")
|
||||
printer.header(name.upper())
|
||||
|
||||
try:
|
||||
if action == "run":
|
||||
# If stdout is true, we stream results as they arrive
|
||||
on_complete = printer.node_panel if stdout else None
|
||||
results = self.app.services.execution.run_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
# If not streaming, we could print a summary table here if needed
|
||||
if not stdout:
|
||||
for unique, output in results.items():
|
||||
printer.node_panel(unique, output, 0)
|
||||
|
||||
elif action == "test":
|
||||
expected = script.get("expected", [])
|
||||
on_complete = printer.test_panel if stdout else None
|
||||
results = self.app.services.execution.test_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
if not stdout:
|
||||
printer.test_summary(results)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.run_handler.RunHandler.cli_run"><code class="name flex">
|
||||
<span>def <span class="ident">cli_run</span></span>(<span>self, script)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def cli_run(self, script):
|
||||
try:
|
||||
action = script["action"]
|
||||
nodelist = script["nodes"]
|
||||
commands = script["commands"]
|
||||
variables = script.get("variables")
|
||||
output_cfg = script["output"]
|
||||
name = script.get("name", "Task")
|
||||
options = script.get("options", {})
|
||||
except KeyError as e:
|
||||
printer.error(f"'{e.args[0]}' is mandatory in script")
|
||||
sys.exit(11)
|
||||
|
||||
stdout = (output_cfg == "stdout")
|
||||
folder = output_cfg if output_cfg not in [None, "stdout"] else None
|
||||
prompt = options.get("prompt")
|
||||
printer.header(name.upper())
|
||||
|
||||
try:
|
||||
if action == "run":
|
||||
# If stdout is true, we stream results as they arrive
|
||||
on_complete = printer.node_panel if stdout else None
|
||||
results = self.app.services.execution.run_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
# If not streaming, we could print a summary table here if needed
|
||||
if not stdout:
|
||||
for unique, output in results.items():
|
||||
printer.node_panel(unique, output, 0)
|
||||
|
||||
elif action == "test":
|
||||
expected = script.get("expected", [])
|
||||
on_complete = printer.test_panel if stdout else None
|
||||
results = self.app.services.execution.test_commands(
|
||||
nodes_filter=nodelist,
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
variables=variables,
|
||||
parallel=options.get("parallel", 10),
|
||||
timeout=options.get("timeout", 10),
|
||||
prompt=prompt,
|
||||
on_node_complete=on_complete
|
||||
)
|
||||
if not stdout:
|
||||
printer.test_summary(results)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.run_handler.RunHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
if len(args.data) > 1:
|
||||
args.action = "noderun"
|
||||
actions = {"noderun": self.node_run, "generate": self.yaml_generate, "run": self.yaml_run}
|
||||
return actions.get(args.action)(args)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.run_handler.RunHandler.node_run"><code class="name flex">
|
||||
<span>def <span class="ident">node_run</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def node_run(self, args):
|
||||
nodes_filter = args.data[0]
|
||||
commands = [" ".join(args.data[1:])]
|
||||
|
||||
try:
|
||||
header_printed = False
|
||||
# Inline execution with streaming results
|
||||
def _on_node_complete(unique, node_output, node_status):
|
||||
nonlocal header_printed
|
||||
if not header_printed:
|
||||
printer.console.print(Rule("OUTPUT", style="header"))
|
||||
header_printed = True
|
||||
printer.node_panel(unique, node_output, node_status)
|
||||
|
||||
self.app.services.execution.run_commands(
|
||||
nodes_filter=nodes_filter,
|
||||
commands=commands,
|
||||
on_node_complete=_on_node_complete
|
||||
)
|
||||
|
||||
except ConnpyError as e:
|
||||
printer.error(str(e))
|
||||
sys.exit(1)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.run_handler.RunHandler.yaml_generate"><code class="name flex">
|
||||
<span>def <span class="ident">yaml_generate</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def yaml_generate(self, args):
|
||||
if os.path.exists(args.data[0]):
|
||||
printer.error(f"File '{args.data[0]}' already exists.")
|
||||
sys.exit(14)
|
||||
else:
|
||||
with open(args.data[0], "w") as file:
|
||||
file.write(get_instructions("generate"))
|
||||
printer.success(f"File {args.data[0]} generated successfully")
|
||||
sys.exit()</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.run_handler.RunHandler.yaml_run"><code class="name flex">
|
||||
<span>def <span class="ident">yaml_run</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def yaml_run(self, args):
|
||||
path = args.data[0]
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
|
||||
for task in playbook.get("tasks", []):
|
||||
self.cli_run(task)
|
||||
|
||||
except Exception as e:
|
||||
printer.error(f"Failed to run playbook {path}: {e}")
|
||||
sys.exit(10)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.run_handler.RunHandler" href="#connpy.cli.run_handler.RunHandler">RunHandler</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.run_handler.RunHandler.cli_run" href="#connpy.cli.run_handler.RunHandler.cli_run">cli_run</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler.RunHandler.dispatch" href="#connpy.cli.run_handler.RunHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler.RunHandler.node_run" href="#connpy.cli.run_handler.RunHandler.node_run">node_run</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler.RunHandler.yaml_generate" href="#connpy.cli.run_handler.RunHandler.yaml_generate">yaml_generate</a></code></li>
|
||||
<li><code><a title="connpy.cli.run_handler.RunHandler.yaml_run" href="#connpy.cli.run_handler.RunHandler.yaml_run">yaml_run</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,433 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.sync_handler API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.sync_handler</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler"><code class="flex name class">
|
||||
<span>class <span class="ident">SyncHandler</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class SyncHandler:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def dispatch(self, args):
|
||||
action = getattr(args, "action", None)
|
||||
actions = {
|
||||
"login": self.login,
|
||||
"logout": self.logout,
|
||||
"status": self.status,
|
||||
"list": self.list_backups,
|
||||
"once": self.once,
|
||||
"restore": self.restore,
|
||||
"start": self.start,
|
||||
"stop": self.stop
|
||||
}
|
||||
handler = actions.get(action)
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
return self.status(args)
|
||||
|
||||
def login(self, args):
|
||||
self.app.services.sync.login()
|
||||
|
||||
def logout(self, args):
|
||||
self.app.services.sync.logout()
|
||||
|
||||
def status(self, args):
|
||||
status = self.app.services.sync.check_login_status()
|
||||
enabled = self.app.services.sync.sync_enabled
|
||||
remote = self.app.services.sync.sync_remote
|
||||
|
||||
printer.info(f"Login Status: {status}")
|
||||
printer.info(f"Auto-Sync: {'Enabled' if enabled else 'Disabled'}")
|
||||
printer.info(f"Sync Remote Nodes: {'Yes' if remote else 'No'}")
|
||||
|
||||
def list_backups(self, args):
|
||||
backups = self.app.services.sync.list_backups()
|
||||
if backups:
|
||||
yaml_output = yaml.dump(backups, sort_keys=False, default_flow_style=False)
|
||||
printer.custom("backups", "")
|
||||
print(yaml_output)
|
||||
else:
|
||||
printer.info("No backups found or not logged in.")
|
||||
|
||||
def once(self, args):
|
||||
# Manual backup. We check if we should include remote nodes
|
||||
remote_data = None
|
||||
if self.app.services.sync.sync_remote and self.app.services.mode == "remote":
|
||||
inventory = self.app.services.nodes.get_inventory()
|
||||
# Merge with local settings
|
||||
local_settings = self.app.services.config_svc.get_settings()
|
||||
local_settings.pop("configfolder", None)
|
||||
|
||||
# Maintain proper config structure: {config: {}, connections: {}, profiles: {}}
|
||||
remote_data = {
|
||||
"config": local_settings,
|
||||
"connections": inventory.get("connections", {}),
|
||||
"profiles": inventory.get("profiles", {})
|
||||
}
|
||||
|
||||
if self.app.services.sync.compress_and_upload(remote_data):
|
||||
printer.success("Manual backup completed.")
|
||||
|
||||
def restore(self, args):
|
||||
import inquirer
|
||||
file_id = getattr(args, "id", None)
|
||||
|
||||
# Segmented flags
|
||||
restore_config = getattr(args, "restore_config", False)
|
||||
restore_nodes = getattr(args, "restore_nodes", False)
|
||||
|
||||
# If neither is specified, we restore ALL (backwards compatibility)
|
||||
if not restore_config and not restore_nodes:
|
||||
restore_config = True
|
||||
restore_nodes = True
|
||||
|
||||
# 1. Analyze what we are about to restore
|
||||
info = self.app.services.sync.analyze_backup_content(file_id)
|
||||
if not info:
|
||||
printer.error("Could not analyze backup content.")
|
||||
return
|
||||
|
||||
# 2. Show detailed info
|
||||
printer.info("Restoration Details:")
|
||||
if restore_config:
|
||||
print(f" - Local Settings: Yes")
|
||||
print(f" - RSA Key (.osk): {'Yes' if info['has_key'] else 'No'}")
|
||||
if restore_nodes:
|
||||
target = "REMOTE" if self.app.services.mode == "remote" else "LOCAL"
|
||||
print(f" - Nodes: {info['nodes']}")
|
||||
print(f" - Folders: {info['folders']}")
|
||||
print(f" - Profiles: {info['profiles']}")
|
||||
print(f" - Destination: {target}")
|
||||
print("")
|
||||
|
||||
questions = [inquirer.Confirm("confirm", message="Do you want to proceed with the restoration?", default=False)]
|
||||
answers = inquirer.prompt(questions)
|
||||
|
||||
if not answers or not answers["confirm"]:
|
||||
printer.info("Restore cancelled.")
|
||||
return
|
||||
|
||||
# 3. Perform the actual restore
|
||||
if self.app.services.sync.restore_backup(
|
||||
file_id=file_id,
|
||||
restore_config=restore_config,
|
||||
restore_nodes=restore_nodes,
|
||||
app_instance=self.app
|
||||
):
|
||||
printer.success("Restore completed successfully.")
|
||||
|
||||
def start(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", True)
|
||||
self.app.services.sync.sync_enabled = True
|
||||
printer.success("Auto-sync enabled.")
|
||||
|
||||
def stop(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", False)
|
||||
self.app.services.sync.sync_enabled = False
|
||||
printer.success("Auto-sync disabled.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.dispatch"><code class="name flex">
|
||||
<span>def <span class="ident">dispatch</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def dispatch(self, args):
|
||||
action = getattr(args, "action", None)
|
||||
actions = {
|
||||
"login": self.login,
|
||||
"logout": self.logout,
|
||||
"status": self.status,
|
||||
"list": self.list_backups,
|
||||
"once": self.once,
|
||||
"restore": self.restore,
|
||||
"start": self.start,
|
||||
"stop": self.stop
|
||||
}
|
||||
handler = actions.get(action)
|
||||
if handler:
|
||||
return handler(args)
|
||||
|
||||
return self.status(args)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.list_backups"><code class="name flex">
|
||||
<span>def <span class="ident">list_backups</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def list_backups(self, args):
|
||||
backups = self.app.services.sync.list_backups()
|
||||
if backups:
|
||||
yaml_output = yaml.dump(backups, sort_keys=False, default_flow_style=False)
|
||||
printer.custom("backups", "")
|
||||
print(yaml_output)
|
||||
else:
|
||||
printer.info("No backups found or not logged in.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.login"><code class="name flex">
|
||||
<span>def <span class="ident">login</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def login(self, args):
|
||||
self.app.services.sync.login()</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.logout"><code class="name flex">
|
||||
<span>def <span class="ident">logout</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def logout(self, args):
|
||||
self.app.services.sync.logout()</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.once"><code class="name flex">
|
||||
<span>def <span class="ident">once</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def once(self, args):
|
||||
# Manual backup. We check if we should include remote nodes
|
||||
remote_data = None
|
||||
if self.app.services.sync.sync_remote and self.app.services.mode == "remote":
|
||||
inventory = self.app.services.nodes.get_inventory()
|
||||
# Merge with local settings
|
||||
local_settings = self.app.services.config_svc.get_settings()
|
||||
local_settings.pop("configfolder", None)
|
||||
|
||||
# Maintain proper config structure: {config: {}, connections: {}, profiles: {}}
|
||||
remote_data = {
|
||||
"config": local_settings,
|
||||
"connections": inventory.get("connections", {}),
|
||||
"profiles": inventory.get("profiles", {})
|
||||
}
|
||||
|
||||
if self.app.services.sync.compress_and_upload(remote_data):
|
||||
printer.success("Manual backup completed.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.restore"><code class="name flex">
|
||||
<span>def <span class="ident">restore</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def restore(self, args):
|
||||
import inquirer
|
||||
file_id = getattr(args, "id", None)
|
||||
|
||||
# Segmented flags
|
||||
restore_config = getattr(args, "restore_config", False)
|
||||
restore_nodes = getattr(args, "restore_nodes", False)
|
||||
|
||||
# If neither is specified, we restore ALL (backwards compatibility)
|
||||
if not restore_config and not restore_nodes:
|
||||
restore_config = True
|
||||
restore_nodes = True
|
||||
|
||||
# 1. Analyze what we are about to restore
|
||||
info = self.app.services.sync.analyze_backup_content(file_id)
|
||||
if not info:
|
||||
printer.error("Could not analyze backup content.")
|
||||
return
|
||||
|
||||
# 2. Show detailed info
|
||||
printer.info("Restoration Details:")
|
||||
if restore_config:
|
||||
print(f" - Local Settings: Yes")
|
||||
print(f" - RSA Key (.osk): {'Yes' if info['has_key'] else 'No'}")
|
||||
if restore_nodes:
|
||||
target = "REMOTE" if self.app.services.mode == "remote" else "LOCAL"
|
||||
print(f" - Nodes: {info['nodes']}")
|
||||
print(f" - Folders: {info['folders']}")
|
||||
print(f" - Profiles: {info['profiles']}")
|
||||
print(f" - Destination: {target}")
|
||||
print("")
|
||||
|
||||
questions = [inquirer.Confirm("confirm", message="Do you want to proceed with the restoration?", default=False)]
|
||||
answers = inquirer.prompt(questions)
|
||||
|
||||
if not answers or not answers["confirm"]:
|
||||
printer.info("Restore cancelled.")
|
||||
return
|
||||
|
||||
# 3. Perform the actual restore
|
||||
if self.app.services.sync.restore_backup(
|
||||
file_id=file_id,
|
||||
restore_config=restore_config,
|
||||
restore_nodes=restore_nodes,
|
||||
app_instance=self.app
|
||||
):
|
||||
printer.success("Restore completed successfully.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.start"><code class="name flex">
|
||||
<span>def <span class="ident">start</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def start(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", True)
|
||||
self.app.services.sync.sync_enabled = True
|
||||
printer.success("Auto-sync enabled.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.status"><code class="name flex">
|
||||
<span>def <span class="ident">status</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def status(self, args):
|
||||
status = self.app.services.sync.check_login_status()
|
||||
enabled = self.app.services.sync.sync_enabled
|
||||
remote = self.app.services.sync.sync_remote
|
||||
|
||||
printer.info(f"Login Status: {status}")
|
||||
printer.info(f"Auto-Sync: {'Enabled' if enabled else 'Disabled'}")
|
||||
printer.info(f"Sync Remote Nodes: {'Yes' if remote else 'No'}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.sync_handler.SyncHandler.stop"><code class="name flex">
|
||||
<span>def <span class="ident">stop</span></span>(<span>self, args)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def stop(self, args):
|
||||
self.app.services.config_svc.update_setting("sync", False)
|
||||
self.app.services.sync.sync_enabled = False
|
||||
printer.success("Auto-sync disabled.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.sync_handler.SyncHandler" href="#connpy.cli.sync_handler.SyncHandler">SyncHandler</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.dispatch" href="#connpy.cli.sync_handler.SyncHandler.dispatch">dispatch</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.list_backups" href="#connpy.cli.sync_handler.SyncHandler.list_backups">list_backups</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.login" href="#connpy.cli.sync_handler.SyncHandler.login">login</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.logout" href="#connpy.cli.sync_handler.SyncHandler.logout">logout</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.once" href="#connpy.cli.sync_handler.SyncHandler.once">once</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.restore" href="#connpy.cli.sync_handler.SyncHandler.restore">restore</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.start" href="#connpy.cli.sync_handler.SyncHandler.start">start</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.status" href="#connpy.cli.sync_handler.SyncHandler.status">status</a></code></li>
|
||||
<li><code><a title="connpy.cli.sync_handler.SyncHandler.stop" href="#connpy.cli.sync_handler.SyncHandler.stop">stop</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,514 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.cli.validators API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.cli.validators</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.cli.validators.Validators"><code class="flex name class">
|
||||
<span>class <span class="ident">Validators</span></span>
|
||||
<span>(</span><span>app)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class Validators:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
|
||||
def host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def profile_protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker or leave empty")
|
||||
return True
|
||||
|
||||
def protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker leave empty or @profile")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def profile_port_validation(self, answers, current, regex = "(^[0-9]*$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535 or leave empty")
|
||||
return True
|
||||
|
||||
def port_validation(self, answers, current, regex = "(^[0-9]*$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile or leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
return True
|
||||
|
||||
def pass_validation(self, answers, current, regex = "(^@.+$)"):
|
||||
profiles = current.split(",")
|
||||
for i in profiles:
|
||||
if not re.match(regex, i) or i[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(i))
|
||||
return True
|
||||
|
||||
def tags_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True
|
||||
|
||||
def profile_tags_validation(self, answers, current):
|
||||
if current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True
|
||||
|
||||
def jumphost_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True
|
||||
|
||||
def profile_jumphost_validation(self, answers, current):
|
||||
if current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True
|
||||
|
||||
def default_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_node_validation(self, answers, current, regex = "^[0-9a-zA-Z_.,$#-]+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_folder_validation(self, answers, current):
|
||||
if not self.app.case:
|
||||
current = current.lower()
|
||||
|
||||
candidate = current
|
||||
if "/" in current:
|
||||
candidate = current.split("/")[0]
|
||||
|
||||
matches = list(filter(lambda k: k == candidate, self.app.folders))
|
||||
if current != "" and len(matches) == 0:
|
||||
raise inquirer.errors.ValidationError("", reason="Location {} don't exist".format(current))
|
||||
return True
|
||||
|
||||
def bulk_host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
hosts = current.split(",")
|
||||
nodes = answers["ids"].split(",")
|
||||
if len(hosts) > 1 and len(hosts) != len(nodes):
|
||||
raise inquirer.errors.ValidationError("", reason="Hosts list should be the same length of nodes list")
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.cli.validators.Validators.bulk_folder_validation"><code class="name flex">
|
||||
<span>def <span class="ident">bulk_folder_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def bulk_folder_validation(self, answers, current):
|
||||
if not self.app.case:
|
||||
current = current.lower()
|
||||
|
||||
candidate = current
|
||||
if "/" in current:
|
||||
candidate = current.split("/")[0]
|
||||
|
||||
matches = list(filter(lambda k: k == candidate, self.app.folders))
|
||||
if current != "" and len(matches) == 0:
|
||||
raise inquirer.errors.ValidationError("", reason="Location {} don't exist".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.bulk_host_validation"><code class="name flex">
|
||||
<span>def <span class="ident">bulk_host_validation</span></span>(<span>self, answers, current, regex='^.+$')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def bulk_host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
hosts = current.split(",")
|
||||
nodes = answers["ids"].split(",")
|
||||
if len(hosts) > 1 and len(hosts) != len(nodes):
|
||||
raise inquirer.errors.ValidationError("", reason="Hosts list should be the same length of nodes list")
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.bulk_node_validation"><code class="name flex">
|
||||
<span>def <span class="ident">bulk_node_validation</span></span>(<span>self, answers, current, regex='^[0-9a-zA-Z_.,$#-]+$')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def bulk_node_validation(self, answers, current, regex = "^[0-9a-zA-Z_.,$#-]+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.default_validation"><code class="name flex">
|
||||
<span>def <span class="ident">default_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def default_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.host_validation"><code class="name flex">
|
||||
<span>def <span class="ident">host_validation</span></span>(<span>self, answers, current, regex='^.+$')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def host_validation(self, answers, current, regex = "^.+$"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Host cannot be empty")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.jumphost_validation"><code class="name flex">
|
||||
<span>def <span class="ident">jumphost_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def jumphost_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.pass_validation"><code class="name flex">
|
||||
<span>def <span class="ident">pass_validation</span></span>(<span>self, answers, current, regex='(^@.+$)')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def pass_validation(self, answers, current, regex = "(^@.+$)"):
|
||||
profiles = current.split(",")
|
||||
for i in profiles:
|
||||
if not re.match(regex, i) or i[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(i))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.port_validation"><code class="name flex">
|
||||
<span>def <span class="ident">port_validation</span></span>(<span>self, answers, current, regex='(^[0-9]*$|^@.+$)')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def port_validation(self, answers, current, regex = "(^[0-9]*$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile or leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.profile_jumphost_validation"><code class="name flex">
|
||||
<span>def <span class="ident">profile_jumphost_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def profile_jumphost_validation(self, answers, current):
|
||||
if current != "":
|
||||
if current not in self.app.nodes_list:
|
||||
raise inquirer.errors.ValidationError("", reason="Node {} don't exist.".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.profile_port_validation"><code class="name flex">
|
||||
<span>def <span class="ident">profile_port_validation</span></span>(<span>self, answers, current, regex='(^[0-9]*$)')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def profile_port_validation(self, answers, current, regex = "(^[0-9]*$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535, @profile o leave empty")
|
||||
try:
|
||||
port = int(current)
|
||||
except ValueError:
|
||||
port = 0
|
||||
if current != "" and not 1 <= int(port) <= 65535:
|
||||
raise inquirer.errors.ValidationError("", reason="Pick a port between 1-65535 or leave empty")
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.profile_protocol_validation"><code class="name flex">
|
||||
<span>def <span class="ident">profile_protocol_validation</span></span>(<span>self, answers, current, regex='(^ssh$|^telnet$|^kubectl$|^docker$|^$)')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def profile_protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker or leave empty")
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.profile_tags_validation"><code class="name flex">
|
||||
<span>def <span class="ident">profile_tags_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def profile_tags_validation(self, answers, current):
|
||||
if current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.protocol_validation"><code class="name flex">
|
||||
<span>def <span class="ident">protocol_validation</span></span>(<span>self, answers, current, regex='(^ssh$|^telnet$|^kubectl$|^docker$|^$|^@.+$)')</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def protocol_validation(self, answers, current, regex = "(^ssh$|^telnet$|^kubectl$|^docker$|^$|^@.+$)"):
|
||||
if not re.match(regex, current):
|
||||
raise inquirer.errors.ValidationError("", reason="Pick between ssh, telnet, kubectl, docker leave empty or @profile")
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.cli.validators.Validators.tags_validation"><code class="name flex">
|
||||
<span>def <span class="ident">tags_validation</span></span>(<span>self, answers, current)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def tags_validation(self, answers, current):
|
||||
if current.startswith("@"):
|
||||
if current[1:] not in self.app.profiles:
|
||||
raise inquirer.errors.ValidationError("", reason="Profile {} don't exist".format(current))
|
||||
elif current != "":
|
||||
isdict = False
|
||||
try:
|
||||
isdict = ast.literal_eval(current)
|
||||
except Exception:
|
||||
pass
|
||||
if not isinstance (isdict, dict):
|
||||
raise inquirer.errors.ValidationError("", reason="Tags should be a python dictionary.".format(current))
|
||||
return True</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.cli" href="index.html">connpy.cli</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.cli.validators.Validators" href="#connpy.cli.validators.Validators">Validators</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.cli.validators.Validators.bulk_folder_validation" href="#connpy.cli.validators.Validators.bulk_folder_validation">bulk_folder_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.bulk_host_validation" href="#connpy.cli.validators.Validators.bulk_host_validation">bulk_host_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.bulk_node_validation" href="#connpy.cli.validators.Validators.bulk_node_validation">bulk_node_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.default_validation" href="#connpy.cli.validators.Validators.default_validation">default_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.host_validation" href="#connpy.cli.validators.Validators.host_validation">host_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.jumphost_validation" href="#connpy.cli.validators.Validators.jumphost_validation">jumphost_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.pass_validation" href="#connpy.cli.validators.Validators.pass_validation">pass_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.port_validation" href="#connpy.cli.validators.Validators.port_validation">port_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.profile_jumphost_validation" href="#connpy.cli.validators.Validators.profile_jumphost_validation">profile_jumphost_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.profile_port_validation" href="#connpy.cli.validators.Validators.profile_port_validation">profile_port_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.profile_protocol_validation" href="#connpy.cli.validators.Validators.profile_protocol_validation">profile_protocol_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.profile_tags_validation" href="#connpy.cli.validators.Validators.profile_tags_validation">profile_tags_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.protocol_validation" href="#connpy.cli.validators.Validators.protocol_validation">protocol_validation</a></code></li>
|
||||
<li><code><a title="connpy.cli.validators.Validators.tags_validation" href="#connpy.cli.validators.Validators.tags_validation">tags_validation</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,799 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc.connpy_pb2 API documentation</title>
|
||||
<meta name="description" content="Generated protocol buffer code.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.grpc.connpy_pb2</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
<p>Generated protocol buffer code.</p>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.AIResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">AIResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.AIResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.AskRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">AskRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.AskRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.BoolResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">BoolResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.BoolResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.BulkRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">BulkRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.BulkRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.DeleteRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">DeleteRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.DeleteRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ExportRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">ExportRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ExportRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.FilterRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">FilterRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.FilterRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.FullReplaceRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">FullReplaceRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.FullReplaceRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.IdRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">IdRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.IdRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.IntRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">IntRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.IntRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.InteractRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">InteractRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.InteractRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.InteractResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">InteractResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.InteractResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ListRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">ListRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ListRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.MessageValue"><code class="flex name class">
|
||||
<span>class <span class="ident">MessageValue</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.MessageValue.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.MoveRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">MoveRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.MoveRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.NodeRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.NodeRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.NodeRunResult"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeRunResult</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.NodeRunResult.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.PluginRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">PluginRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.PluginRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ProfileRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">ProfileRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ProfileRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ProviderRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">ProviderRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ProviderRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.RunRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">RunRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.RunRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ScriptRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">ScriptRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ScriptRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.StringRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">StringRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.StringRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.StringResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">StringResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.StringResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.StructRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">StructRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.StructRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.StructResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">StructResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.StructResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.TestRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">TestRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.TestRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.UpdateRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">UpdateRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.UpdateRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.connpy_pb2.ValueResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">ValueResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.connpy_pb2.ValueResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.grpc" href="index.html">connpy.grpc</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.AIResponse" href="#connpy.grpc.connpy_pb2.AIResponse">AIResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.AIResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.AIResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.AskRequest" href="#connpy.grpc.connpy_pb2.AskRequest">AskRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.AskRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.AskRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.BoolResponse" href="#connpy.grpc.connpy_pb2.BoolResponse">BoolResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.BoolResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.BoolResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.BulkRequest" href="#connpy.grpc.connpy_pb2.BulkRequest">BulkRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.BulkRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.BulkRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.DeleteRequest" href="#connpy.grpc.connpy_pb2.DeleteRequest">DeleteRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.DeleteRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.DeleteRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ExportRequest" href="#connpy.grpc.connpy_pb2.ExportRequest">ExportRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ExportRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ExportRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.FilterRequest" href="#connpy.grpc.connpy_pb2.FilterRequest">FilterRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.FilterRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.FilterRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.FullReplaceRequest" href="#connpy.grpc.connpy_pb2.FullReplaceRequest">FullReplaceRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.FullReplaceRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.FullReplaceRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.IdRequest" href="#connpy.grpc.connpy_pb2.IdRequest">IdRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.IdRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.IdRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.IntRequest" href="#connpy.grpc.connpy_pb2.IntRequest">IntRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.IntRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.IntRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.InteractRequest" href="#connpy.grpc.connpy_pb2.InteractRequest">InteractRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.InteractRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.InteractRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.InteractResponse" href="#connpy.grpc.connpy_pb2.InteractResponse">InteractResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.InteractResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.InteractResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ListRequest" href="#connpy.grpc.connpy_pb2.ListRequest">ListRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ListRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ListRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.MessageValue" href="#connpy.grpc.connpy_pb2.MessageValue">MessageValue</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.MessageValue.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.MessageValue.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.MoveRequest" href="#connpy.grpc.connpy_pb2.MoveRequest">MoveRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.MoveRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.MoveRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.NodeRequest" href="#connpy.grpc.connpy_pb2.NodeRequest">NodeRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.NodeRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.NodeRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.NodeRunResult" href="#connpy.grpc.connpy_pb2.NodeRunResult">NodeRunResult</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.NodeRunResult.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.NodeRunResult.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.PluginRequest" href="#connpy.grpc.connpy_pb2.PluginRequest">PluginRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.PluginRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.PluginRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ProfileRequest" href="#connpy.grpc.connpy_pb2.ProfileRequest">ProfileRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ProfileRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ProfileRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ProviderRequest" href="#connpy.grpc.connpy_pb2.ProviderRequest">ProviderRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ProviderRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ProviderRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.RunRequest" href="#connpy.grpc.connpy_pb2.RunRequest">RunRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.RunRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.RunRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ScriptRequest" href="#connpy.grpc.connpy_pb2.ScriptRequest">ScriptRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ScriptRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ScriptRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.StringRequest" href="#connpy.grpc.connpy_pb2.StringRequest">StringRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.StringRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.StringRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.StringResponse" href="#connpy.grpc.connpy_pb2.StringResponse">StringResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.StringResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.StringResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.StructRequest" href="#connpy.grpc.connpy_pb2.StructRequest">StructRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.StructRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.StructRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.StructResponse" href="#connpy.grpc.connpy_pb2.StructResponse">StructResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.StructResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.StructResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.TestRequest" href="#connpy.grpc.connpy_pb2.TestRequest">TestRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.TestRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.TestRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.UpdateRequest" href="#connpy.grpc.connpy_pb2.UpdateRequest">UpdateRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.UpdateRequest.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.UpdateRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.connpy_pb2.ValueResponse" href="#connpy.grpc.connpy_pb2.ValueResponse">ValueResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.connpy_pb2.ValueResponse.DESCRIPTOR" href="#connpy.grpc.connpy_pb2.ValueResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,108 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Namespace <code>connpy.grpc</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-submodules">Sub-modules</h2>
|
||||
<dl>
|
||||
<dt><code class="name"><a title="connpy.grpc.connpy_pb2" href="connpy_pb2.html">connpy.grpc.connpy_pb2</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>Generated protocol buffer code.</p></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.connpy_pb2_grpc" href="connpy_pb2_grpc.html">connpy.grpc.connpy_pb2_grpc</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>Client and server classes corresponding to protobuf-defined services.</p></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.remote_plugin_pb2" href="remote_plugin_pb2.html">connpy.grpc.remote_plugin_pb2</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>Generated protocol buffer code.</p></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.remote_plugin_pb2_grpc" href="remote_plugin_pb2_grpc.html">connpy.grpc.remote_plugin_pb2_grpc</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>Client and server classes corresponding to protobuf-defined services.</p></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.server" href="server.html">connpy.grpc.server</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.stubs" href="stubs.html">connpy.grpc.stubs</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt><code class="name"><a title="connpy.grpc.utils" href="utils.html">connpy.grpc.utils</a></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy" href="../index.html">connpy</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-submodules">Sub-modules</a></h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.grpc.connpy_pb2" href="connpy_pb2.html">connpy.grpc.connpy_pb2</a></code></li>
|
||||
<li><code><a title="connpy.grpc.connpy_pb2_grpc" href="connpy_pb2_grpc.html">connpy.grpc.connpy_pb2_grpc</a></code></li>
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2" href="remote_plugin_pb2.html">connpy.grpc.remote_plugin_pb2</a></code></li>
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc" href="remote_plugin_pb2_grpc.html">connpy.grpc.remote_plugin_pb2_grpc</a></code></li>
|
||||
<li><code><a title="connpy.grpc.server" href="server.html">connpy.grpc.server</a></code></li>
|
||||
<li><code><a title="connpy.grpc.stubs" href="stubs.html">connpy.grpc.stubs</a></code></li>
|
||||
<li><code><a title="connpy.grpc.utils" href="utils.html">connpy.grpc.utils</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,174 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc.remote_plugin_pb2 API documentation</title>
|
||||
<meta name="description" content="Generated protocol buffer code.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.grpc.remote_plugin_pb2</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
<p>Generated protocol buffer code.</p>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.IdRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">IdRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.IdRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.OutputChunk"><code class="flex name class">
|
||||
<span>class <span class="ident">OutputChunk</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.OutputChunk.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.PluginInvokeRequest"><code class="flex name class">
|
||||
<span>class <span class="ident">PluginInvokeRequest</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.PluginInvokeRequest.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.StringResponse"><code class="flex name class">
|
||||
<span>class <span class="ident">StringResponse</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<div class="desc"><p>A ProtocolMessage</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>google._upb._message.Message</li>
|
||||
<li>google.protobuf.message.Message</li>
|
||||
</ul>
|
||||
<h3>Class variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2.StringResponse.DESCRIPTOR"><code class="name">var <span class="ident">DESCRIPTOR</span></code></dt>
|
||||
<dd>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.grpc" href="index.html">connpy.grpc</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2.IdRequest" href="#connpy.grpc.remote_plugin_pb2.IdRequest">IdRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2.IdRequest.DESCRIPTOR" href="#connpy.grpc.remote_plugin_pb2.IdRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2.OutputChunk" href="#connpy.grpc.remote_plugin_pb2.OutputChunk">OutputChunk</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2.OutputChunk.DESCRIPTOR" href="#connpy.grpc.remote_plugin_pb2.OutputChunk.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2.PluginInvokeRequest" href="#connpy.grpc.remote_plugin_pb2.PluginInvokeRequest">PluginInvokeRequest</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2.PluginInvokeRequest.DESCRIPTOR" href="#connpy.grpc.remote_plugin_pb2.PluginInvokeRequest.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2.StringResponse" href="#connpy.grpc.remote_plugin_pb2.StringResponse">StringResponse</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2.StringResponse.DESCRIPTOR" href="#connpy.grpc.remote_plugin_pb2.StringResponse.DESCRIPTOR">DESCRIPTOR</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,372 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc.remote_plugin_pb2_grpc API documentation</title>
|
||||
<meta name="description" content="Client and server classes corresponding to protobuf-defined services.">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.grpc.remote_plugin_pb2_grpc</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
<p>Client and server classes corresponding to protobuf-defined services.</p>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.add_RemotePluginServiceServicer_to_server"><code class="name flex">
|
||||
<span>def <span class="ident">add_RemotePluginServiceServicer_to_server</span></span>(<span>servicer, server)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def add_RemotePluginServiceServicer_to_server(servicer, server):
|
||||
rpc_method_handlers = {
|
||||
'get_plugin_source': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.get_plugin_source,
|
||||
request_deserializer=remote__plugin__pb2.IdRequest.FromString,
|
||||
response_serializer=remote__plugin__pb2.StringResponse.SerializeToString,
|
||||
),
|
||||
'invoke_plugin': grpc.unary_stream_rpc_method_handler(
|
||||
servicer.invoke_plugin,
|
||||
request_deserializer=remote__plugin__pb2.PluginInvokeRequest.FromString,
|
||||
response_serializer=remote__plugin__pb2.OutputChunk.SerializeToString,
|
||||
),
|
||||
}
|
||||
generic_handler = grpc.method_handlers_generic_handler(
|
||||
'connpy_remote.RemotePluginService', rpc_method_handlers)
|
||||
server.add_generic_rpc_handlers((generic_handler,))
|
||||
server.add_registered_method_handlers('connpy_remote.RemotePluginService', rpc_method_handlers)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService"><code class="flex name class">
|
||||
<span>class <span class="ident">RemotePluginService</span></span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class RemotePluginService(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
@staticmethod
|
||||
def get_plugin_source(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/get_plugin_source',
|
||||
remote__plugin__pb2.IdRequest.SerializeToString,
|
||||
remote__plugin__pb2.StringResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def invoke_plugin(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_stream(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/invoke_plugin',
|
||||
remote__plugin__pb2.PluginInvokeRequest.SerializeToString,
|
||||
remote__plugin__pb2.OutputChunk.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Missing associated documentation comment in .proto file.</p></div>
|
||||
<h3>Static methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.get_plugin_source"><code class="name flex">
|
||||
<span>def <span class="ident">get_plugin_source</span></span>(<span>request,<br>target,<br>options=(),<br>channel_credentials=None,<br>call_credentials=None,<br>insecure=False,<br>compression=None,<br>wait_for_ready=None,<br>timeout=None,<br>metadata=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@staticmethod
|
||||
def get_plugin_source(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/get_plugin_source',
|
||||
remote__plugin__pb2.IdRequest.SerializeToString,
|
||||
remote__plugin__pb2.StringResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.invoke_plugin"><code class="name flex">
|
||||
<span>def <span class="ident">invoke_plugin</span></span>(<span>request,<br>target,<br>options=(),<br>channel_credentials=None,<br>call_credentials=None,<br>insecure=False,<br>compression=None,<br>wait_for_ready=None,<br>timeout=None,<br>metadata=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@staticmethod
|
||||
def invoke_plugin(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_stream(
|
||||
request,
|
||||
target,
|
||||
'/connpy_remote.RemotePluginService/invoke_plugin',
|
||||
remote__plugin__pb2.PluginInvokeRequest.SerializeToString,
|
||||
remote__plugin__pb2.OutputChunk.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer"><code class="flex name class">
|
||||
<span>class <span class="ident">RemotePluginServiceServicer</span></span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class RemotePluginServiceServicer(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
def get_plugin_source(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def invoke_plugin(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Missing associated documentation comment in .proto file.</p></div>
|
||||
<h3>Subclasses</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.grpc.server.PluginServicer" href="server.html#connpy.grpc.server.PluginServicer">PluginServicer</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.get_plugin_source"><code class="name flex">
|
||||
<span>def <span class="ident">get_plugin_source</span></span>(<span>self, request, context)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_plugin_source(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Missing associated documentation comment in .proto file.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.invoke_plugin"><code class="name flex">
|
||||
<span>def <span class="ident">invoke_plugin</span></span>(<span>self, request, context)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def invoke_plugin(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Missing associated documentation comment in .proto file.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceStub"><code class="flex name class">
|
||||
<span>class <span class="ident">RemotePluginServiceStub</span></span>
|
||||
<span>(</span><span>channel)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class RemotePluginServiceStub(object):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
|
||||
def __init__(self, channel):
|
||||
"""Constructor.
|
||||
|
||||
Args:
|
||||
channel: A grpc.Channel.
|
||||
"""
|
||||
self.get_plugin_source = channel.unary_unary(
|
||||
'/connpy_remote.RemotePluginService/get_plugin_source',
|
||||
request_serializer=remote__plugin__pb2.IdRequest.SerializeToString,
|
||||
response_deserializer=remote__plugin__pb2.StringResponse.FromString,
|
||||
_registered_method=True)
|
||||
self.invoke_plugin = channel.unary_stream(
|
||||
'/connpy_remote.RemotePluginService/invoke_plugin',
|
||||
request_serializer=remote__plugin__pb2.PluginInvokeRequest.SerializeToString,
|
||||
response_deserializer=remote__plugin__pb2.OutputChunk.FromString,
|
||||
_registered_method=True)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Missing associated documentation comment in .proto file.</p>
|
||||
<p>Constructor.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>channel</code></strong></dt>
|
||||
<dd>A grpc.Channel.</dd>
|
||||
</dl></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.grpc" href="index.html">connpy.grpc</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc.add_RemotePluginServiceServicer_to_server" href="#connpy.grpc.remote_plugin_pb2_grpc.add_RemotePluginServiceServicer_to_server">add_RemotePluginServiceServicer_to_server</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService">RemotePluginService</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.get_plugin_source" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.get_plugin_source">get_plugin_source</a></code></li>
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.invoke_plugin" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginService.invoke_plugin">invoke_plugin</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer">RemotePluginServiceServicer</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.get_plugin_source" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.get_plugin_source">get_plugin_source</a></code></li>
|
||||
<li><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.invoke_plugin" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceServicer.invoke_plugin">invoke_plugin</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceStub" href="#connpy.grpc.remote_plugin_pb2_grpc.RemotePluginServiceStub">RemotePluginServiceStub</a></code></h4>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,144 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.grpc.utils API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.grpc.utils</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-functions">Functions</h2>
|
||||
<dl>
|
||||
<dt id="connpy.grpc.utils.from_struct"><code class="name flex">
|
||||
<span>def <span class="ident">from_struct</span></span>(<span>struct)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def from_struct(struct):
|
||||
if not struct:
|
||||
return {}
|
||||
return json_format.MessageToDict(struct, preserving_proto_field_name=True)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.utils.from_value"><code class="name flex">
|
||||
<span>def <span class="ident">from_value</span></span>(<span>val)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def from_value(val):
|
||||
if not val.HasField("kind"):
|
||||
return None
|
||||
return json.loads(json_format.MessageToJson(val))</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.utils.to_struct"><code class="name flex">
|
||||
<span>def <span class="ident">to_struct</span></span>(<span>obj)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def to_struct(obj):
|
||||
if not obj:
|
||||
return Struct()
|
||||
s = Struct()
|
||||
json_format.ParseDict(obj, s)
|
||||
return s</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.grpc.utils.to_value"><code class="name flex">
|
||||
<span>def <span class="ident">to_value</span></span>(<span>obj)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def to_value(obj):
|
||||
if obj is None:
|
||||
v = Value()
|
||||
v.null_value = 0
|
||||
return v
|
||||
json_str = json.dumps(obj)
|
||||
v = Value()
|
||||
json_format.Parse(json_str, v)
|
||||
return v</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.grpc" href="index.html">connpy.grpc</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-functions">Functions</a></h3>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.grpc.utils.from_struct" href="#connpy.grpc.utils.from_struct">from_struct</a></code></li>
|
||||
<li><code><a title="connpy.grpc.utils.from_value" href="#connpy.grpc.utils.from_value">from_value</a></code></li>
|
||||
<li><code><a title="connpy.grpc.utils.to_struct" href="#connpy.grpc.utils.to_struct">to_struct</a></code></li>
|
||||
<li><code><a title="connpy.grpc.utils.to_value" href="#connpy.grpc.utils.to_value">to_value</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
+648
-363
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,271 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.ai_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.ai_service</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.ai_service.AIService"><code class="flex name class">
|
||||
<span>class <span class="ident">AIService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class AIService(BaseService):
|
||||
"""Business logic for interacting with AI agents and LLM configurations."""
|
||||
|
||||
def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console, confirm_handler=confirm_handler, trust=trust, **overrides)
|
||||
return agent.ask(input_text, dryrun, chat_history, status=status, debug=debug, session_id=session_id, chunk_callback=chunk_callback)
|
||||
|
||||
|
||||
def confirm(self, input_text, console=None):
|
||||
"""Ask for a safe confirmation of an action."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console)
|
||||
return agent.confirm(input_text)
|
||||
|
||||
|
||||
def list_sessions(self):
|
||||
"""Return a list of all saved AI sessions."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent._get_sessions()
|
||||
|
||||
def delete_session(self, session_id):
|
||||
"""Delete an AI session by ID."""
|
||||
import os
|
||||
sessions_dir = os.path.join(self.config.defaultdir, "ai_sessions")
|
||||
path = os.path.join(sessions_dir, f"{session_id}.json")
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
raise InvalidConfigurationError(f"Session '{session_id}' not found.")
|
||||
|
||||
def configure_provider(self, provider, model=None, api_key=None):
|
||||
"""Update AI provider settings in the configuration."""
|
||||
settings = self.config.config.get("ai", {})
|
||||
if model:
|
||||
settings[f"{provider}_model"] = model
|
||||
if api_key:
|
||||
settings[f"{provider}_api_key"] = api_key
|
||||
|
||||
self.config.config["ai"] = settings
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def load_session_data(self, session_id):
|
||||
"""Load a session's raw data by ID."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent.load_session_data(session_id)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for interacting with AI agents and LLM configurations.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.ai_service.AIService.ask"><code class="name flex">
|
||||
<span>def <span class="ident">ask</span></span>(<span>self,<br>input_text,<br>dryrun=False,<br>chat_history=None,<br>status=None,<br>debug=False,<br>session_id=None,<br>console=None,<br>chunk_callback=None,<br>confirm_handler=None,<br>trust=False,<br>**overrides)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def ask(self, input_text, dryrun=False, chat_history=None, status=None, debug=False, session_id=None, console=None, chunk_callback=None, confirm_handler=None, trust=False, **overrides):
|
||||
"""Send a prompt to the AI agent."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console, confirm_handler=confirm_handler, trust=trust, **overrides)
|
||||
return agent.ask(input_text, dryrun, chat_history, status=status, debug=debug, session_id=session_id, chunk_callback=chunk_callback)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Send a prompt to the AI agent.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.configure_provider"><code class="name flex">
|
||||
<span>def <span class="ident">configure_provider</span></span>(<span>self, provider, model=None, api_key=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def configure_provider(self, provider, model=None, api_key=None):
|
||||
"""Update AI provider settings in the configuration."""
|
||||
settings = self.config.config.get("ai", {})
|
||||
if model:
|
||||
settings[f"{provider}_model"] = model
|
||||
if api_key:
|
||||
settings[f"{provider}_api_key"] = api_key
|
||||
|
||||
self.config.config["ai"] = settings
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Update AI provider settings in the configuration.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.confirm"><code class="name flex">
|
||||
<span>def <span class="ident">confirm</span></span>(<span>self, input_text, console=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def confirm(self, input_text, console=None):
|
||||
"""Ask for a safe confirmation of an action."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config, console=console)
|
||||
return agent.confirm(input_text)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Ask for a safe confirmation of an action.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.delete_session"><code class="name flex">
|
||||
<span>def <span class="ident">delete_session</span></span>(<span>self, session_id)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def delete_session(self, session_id):
|
||||
"""Delete an AI session by ID."""
|
||||
import os
|
||||
sessions_dir = os.path.join(self.config.defaultdir, "ai_sessions")
|
||||
path = os.path.join(sessions_dir, f"{session_id}.json")
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
raise InvalidConfigurationError(f"Session '{session_id}' not found.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Delete an AI session by ID.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.list_sessions"><code class="name flex">
|
||||
<span>def <span class="ident">list_sessions</span></span>(<span>self)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def list_sessions(self):
|
||||
"""Return a list of all saved AI sessions."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent._get_sessions()</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Return a list of all saved AI sessions.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.ai_service.AIService.load_session_data"><code class="name flex">
|
||||
<span>def <span class="ident">load_session_data</span></span>(<span>self, session_id)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def load_session_data(self, session_id):
|
||||
"""Load a session's raw data by ID."""
|
||||
from connpy.ai import ai
|
||||
agent = ai(self.config)
|
||||
return agent.load_session_data(session_id)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Load a session's raw data by ID.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
<li><code><b><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="base.html#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.ai_service.AIService" href="#connpy.services.ai_service.AIService">AIService</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.services.ai_service.AIService.ask" href="#connpy.services.ai_service.AIService.ask">ask</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.configure_provider" href="#connpy.services.ai_service.AIService.configure_provider">configure_provider</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.confirm" href="#connpy.services.ai_service.AIService.confirm">confirm</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.delete_session" href="#connpy.services.ai_service.AIService.delete_session">delete_session</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.list_sessions" href="#connpy.services.ai_service.AIService.list_sessions">list_sessions</a></code></li>
|
||||
<li><code><a title="connpy.services.ai_service.AIService.load_session_data" href="#connpy.services.ai_service.AIService.load_session_data">load_session_data</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,158 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.base API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.base</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.base.BaseService"><code class="flex name class">
|
||||
<span>class <span class="ident">BaseService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class BaseService:
|
||||
"""Base class for all connpy services, providing common configuration access."""
|
||||
|
||||
def __init__(self, config=None):
|
||||
"""
|
||||
Initialize the service.
|
||||
|
||||
Args:
|
||||
config: An instance of configfile (or None to instantiate a new one/use global context).
|
||||
"""
|
||||
from connpy import configfile
|
||||
self.config = config or configfile()
|
||||
self.hooks = MethodHook
|
||||
self.reserved_names = []
|
||||
|
||||
def set_reserved_names(self, names):
|
||||
"""Inject a list of reserved names (e.g. from the CLI)."""
|
||||
self.reserved_names = names
|
||||
|
||||
def _validate_node_name(self, unique_id):
|
||||
"""Check if the node name in unique_id is reserved."""
|
||||
from .exceptions import ReservedNameError
|
||||
if not self.reserved_names:
|
||||
return
|
||||
|
||||
uniques = self.config._explode_unique(unique_id)
|
||||
if uniques and "id" in uniques:
|
||||
# We only validate the 'id' (the actual node name), folders are prefixed with @
|
||||
node_name = uniques["id"]
|
||||
if node_name in self.reserved_names:
|
||||
raise ReservedNameError(f"Node name '{node_name}' is a reserved command.")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Base class for all connpy services, providing common configuration access.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Subclasses</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.ai_service.AIService" href="ai_service.html#connpy.services.ai_service.AIService">AIService</a></li>
|
||||
<li><a title="connpy.services.config_service.ConfigService" href="config_service.html#connpy.services.config_service.ConfigService">ConfigService</a></li>
|
||||
<li><a title="connpy.services.context_service.ContextService" href="context_service.html#connpy.services.context_service.ContextService">ContextService</a></li>
|
||||
<li><a title="connpy.services.execution_service.ExecutionService" href="execution_service.html#connpy.services.execution_service.ExecutionService">ExecutionService</a></li>
|
||||
<li><a title="connpy.services.import_export_service.ImportExportService" href="import_export_service.html#connpy.services.import_export_service.ImportExportService">ImportExportService</a></li>
|
||||
<li><a title="connpy.services.node_service.NodeService" href="node_service.html#connpy.services.node_service.NodeService">NodeService</a></li>
|
||||
<li><a title="connpy.services.plugin_service.PluginService" href="plugin_service.html#connpy.services.plugin_service.PluginService">PluginService</a></li>
|
||||
<li><a title="connpy.services.profile_service.ProfileService" href="profile_service.html#connpy.services.profile_service.ProfileService">ProfileService</a></li>
|
||||
<li><a title="connpy.services.sync_service.SyncService" href="sync_service.html#connpy.services.sync_service.SyncService">SyncService</a></li>
|
||||
<li><a title="connpy.services.system_service.SystemService" href="system_service.html#connpy.services.system_service.SystemService">SystemService</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.base.BaseService.set_reserved_names"><code class="name flex">
|
||||
<span>def <span class="ident">set_reserved_names</span></span>(<span>self, names)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_reserved_names(self, names):
|
||||
"""Inject a list of reserved names (e.g. from the CLI)."""
|
||||
self.reserved_names = names</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Inject a list of reserved names (e.g. from the CLI).</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.base.BaseService" href="#connpy.services.base.BaseService">BaseService</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,317 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.config_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.config_service</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.config_service.ConfigService"><code class="flex name class">
|
||||
<span>class <span class="ident">ConfigService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ConfigService(BaseService):
|
||||
"""Business logic for general application settings and state configuration."""
|
||||
|
||||
def get_settings(self) -> Dict[str, Any]:
|
||||
"""Get the global configuration settings block."""
|
||||
settings = self.config.config.copy()
|
||||
settings["configfolder"] = self.config.defaultdir
|
||||
return settings
|
||||
|
||||
def get_default_dir(self) -> str:
|
||||
"""Get the default configuration directory."""
|
||||
return self.config.defaultdir
|
||||
|
||||
def set_config_folder(self, folder_path: str):
|
||||
"""Set the default location for config file by writing to ~/.config/conn/.folder"""
|
||||
if not os.path.isdir(folder_path):
|
||||
raise ConnpyError(f"readable_dir:{folder_path} is not a valid path")
|
||||
|
||||
pathfile = os.path.join(self.config.anchor_path, ".folder")
|
||||
folder = os.path.abspath(folder_path).rstrip('/')
|
||||
|
||||
try:
|
||||
with open(pathfile, "w") as f:
|
||||
f.write(str(folder))
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to save config folder: {e}")
|
||||
|
||||
def update_setting(self, key, value):
|
||||
"""Update a setting in the configuration file."""
|
||||
self.config.config[key] = value
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def encrypt_password(self, password):
|
||||
"""Encrypt a password using the application's configuration encryption key."""
|
||||
return self.config.encrypt(password)
|
||||
|
||||
def apply_theme_from_file(self, theme_input):
|
||||
"""Apply 'dark', 'light' theme or load a YAML theme file and save it to the configuration."""
|
||||
import yaml
|
||||
from ..printer import STYLES, LIGHT_THEME
|
||||
|
||||
if theme_input == "dark":
|
||||
valid_styles = {}
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
elif theme_input == "light":
|
||||
valid_styles = LIGHT_THEME.copy()
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
|
||||
if not os.path.exists(theme_input):
|
||||
raise InvalidConfigurationError(f"Theme file '{theme_input}' not found.")
|
||||
|
||||
try:
|
||||
with open(theme_input, 'r') as f:
|
||||
user_styles = yaml.safe_load(f)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to parse theme file: {e}")
|
||||
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
if not valid_styles:
|
||||
raise InvalidConfigurationError("No valid style keys found in theme file.")
|
||||
|
||||
# Persist and return merged styles
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for general application settings and state configuration.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.config_service.ConfigService.apply_theme_from_file"><code class="name flex">
|
||||
<span>def <span class="ident">apply_theme_from_file</span></span>(<span>self, theme_input)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def apply_theme_from_file(self, theme_input):
|
||||
"""Apply 'dark', 'light' theme or load a YAML theme file and save it to the configuration."""
|
||||
import yaml
|
||||
from ..printer import STYLES, LIGHT_THEME
|
||||
|
||||
if theme_input == "dark":
|
||||
valid_styles = {}
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
elif theme_input == "light":
|
||||
valid_styles = LIGHT_THEME.copy()
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles
|
||||
|
||||
if not os.path.exists(theme_input):
|
||||
raise InvalidConfigurationError(f"Theme file '{theme_input}' not found.")
|
||||
|
||||
try:
|
||||
with open(theme_input, 'r') as f:
|
||||
user_styles = yaml.safe_load(f)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to parse theme file: {e}")
|
||||
|
||||
if not isinstance(user_styles, dict):
|
||||
raise InvalidConfigurationError("Theme file must be a YAML dictionary.")
|
||||
|
||||
# Filter for valid styles only (prevent junk in config)
|
||||
valid_styles = {k: v for k, v in user_styles.items() if k in STYLES}
|
||||
|
||||
if not valid_styles:
|
||||
raise InvalidConfigurationError("No valid style keys found in theme file.")
|
||||
|
||||
# Persist and return merged styles
|
||||
self.update_setting("theme", valid_styles)
|
||||
return valid_styles</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Apply 'dark', 'light' theme or load a YAML theme file and save it to the configuration.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.config_service.ConfigService.encrypt_password"><code class="name flex">
|
||||
<span>def <span class="ident">encrypt_password</span></span>(<span>self, password)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def encrypt_password(self, password):
|
||||
"""Encrypt a password using the application's configuration encryption key."""
|
||||
return self.config.encrypt(password)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Encrypt a password using the application's configuration encryption key.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.config_service.ConfigService.get_default_dir"><code class="name flex">
|
||||
<span>def <span class="ident">get_default_dir</span></span>(<span>self) ‑> str</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_default_dir(self) -> str:
|
||||
"""Get the default configuration directory."""
|
||||
return self.config.defaultdir</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Get the default configuration directory.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.config_service.ConfigService.get_settings"><code class="name flex">
|
||||
<span>def <span class="ident">get_settings</span></span>(<span>self) ‑> Dict[str, Any]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_settings(self) -> Dict[str, Any]:
|
||||
"""Get the global configuration settings block."""
|
||||
settings = self.config.config.copy()
|
||||
settings["configfolder"] = self.config.defaultdir
|
||||
return settings</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Get the global configuration settings block.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.config_service.ConfigService.set_config_folder"><code class="name flex">
|
||||
<span>def <span class="ident">set_config_folder</span></span>(<span>self, folder_path: str)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_config_folder(self, folder_path: str):
|
||||
"""Set the default location for config file by writing to ~/.config/conn/.folder"""
|
||||
if not os.path.isdir(folder_path):
|
||||
raise ConnpyError(f"readable_dir:{folder_path} is not a valid path")
|
||||
|
||||
pathfile = os.path.join(self.config.anchor_path, ".folder")
|
||||
folder = os.path.abspath(folder_path).rstrip('/')
|
||||
|
||||
try:
|
||||
with open(pathfile, "w") as f:
|
||||
f.write(str(folder))
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to save config folder: {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Set the default location for config file by writing to ~/.config/conn/.folder</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.config_service.ConfigService.update_setting"><code class="name flex">
|
||||
<span>def <span class="ident">update_setting</span></span>(<span>self, key, value)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def update_setting(self, key, value):
|
||||
"""Update a setting in the configuration file."""
|
||||
self.config.config[key] = value
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Update a setting in the configuration file.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
<li><code><b><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="base.html#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.config_service.ConfigService" href="#connpy.services.config_service.ConfigService">ConfigService</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.apply_theme_from_file" href="#connpy.services.config_service.ConfigService.apply_theme_from_file">apply_theme_from_file</a></code></li>
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.encrypt_password" href="#connpy.services.config_service.ConfigService.encrypt_password">encrypt_password</a></code></li>
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.get_default_dir" href="#connpy.services.config_service.ConfigService.get_default_dir">get_default_dir</a></code></li>
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.get_settings" href="#connpy.services.config_service.ConfigService.get_settings">get_settings</a></code></li>
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.set_config_folder" href="#connpy.services.config_service.ConfigService.set_config_folder">set_config_folder</a></code></li>
|
||||
<li><code><a title="connpy.services.config_service.ConfigService.update_setting" href="#connpy.services.config_service.ConfigService.update_setting">update_setting</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,376 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.context_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.context_service</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.context_service.ContextService"><code class="flex name class">
|
||||
<span>class <span class="ident">ContextService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ContextService(BaseService):
|
||||
"""Business logic for managing and applying regex-based contexts locally."""
|
||||
|
||||
@property
|
||||
def contexts(self) -> Dict[str, List[str]]:
|
||||
return self.config.config.get("contexts", {"all": [".*"]})
|
||||
|
||||
@property
|
||||
def current_context(self) -> str:
|
||||
return self.config.config.get("current_context", "all")
|
||||
|
||||
def list_contexts(self) -> List[Dict[str, Any]]:
|
||||
result = []
|
||||
for name in self.contexts.keys():
|
||||
result.append({
|
||||
"name": name,
|
||||
"active": (name == self.current_context),
|
||||
"regexes": self.contexts[name]
|
||||
})
|
||||
return result
|
||||
|
||||
def add_context(self, name: str, regexes: List[str]):
|
||||
if not name.isalnum():
|
||||
raise ValueError("Context name must be alphanumeric")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name in ctxs:
|
||||
raise ValueError(f"Context '{name}' already exists")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def update_context(self, name: str, regexes: List[str]):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot modify default context 'all'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def delete_context(self, name: str):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot delete default context 'all'")
|
||||
if name == self.current_context:
|
||||
raise ValueError(f"Cannot delete active context '{name}'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
del ctxs[name]
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def set_active_context(self, name: str):
|
||||
if name not in self.contexts:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
self.config.config["current_context"] = name
|
||||
self.config._saveconfig(self.config.file)
|
||||
|
||||
def get_active_regexes(self) -> List[re.Pattern]:
|
||||
patterns = self.contexts.get(self.current_context, [".*"])
|
||||
return [re.compile(p) for p in patterns]
|
||||
|
||||
def _match_any(self, node_name: str, patterns: List[re.Pattern]) -> bool:
|
||||
return any(p.match(node_name) for p in patterns)
|
||||
|
||||
# Hook handlers for filtering
|
||||
def filter_node_list(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return [node for node in kwargs["result"] if self._match_any(node, patterns)]
|
||||
|
||||
def filter_node_dict(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return {k: v for k, v in kwargs["result"].items() if self._match_any(k, patterns)}</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for managing and applying regex-based contexts locally.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></li>
|
||||
</ul>
|
||||
<h3>Instance variables</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.context_service.ContextService.contexts"><code class="name">prop <span class="ident">contexts</span> : Dict[str, List[str]]</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@property
|
||||
def contexts(self) -> Dict[str, List[str]]:
|
||||
return self.config.config.get("contexts", {"all": [".*"]})</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.current_context"><code class="name">prop <span class="ident">current_context</span> : str</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">@property
|
||||
def current_context(self) -> str:
|
||||
return self.config.config.get("current_context", "all")</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.context_service.ContextService.add_context"><code class="name flex">
|
||||
<span>def <span class="ident">add_context</span></span>(<span>self, name: str, regexes: List[str])</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def add_context(self, name: str, regexes: List[str]):
|
||||
if not name.isalnum():
|
||||
raise ValueError("Context name must be alphanumeric")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name in ctxs:
|
||||
raise ValueError(f"Context '{name}' already exists")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.delete_context"><code class="name flex">
|
||||
<span>def <span class="ident">delete_context</span></span>(<span>self, name: str)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def delete_context(self, name: str):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot delete default context 'all'")
|
||||
if name == self.current_context:
|
||||
raise ValueError(f"Cannot delete active context '{name}'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
del ctxs[name]
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.filter_node_dict"><code class="name flex">
|
||||
<span>def <span class="ident">filter_node_dict</span></span>(<span>self, *args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def filter_node_dict(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return {k: v for k, v in kwargs["result"].items() if self._match_any(k, patterns)}</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.filter_node_list"><code class="name flex">
|
||||
<span>def <span class="ident">filter_node_list</span></span>(<span>self, *args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def filter_node_list(self, *args, **kwargs):
|
||||
patterns = self.get_active_regexes()
|
||||
return [node for node in kwargs["result"] if self._match_any(node, patterns)]</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.get_active_regexes"><code class="name flex">
|
||||
<span>def <span class="ident">get_active_regexes</span></span>(<span>self) ‑> List[re.Pattern]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def get_active_regexes(self) -> List[re.Pattern]:
|
||||
patterns = self.contexts.get(self.current_context, [".*"])
|
||||
return [re.compile(p) for p in patterns]</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.list_contexts"><code class="name flex">
|
||||
<span>def <span class="ident">list_contexts</span></span>(<span>self) ‑> List[Dict[str, Any]]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def list_contexts(self) -> List[Dict[str, Any]]:
|
||||
result = []
|
||||
for name in self.contexts.keys():
|
||||
result.append({
|
||||
"name": name,
|
||||
"active": (name == self.current_context),
|
||||
"regexes": self.contexts[name]
|
||||
})
|
||||
return result</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.set_active_context"><code class="name flex">
|
||||
<span>def <span class="ident">set_active_context</span></span>(<span>self, name: str)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def set_active_context(self, name: str):
|
||||
if name not in self.contexts:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
self.config.config["current_context"] = name
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.context_service.ContextService.update_context"><code class="name flex">
|
||||
<span>def <span class="ident">update_context</span></span>(<span>self, name: str, regexes: List[str])</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def update_context(self, name: str, regexes: List[str]):
|
||||
if name == "all":
|
||||
raise ValueError("Cannot modify default context 'all'")
|
||||
|
||||
ctxs = self.contexts
|
||||
if name not in ctxs:
|
||||
raise ValueError(f"Context '{name}' does not exist")
|
||||
|
||||
ctxs[name] = regexes
|
||||
self.config.config["contexts"] = ctxs
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
<li><code><b><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="base.html#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.context_service.ContextService" href="#connpy.services.context_service.ContextService">ContextService</a></code></h4>
|
||||
<ul class="two-column">
|
||||
<li><code><a title="connpy.services.context_service.ContextService.add_context" href="#connpy.services.context_service.ContextService.add_context">add_context</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.contexts" href="#connpy.services.context_service.ContextService.contexts">contexts</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.current_context" href="#connpy.services.context_service.ContextService.current_context">current_context</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.delete_context" href="#connpy.services.context_service.ContextService.delete_context">delete_context</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.filter_node_dict" href="#connpy.services.context_service.ContextService.filter_node_dict">filter_node_dict</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.filter_node_list" href="#connpy.services.context_service.ContextService.filter_node_list">filter_node_list</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.get_active_regexes" href="#connpy.services.context_service.ContextService.get_active_regexes">get_active_regexes</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.list_contexts" href="#connpy.services.context_service.ContextService.list_contexts">list_contexts</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.set_active_context" href="#connpy.services.context_service.ContextService.set_active_context">set_active_context</a></code></li>
|
||||
<li><code><a title="connpy.services.context_service.ContextService.update_context" href="#connpy.services.context_service.ContextService.update_context">update_context</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,274 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.exceptions API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.exceptions</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.exceptions.ConnpyError"><code class="flex name class">
|
||||
<span>class <span class="ident">ConnpyError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ConnpyError(Exception):
|
||||
"""Base exception for all connpy services."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Base exception for all connpy services.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
<h3>Subclasses</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ExecutionError" href="#connpy.services.exceptions.ExecutionError">ExecutionError</a></li>
|
||||
<li><a title="connpy.services.exceptions.InvalidConfigurationError" href="#connpy.services.exceptions.InvalidConfigurationError">InvalidConfigurationError</a></li>
|
||||
<li><a title="connpy.services.exceptions.NodeAlreadyExistsError" href="#connpy.services.exceptions.NodeAlreadyExistsError">NodeAlreadyExistsError</a></li>
|
||||
<li><a title="connpy.services.exceptions.NodeNotFoundError" href="#connpy.services.exceptions.NodeNotFoundError">NodeNotFoundError</a></li>
|
||||
<li><a title="connpy.services.exceptions.ProfileAlreadyExistsError" href="#connpy.services.exceptions.ProfileAlreadyExistsError">ProfileAlreadyExistsError</a></li>
|
||||
<li><a title="connpy.services.exceptions.ProfileNotFoundError" href="#connpy.services.exceptions.ProfileNotFoundError">ProfileNotFoundError</a></li>
|
||||
<li><a title="connpy.services.exceptions.ReservedNameError" href="#connpy.services.exceptions.ReservedNameError">ReservedNameError</a></li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.ExecutionError"><code class="flex name class">
|
||||
<span>class <span class="ident">ExecutionError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ExecutionError(ConnpyError):
|
||||
"""Raised when an execution fails or returns error."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when an execution fails or returns error.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.InvalidConfigurationError"><code class="flex name class">
|
||||
<span>class <span class="ident">InvalidConfigurationError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class InvalidConfigurationError(ConnpyError):
|
||||
"""Raised when data or configuration input is invalid."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when data or configuration input is invalid.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.NodeAlreadyExistsError"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeAlreadyExistsError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class NodeAlreadyExistsError(ConnpyError):
|
||||
"""Raised when a node or folder already exists."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when a node or folder already exists.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.NodeNotFoundError"><code class="flex name class">
|
||||
<span>class <span class="ident">NodeNotFoundError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class NodeNotFoundError(ConnpyError):
|
||||
"""Raised when a connection or folder is not found."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when a connection or folder is not found.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.ProfileAlreadyExistsError"><code class="flex name class">
|
||||
<span>class <span class="ident">ProfileAlreadyExistsError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ProfileAlreadyExistsError(ConnpyError):
|
||||
"""Raised when a profile with the same name already exists."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when a profile with the same name already exists.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.ProfileNotFoundError"><code class="flex name class">
|
||||
<span>class <span class="ident">ProfileNotFoundError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ProfileNotFoundError(ConnpyError):
|
||||
"""Raised when a profile is not found."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when a profile is not found.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
<dt id="connpy.services.exceptions.ReservedNameError"><code class="flex name class">
|
||||
<span>class <span class="ident">ReservedNameError</span></span>
|
||||
<span>(</span><span>*args, **kwargs)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ReservedNameError(ConnpyError):
|
||||
"""Raised when a node name conflicts with a reserved command."""
|
||||
pass</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Raised when a node name conflicts with a reserved command.</p></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></li>
|
||||
<li>builtins.Exception</li>
|
||||
<li>builtins.BaseException</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.ConnpyError" href="#connpy.services.exceptions.ConnpyError">ConnpyError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.ExecutionError" href="#connpy.services.exceptions.ExecutionError">ExecutionError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.InvalidConfigurationError" href="#connpy.services.exceptions.InvalidConfigurationError">InvalidConfigurationError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.NodeAlreadyExistsError" href="#connpy.services.exceptions.NodeAlreadyExistsError">NodeAlreadyExistsError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.NodeNotFoundError" href="#connpy.services.exceptions.NodeNotFoundError">NodeNotFoundError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.ProfileAlreadyExistsError" href="#connpy.services.exceptions.ProfileAlreadyExistsError">ProfileAlreadyExistsError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.ProfileNotFoundError" href="#connpy.services.exceptions.ProfileNotFoundError">ProfileNotFoundError</a></code></h4>
|
||||
</li>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.exceptions.ReservedNameError" href="#connpy.services.exceptions.ReservedNameError">ReservedNameError</a></code></h4>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,401 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.execution_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.execution_service</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.execution_service.ExecutionService"><code class="flex name class">
|
||||
<span>class <span class="ident">ExecutionService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ExecutionService(BaseService):
|
||||
"""Business logic for executing commands on nodes and running automation scripts."""
|
||||
|
||||
def run_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, str]:
|
||||
|
||||
"""Execute commands on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.run(
|
||||
commands=commands,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Execution failed: {e}")
|
||||
|
||||
def test_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, Dict[str, bool]]:
|
||||
|
||||
"""Run commands and verify expected output on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.test(
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Testing failed: {e}")
|
||||
|
||||
def run_cli_script(self, nodes_filter: str, script_path: str, parallel: int = 10) -> Dict[str, str]:
|
||||
"""Run a plain-text script containing one command per line."""
|
||||
if not os.path.exists(script_path):
|
||||
raise ConnpyError(f"Script file not found: {script_path}")
|
||||
|
||||
try:
|
||||
with open(script_path, "r") as f:
|
||||
commands = [line.strip() for line in f if line.strip()]
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to read script {script_path}: {e}")
|
||||
|
||||
return self.run_commands(nodes_filter, commands, parallel=parallel)
|
||||
|
||||
def run_yaml_playbook(self, playbook_path: str, parallel: int = 10) -> Dict[str, Any]:
|
||||
"""Run a structured Connpy YAML automation playbook."""
|
||||
if not os.path.exists(playbook_path):
|
||||
raise ConnpyError(f"Playbook file not found: {playbook_path}")
|
||||
|
||||
try:
|
||||
with open(playbook_path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to load playbook {playbook_path}: {e}")
|
||||
|
||||
# Basic validation
|
||||
if not isinstance(playbook, dict) or "nodes" not in playbook or "commands" not in playbook:
|
||||
raise ConnpyError("Invalid playbook format: missing 'nodes' or 'commands' keys.")
|
||||
|
||||
action = playbook.get("action", "run")
|
||||
if action == "run":
|
||||
return self.run_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
elif action == "test":
|
||||
return self.test_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
expected=playbook.get("expected", []),
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
else:
|
||||
raise ConnpyError(f"Unsupported playbook action: {action}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for executing commands on nodes and running automation scripts.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.run_cli_script"><code class="name flex">
|
||||
<span>def <span class="ident">run_cli_script</span></span>(<span>self, nodes_filter: str, script_path: str, parallel: int = 10) ‑> Dict[str, str]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def run_cli_script(self, nodes_filter: str, script_path: str, parallel: int = 10) -> Dict[str, str]:
|
||||
"""Run a plain-text script containing one command per line."""
|
||||
if not os.path.exists(script_path):
|
||||
raise ConnpyError(f"Script file not found: {script_path}")
|
||||
|
||||
try:
|
||||
with open(script_path, "r") as f:
|
||||
commands = [line.strip() for line in f if line.strip()]
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to read script {script_path}: {e}")
|
||||
|
||||
return self.run_commands(nodes_filter, commands, parallel=parallel)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Run a plain-text script containing one command per line.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.run_commands"><code class="name flex">
|
||||
<span>def <span class="ident">run_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>folder: str | None = None,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None) ‑> Dict[str, str]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def run_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
folder: Optional[str] = None,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, str]:
|
||||
|
||||
"""Execute commands on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.run(
|
||||
commands=commands,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
folder=folder,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Execution failed: {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Execute commands on a set of nodes.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.run_yaml_playbook"><code class="name flex">
|
||||
<span>def <span class="ident">run_yaml_playbook</span></span>(<span>self, playbook_path: str, parallel: int = 10) ‑> Dict[str, Any]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def run_yaml_playbook(self, playbook_path: str, parallel: int = 10) -> Dict[str, Any]:
|
||||
"""Run a structured Connpy YAML automation playbook."""
|
||||
if not os.path.exists(playbook_path):
|
||||
raise ConnpyError(f"Playbook file not found: {playbook_path}")
|
||||
|
||||
try:
|
||||
with open(playbook_path, "r") as f:
|
||||
playbook = yaml.load(f, Loader=yaml.FullLoader)
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Failed to load playbook {playbook_path}: {e}")
|
||||
|
||||
# Basic validation
|
||||
if not isinstance(playbook, dict) or "nodes" not in playbook or "commands" not in playbook:
|
||||
raise ConnpyError("Invalid playbook format: missing 'nodes' or 'commands' keys.")
|
||||
|
||||
action = playbook.get("action", "run")
|
||||
if action == "run":
|
||||
return self.run_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
elif action == "test":
|
||||
return self.test_commands(
|
||||
nodes_filter=playbook["nodes"],
|
||||
commands=playbook["commands"],
|
||||
expected=playbook.get("expected", []),
|
||||
parallel=parallel,
|
||||
timeout=playbook.get("timeout", 10)
|
||||
)
|
||||
else:
|
||||
raise ConnpyError(f"Unsupported playbook action: {action}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Run a structured Connpy YAML automation playbook.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.execution_service.ExecutionService.test_commands"><code class="name flex">
|
||||
<span>def <span class="ident">test_commands</span></span>(<span>self,<br>nodes_filter: str,<br>commands: List[str],<br>expected: List[str],<br>variables: Dict[str, Any] | None = None,<br>parallel: int = 10,<br>timeout: int = 10,<br>prompt: str | None = None,<br>on_node_complete: Callable | None = None,<br>logger: Callable | None = None) ‑> Dict[str, Dict[str, bool]]</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def test_commands(
|
||||
self,
|
||||
nodes_filter: str,
|
||||
commands: List[str],
|
||||
expected: List[str],
|
||||
variables: Optional[Dict[str, Any]] = None,
|
||||
parallel: int = 10,
|
||||
timeout: int = 10,
|
||||
prompt: Optional[str] = None,
|
||||
on_node_complete: Optional[Callable] = None,
|
||||
logger: Optional[Callable] = None
|
||||
) -> Dict[str, Dict[str, bool]]:
|
||||
|
||||
"""Run commands and verify expected output on a set of nodes."""
|
||||
try:
|
||||
matched_names = self.config._getallnodes(nodes_filter)
|
||||
if not matched_names:
|
||||
raise ConnpyError(f"No nodes found matching filter: {nodes_filter}")
|
||||
|
||||
node_data = self.config.getitems(matched_names, extract=True)
|
||||
executor = Nodes(node_data, config=self.config)
|
||||
self.last_executor = executor
|
||||
|
||||
results = executor.test(
|
||||
commands=commands,
|
||||
expected=expected,
|
||||
vars=variables,
|
||||
parallel=parallel,
|
||||
timeout=timeout,
|
||||
prompt=prompt,
|
||||
on_complete=on_node_complete,
|
||||
logger=logger
|
||||
)
|
||||
return results
|
||||
except Exception as e:
|
||||
raise ConnpyError(f"Testing failed: {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Run commands and verify expected output on a set of nodes.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
<li><code><b><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="base.html#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.execution_service.ExecutionService" href="#connpy.services.execution_service.ExecutionService">ExecutionService</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.execution_service.ExecutionService.run_cli_script" href="#connpy.services.execution_service.ExecutionService.run_cli_script">run_cli_script</a></code></li>
|
||||
<li><code><a title="connpy.services.execution_service.ExecutionService.run_commands" href="#connpy.services.execution_service.ExecutionService.run_commands">run_commands</a></code></li>
|
||||
<li><code><a title="connpy.services.execution_service.ExecutionService.run_yaml_playbook" href="#connpy.services.execution_service.ExecutionService.run_yaml_playbook">run_yaml_playbook</a></code></li>
|
||||
<li><code><a title="connpy.services.execution_service.ExecutionService.test_commands" href="#connpy.services.execution_service.ExecutionService.test_commands">test_commands</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,285 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1">
|
||||
<meta name="generator" content="pdoc3 0.11.5">
|
||||
<title>connpy.services.import_export_service API documentation</title>
|
||||
<meta name="description" content="">
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/sanitize.min.css" integrity="sha512-y1dtMcuvtTMJc1yPgEqF0ZjQbhnc/bFhyvIyVNb9Zk5mIGtqVaAB1Ttl28su8AvFMOY0EwRbAe+HCLqj6W7/KA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/10up-sanitize.css/13.0.0/typography.min.css" integrity="sha512-Y1DYSb995BAfxobCkKepB1BqJJTPrOp3zPL74AWFugHHmmdcvO+C48WLrUOlhGMc0QG7AE3f7gmvvcrmX2fDoA==" crossorigin>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/default.min.css" crossorigin>
|
||||
<style>:root{--highlight-color:#fe9}.flex{display:flex !important}body{line-height:1.5em}#content{padding:20px}#sidebar{padding:1.5em;overflow:hidden}#sidebar > *:last-child{margin-bottom:2cm}.http-server-breadcrumbs{font-size:130%;margin:0 0 15px 0}#footer{font-size:.75em;padding:5px 30px;border-top:1px solid #ddd;text-align:right}#footer p{margin:0 0 0 1em;display:inline-block}#footer p:last-child{margin-right:30px}h1,h2,h3,h4,h5{font-weight:300}h1{font-size:2.5em;line-height:1.1em}h2{font-size:1.75em;margin:2em 0 .50em 0}h3{font-size:1.4em;margin:1.6em 0 .7em 0}h4{margin:0;font-size:105%}h1:target,h2:target,h3:target,h4:target,h5:target,h6:target{background:var(--highlight-color);padding:.2em 0}a{color:#058;text-decoration:none;transition:color .2s ease-in-out}a:visited{color:#503}a:hover{color:#b62}.title code{font-weight:bold}h2[id^="header-"]{margin-top:2em}.ident{color:#900;font-weight:bold}pre code{font-size:.8em;line-height:1.4em;padding:1em;display:block}code{background:#f3f3f3;font-family:"DejaVu Sans Mono",monospace;padding:1px 4px;overflow-wrap:break-word}h1 code{background:transparent}pre{border-top:1px solid #ccc;border-bottom:1px solid #ccc;margin:1em 0}#http-server-module-list{display:flex;flex-flow:column}#http-server-module-list div{display:flex}#http-server-module-list dt{min-width:10%}#http-server-module-list p{margin-top:0}.toc ul,#index{list-style-type:none;margin:0;padding:0}#index code{background:transparent}#index h3{border-bottom:1px solid #ddd}#index ul{padding:0}#index h4{margin-top:.6em;font-weight:bold}@media (min-width:200ex){#index .two-column{column-count:2}}@media (min-width:300ex){#index .two-column{column-count:3}}dl{margin-bottom:2em}dl dl:last-child{margin-bottom:4em}dd{margin:0 0 1em 3em}#header-classes + dl > dd{margin-bottom:3em}dd dd{margin-left:2em}dd p{margin:10px 0}.name{background:#eee;font-size:.85em;padding:5px 10px;display:inline-block;min-width:40%}.name:hover{background:#e0e0e0}dt:target .name{background:var(--highlight-color)}.name > span:first-child{white-space:nowrap}.name.class > span:nth-child(2){margin-left:.4em}.inherited{color:#999;border-left:5px solid #eee;padding-left:1em}.inheritance em{font-style:normal;font-weight:bold}.desc h2{font-weight:400;font-size:1.25em}.desc h3{font-size:1em}.desc dt code{background:inherit}.source > summary,.git-link-div{color:#666;text-align:right;font-weight:400;font-size:.8em;text-transform:uppercase}.source summary > *{white-space:nowrap;cursor:pointer}.git-link{color:inherit;margin-left:1em}.source pre{max-height:500px;overflow:auto;margin:0}.source pre code{font-size:12px;overflow:visible;min-width:max-content}.hlist{list-style:none}.hlist li{display:inline}.hlist li:after{content:',\2002'}.hlist li:last-child:after{content:none}.hlist .hlist{display:inline;padding-left:1em}img{max-width:100%}td{padding:0 .5em}.admonition{padding:.1em 1em;margin:1em 0}.admonition-title{font-weight:bold}.admonition.note,.admonition.info,.admonition.important{background:#aef}.admonition.todo,.admonition.versionadded,.admonition.tip,.admonition.hint{background:#dfd}.admonition.warning,.admonition.versionchanged,.admonition.deprecated{background:#fd4}.admonition.error,.admonition.danger,.admonition.caution{background:lightpink}</style>
|
||||
<style media="screen and (min-width: 700px)">@media screen and (min-width:700px){#sidebar{width:30%;height:100vh;overflow:auto;position:sticky;top:0}#content{width:70%;max-width:100ch;padding:3em 4em;border-left:1px solid #ddd}pre code{font-size:1em}.name{font-size:1em}main{display:flex;flex-direction:row-reverse;justify-content:flex-end}.toc ul ul,#index ul ul{padding-left:1em}.toc > ul > li{margin-top:.5em}}</style>
|
||||
<style media="print">@media print{#sidebar h1{page-break-before:always}.source{display:none}}@media print{*{background:transparent !important;color:#000 !important;box-shadow:none !important;text-shadow:none !important}a[href]:after{content:" (" attr(href) ")";font-size:90%}a[href][title]:after{content:none}abbr[title]:after{content:" (" attr(title) ")"}.ir a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{border:1px solid #999;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100% !important}@page{margin:0.5cm}p,h2,h3{orphans:3;widows:3}h1,h2,h3,h4,h5,h6{page-break-after:avoid}}</style>
|
||||
<script defer src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js" integrity="sha512-D9gUyxqja7hBtkWpPWGt9wfbfaMGVt9gnyCvYa+jojwwPHLCzUm5i8rpk7vD7wNee9bA35eYIjobYPaQuKS1MQ==" crossorigin></script>
|
||||
<script>window.addEventListener('DOMContentLoaded', () => {
|
||||
hljs.configure({languages: ['bash', 'css', 'diff', 'graphql', 'ini', 'javascript', 'json', 'plaintext', 'python', 'python-repl', 'rust', 'shell', 'sql', 'typescript', 'xml', 'yaml']});
|
||||
hljs.highlightAll();
|
||||
/* Collapse source docstrings */
|
||||
setTimeout(() => {
|
||||
[...document.querySelectorAll('.hljs.language-python > .hljs-string')]
|
||||
.filter(el => el.innerHTML.length > 200 && ['"""', "'''"].includes(el.innerHTML.substring(0, 3)))
|
||||
.forEach(el => {
|
||||
let d = document.createElement('details');
|
||||
d.classList.add('hljs-string');
|
||||
d.innerHTML = '<summary>"""</summary>' + el.innerHTML.substring(3);
|
||||
el.replaceWith(d);
|
||||
});
|
||||
}, 100);
|
||||
})</script>
|
||||
</head>
|
||||
<body>
|
||||
<main>
|
||||
<article id="content">
|
||||
<header>
|
||||
<h1 class="title">Module <code>connpy.services.import_export_service</code></h1>
|
||||
</header>
|
||||
<section id="section-intro">
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
</section>
|
||||
<section>
|
||||
<h2 class="section-title" id="header-classes">Classes</h2>
|
||||
<dl>
|
||||
<dt id="connpy.services.import_export_service.ImportExportService"><code class="flex name class">
|
||||
<span>class <span class="ident">ImportExportService</span></span>
|
||||
<span>(</span><span>config=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">class ImportExportService(BaseService):
|
||||
"""Business logic for YAML/JSON inventory import and export."""
|
||||
|
||||
def export_to_file(self, file_path, folders=None):
|
||||
"""Export nodes/folders to a YAML file."""
|
||||
if os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' already exists.")
|
||||
|
||||
data = self.export_to_dict(folders)
|
||||
try:
|
||||
with open(file_path, "w") as f:
|
||||
yaml.dump(data, f, Dumper=NoAliasDumper, default_flow_style=False)
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to export to '{file_path}': {e}")
|
||||
|
||||
def export_to_dict(self, folders=None):
|
||||
"""Export nodes/folders to a dictionary."""
|
||||
if not folders:
|
||||
return self.config._getallnodesfull(extract=False)
|
||||
else:
|
||||
# Validate folders exist
|
||||
for f in folders:
|
||||
if f != "@" and f not in self.config._getallfolders():
|
||||
raise NodeNotFoundError(f"Folder '{f}' not found.")
|
||||
return self.config._getallnodesfull(folders, extract=False)
|
||||
|
||||
def import_from_file(self, file_path):
|
||||
"""Import nodes/folders from a YAML file."""
|
||||
if not os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' does not exist.")
|
||||
|
||||
try:
|
||||
with open(file_path, "r") as f:
|
||||
data = yaml.load(f, Loader=yaml.FullLoader)
|
||||
self.import_from_dict(data)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to read/parse import file: {e}")
|
||||
|
||||
def import_from_dict(self, data):
|
||||
"""Import nodes/folders from a dictionary."""
|
||||
if not isinstance(data, dict):
|
||||
raise InvalidConfigurationError("Invalid import data format: expected a dictionary of nodes.")
|
||||
|
||||
# Process imports
|
||||
for k, v in data.items():
|
||||
uniques = self.config._explode_unique(k)
|
||||
|
||||
# Ensure folders exist
|
||||
if "folder" in uniques:
|
||||
folder_name = f"@{uniques['folder']}"
|
||||
if folder_name not in self.config._getallfolders():
|
||||
folder_uniques = self.config._explode_unique(folder_name)
|
||||
self.config._folder_add(**folder_uniques)
|
||||
|
||||
if "subfolder" in uniques:
|
||||
sub_name = f"@{uniques['subfolder']}@{uniques['folder']}"
|
||||
if sub_name not in self.config._getallfolders():
|
||||
sub_uniques = self.config._explode_unique(sub_name)
|
||||
self.config._folder_add(**sub_uniques)
|
||||
|
||||
# Add node/connection
|
||||
v.update(uniques)
|
||||
self._validate_node_name(k)
|
||||
self.config._connections_add(**v)
|
||||
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Business logic for YAML/JSON inventory import and export.</p>
|
||||
<p>Initialize the service.</p>
|
||||
<h2 id="args">Args</h2>
|
||||
<dl>
|
||||
<dt><strong><code>config</code></strong></dt>
|
||||
<dd>An instance of configfile (or None to instantiate a new one/use global context).</dd>
|
||||
</dl></div>
|
||||
<h3>Ancestors</h3>
|
||||
<ul class="hlist">
|
||||
<li><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></li>
|
||||
</ul>
|
||||
<h3>Methods</h3>
|
||||
<dl>
|
||||
<dt id="connpy.services.import_export_service.ImportExportService.export_to_dict"><code class="name flex">
|
||||
<span>def <span class="ident">export_to_dict</span></span>(<span>self, folders=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def export_to_dict(self, folders=None):
|
||||
"""Export nodes/folders to a dictionary."""
|
||||
if not folders:
|
||||
return self.config._getallnodesfull(extract=False)
|
||||
else:
|
||||
# Validate folders exist
|
||||
for f in folders:
|
||||
if f != "@" and f not in self.config._getallfolders():
|
||||
raise NodeNotFoundError(f"Folder '{f}' not found.")
|
||||
return self.config._getallnodesfull(folders, extract=False)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Export nodes/folders to a dictionary.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.import_export_service.ImportExportService.export_to_file"><code class="name flex">
|
||||
<span>def <span class="ident">export_to_file</span></span>(<span>self, file_path, folders=None)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def export_to_file(self, file_path, folders=None):
|
||||
"""Export nodes/folders to a YAML file."""
|
||||
if os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' already exists.")
|
||||
|
||||
data = self.export_to_dict(folders)
|
||||
try:
|
||||
with open(file_path, "w") as f:
|
||||
yaml.dump(data, f, Dumper=NoAliasDumper, default_flow_style=False)
|
||||
except OSError as e:
|
||||
raise InvalidConfigurationError(f"Failed to export to '{file_path}': {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Export nodes/folders to a YAML file.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.import_export_service.ImportExportService.import_from_dict"><code class="name flex">
|
||||
<span>def <span class="ident">import_from_dict</span></span>(<span>self, data)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def import_from_dict(self, data):
|
||||
"""Import nodes/folders from a dictionary."""
|
||||
if not isinstance(data, dict):
|
||||
raise InvalidConfigurationError("Invalid import data format: expected a dictionary of nodes.")
|
||||
|
||||
# Process imports
|
||||
for k, v in data.items():
|
||||
uniques = self.config._explode_unique(k)
|
||||
|
||||
# Ensure folders exist
|
||||
if "folder" in uniques:
|
||||
folder_name = f"@{uniques['folder']}"
|
||||
if folder_name not in self.config._getallfolders():
|
||||
folder_uniques = self.config._explode_unique(folder_name)
|
||||
self.config._folder_add(**folder_uniques)
|
||||
|
||||
if "subfolder" in uniques:
|
||||
sub_name = f"@{uniques['subfolder']}@{uniques['folder']}"
|
||||
if sub_name not in self.config._getallfolders():
|
||||
sub_uniques = self.config._explode_unique(sub_name)
|
||||
self.config._folder_add(**sub_uniques)
|
||||
|
||||
# Add node/connection
|
||||
v.update(uniques)
|
||||
self._validate_node_name(k)
|
||||
self.config._connections_add(**v)
|
||||
|
||||
self.config._saveconfig(self.config.file)</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Import nodes/folders from a dictionary.</p></div>
|
||||
</dd>
|
||||
<dt id="connpy.services.import_export_service.ImportExportService.import_from_file"><code class="name flex">
|
||||
<span>def <span class="ident">import_from_file</span></span>(<span>self, file_path)</span>
|
||||
</code></dt>
|
||||
<dd>
|
||||
<details class="source">
|
||||
<summary>
|
||||
<span>Expand source code</span>
|
||||
</summary>
|
||||
<pre><code class="python">def import_from_file(self, file_path):
|
||||
"""Import nodes/folders from a YAML file."""
|
||||
if not os.path.exists(file_path):
|
||||
raise InvalidConfigurationError(f"File '{file_path}' does not exist.")
|
||||
|
||||
try:
|
||||
with open(file_path, "r") as f:
|
||||
data = yaml.load(f, Loader=yaml.FullLoader)
|
||||
self.import_from_dict(data)
|
||||
except Exception as e:
|
||||
raise InvalidConfigurationError(f"Failed to read/parse import file: {e}")</code></pre>
|
||||
</details>
|
||||
<div class="desc"><p>Import nodes/folders from a YAML file.</p></div>
|
||||
</dd>
|
||||
</dl>
|
||||
<h3>Inherited members</h3>
|
||||
<ul class="hlist">
|
||||
<li><code><b><a title="connpy.services.base.BaseService" href="base.html#connpy.services.base.BaseService">BaseService</a></b></code>:
|
||||
<ul class="hlist">
|
||||
<li><code><a title="connpy.services.base.BaseService.set_reserved_names" href="base.html#connpy.services.base.BaseService.set_reserved_names">set_reserved_names</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</dd>
|
||||
</dl>
|
||||
</section>
|
||||
</article>
|
||||
<nav id="sidebar">
|
||||
<div class="toc">
|
||||
<ul></ul>
|
||||
</div>
|
||||
<ul id="index">
|
||||
<li><h3>Super-module</h3>
|
||||
<ul>
|
||||
<li><code><a title="connpy.services" href="index.html">connpy.services</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li><h3><a href="#header-classes">Classes</a></h3>
|
||||
<ul>
|
||||
<li>
|
||||
<h4><code><a title="connpy.services.import_export_service.ImportExportService" href="#connpy.services.import_export_service.ImportExportService">ImportExportService</a></code></h4>
|
||||
<ul class="">
|
||||
<li><code><a title="connpy.services.import_export_service.ImportExportService.export_to_dict" href="#connpy.services.import_export_service.ImportExportService.export_to_dict">export_to_dict</a></code></li>
|
||||
<li><code><a title="connpy.services.import_export_service.ImportExportService.export_to_file" href="#connpy.services.import_export_service.ImportExportService.export_to_file">export_to_file</a></code></li>
|
||||
<li><code><a title="connpy.services.import_export_service.ImportExportService.import_from_dict" href="#connpy.services.import_export_service.ImportExportService.import_from_dict">import_from_dict</a></code></li>
|
||||
<li><code><a title="connpy.services.import_export_service.ImportExportService.import_from_file" href="#connpy.services.import_export_service.ImportExportService.import_from_file">import_from_file</a></code></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</nav>
|
||||
</main>
|
||||
<footer id="footer">
|
||||
<p>Generated by <a href="https://pdoc3.github.io/pdoc" title="pdoc: Python API documentation generator"><cite>pdoc</cite> 0.11.5</a>.</p>
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user