Commit Graph

8 Commits

Author SHA1 Message Date
fluzzi32 a192bd1912 connpy v6.0.0b4: AI Stability, Remote Sync & UI Polish (Clean Commit) 2026-05-01 18:55:25 -03:00
fluzzi32 1c814eb9fd refactor: Major upgrade to v5.1b6 - AWS SSM support & Distributed Architecture
Core & Protocols:
- Native AWS SSM support added (aws ssm start-session).
- Improved Pexpect logic for ssm, kubectl, and docker.
- Cleaned connection success messages (omitting ports for non-IP protocols).

gRPC Layer:
- Migrated gRPC modules to 'connpy/grpc_layer/'.
- Implemented dynamic node naming (e.g. ssm-i-xxxx@aws) for accurate server-side logging.
- Added automatic sys.path resolution for gRPC generated modules.
- Enhanced InteractNode response with initial connection status.

Printer & Concurrency:
- Implemented ThreadLocalStream for isolated thread-safe output.
- Self-healing Console objects to prevent 'closed file' errors in test/async environments.
- Capture clean plugin output in remote executions.

AI & Services:
- Improved tool registration and debug visualization.
- Restored native dictionary returns for AI tools to fix Web UI rendering.
- Increased backup retention to 100 copies in SyncService.
- Silenced noisy auto-sync CLI messages.

Quality & Docs:
- Total tests: 267 (all passing).
- New test suites for gRPC layer and printer concurrency.
- Updated .gitignore to exclude internal planning docs.
- Full technical documentation regenerated with pdoc.
2026-04-24 19:23:00 -03:00
fluzzi32 cb926c2b85 feat: major architectural refactor to 5.1b1 - Service Layer, gRPC & Agent evolution (fragmented secrets) 2026-04-17 18:42:08 -03:00
fluzzi32 d8f7d4db87 feat: migrate config to YAML, add dual-caching and 0ms fzf wrapper
- Migrated configuration backend from JSON to YAML for better readability.
- Added automatic dual-caching (.config.cache.json) to preserve fast load times with YAML.
- Implemented a new 0ms latency fzf wrapper for bash and zsh (--fzf-wrapper).
- Updated sync plugin to support the new YAML config format and clear caches on extraction.
- Refactored 'completion.py' to gracefully handle fallback config formats.
- Added new test modules (test_capture, test_context, test_sync) covering core plugins.
- Updated existing unit tests to handle YAML config creation and parsing.
- Bumped version to 5.0b3 and regenerated HTML documentation.
2026-04-03 18:47:55 -03:00
fluzzi32 257cb05cc1 feat: complete overhaul of AI subsystem with multi-agent litellm architecture
- Refactored AI module to use litellm, supporting Anthropic, Google, OpenAI, etc.
- Introduced 'Engineer' (execution) and 'Architect' (strategic) AI agents.
- Added real-time streaming responses and interactive chat mode via 'rich'.
- Added CLI arguments for model/key overrides (--engineer-model, --architect-model).
- Replaced 'openai' with 'litellm' in requirements.txt and setup.cfg.
- Updated nodes.run() to support an 'on_complete' callback for live node-status streaming.
- Fixed an undefined variable bug (config.profiles -> self.profiles) in configfile.py.
- Updated README and docstrings with new AI plugin tool registration API.
- Regenerated HTML documentation using pdoc3.
- Bumped version to 5.0b1 for beta release.
2026-04-03 15:11:37 -03:00
fluzzi32 fcf8ed6859 Add features:
- New protocols: Docker and Kubectl
 - Add contexts to filter the number of nodes
 - Add option to modify the api using plugins
 - Minor bug fixes
2024-07-15 15:38:01 -03:00
fluzzi32 76a73fa427 add hooks and sync to google 2024-04-17 16:27:02 -03:00
fluzzi32 05d72276d8 Initial commit 2022-03-17 19:00:57 -03:00