- Refactored AI module to use litellm, supporting Anthropic, Google, OpenAI, etc.
- Introduced 'Engineer' (execution) and 'Architect' (strategic) AI agents.
- Added real-time streaming responses and interactive chat mode via 'rich'.
- Added CLI arguments for model/key overrides (--engineer-model, --architect-model).
- Replaced 'openai' with 'litellm' in requirements.txt and setup.cfg.
- Updated nodes.run() to support an 'on_complete' callback for live node-status streaming.
- Fixed an undefined variable bug (config.profiles -> self.profiles) in configfile.py.
- Updated README and docstrings with new AI plugin tool registration API.
- Regenerated HTML documentation using pdoc3.
- Bumped version to 5.0b1 for beta release.
- Fix not allowing to use some regex symbols when passing arguments.
- Fix AI requests timing out when list of nodes is big.
- Fix error when forwarding connpy run commands to a file
Features:
- Improve AI response time changing list of devices to list of OS,
reducing the lenght of request.
- Update GPT model to last one.
- Add filtering option to list command, Also format can be passed to
format the output as needed.
- Allow to use regular expresions to match nodes in: run command
(using yaml file or directly), remove command.
- When there is a connectivity error, now it shows the error number
plus the protocol error.