Newelle, the AI-powered assistant for Linux systems, has been updated to version 1.2, bringing with it a host of new features and improvements.
Key Features
- llama.cpp Integration: Added support for llama.cpp, with options to recompile with any backend.
- Model Library: Implemented a new model library for ollama and llama.cpp.
- Hybrid Search: Improved hybrid search, with a focus on optimizing document reading.
- Command Execution: Added a tool for executing commands directly from the assistant.
- Tool Groups: Introduced the ability to create and manage tool groups.
- MCP Server: Optimized the addition of MCP servers, with STDIO support for non-Flatpak environments.
- Semantic Memory: Implemented semantic memory management.
- Chat Import/Export: Added the ability to import and export chats.
- RAG Index: Ability to add custom folders to the RAG index.
- Message Information: Improved the message information menu, displaying token count and generation speed.
Newelle 1.2 is available for download via FlatHub.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!