๐ LLM
AI generated
Newelle 1.2: AI assistant for Linux gets an update
Newelle, the AI-powered assistant for Linux systems, has been updated to version 1.2, bringing with it a host of new features and improvements.
## Key Features
* **llama.cpp Integration:** Added support for llama.cpp, with options to recompile with any backend.
* **Model Library:** Implemented a new model library for ollama and llama.cpp.
* **Hybrid Search:** Improved hybrid search, with a focus on optimizing document reading.
* **Command Execution:** Added a tool for executing commands directly from the assistant.
* **Tool Groups:** Introduced the ability to create and manage tool groups.
* **MCP Server:** Optimized the addition of MCP servers, with STDIO support for non-Flatpak environments.
* **Semantic Memory:** Implemented semantic memory management.
* **Chat Import/Export:** Added the ability to import and export chats.
* **RAG Index:** Ability to add custom folders to the RAG index.
* **Message Information:** Improved the message information menu, displaying token count and generation speed.
Newelle 1.2 is available for download via [FlatHub](https://flathub.org/en/apps/io.github.qwersyk.Newelle).
๐ฌ Commenti (0)
๐ Accedi o registrati per commentare gli articoli.
Nessun commento ancora. Sii il primo a commentare!