The integration of MCP support in llama.cpp is under testing, offering new features to improve interaction with language models.
New Features
The new features include:
- Adding System Message to conversations.
- CORS proxy server integrated into the llama-server backend.
- Server Selector.
- Settings with server cards showing capabilities and instructions.
- Tool Calls and Agentic Loop.
- Logic and UI with processing statistics.
- Automatic prompt detection.
- Prompt Picker and Prompt Args Form.
- Resource management via integrated browser.
- Raw output display.
This is a work in progress, so it is recommended to proceed with caution and awareness.
๐ฌ Commenti (0)
๐ Accedi o registrati per commentare gli articoli.
Nessun commento ancora. Sii il primo a commentare!