The integration of MCP support in llama.cpp is under testing, offering new features to improve interaction with language models.
New Features
The new features include:
- Adding System Message to conversations.
- CORS proxy server integrated into the llama-server backend.
- Server Selector.
- Settings with server cards showing capabilities and instructions.
- Tool Calls and Agentic Loop.
- Logic and UI with processing statistics.
- Automatic prompt detection.
- Prompt Picker and Prompt Args Form.
- Resource management via integrated browser.
- Raw output display.
This is a work in progress, so it is recommended to proceed with caution and awareness.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!