Add LLM configuration and MCP server management UI and backend functionality

- Implemented a new SPA page for LLM Bridge and MCP Server settings in `llm-config.js`.
- Added functionality for managing LLM and MCP configurations, including toggling, saving settings, and testing connections.
- Created HTTP endpoints in `llm_utils.py` for handling LLM chat, status checks, and MCP server configuration.
- Integrated model fetching from LaRuche and Ollama backends.
- Enhanced error handling and logging for better debugging and user feedback.
This commit is contained in:
infinition
2026-03-16 20:33:22 +01:00
parent aac77a3e76
commit b759ab6d4b
41 changed files with 9991 additions and 397 deletions

View File

@@ -364,3 +364,32 @@
align-items: flex-start;
}
}
/* ── AI Sentinel elements ─────────────────────────────── */
.sentinel-ai-btn {
background: rgba(168, 85, 247, .1) !important;
border-color: rgba(168, 85, 247, .3) !important;
color: #c084fc !important;
}
.sentinel-ai-btn:hover {
background: rgba(168, 85, 247, .2) !important;
}
.sentinel-ai-result {
margin: 4px 0 0;
padding: 6px 8px;
background: rgba(168, 85, 247, .06);
border: 1px solid rgba(168, 85, 247, .15);
border-radius: 4px;
font-size: 0.7rem;
white-space: pre-wrap;
line-height: 1.4;
display: none;
}
.sentinel-ai-result.active { display: block; }
.sentinel-ai-summary {
padding: 8px;
background: rgba(59, 130, 246, .06);
border: 1px solid rgba(59, 130, 246, .15);
border-radius: 6px;
margin-bottom: 8px;
}