feat(chat): replace built-in chat with LibreChat SSO integration
All checks were successful
CI / Format (push) Successful in 26s
CI / Clippy (push) Successful in 3m0s
CI / Security Audit (push) Has been skipped
CI / Tests (push) Has been skipped
CI / Deploy (push) Has been skipped

Replaces the custom chat page with an external LibreChat instance that
shares Keycloak SSO for seamless auto-login. Removes Tools and Knowledge
Base pages as these are now handled by LibreChat's built-in capabilities.

- Add LibreChat service to docker-compose with Ollama backend config
- Add Keycloak OIDC client (certifai-librechat) with prompt=none for
  silent SSO
- Create librechat.yaml with CERTifAI branding, Ollama endpoint, and
  custom page title/logo
- Change sidebar Chat link to external URL (opens LibreChat in new tab)
- Remove chat page, tools page, knowledge base page and all related
  components (chat_sidebar, chat_bubble, chat_input_bar, etc.)
- Remove tool_card, file_row components and tool/knowledge models
- Remove chat_stream SSE handler (no longer needed)
- Clean up i18n files: remove chat, tools, knowledge sections
- Dashboard article summarization via Ollama remains intact

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Sharang Parnerkar
2026-02-23 13:37:17 +01:00
parent d814e22f9d
commit 74a225224c
32 changed files with 200 additions and 2118 deletions

35
librechat/librechat.yaml Normal file
View File

@@ -0,0 +1,35 @@
# CERTifAI LibreChat Configuration
# Ollama backend for self-hosted LLM inference.
version: 1.2.1
cache: true
registration:
socialLogins:
- openid
interface:
privacyPolicy:
externalUrl: http://localhost:8000/privacy
termsOfService:
externalUrl: http://localhost:8000/impressum
endpointsMenu: true
modelSelect: true
parameters: true
endpoints:
ollama:
titleModel: "current_model"
# Use the Docker host network alias when running inside compose.
# Override OLLAMA_URL in .env for external Ollama instances.
url: "http://host.docker.internal:11434"
models:
fetch: true
summarize: true
forcePrompt: false
dropParams:
- stop
- user
- frequency_penalty
- presence_penalty
modelDisplayLabel: "CERTifAI Ollama"

25
librechat/logo.svg Normal file
View File

@@ -0,0 +1,25 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 64 64" fill="none">
<!-- Shield body -->
<path d="M32 4L8 16v16c0 14.4 10.24 27.2 24 32 13.76-4.8 24-17.6 24-32V16L32 4z"
fill="#4B3FE0" fill-opacity="0.12" stroke="#4B3FE0" stroke-width="2"
stroke-linejoin="round"/>
<!-- Inner shield highlight -->
<path d="M32 10L14 19v11c0 11.6 7.68 22 18 26 10.32-4 18-14.4 18-26V19L32 10z"
fill="none" stroke="#4B3FE0" stroke-width="1" stroke-opacity="0.3"
stroke-linejoin="round"/>
<!-- Neural network nodes -->
<circle cx="32" cy="24" r="3.5" fill="#38B2AC"/>
<circle cx="22" cy="36" r="3" fill="#38B2AC"/>
<circle cx="42" cy="36" r="3" fill="#38B2AC"/>
<circle cx="27" cy="48" r="2.5" fill="#38B2AC" fill-opacity="0.7"/>
<circle cx="37" cy="48" r="2.5" fill="#38B2AC" fill-opacity="0.7"/>
<!-- Neural network edges -->
<line x1="32" y1="24" x2="22" y2="36" stroke="#38B2AC" stroke-width="1.2" stroke-opacity="0.6"/>
<line x1="32" y1="24" x2="42" y2="36" stroke="#38B2AC" stroke-width="1.2" stroke-opacity="0.6"/>
<line x1="22" y1="36" x2="27" y2="48" stroke="#38B2AC" stroke-width="1" stroke-opacity="0.4"/>
<line x1="22" y1="36" x2="37" y2="48" stroke="#38B2AC" stroke-width="1" stroke-opacity="0.4"/>
<line x1="42" y1="36" x2="27" y2="48" stroke="#38B2AC" stroke-width="1" stroke-opacity="0.4"/>
<line x1="42" y1="36" x2="37" y2="48" stroke="#38B2AC" stroke-width="1" stroke-opacity="0.4"/>
<!-- Cross edge for connectivity -->
<line x1="22" y1="36" x2="42" y2="36" stroke="#38B2AC" stroke-width="0.8" stroke-opacity="0.3"/>
</svg>

After

Width:  |  Height:  |  Size: 1.6 KiB