This repository has been archived on 2026-02-15. You can view files and clone it. You cannot open issues or pull requests or push a commit.
Files
BreakPilot Dev 19855efacc
Some checks failed
Tests / Go Tests (push) Has been cancelled
Tests / Python Tests (push) Has been cancelled
Tests / Integration Tests (push) Has been cancelled
Tests / Go Lint (push) Has been cancelled
Tests / Python Lint (push) Has been cancelled
Tests / Security Scan (push) Has been cancelled
Tests / All Checks Passed (push) Has been cancelled
Security Scanning / Secret Scanning (push) Has been cancelled
Security Scanning / Dependency Vulnerability Scan (push) Has been cancelled
Security Scanning / Go Security Scan (push) Has been cancelled
Security Scanning / Python Security Scan (push) Has been cancelled
Security Scanning / Node.js Security Scan (push) Has been cancelled
Security Scanning / Docker Image Security (push) Has been cancelled
Security Scanning / Security Summary (push) Has been cancelled
CI/CD Pipeline / Go Tests (push) Has been cancelled
CI/CD Pipeline / Python Tests (push) Has been cancelled
CI/CD Pipeline / Website Tests (push) Has been cancelled
CI/CD Pipeline / Linting (push) Has been cancelled
CI/CD Pipeline / Security Scan (push) Has been cancelled
CI/CD Pipeline / Docker Build & Push (push) Has been cancelled
CI/CD Pipeline / Integration Tests (push) Has been cancelled
CI/CD Pipeline / Deploy to Staging (push) Has been cancelled
CI/CD Pipeline / Deploy to Production (push) Has been cancelled
CI/CD Pipeline / CI Summary (push) Has been cancelled
ci/woodpecker/manual/build-ci-image Pipeline was successful
ci/woodpecker/manual/main Pipeline failed
feat: BreakPilot PWA - Full codebase (clean push without large binaries)
All services: admin-v2, studio-v2, website, ai-compliance-sdk,
consent-service, klausur-service, voice-service, and infrastructure.
Large PDFs and compiled binaries excluded via .gitignore.
2026-02-11 13:25:58 +01:00

86 lines
2.3 KiB
Python

"""
BreakPilot LLM Gateway - Main Application
OpenAI-kompatibles API Gateway für Self-hosted LLMs.
"""
import logging
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from .config import get_config
from .routes import chat_router, playbooks_router, health_router, comparison_router, edu_search_seeds_router, communication_router
from .services.inference import get_inference_service
# Logging Setup
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Lifecycle Management für den Gateway."""
logger.info("Starting LLM Gateway...")
config = get_config()
logger.info(f"Debug mode: {config.debug}")
logger.info(f"Backends configured: ollama={bool(config.ollama)}, vllm={bool(config.vllm)}, anthropic={bool(config.anthropic)}")
yield
# Cleanup
logger.info("Shutting down LLM Gateway...")
inference_service = get_inference_service()
await inference_service.close()
def create_app() -> FastAPI:
"""Factory Function für die FastAPI App."""
config = get_config()
app = FastAPI(
title="BreakPilot LLM Gateway",
description="OpenAI-kompatibles API Gateway für Self-hosted LLMs",
version="0.1.0",
lifespan=lifespan,
docs_url="/docs" if config.debug else None,
redoc_url="/redoc" if config.debug else None,
)
# CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # In Produktion einschränken
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Routes
app.include_router(health_router)
app.include_router(chat_router, prefix="/v1")
app.include_router(playbooks_router)
app.include_router(comparison_router, prefix="/v1")
app.include_router(edu_search_seeds_router, prefix="/v1")
app.include_router(communication_router, prefix="/v1")
return app
# App Instance für uvicorn
app = create_app()
if __name__ == "__main__":
import uvicorn
config = get_config()
uvicorn.run(
"llm_gateway.main:app",
host=config.host,
port=config.port,
reload=config.debug,
)