fix: Restore all files lost during destructive rebase

A previous `git pull --rebase origin main` dropped 177 local commits,
losing 3400+ files across admin-v2, backend, studio-v2, website,
klausur-service, and many other services. The partial restore attempt
(660295e2) only recovered some files.

This commit restores all missing files from pre-rebase ref 98933f5e
while preserving post-rebase additions (night-scheduler, night-mode UI,
NightModeWidget dashboard integration).

Restored features include:
- AI Module Sidebar (FAB), OCR Labeling, OCR Compare
- GPU Dashboard, RAG Pipeline, Magic Help
- Klausur-Korrektur (8 files), Abitur-Archiv (5+ files)
- Companion, Zeugnisse-Crawler, Screen Flow
- Full backend, studio-v2, website, klausur-service
- All compliance SDKs, agent-core, voice-service
- CI/CD configs, documentation, scripts

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Benjamin Admin
2026-02-09 09:51:32 +01:00
parent f7487ee240
commit 21a844cb8a
1986 changed files with 744143 additions and 1731 deletions

81
geo-service/.env.example Normal file
View File

@@ -0,0 +1,81 @@
# GeoEdu Service Environment Configuration
# Copy this file to .env and adjust values as needed
# ===========================================
# Service Configuration
# ===========================================
PORT=8088
ENVIRONMENT=development
DEBUG=false
# ===========================================
# JWT Authentication
# ===========================================
JWT_SECRET=your-super-secret-jwt-key-change-in-production
JWT_ALGORITHM=HS256
JWT_EXPIRATION_HOURS=24
# ===========================================
# PostgreSQL (PostGIS)
# ===========================================
# Note: Database must have PostGIS extension enabled
DATABASE_URL=postgresql+asyncpg://breakpilot:breakpilot123@postgres:5432/breakpilot_db
# ===========================================
# MinIO Object Storage (S3-compatible)
# ===========================================
# Used for storing AOI bundles and generated assets
MINIO_ENDPOINT=minio:9000
MINIO_ACCESS_KEY=breakpilot
MINIO_SECRET_KEY=breakpilot123
MINIO_BUCKET=breakpilot-geo
MINIO_SECURE=false
# ===========================================
# Ollama LLM (for Learning Node Generation)
# ===========================================
# DSGVO-compliant local LLM for generating learning content
OLLAMA_BASE_URL=http://host.docker.internal:11434
OLLAMA_MODEL=qwen2.5:14b
OLLAMA_TIMEOUT=120
# ===========================================
# Data Directories (Docker Volumes)
# ===========================================
OSM_DATA_DIR=/app/data/osm
DEM_DATA_DIR=/app/data/dem
TILE_CACHE_DIR=/app/cache/tiles
BUNDLE_DIR=/app/bundles
# ===========================================
# Tile Server Configuration
# ===========================================
DEFAULT_PMTILES_FILE=germany.pmtiles
TILE_CACHE_MAX_SIZE_GB=50.0
# ===========================================
# DEM (Digital Elevation Model) Configuration
# ===========================================
# Copernicus DEM GLO-30 (30m resolution)
DEM_RESOLUTION=GLO-30
TERRAIN_TILE_SIZE=256
# ===========================================
# AOI (Area of Interest) Limits
# ===========================================
# DSGVO data minimization: limit area size
MAX_AOI_SIZE_KM2=4.0
MAX_AOI_PER_USER=10
AOI_RETENTION_DAYS=30
# ===========================================
# Learning Nodes Configuration
# ===========================================
MAX_NODES_PER_AOI=20
# Supported themes: topographie, landnutzung, orientierung, geologie, hydrologie, vegetation
# ===========================================
# CORS Configuration
# ===========================================
# Comma-separated list of allowed origins
CORS_ORIGINS=http://localhost:3000,http://localhost:3001,http://localhost:8088

69
geo-service/Dockerfile Normal file
View File

@@ -0,0 +1,69 @@
# GeoEdu Service - Self-Hosted OSM + Terrain
# DSGVO-konform, keine externen Tile-Services
FROM python:3.11-slim-bookworm
# Build arguments
ARG TARGETARCH
# Install system dependencies for geo-processing
RUN apt-get update && apt-get install -y --no-install-recommends \
# Build essentials
build-essential \
gcc \
g++ \
# Geo libraries
libgdal-dev \
gdal-bin \
libgeos-dev \
libproj-dev \
# Image processing
libpng-dev \
libjpeg-dev \
# Network tools
curl \
wget \
# Clean up
&& rm -rf /var/lib/apt/lists/*
# Set GDAL environment variables
ENV GDAL_CONFIG=/usr/bin/gdal-config
ENV CPLUS_INCLUDE_PATH=/usr/include/gdal
ENV C_INCLUDE_PATH=/usr/include/gdal
# Create app directory
WORKDIR /app
# Create non-root user for security
RUN groupadd -r geoservice && useradd -r -g geoservice geoservice
# Create data and cache directories
RUN mkdir -p /app/data/osm /app/data/dem /app/cache/tiles /app/bundles \
&& chown -R geoservice:geoservice /app
# Copy requirements first for better caching
COPY requirements.txt .
# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY --chown=geoservice:geoservice . .
# Create __init__.py files for Python packages
RUN touch /app/api/__init__.py \
&& touch /app/services/__init__.py \
&& touch /app/models/__init__.py \
&& touch /app/utils/__init__.py
# Switch to non-root user
USER geoservice
# Expose port
EXPOSE 8088
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \
CMD curl -f http://localhost:8088/health || exit 1
# Start application
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8088"]

196
geo-service/STATUS.md Normal file
View File

@@ -0,0 +1,196 @@
# GeoEdu Service - Implementierungsstatus
**Stand:** 2026-01-24
**Status:** Infrastruktur komplett, Downloads ausstehend
---
## Übersicht
Der GeoEdu Service ist eine DSGVO-konforme Erdkunde-Lernplattform mit:
- Self-Hosted OpenStreetMap Tile Server für Deutschland
- 3D-Terrain aus Copernicus DEM (Open Data)
- Unity WebGL Integration für immersive Lernwelten
- Learning Nodes System für didaktische Inhalte (via Ollama LLM)
**Port:** 8088
---
## Abgeschlossene Arbeiten
### Backend (geo-service/)
| Datei | Status | Beschreibung |
|-------|--------|--------------|
| `Dockerfile` | ✅ | Python 3.11 + GDAL + Geo-Libraries |
| `requirements.txt` | ✅ | FastAPI, asyncpg, shapely, rasterio, pmtiles |
| `config.py` | ✅ | Pydantic Settings |
| `main.py` | ✅ | FastAPI App mit allen Routers |
| `api/tiles.py` | ✅ | Vector Tile Endpoints |
| `api/terrain.py` | ✅ | DEM/Heightmap Endpoints |
| `api/aoi.py` | ✅ | AOI Packaging Endpoints |
| `api/learning.py` | ✅ | Learning Nodes API |
| `services/tile_server.py` | ✅ | PMTiles Reader |
| `services/dem_service.py` | ✅ | Copernicus DEM Handler |
| `services/aoi_packager.py` | ✅ | Unity Bundle Generator |
| `services/osm_extractor.py` | ✅ | OSM Feature Extraction |
| `services/learning_generator.py` | ✅ | Ollama LLM Integration |
| `models/aoi.py` | ✅ | AOI Pydantic Models |
| `models/learning_node.py` | ✅ | Learning Node Models |
| `models/attribution.py` | ✅ | License Attribution |
| `utils/geo_utils.py` | ✅ | Koordinaten-Transformationen |
| `utils/minio_client.py` | ✅ | S3 Client |
| `utils/license_checker.py` | ✅ | Lizenz-Validierung |
| `scripts/download_osm.sh` | ✅ | Geofabrik Download (NICHT ausgeführt) |
| `scripts/download_dem.sh` | ✅ | Copernicus Download (NICHT ausgeführt) |
| `scripts/import_osm.sh` | ✅ | osm2pgsql Import |
| `scripts/generate_tiles.sh` | ✅ | PMTiles Generation |
| `scripts/init_postgis.sql` | ✅ | PostGIS Schema + Tabellen |
| `tests/test_tiles.py` | ✅ | API Tests |
| `tests/test_aoi.py` | ✅ | AOI + Geo Utils Tests |
| `tests/test_learning.py` | ✅ | Learning Generator Tests |
| `.env.example` | ✅ | Environment Template |
### Frontend (studio-v2/)
| Datei | Status | Beschreibung |
|-------|--------|--------------|
| `package.json` | ✅ | maplibre-gl Dependency hinzugefügt |
| `app/geo-lernwelt/page.tsx` | ✅ | Hauptseite mit Tabs |
| `app/geo-lernwelt/types.ts` | ✅ | TypeScript Interfaces |
| `components/geo-lernwelt/AOISelector.tsx` | ✅ | MapLibre Polygon-Zeichnung |
| `components/geo-lernwelt/UnityViewer.tsx` | ✅ | Unity WebGL Viewer |
| `lib/geo-lernwelt/GeoContext.tsx` | ✅ | React Context |
| `lib/geo-lernwelt/mapStyles.ts` | ✅ | MapLibre Styles |
| `lib/geo-lernwelt/unityBridge.ts` | ✅ | Unity Communication |
### Docker-Konfiguration
| Änderung | Status | Beschreibung |
|----------|--------|--------------|
| `docker-compose.yml` | ✅ | geo-service hinzugefügt |
| PostgreSQL Image | ✅ | Geändert zu `postgis/postgis:16-3.4-alpine` |
| PostGIS Init Script | ✅ | Auto-Setup bei Container-Start |
| Volumes | ✅ | geo_osm_data, geo_dem_data, geo_tile_cache, geo_aoi_bundles |
---
## Ausstehende Arbeiten
### Downloads (NICHT automatisch starten!)
| Download | Größe | Script | Dauer (~100 Mbit/s) |
|----------|-------|--------|---------------------|
| Germany OSM PBF | 4.4 GB | `scripts/download_osm.sh` | ~6 Min |
| Copernicus DEM | ~25 GB | `scripts/download_dem.sh` | ~35 Min |
### Generierung (nach Downloads)
| Prozess | Output | Dauer |
|---------|--------|-------|
| osm2pgsql Import | PostgreSQL DB | ~2-4 Stunden |
| PMTiles (Z0-14) | ~200-300 GB | ~12-24 Stunden |
| DEM Tiles | ~15 GB | ~2-3 Stunden |
**Gesamtspeicherbedarf nach Setup: ~350 GB**
---
## API Endpoints
```
# Tiles
GET /api/v1/tiles/{z}/{x}/{y}.pbf → Vector Tile
GET /api/v1/tiles/style.json → MapLibre Style
GET /api/v1/tiles/metadata → Tile Metadata
GET /api/v1/tiles/bounds → Germany Bounds
# Terrain
GET /api/v1/terrain/{z}/{x}/{y}.png → Heightmap (Terrain-RGB)
GET /api/v1/terrain/hillshade/{z}/{x}/{y}.png → Hillshade
GET /api/v1/terrain/elevation?lat=&lon= → Point Elevation
GET /api/v1/terrain/metadata → DEM Metadata
# AOI (Area of Interest)
POST /api/v1/aoi → Create AOI
GET /api/v1/aoi/{id} → Get Status
GET /api/v1/aoi/{id}/manifest.json → Unity Manifest
POST /api/v1/aoi/validate → Validate Polygon
GET /api/v1/aoi/templates/mainau → Mainau Demo
# Learning Nodes
POST /api/v1/learning/generate → Generate with LLM
GET /api/v1/learning/templates → Available Themes
GET /api/v1/learning/{aoi_id}/nodes → Get Nodes
GET /api/v1/learning/statistics → Stats
```
---
## Nächste Schritte
1. **Container starten (ohne Downloads):**
```bash
docker compose up -d geo-service
```
2. **Health Check testen:**
```bash
curl http://localhost:8088/health
```
3. **Frontend testen:**
```
http://localhost:3001/geo-lernwelt
```
4. **Downloads starten (wenn bereit):**
```bash
# OSM Download (~6 Min)
docker exec breakpilot-pwa-geo-service /app/scripts/download_osm.sh
# DEM Download (~35 Min)
docker exec breakpilot-pwa-geo-service /app/scripts/download_dem.sh
```
5. **Tile-Generierung (nach Downloads):**
```bash
# OSM Import (~2-4h)
docker exec breakpilot-pwa-geo-service /app/scripts/import_osm.sh
# PMTiles Generation (~12-24h)
docker exec breakpilot-pwa-geo-service /app/scripts/generate_tiles.sh
```
---
## Lizenz-Compliance
### Zugelassene Quellen
- ✅ OpenStreetMap (ODbL)
- ✅ Copernicus DEM (Copernicus License)
- ✅ OpenAerialMap (CC-BY 4.0)
### VERBOTEN
- ❌ Google Maps/Earth
- ❌ Bing Maps
- ❌ Apple Maps
- ❌ HERE
---
## Wichtige Dateien
| Datei | Pfad |
|-------|------|
| Haupt-Plan | `/Users/benjaminadmin/.claude/plans/stateful-hugging-kahan.md` |
| Backend | `/Users/benjaminadmin/Projekte/breakpilot-pwa/geo-service/` |
| Frontend | `/Users/benjaminadmin/Projekte/breakpilot-pwa/studio-v2/app/geo-lernwelt/` |
| Docker Config | `/Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml` |
---
## Kontakt bei Fragen
Dieser Service wurde gemäß dem Plan in `/Users/benjaminadmin/.claude/plans/stateful-hugging-kahan.md` implementiert.

View File

@@ -0,0 +1,9 @@
"""
GeoEdu Service API Endpoints
"""
from .tiles import router as tiles_router
from .terrain import router as terrain_router
from .aoi import router as aoi_router
from .learning import router as learning_router
__all__ = ["tiles_router", "terrain_router", "aoi_router", "learning_router"]

304
geo-service/api/aoi.py Normal file
View File

@@ -0,0 +1,304 @@
"""
AOI (Area of Interest) API Endpoints
Handles polygon selection, validation, and Unity bundle generation
"""
from fastapi import APIRouter, HTTPException, Path, Query, BackgroundTasks
from fastapi.responses import JSONResponse, FileResponse
from pydantic import BaseModel, Field
from typing import Optional
from datetime import datetime
import uuid
import structlog
from config import settings
from models.aoi import AOIRequest, AOIResponse, AOIStatus, AOIManifest
from services.aoi_packager import AOIPackagerService
from services.osm_extractor import OSMExtractorService
logger = structlog.get_logger(__name__)
router = APIRouter()
# Initialize services
aoi_packager = AOIPackagerService()
osm_extractor = OSMExtractorService()
@router.post("", response_model=AOIResponse)
async def create_aoi(
request: AOIRequest,
background_tasks: BackgroundTasks,
):
"""
Create a new AOI (Area of Interest) for Unity 3D export.
Validates the polygon, checks size limits (max 4 km²), and queues
bundle generation. Returns immediately with a status URL.
The bundle will contain:
- Terrain heightmap
- OSM features (buildings, roads, water, etc.)
- Learning node positions
- Attribution information
"""
# Validate polygon
try:
area_km2 = aoi_packager.calculate_area_km2(request.polygon)
except ValueError as e:
raise HTTPException(status_code=400, detail=f"Invalid polygon: {str(e)}")
# Check size limit
if area_km2 > settings.max_aoi_size_km2:
raise HTTPException(
status_code=400,
detail=f"AOI too large: {area_km2:.2f} km² exceeds maximum of {settings.max_aoi_size_km2} km²",
)
# Check if polygon is within Germany bounds
if not aoi_packager.is_within_germany(request.polygon):
raise HTTPException(
status_code=400,
detail="AOI must be within Germany. Bounds: [5.87°E, 47.27°N] to [15.04°E, 55.06°N]",
)
# Generate AOI ID
aoi_id = str(uuid.uuid4())
# Create AOI record
aoi_data = {
"id": aoi_id,
"polygon": request.polygon,
"theme": request.theme,
"quality": request.quality,
"area_km2": area_km2,
"status": AOIStatus.QUEUED,
"created_at": datetime.utcnow().isoformat(),
}
# Start background processing
background_tasks.add_task(
aoi_packager.process_aoi,
aoi_id=aoi_id,
polygon=request.polygon,
theme=request.theme,
quality=request.quality,
)
logger.info(
"AOI created",
aoi_id=aoi_id,
area_km2=area_km2,
theme=request.theme,
)
return AOIResponse(
aoi_id=aoi_id,
status=AOIStatus.QUEUED,
area_km2=area_km2,
estimated_size_mb=aoi_packager.estimate_bundle_size_mb(area_km2, request.quality),
message="AOI queued for processing",
)
@router.get("/{aoi_id}", response_model=AOIResponse)
async def get_aoi_status(
aoi_id: str = Path(..., description="AOI UUID"),
):
"""
Get the status of an AOI processing job.
Returns current status (queued, processing, completed, failed)
and download URLs when ready.
"""
aoi_data = await aoi_packager.get_aoi_status(aoi_id)
if aoi_data is None:
raise HTTPException(status_code=404, detail="AOI not found")
response = AOIResponse(
aoi_id=aoi_id,
status=aoi_data["status"],
area_km2=aoi_data.get("area_km2", 0),
estimated_size_mb=aoi_data.get("estimated_size_mb", 0),
message=aoi_data.get("message", ""),
)
# Add download URLs if completed
if aoi_data["status"] == AOIStatus.COMPLETED:
response.download_url = f"/api/v1/aoi/{aoi_id}/bundle.zip"
response.manifest_url = f"/api/v1/aoi/{aoi_id}/manifest.json"
response.completed_at = aoi_data.get("completed_at")
return response
@router.get("/{aoi_id}/manifest.json")
async def get_aoi_manifest(
aoi_id: str = Path(..., description="AOI UUID"),
):
"""
Get the Unity bundle manifest for an AOI.
The manifest contains:
- Terrain configuration
- Asset list and paths
- Learning node positions
- Attribution requirements
"""
manifest = await aoi_packager.get_manifest(aoi_id)
if manifest is None:
raise HTTPException(status_code=404, detail="Manifest not found or AOI not ready")
return JSONResponse(content=manifest)
@router.get("/{aoi_id}/bundle.zip")
async def download_aoi_bundle(
aoi_id: str = Path(..., description="AOI UUID"),
):
"""
Download the complete AOI bundle as a ZIP file.
Contains all assets needed for Unity 3D rendering:
- terrain.heightmap (16-bit PNG)
- osm_features.json (buildings, roads, etc.)
- learning_nodes.json (educational content positions)
- attribution.json (required license notices)
"""
bundle_path = await aoi_packager.get_bundle_path(aoi_id)
if bundle_path is None:
raise HTTPException(status_code=404, detail="Bundle not found or AOI not ready")
return FileResponse(
path=bundle_path,
filename=f"geo-lernwelt-{aoi_id[:8]}.zip",
media_type="application/zip",
)
@router.delete("/{aoi_id}")
async def delete_aoi(
aoi_id: str = Path(..., description="AOI UUID"),
):
"""
Delete an AOI and its bundle.
Implements DSGVO data minimization - users can delete their data.
"""
success = await aoi_packager.delete_aoi(aoi_id)
if not success:
raise HTTPException(status_code=404, detail="AOI not found")
logger.info("AOI deleted", aoi_id=aoi_id)
return {"message": "AOI deleted successfully", "aoi_id": aoi_id}
@router.get("/{aoi_id}/preview")
async def get_aoi_preview(
aoi_id: str = Path(..., description="AOI UUID"),
width: int = Query(512, ge=64, le=2048, description="Preview width"),
height: int = Query(512, ge=64, le=2048, description="Preview height"),
):
"""
Get a preview image of the AOI.
Returns a rendered preview showing terrain, OSM features,
and learning node positions.
"""
preview_data = await aoi_packager.generate_preview(aoi_id, width, height)
if preview_data is None:
raise HTTPException(status_code=404, detail="Preview not available")
from fastapi.responses import Response
return Response(
content=preview_data,
media_type="image/png",
)
@router.post("/validate")
async def validate_aoi_polygon(
polygon: dict,
):
"""
Validate an AOI polygon without creating it.
Checks:
- Valid GeoJSON format
- Within Germany bounds
- Within size limits
- Not self-intersecting
"""
try:
# Validate geometry
is_valid, message = aoi_packager.validate_polygon(polygon)
if not is_valid:
return {
"valid": False,
"error": message,
}
# Calculate area
area_km2 = aoi_packager.calculate_area_km2(polygon)
# Check bounds
within_germany = aoi_packager.is_within_germany(polygon)
# Check size
within_size_limit = area_km2 <= settings.max_aoi_size_km2
return {
"valid": is_valid and within_germany and within_size_limit,
"area_km2": round(area_km2, 3),
"within_germany": within_germany,
"within_size_limit": within_size_limit,
"max_size_km2": settings.max_aoi_size_km2,
"estimated_bundle_size_mb": aoi_packager.estimate_bundle_size_mb(area_km2, "medium"),
}
except Exception as e:
return {
"valid": False,
"error": str(e),
}
@router.get("/templates/mainau")
async def get_mainau_template():
"""
Get pre-configured AOI template for Mainau island (demo location).
Mainau is a small island in Lake Constance (Bodensee) -
perfect for educational geography lessons.
"""
return {
"name": "Insel Mainau",
"description": "Blumeninsel im Bodensee - ideal fuer Erdkunde-Unterricht",
"polygon": {
"type": "Polygon",
"coordinates": [
[
[9.1875, 47.7055],
[9.1975, 47.7055],
[9.1975, 47.7115],
[9.1875, 47.7115],
[9.1875, 47.7055],
]
],
},
"center": [9.1925, 47.7085],
"area_km2": 0.45,
"suggested_themes": ["topographie", "vegetation", "landnutzung"],
"features": [
"Schloss und Schlosskirche",
"Botanischer Garten",
"Bodensee-Ufer",
"Waldgebiete",
],
}

289
geo-service/api/learning.py Normal file
View File

@@ -0,0 +1,289 @@
"""
Learning Nodes API Endpoints
Generates and manages educational content for AOI regions
"""
from fastapi import APIRouter, HTTPException, Path, Query, BackgroundTasks
from fastapi.responses import JSONResponse
from pydantic import BaseModel, Field
from typing import Optional
import structlog
from config import settings
from models.learning_node import LearningNode, LearningNodeRequest, LearningTheme
from services.learning_generator import LearningGeneratorService
logger = structlog.get_logger(__name__)
router = APIRouter()
# Initialize learning generator service
learning_service = LearningGeneratorService()
class GenerateNodesRequest(BaseModel):
"""Request model for generating learning nodes."""
aoi_id: str = Field(..., description="AOI UUID to generate nodes for")
theme: LearningTheme = Field(LearningTheme.TOPOGRAPHIE, description="Learning theme")
difficulty: str = Field("mittel", pattern="^(leicht|mittel|schwer)$", description="Difficulty level")
node_count: int = Field(5, ge=1, le=20, description="Number of nodes to generate")
grade_level: Optional[str] = Field(None, description="Target grade level (e.g., '5-6', '7-8')")
language: str = Field("de", description="Content language")
class GenerateNodesResponse(BaseModel):
"""Response model for generated learning nodes."""
aoi_id: str
theme: str
nodes: list[LearningNode]
total_count: int
generation_model: str
@router.post("/generate", response_model=GenerateNodesResponse)
async def generate_learning_nodes(
request: GenerateNodesRequest,
):
"""
Generate learning nodes for an AOI using LLM.
Uses Ollama with the configured model to generate educational content
based on the AOI's geographic features and selected theme.
Themes:
- topographie: Landforms, elevation, terrain features
- landnutzung: Land use, settlement patterns, agriculture
- orientierung: Navigation, compass, map reading
- geologie: Rock types, geological formations
- hydrologie: Water features, drainage, watersheds
- vegetation: Plant communities, climate zones
"""
try:
nodes = await learning_service.generate_nodes(
aoi_id=request.aoi_id,
theme=request.theme,
difficulty=request.difficulty,
node_count=request.node_count,
grade_level=request.grade_level,
language=request.language,
)
logger.info(
"Learning nodes generated",
aoi_id=request.aoi_id,
theme=request.theme.value,
count=len(nodes),
)
return GenerateNodesResponse(
aoi_id=request.aoi_id,
theme=request.theme.value,
nodes=nodes,
total_count=len(nodes),
generation_model=settings.ollama_model,
)
except FileNotFoundError:
raise HTTPException(
status_code=404,
detail="AOI not found. Please create an AOI first.",
)
except ConnectionError:
raise HTTPException(
status_code=503,
detail="LLM service (Ollama) not available. Please check if Ollama is running.",
)
except Exception as e:
logger.error("Error generating learning nodes", error=str(e))
raise HTTPException(status_code=500, detail="Error generating learning nodes")
@router.get("/templates")
async def get_learning_templates():
"""
Get available learning theme templates.
Returns theme definitions with descriptions, suitable grade levels,
and example questions.
"""
return {
"themes": [
{
"id": "topographie",
"name": "Topographie",
"description": "Landschaftsformen, Höhen und Geländemerkmale",
"icon": "mountain",
"grade_levels": ["5-6", "7-8", "9-10"],
"example_questions": [
"Welche Höhe hat der höchste Punkt in diesem Gebiet?",
"Beschreibe die Hangneigung im nördlichen Bereich.",
"Wo befinden sich Täler und wo Bergrücken?",
],
"keywords": ["Höhe", "Hang", "Tal", "Berg", "Hügel", "Ebene"],
},
{
"id": "landnutzung",
"name": "Landnutzung",
"description": "Siedlungen, Landwirtschaft und Flächennutzung",
"icon": "home",
"grade_levels": ["5-6", "7-8", "9-10", "11-12"],
"example_questions": [
"Welche Arten von Gebäuden sind in diesem Bereich zu finden?",
"Wie viel Prozent der Fläche wird landwirtschaftlich genutzt?",
"Wo verläuft die Grenze zwischen Siedlung und Naturraum?",
],
"keywords": ["Siedlung", "Acker", "Wald", "Industrie", "Straße"],
},
{
"id": "orientierung",
"name": "Orientierung",
"description": "Kartenlesen, Kompass und Navigation",
"icon": "compass",
"grade_levels": ["5-6", "7-8"],
"example_questions": [
"In welcher Himmelsrichtung liegt der See?",
"Wie lang ist der Weg von A nach B?",
"Beschreibe den Weg vom Parkplatz zum Aussichtspunkt.",
],
"keywords": ["Norden", "Süden", "Entfernung", "Maßstab", "Legende"],
},
{
"id": "geologie",
"name": "Geologie",
"description": "Gesteinsarten und geologische Formationen",
"icon": "layers",
"grade_levels": ["7-8", "9-10", "11-12"],
"example_questions": [
"Welches Gestein dominiert in diesem Gebiet?",
"Wie sind die Felsformationen entstanden?",
"Welche Rolle spielt die Geologie für die Landschaft?",
],
"keywords": ["Gestein", "Formation", "Erosion", "Schicht", "Mineral"],
},
{
"id": "hydrologie",
"name": "Hydrologie",
"description": "Gewässer, Einzugsgebiete und Wasserkreislauf",
"icon": "droplet",
"grade_levels": ["5-6", "7-8", "9-10"],
"example_questions": [
"Wohin fließt das Wasser in diesem Gebiet?",
"Welche Gewässerarten sind vorhanden?",
"Wo könnte sich bei Starkregen Wasser sammeln?",
],
"keywords": ["Fluss", "See", "Bach", "Einzugsgebiet", "Quelle"],
},
{
"id": "vegetation",
"name": "Vegetation",
"description": "Pflanzengemeinschaften und Klimazonen",
"icon": "tree",
"grade_levels": ["5-6", "7-8", "9-10", "11-12"],
"example_questions": [
"Welche Waldtypen sind in diesem Gebiet zu finden?",
"Wie beeinflusst die Höhenlage die Vegetation?",
"Welche Pflanzen würdest du hier erwarten?",
],
"keywords": ["Wald", "Wiese", "Laubbaum", "Nadelbaum", "Höhenstufe"],
},
],
"difficulties": [
{"id": "leicht", "name": "Leicht", "description": "Grundlegende Beobachtungen"},
{"id": "mittel", "name": "Mittel", "description": "Verknüpfung von Zusammenhängen"},
{"id": "schwer", "name": "Schwer", "description": "Analyse und Transfer"},
],
"supported_languages": ["de", "en"],
}
@router.get("/{aoi_id}/nodes")
async def get_aoi_learning_nodes(
aoi_id: str = Path(..., description="AOI UUID"),
theme: Optional[LearningTheme] = Query(None, description="Filter by theme"),
):
"""
Get all learning nodes for an AOI.
Returns previously generated nodes, optionally filtered by theme.
"""
nodes = await learning_service.get_nodes_for_aoi(aoi_id, theme)
if nodes is None:
raise HTTPException(status_code=404, detail="AOI not found")
return {
"aoi_id": aoi_id,
"nodes": nodes,
"total_count": len(nodes),
}
@router.put("/{aoi_id}/nodes/{node_id}")
async def update_learning_node(
aoi_id: str = Path(..., description="AOI UUID"),
node_id: str = Path(..., description="Node UUID"),
node_update: LearningNode,
):
"""
Update a learning node (teacher review/edit).
Allows teachers to modify AI-generated content before use.
"""
success = await learning_service.update_node(aoi_id, node_id, node_update)
if not success:
raise HTTPException(status_code=404, detail="Node not found")
logger.info("Learning node updated", aoi_id=aoi_id, node_id=node_id)
return {"message": "Node updated successfully", "node_id": node_id}
@router.delete("/{aoi_id}/nodes/{node_id}")
async def delete_learning_node(
aoi_id: str = Path(..., description="AOI UUID"),
node_id: str = Path(..., description="Node UUID"),
):
"""
Delete a learning node.
"""
success = await learning_service.delete_node(aoi_id, node_id)
if not success:
raise HTTPException(status_code=404, detail="Node not found")
return {"message": "Node deleted successfully", "node_id": node_id}
@router.post("/{aoi_id}/nodes/{node_id}/approve")
async def approve_learning_node(
aoi_id: str = Path(..., description="AOI UUID"),
node_id: str = Path(..., description="Node UUID"),
):
"""
Approve a learning node for student use.
Only approved nodes will be visible to students.
"""
success = await learning_service.approve_node(aoi_id, node_id)
if not success:
raise HTTPException(status_code=404, detail="Node not found")
return {"message": "Node approved", "node_id": node_id}
@router.get("/statistics")
async def get_learning_statistics():
"""
Get statistics about learning node usage.
"""
stats = await learning_service.get_statistics()
return {
"total_nodes_generated": stats.get("total_nodes", 0),
"nodes_by_theme": stats.get("by_theme", {}),
"nodes_by_difficulty": stats.get("by_difficulty", {}),
"average_nodes_per_aoi": stats.get("avg_per_aoi", 0),
"most_popular_theme": stats.get("popular_theme", "topographie"),
}

230
geo-service/api/terrain.py Normal file
View File

@@ -0,0 +1,230 @@
"""
Terrain/DEM API Endpoints
Serves heightmap tiles from Copernicus DEM data
"""
from fastapi import APIRouter, HTTPException, Path, Query, Response
from fastapi.responses import JSONResponse
import structlog
from config import settings
from services.dem_service import DEMService
logger = structlog.get_logger(__name__)
router = APIRouter()
# Initialize DEM service
dem_service = DEMService()
@router.get("/{z}/{x}/{y}.png", response_class=Response)
async def get_heightmap_tile(
z: int = Path(..., ge=0, le=14, description="Zoom level"),
x: int = Path(..., ge=0, description="Tile X coordinate"),
y: int = Path(..., ge=0, description="Tile Y coordinate"),
):
"""
Get a heightmap tile as 16-bit PNG (Mapbox Terrain-RGB encoding).
Heightmaps are generated from Copernicus DEM GLO-30 (30m resolution).
The encoding allows for ~0.1m precision: height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
"""
try:
tile_data = await dem_service.get_heightmap_tile(z, x, y)
if tile_data is None:
return Response(status_code=204) # No terrain data for this tile
return Response(
content=tile_data,
media_type="image/png",
headers={
"Cache-Control": "public, max-age=604800", # 7 days cache
"X-Tile-Source": "copernicus-dem",
},
)
except FileNotFoundError:
logger.warning("DEM data not found", z=z, x=x, y=y)
raise HTTPException(
status_code=503,
detail="DEM data not available. Please download Copernicus DEM first.",
)
except Exception as e:
logger.error("Error serving heightmap tile", z=z, x=x, y=y, error=str(e))
raise HTTPException(status_code=500, detail="Error serving heightmap tile")
@router.get("/hillshade/{z}/{x}/{y}.png", response_class=Response)
async def get_hillshade_tile(
z: int = Path(..., ge=0, le=14, description="Zoom level"),
x: int = Path(..., ge=0, description="Tile X coordinate"),
y: int = Path(..., ge=0, description="Tile Y coordinate"),
azimuth: float = Query(315, ge=0, le=360, description="Light azimuth in degrees"),
altitude: float = Query(45, ge=0, le=90, description="Light altitude in degrees"),
):
"""
Get a hillshade tile for terrain visualization.
Hillshade is rendered from DEM with configurable light direction.
Default light comes from northwest (azimuth=315) at 45° altitude.
"""
try:
tile_data = await dem_service.get_hillshade_tile(z, x, y, azimuth, altitude)
if tile_data is None:
return Response(status_code=204)
return Response(
content=tile_data,
media_type="image/png",
headers={
"Cache-Control": "public, max-age=604800",
"X-Hillshade-Azimuth": str(azimuth),
"X-Hillshade-Altitude": str(altitude),
},
)
except FileNotFoundError:
raise HTTPException(status_code=503, detail="DEM data not available")
except Exception as e:
logger.error("Error serving hillshade tile", z=z, x=x, y=y, error=str(e))
raise HTTPException(status_code=500, detail="Error serving hillshade tile")
@router.get("/contours/{z}/{x}/{y}.pbf", response_class=Response)
async def get_contour_tile(
z: int = Path(..., ge=0, le=14, description="Zoom level"),
x: int = Path(..., ge=0, description="Tile X coordinate"),
y: int = Path(..., ge=0, description="Tile Y coordinate"),
interval: int = Query(20, ge=5, le=100, description="Contour interval in meters"),
):
"""
Get contour lines as vector tile.
Contours are generated from DEM at the specified interval.
Useful for topographic map overlays.
"""
try:
tile_data = await dem_service.get_contour_tile(z, x, y, interval)
if tile_data is None:
return Response(status_code=204)
return Response(
content=tile_data,
media_type="application/x-protobuf",
headers={
"Content-Encoding": "gzip",
"Cache-Control": "public, max-age=604800",
"X-Contour-Interval": str(interval),
},
)
except FileNotFoundError:
raise HTTPException(status_code=503, detail="DEM data not available")
except Exception as e:
logger.error("Error serving contour tile", z=z, x=x, y=y, error=str(e))
raise HTTPException(status_code=500, detail="Error serving contour tile")
@router.get("/elevation")
async def get_elevation_at_point(
lat: float = Query(..., ge=47.27, le=55.06, description="Latitude"),
lon: float = Query(..., ge=5.87, le=15.04, description="Longitude"),
):
"""
Get elevation at a specific point.
Returns elevation in meters from Copernicus DEM.
Only works within Germany bounds.
"""
try:
elevation = await dem_service.get_elevation(lat, lon)
if elevation is None:
raise HTTPException(
status_code=404,
detail="No elevation data available for this location",
)
return {
"latitude": lat,
"longitude": lon,
"elevation_m": round(elevation, 1),
"source": "Copernicus DEM GLO-30",
"resolution_m": 30,
}
except FileNotFoundError:
raise HTTPException(status_code=503, detail="DEM data not available")
except Exception as e:
logger.error("Error getting elevation", lat=lat, lon=lon, error=str(e))
raise HTTPException(status_code=500, detail="Error getting elevation")
@router.post("/elevation/profile")
async def get_elevation_profile(
coordinates: list[list[float]],
samples: int = Query(100, ge=10, le=1000, description="Number of sample points"),
):
"""
Get elevation profile along a path.
Takes a list of [lon, lat] coordinates and returns elevations sampled along the path.
Useful for hiking/cycling route profiles.
"""
if len(coordinates) < 2:
raise HTTPException(status_code=400, detail="At least 2 coordinates required")
if len(coordinates) > 100:
raise HTTPException(status_code=400, detail="Maximum 100 coordinates allowed")
try:
profile = await dem_service.get_elevation_profile(coordinates, samples)
return {
"profile": profile,
"statistics": {
"min_elevation_m": min(p["elevation_m"] for p in profile if p["elevation_m"]),
"max_elevation_m": max(p["elevation_m"] for p in profile if p["elevation_m"]),
"total_ascent_m": sum(
max(0, profile[i + 1]["elevation_m"] - profile[i]["elevation_m"])
for i in range(len(profile) - 1)
if profile[i]["elevation_m"] and profile[i + 1]["elevation_m"]
),
"total_descent_m": sum(
max(0, profile[i]["elevation_m"] - profile[i + 1]["elevation_m"])
for i in range(len(profile) - 1)
if profile[i]["elevation_m"] and profile[i + 1]["elevation_m"]
),
},
"source": "Copernicus DEM GLO-30",
}
except FileNotFoundError:
raise HTTPException(status_code=503, detail="DEM data not available")
except Exception as e:
logger.error("Error getting elevation profile", error=str(e))
raise HTTPException(status_code=500, detail="Error getting elevation profile")
@router.get("/metadata")
async def get_dem_metadata():
"""
Get metadata about available DEM data.
"""
metadata = await dem_service.get_metadata()
return {
"name": "Copernicus DEM GLO-30",
"description": "Global 30m Digital Elevation Model",
"resolution_m": 30,
"coverage": "Germany (Deutschland)",
"bounds": [5.87, 47.27, 15.04, 55.06],
"vertical_datum": "EGM2008",
"horizontal_datum": "WGS84",
"license": "Copernicus Data (free, attribution required)",
"attribution": "© Copernicus Service Information 2024",
"data_available": metadata.get("data_available", False),
"tiles_generated": metadata.get("tiles_generated", 0),
}

221
geo-service/api/tiles.py Normal file
View File

@@ -0,0 +1,221 @@
"""
Tile Server API Endpoints
Serves Vector Tiles from PMTiles or generates on-demand from PostGIS
"""
from fastapi import APIRouter, HTTPException, Path, Query, Response
from fastapi.responses import JSONResponse
import structlog
from config import settings
from services.tile_server import TileServerService
logger = structlog.get_logger(__name__)
router = APIRouter()
# Initialize tile server service
tile_service = TileServerService()
@router.get("/{z}/{x}/{y}.pbf", response_class=Response)
async def get_vector_tile(
z: int = Path(..., ge=0, le=22, description="Zoom level"),
x: int = Path(..., ge=0, description="Tile X coordinate"),
y: int = Path(..., ge=0, description="Tile Y coordinate"),
):
"""
Get a vector tile in Protocol Buffers format.
Returns OSM data as vector tiles suitable for MapLibre GL JS.
Tiles are served from pre-generated PMTiles or cached on-demand.
"""
try:
tile_data = await tile_service.get_tile(z, x, y)
if tile_data is None:
# Return empty tile (204 No Content is standard for empty tiles)
return Response(status_code=204)
return Response(
content=tile_data,
media_type="application/x-protobuf",
headers={
"Content-Encoding": "gzip",
"Cache-Control": "public, max-age=86400", # 24h cache
"X-Tile-Source": "pmtiles",
},
)
except FileNotFoundError:
logger.warning("PMTiles file not found", z=z, x=x, y=y)
raise HTTPException(
status_code=503,
detail="Tile data not available. Please run the data download script first.",
)
except Exception as e:
logger.error("Error serving tile", z=z, x=x, y=y, error=str(e))
raise HTTPException(status_code=500, detail="Error serving tile")
@router.get("/style.json")
async def get_maplibre_style(
base_url: str = Query(None, description="Base URL for tile server"),
):
"""
Get MapLibre GL JS style specification.
Returns a style document configured for the self-hosted tile server.
"""
# Use provided base_url or construct from settings
if base_url is None:
base_url = f"http://localhost:{settings.port}"
style = {
"version": 8,
"name": "GeoEdu Germany",
"metadata": {
"description": "Self-hosted OSM tiles for DSGVO-compliant education",
"attribution": "© OpenStreetMap contributors",
},
"sources": {
"osm": {
"type": "vector",
"tiles": [f"{base_url}/api/v1/tiles/{{z}}/{{x}}/{{y}}.pbf"],
"minzoom": 0,
"maxzoom": 14,
"attribution": "© OpenStreetMap contributors (ODbL)",
},
"terrain": {
"type": "raster-dem",
"tiles": [f"{base_url}/api/v1/terrain/{{z}}/{{x}}/{{y}}.png"],
"tileSize": 256,
"maxzoom": 14,
"attribution": "© Copernicus DEM GLO-30",
},
},
"sprite": "",
"glyphs": "https://fonts.openmaptiles.org/{fontstack}/{range}.pbf",
"layers": [
# Background
{
"id": "background",
"type": "background",
"paint": {"background-color": "#f8f4f0"},
},
# Water
{
"id": "water",
"type": "fill",
"source": "osm",
"source-layer": "water",
"paint": {"fill-color": "#a0c8f0"},
},
# Landuse - Parks
{
"id": "landuse-park",
"type": "fill",
"source": "osm",
"source-layer": "landuse",
"filter": ["==", "class", "park"],
"paint": {"fill-color": "#c8e6c8", "fill-opacity": 0.5},
},
# Landuse - Forest
{
"id": "landuse-forest",
"type": "fill",
"source": "osm",
"source-layer": "landuse",
"filter": ["==", "class", "wood"],
"paint": {"fill-color": "#94d294", "fill-opacity": 0.5},
},
# Buildings
{
"id": "building",
"type": "fill",
"source": "osm",
"source-layer": "building",
"minzoom": 13,
"paint": {"fill-color": "#d9d0c9", "fill-opacity": 0.8},
},
# Roads - Minor
{
"id": "road-minor",
"type": "line",
"source": "osm",
"source-layer": "transportation",
"filter": ["all", ["==", "$type", "LineString"], ["in", "class", "minor", "service"]],
"paint": {"line-color": "#ffffff", "line-width": 1},
},
# Roads - Major
{
"id": "road-major",
"type": "line",
"source": "osm",
"source-layer": "transportation",
"filter": ["all", ["==", "$type", "LineString"], ["in", "class", "primary", "secondary", "tertiary"]],
"paint": {"line-color": "#ffc107", "line-width": 2},
},
# Roads - Highway
{
"id": "road-highway",
"type": "line",
"source": "osm",
"source-layer": "transportation",
"filter": ["all", ["==", "$type", "LineString"], ["==", "class", "motorway"]],
"paint": {"line-color": "#ff6f00", "line-width": 3},
},
# Place labels
{
"id": "place-label",
"type": "symbol",
"source": "osm",
"source-layer": "place",
"layout": {
"text-field": "{name}",
"text-font": ["Open Sans Regular"],
"text-size": 12,
},
"paint": {"text-color": "#333333", "text-halo-color": "#ffffff", "text-halo-width": 1},
},
],
"terrain": {"source": "terrain", "exaggeration": 1.5},
}
return JSONResponse(content=style)
@router.get("/metadata")
async def get_tile_metadata():
"""
Get metadata about available tiles.
Returns information about data coverage, zoom levels, and update status.
"""
metadata = await tile_service.get_metadata()
return {
"name": "GeoEdu Germany Tiles",
"description": "Self-hosted OSM vector tiles for Germany",
"format": "pbf",
"scheme": "xyz",
"minzoom": metadata.get("minzoom", 0),
"maxzoom": metadata.get("maxzoom", 14),
"bounds": metadata.get("bounds", [5.87, 47.27, 15.04, 55.06]), # Germany bbox
"center": metadata.get("center", [10.45, 51.16, 6]), # Center of Germany
"attribution": "© OpenStreetMap contributors (ODbL)",
"data_available": metadata.get("data_available", False),
"last_updated": metadata.get("last_updated"),
}
@router.get("/bounds")
async def get_tile_bounds():
"""
Get the geographic bounds of available tile data.
Returns bounding box for Germany in [west, south, east, north] format.
"""
return {
"bounds": [5.87, 47.27, 15.04, 55.06], # Germany bounding box
"center": [10.45, 51.16],
"description": "Germany (Deutschland)",
}

99
geo-service/config.py Normal file
View File

@@ -0,0 +1,99 @@
"""
GeoEdu Service Configuration
Environment-based configuration with Pydantic Settings
"""
from functools import lru_cache
from typing import Optional
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
"""Application settings loaded from environment variables."""
model_config = SettingsConfigDict(
env_file=".env",
env_file_encoding="utf-8",
case_sensitive=False,
)
# Service Config
port: int = 8088
environment: str = "development"
debug: bool = False
# JWT Authentication
jwt_secret: str = "your-super-secret-jwt-key"
jwt_algorithm: str = "HS256"
jwt_expiration_hours: int = 24
# PostgreSQL (PostGIS)
database_url: str = "postgresql+asyncpg://breakpilot:breakpilot123@postgres:5432/breakpilot_db"
# MinIO (S3-compatible storage)
minio_endpoint: str = "minio:9000"
minio_access_key: str = "breakpilot"
minio_secret_key: str = "breakpilot123"
minio_bucket: str = "breakpilot-geo"
minio_secure: bool = False
# Ollama (LLM for learning nodes)
ollama_base_url: str = "http://host.docker.internal:11434"
ollama_model: str = "qwen2.5:14b"
ollama_timeout: int = 120
# Data Directories
osm_data_dir: str = "/app/data/osm"
dem_data_dir: str = "/app/data/dem"
tile_cache_dir: str = "/app/cache/tiles"
bundle_dir: str = "/app/bundles"
# Tile Server Config
default_pmtiles_file: str = "germany.pmtiles"
tile_cache_max_size_gb: float = 50.0
# DEM Config
dem_resolution: str = "GLO-30" # 30m Copernicus DEM
terrain_tile_size: int = 256
# AOI Limits (DSGVO data minimization)
max_aoi_size_km2: float = 4.0 # Max 4 km² per AOI
max_aoi_per_user: int = 10
aoi_retention_days: int = 30 # Auto-delete after 30 days
# Learning Nodes
max_nodes_per_aoi: int = 20
supported_themes: list[str] = [
"topographie",
"landnutzung",
"orientierung",
"geologie",
"hydrologie",
"vegetation",
]
# CORS (for frontend access)
cors_origins: list[str] = [
"http://localhost:3000",
"http://localhost:3001",
"http://localhost:8088",
]
@property
def pmtiles_path(self) -> str:
"""Full path to PMTiles file."""
return f"{self.osm_data_dir}/{self.default_pmtiles_file}"
@property
def is_development(self) -> bool:
"""Check if running in development mode."""
return self.environment == "development"
@lru_cache
def get_settings() -> Settings:
"""Get cached settings instance."""
return Settings()
# Export settings instance for convenience
settings = get_settings()

192
geo-service/main.py Normal file
View File

@@ -0,0 +1,192 @@
"""
GeoEdu Service - Self-Hosted OSM + Terrain Learning Platform
DSGVO-konforme Erdkunde-Lernplattform mit selbst gehostetem OpenStreetMap
Main FastAPI Application
"""
import structlog
from contextlib import asynccontextmanager
from fastapi import FastAPI, Request
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import JSONResponse
import time
from config import settings
# Configure structured logging
structlog.configure(
processors=[
structlog.stdlib.filter_by_level,
structlog.stdlib.add_logger_name,
structlog.stdlib.add_log_level,
structlog.stdlib.PositionalArgumentsFormatter(),
structlog.processors.TimeStamper(fmt="iso"),
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
structlog.processors.UnicodeDecoder(),
structlog.processors.JSONRenderer() if not settings.is_development else structlog.dev.ConsoleRenderer(),
],
wrapper_class=structlog.stdlib.BoundLogger,
context_class=dict,
logger_factory=structlog.stdlib.LoggerFactory(),
cache_logger_on_first_use=True,
)
logger = structlog.get_logger(__name__)
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Application lifespan manager."""
# Startup
logger.info(
"Starting GeoEdu Service",
environment=settings.environment,
port=settings.port,
)
# Check data directories
import os
for dir_name, dir_path in [
("OSM Data", settings.osm_data_dir),
("DEM Data", settings.dem_data_dir),
("Tile Cache", settings.tile_cache_dir),
("Bundles", settings.bundle_dir),
]:
if os.path.exists(dir_path):
logger.info(f"{dir_name} directory exists", path=dir_path)
else:
logger.warning(f"{dir_name} directory missing", path=dir_path)
os.makedirs(dir_path, exist_ok=True)
yield
# Shutdown
logger.info("Shutting down GeoEdu Service")
# Create FastAPI app
app = FastAPI(
title="GeoEdu Service",
description="DSGVO-konforme Erdkunde-Lernplattform mit selbst gehostetem OpenStreetMap",
version="1.0.0",
docs_url="/docs" if settings.is_development else None,
redoc_url="/redoc" if settings.is_development else None,
lifespan=lifespan,
)
# CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=settings.cors_origins,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Request timing middleware
@app.middleware("http")
async def add_timing_header(request: Request, call_next):
"""Add X-Process-Time header to all responses."""
start_time = time.time()
response = await call_next(request)
process_time = time.time() - start_time
response.headers["X-Process-Time"] = str(process_time)
return response
# Import and register routers
from api.tiles import router as tiles_router
from api.terrain import router as terrain_router
from api.aoi import router as aoi_router
from api.learning import router as learning_router
app.include_router(tiles_router, prefix="/api/v1/tiles", tags=["Tiles"])
app.include_router(terrain_router, prefix="/api/v1/terrain", tags=["Terrain"])
app.include_router(aoi_router, prefix="/api/v1/aoi", tags=["AOI"])
app.include_router(learning_router, prefix="/api/v1/learning", tags=["Learning"])
# Health check endpoint
@app.get("/health", tags=["System"])
async def health_check():
"""
Health check endpoint for Docker/Kubernetes probes.
Returns service status and basic metrics.
"""
import os
# Check data availability
pmtiles_exists = os.path.exists(settings.pmtiles_path)
dem_exists = os.path.exists(settings.dem_data_dir) and len(os.listdir(settings.dem_data_dir)) > 0
return {
"status": "healthy",
"service": "geo-service",
"version": "1.0.0",
"environment": settings.environment,
"data_status": {
"pmtiles_available": pmtiles_exists,
"dem_available": dem_exists,
"tile_cache_dir": os.path.exists(settings.tile_cache_dir),
"bundle_dir": os.path.exists(settings.bundle_dir),
},
"config": {
"max_aoi_size_km2": settings.max_aoi_size_km2,
"supported_themes": settings.supported_themes,
},
}
# Root endpoint
@app.get("/", tags=["System"])
async def root():
"""Root endpoint with service information."""
return {
"service": "GeoEdu Service",
"description": "DSGVO-konforme Erdkunde-Lernplattform",
"version": "1.0.0",
"docs": "/docs" if settings.is_development else "disabled",
"endpoints": {
"tiles": "/api/v1/tiles",
"terrain": "/api/v1/terrain",
"aoi": "/api/v1/aoi",
"learning": "/api/v1/learning",
},
"attribution": {
"osm": "© OpenStreetMap contributors (ODbL)",
"dem": "© Copernicus Service (free, attribution required)",
},
}
# Error handlers
@app.exception_handler(404)
async def not_found_handler(request: Request, exc):
"""Handle 404 errors."""
return JSONResponse(
status_code=404,
content={"error": "Not found", "path": str(request.url.path)},
)
@app.exception_handler(500)
async def internal_error_handler(request: Request, exc):
"""Handle 500 errors."""
logger.error("Internal server error", path=str(request.url.path), error=str(exc))
return JSONResponse(
status_code=500,
content={"error": "Internal server error"},
)
if __name__ == "__main__":
import uvicorn
uvicorn.run(
"main:app",
host="0.0.0.0",
port=settings.port,
reload=settings.is_development,
)

View File

@@ -0,0 +1,19 @@
"""
GeoEdu Service - Pydantic Models
"""
from .aoi import AOIRequest, AOIResponse, AOIStatus, AOIManifest
from .learning_node import LearningNode, LearningNodeRequest, LearningTheme, NodeType
from .attribution import Attribution, AttributionSource
__all__ = [
"AOIRequest",
"AOIResponse",
"AOIStatus",
"AOIManifest",
"LearningNode",
"LearningNodeRequest",
"LearningTheme",
"NodeType",
"Attribution",
"AttributionSource",
]

162
geo-service/models/aoi.py Normal file
View File

@@ -0,0 +1,162 @@
"""
AOI (Area of Interest) Models
Pydantic models for AOI requests and responses
"""
from enum import Enum
from typing import Optional
from datetime import datetime
from pydantic import BaseModel, Field
class AOIStatus(str, Enum):
"""Status of an AOI processing job."""
QUEUED = "queued"
PROCESSING = "processing"
COMPLETED = "completed"
FAILED = "failed"
class AOIQuality(str, Enum):
"""Quality level for AOI bundle generation."""
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
class AOITheme(str, Enum):
"""Learning theme for AOI."""
TOPOGRAPHIE = "topographie"
LANDNUTZUNG = "landnutzung"
ORIENTIERUNG = "orientierung"
GEOLOGIE = "geologie"
HYDROLOGIE = "hydrologie"
VEGETATION = "vegetation"
class GeoJSONPolygon(BaseModel):
"""GeoJSON Polygon geometry."""
type: str = Field("Polygon", const=True)
coordinates: list[list[list[float]]] = Field(
...,
description="Polygon coordinates as [[[lon, lat], ...]]",
)
class AOIRequest(BaseModel):
"""Request model for creating an AOI."""
polygon: dict = Field(
...,
description="GeoJSON Polygon geometry",
example={
"type": "Polygon",
"coordinates": [[[9.19, 47.71], [9.20, 47.71], [9.20, 47.70], [9.19, 47.70], [9.19, 47.71]]]
},
)
theme: str = Field(
"topographie",
description="Learning theme for the AOI",
)
quality: str = Field(
"medium",
pattern="^(low|medium|high)$",
description="Bundle quality level",
)
class Config:
json_schema_extra = {
"example": {
"polygon": {
"type": "Polygon",
"coordinates": [[[9.1875, 47.7055], [9.1975, 47.7055], [9.1975, 47.7115], [9.1875, 47.7115], [9.1875, 47.7055]]]
},
"theme": "topographie",
"quality": "medium",
}
}
class AOIResponse(BaseModel):
"""Response model for AOI operations."""
aoi_id: str = Field(..., description="Unique AOI identifier")
status: AOIStatus = Field(..., description="Current processing status")
area_km2: float = Field(0, description="Area in square kilometers")
estimated_size_mb: float = Field(0, description="Estimated bundle size in MB")
message: Optional[str] = Field(None, description="Status message")
download_url: Optional[str] = Field(None, description="Bundle download URL")
manifest_url: Optional[str] = Field(None, description="Manifest URL")
created_at: Optional[str] = Field(None, description="Creation timestamp")
completed_at: Optional[str] = Field(None, description="Completion timestamp")
class Config:
json_schema_extra = {
"example": {
"aoi_id": "550e8400-e29b-41d4-a716-446655440000",
"status": "completed",
"area_km2": 0.45,
"estimated_size_mb": 25.5,
"message": "AOI processing complete",
"download_url": "/api/v1/aoi/550e8400-e29b-41d4-a716-446655440000/bundle.zip",
"manifest_url": "/api/v1/aoi/550e8400-e29b-41d4-a716-446655440000/manifest.json",
}
}
class AOIBounds(BaseModel):
"""Geographic bounding box."""
west: float = Field(..., description="Western longitude")
south: float = Field(..., description="Southern latitude")
east: float = Field(..., description="Eastern longitude")
north: float = Field(..., description="Northern latitude")
class AOICenter(BaseModel):
"""Geographic center point."""
longitude: float
latitude: float
class AOIManifest(BaseModel):
"""Unity bundle manifest for an AOI."""
version: str = Field("1.0.0", description="Manifest version")
aoi_id: str = Field(..., description="AOI identifier")
created_at: str = Field(..., description="Creation timestamp")
bounds: AOIBounds = Field(..., description="Geographic bounds")
center: AOICenter = Field(..., description="Geographic center")
area_km2: float = Field(..., description="Area in km²")
theme: str = Field(..., description="Learning theme")
quality: str = Field(..., description="Quality level")
assets: dict = Field(..., description="Asset file references")
unity: dict = Field(..., description="Unity-specific configuration")
class Config:
json_schema_extra = {
"example": {
"version": "1.0.0",
"aoi_id": "550e8400-e29b-41d4-a716-446655440000",
"created_at": "2024-01-15T12:00:00Z",
"bounds": {
"west": 9.1875,
"south": 47.7055,
"east": 9.1975,
"north": 47.7115,
},
"center": {
"longitude": 9.1925,
"latitude": 47.7085,
},
"area_km2": 0.45,
"theme": "topographie",
"quality": "medium",
"assets": {
"terrain": {"file": "terrain.heightmap.png", "config": "terrain.json"},
"osm_features": {"file": "osm_features.json"},
"learning_positions": {"file": "learning_positions.json"},
"attribution": {"file": "attribution.json"},
},
"unity": {
"coordinate_system": "Unity (Y-up, left-handed)",
"scale": 1.0,
"terrain_resolution": 256,
},
}
}

View File

@@ -0,0 +1,97 @@
"""
Attribution Models
Models for license and attribution tracking
"""
from typing import Optional
from pydantic import BaseModel, Field
class AttributionSource(BaseModel):
"""
Attribution information for a data source.
All geographic data requires proper attribution per their licenses.
"""
name: str = Field(..., description="Source name")
license: str = Field(..., description="License name")
url: str = Field(..., description="License or source URL")
attribution: str = Field(..., description="Required attribution text")
required: bool = Field(True, description="Whether attribution is legally required")
logo_url: Optional[str] = Field(None, description="Optional logo URL")
class Attribution(BaseModel):
"""
Complete attribution information for an AOI bundle.
Ensures DSGVO/GDPR compliance and proper data source attribution.
"""
sources: list[AttributionSource] = Field(
...,
description="List of data sources requiring attribution",
)
generated_at: str = Field(..., description="Timestamp when attribution was generated")
notice: str = Field(
"This data must be attributed according to the licenses above when used publicly.",
description="General attribution notice",
)
class Config:
json_schema_extra = {
"example": {
"sources": [
{
"name": "OpenStreetMap",
"license": "Open Database License (ODbL) v1.0",
"url": "https://www.openstreetmap.org/copyright",
"attribution": "© OpenStreetMap contributors",
"required": True,
},
{
"name": "Copernicus DEM",
"license": "Copernicus Data License",
"url": "https://spacedata.copernicus.eu/",
"attribution": "© Copernicus Service Information 2024",
"required": True,
},
],
"generated_at": "2024-01-15T12:00:00Z",
"notice": "This data must be attributed according to the licenses above when used publicly.",
}
}
# Predefined attribution sources
OSM_ATTRIBUTION = AttributionSource(
name="OpenStreetMap",
license="Open Database License (ODbL) v1.0",
url="https://www.openstreetmap.org/copyright",
attribution="© OpenStreetMap contributors",
required=True,
)
COPERNICUS_ATTRIBUTION = AttributionSource(
name="Copernicus DEM",
license="Copernicus Data License",
url="https://spacedata.copernicus.eu/",
attribution="© Copernicus Service Information 2024",
required=True,
)
OPENAERIAL_ATTRIBUTION = AttributionSource(
name="OpenAerialMap",
license="CC-BY 4.0",
url="https://openaerialmap.org/",
attribution="© OpenAerialMap contributors",
required=True,
)
def get_default_attribution() -> Attribution:
"""Get default attribution with standard sources."""
from datetime import datetime
return Attribution(
sources=[OSM_ATTRIBUTION, COPERNICUS_ATTRIBUTION],
generated_at=datetime.utcnow().isoformat(),
)

View File

@@ -0,0 +1,120 @@
"""
Learning Node Models
Pydantic models for educational content nodes
"""
from enum import Enum
from typing import Optional
from pydantic import BaseModel, Field
class LearningTheme(str, Enum):
"""Available learning themes for geographic education."""
TOPOGRAPHIE = "topographie"
LANDNUTZUNG = "landnutzung"
ORIENTIERUNG = "orientierung"
GEOLOGIE = "geologie"
HYDROLOGIE = "hydrologie"
VEGETATION = "vegetation"
class NodeType(str, Enum):
"""Type of learning node interaction."""
QUESTION = "question" # Multiple choice or open question
OBSERVATION = "observation" # Guided observation task
EXPLORATION = "exploration" # Free exploration with hints
class Position(BaseModel):
"""Geographic position for a learning node."""
latitude: float = Field(..., ge=-90, le=90, description="Latitude in degrees")
longitude: float = Field(..., ge=-180, le=180, description="Longitude in degrees")
altitude: Optional[float] = Field(None, description="Altitude in meters")
class LearningNode(BaseModel):
"""
A learning node (station) within a geographic area.
Contains educational content tied to a specific location,
including questions, hints, and explanations.
"""
id: str = Field(..., description="Unique node identifier")
aoi_id: str = Field(..., description="Parent AOI identifier")
title: str = Field(..., min_length=1, max_length=100, description="Node title")
theme: LearningTheme = Field(..., description="Learning theme")
position: dict = Field(..., description="Geographic position")
question: str = Field(..., description="Learning question or task")
hints: list[str] = Field(default_factory=list, description="Progressive hints")
answer: str = Field(..., description="Correct answer or expected observation")
explanation: str = Field(..., description="Didactic explanation")
node_type: NodeType = Field(NodeType.QUESTION, description="Interaction type")
points: int = Field(10, ge=1, le=100, description="Points awarded for completion")
approved: bool = Field(False, description="Teacher-approved for student use")
media: Optional[list[dict]] = Field(None, description="Associated media files")
tags: Optional[list[str]] = Field(None, description="Content tags")
difficulty: Optional[str] = Field(None, description="Difficulty level")
grade_level: Optional[str] = Field(None, description="Target grade level")
class Config:
json_schema_extra = {
"example": {
"id": "node-001",
"aoi_id": "550e8400-e29b-41d4-a716-446655440000",
"title": "Höhenbestimmung",
"theme": "topographie",
"position": {"latitude": 47.7085, "longitude": 9.1925},
"question": "Schätze die Höhe dieses Punktes über dem Meeresspiegel.",
"hints": [
"Schau dir die Vegetation an.",
"Vergleiche mit dem Seespiegel des Bodensees (395m).",
],
"answer": "Ca. 430 Meter über NN",
"explanation": "Die Höhe lässt sich aus der Vegetation und der relativen Position zum Bodensee abschätzen.",
"node_type": "question",
"points": 10,
"approved": True,
}
}
class LearningNodeRequest(BaseModel):
"""Request model for creating a learning node manually."""
title: str = Field(..., min_length=1, max_length=100)
theme: LearningTheme
position: Position
question: str
hints: list[str] = Field(default_factory=list)
answer: str
explanation: str
node_type: NodeType = NodeType.QUESTION
points: int = Field(10, ge=1, le=100)
tags: Optional[list[str]] = None
difficulty: Optional[str] = Field(None, pattern="^(leicht|mittel|schwer)$")
grade_level: Optional[str] = None
class LearningNodeBatch(BaseModel):
"""Batch of learning nodes for bulk operations."""
nodes: list[LearningNode]
total_points: int = 0
def calculate_total_points(self) -> int:
"""Calculate total points from all nodes."""
self.total_points = sum(node.points for node in self.nodes)
return self.total_points
class LearningProgress(BaseModel):
"""Student progress through learning nodes."""
student_id: str
aoi_id: str
completed_nodes: list[str] = Field(default_factory=list)
total_points: int = 0
started_at: Optional[str] = None
completed_at: Optional[str] = None
@property
def completion_percentage(self) -> float:
"""Calculate completion percentage."""
# Would need total node count for accurate calculation
return len(self.completed_nodes) * 10 # Placeholder

View File

@@ -0,0 +1,45 @@
# FastAPI Framework
fastapi==0.115.0
uvicorn[standard]==0.30.6
python-multipart==0.0.9
# Database & GIS
asyncpg==0.29.0
sqlalchemy[asyncio]==2.0.32
geoalchemy2==0.14.7
shapely==2.0.5
pyproj==3.6.1
# Geo Processing
rasterio==1.3.10
numpy==1.26.4
pillow==10.4.0
# PMTiles Support
pmtiles==3.2.0
# MinIO/S3 Client
minio==7.2.7
boto3==1.34.149
# HTTP Client (for Ollama)
httpx==0.27.0
aiohttp==3.10.4
# Validation & Settings
pydantic==2.8.2
pydantic-settings==2.4.0
python-dotenv==1.0.1
# Authentication
python-jose[cryptography]==3.3.0
passlib[bcrypt]==1.7.4
# Utilities
orjson==3.10.6
structlog==24.4.0
# Testing
pytest==8.3.2
pytest-asyncio==0.23.8
httpx==0.27.0

View File

@@ -0,0 +1,198 @@
#!/bin/bash
# ============================================
# Copernicus DEM Download Script for GeoEdu Service
# ============================================
#
# WICHTIG: Dieses Script startet einen Download von ca. 20-40 GB!
# Nur nach expliziter Freigabe ausfuehren!
#
# Quelle: Copernicus Data Space Ecosystem
# Daten: GLO-30 DEM (30m Aufloesung)
# Lizenz: Copernicus Data (frei, Attribution erforderlich)
# Attribution: © Copernicus Service Information
#
# Voraussetzungen:
# - Copernicus Data Space Account (kostenlos)
# - Credentials in ~/.netrc oder als Umgebungsvariablen
#
# Nutzung:
# ./download_dem.sh [--dry-run] [--bbox west,south,east,north]
#
# Beispiel (nur Bayern):
# ./download_dem.sh --bbox 8.97,47.27,13.84,50.56
#
# ============================================
set -e
# Configuration
DATA_DIR="${DEM_DATA_DIR:-/app/data/dem}"
COPERNICUS_URL="https://dataspace.copernicus.eu"
# Germany bounding box (default)
BBOX_WEST="${BBOX_WEST:-5.87}"
BBOX_SOUTH="${BBOX_SOUTH:-47.27}"
BBOX_EAST="${BBOX_EAST:-15.04}"
BBOX_NORTH="${BBOX_NORTH:-55.06}"
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
echo -e "${YELLOW}============================================${NC}"
echo -e "${YELLOW}GeoEdu Service - Copernicus DEM Download${NC}"
echo -e "${YELLOW}============================================${NC}"
echo ""
echo -e "Quelle: ${GREEN}Copernicus Data Space${NC}"
echo -e "Daten: ${GREEN}GLO-30 DEM (30m Aufloesung)${NC}"
echo -e "Groesse: ${YELLOW}~20-40 GB (Deutschland komplett)${NC}"
echo -e "Lizenz: ${GREEN}Copernicus Data (frei)${NC}"
echo -e "Attribution: ${GREEN}© Copernicus Service Information${NC}"
echo ""
# Parse arguments
DRY_RUN=false
while [[ $# -gt 0 ]]; do
case $1 in
--dry-run)
DRY_RUN=true
shift
;;
--bbox)
IFS=',' read -r BBOX_WEST BBOX_SOUTH BBOX_EAST BBOX_NORTH <<< "$2"
shift 2
;;
*)
echo "Unbekannte Option: $1"
exit 1
;;
esac
done
echo "Bounding Box:"
echo " West: $BBOX_WEST"
echo " Sued: $BBOX_SOUTH"
echo " Ost: $BBOX_EAST"
echo " Nord: $BBOX_NORTH"
echo ""
# Calculate required tiles
# Copernicus DEM tiles are 1°x1° cells
calc_tiles() {
local west=$(echo "$BBOX_WEST" | cut -d. -f1)
local south=$(echo "$BBOX_SOUTH" | cut -d. -f1)
local east=$(echo "$BBOX_EAST" | cut -d. -f1)
local north=$(echo "$BBOX_NORTH" | cut -d. -f1)
# Adjust for negative values
west=$((west < 0 ? west : west))
local count=0
for lat in $(seq $south $north); do
for lon in $(seq $west $east); do
count=$((count + 1))
done
done
echo $count
}
TILE_COUNT=$(calc_tiles)
ESTIMATED_SIZE=$((TILE_COUNT * 50)) # ~50 MB per tile
echo "Benoetigte Tiles: $TILE_COUNT"
echo "Geschaetzte Groesse: ~${ESTIMATED_SIZE} MB"
echo ""
if [ "$DRY_RUN" = true ]; then
echo -e "${YELLOW}[DRY-RUN] Kein Download wird durchgefuehrt.${NC}"
echo ""
echo "Wuerde folgende Tiles herunterladen:"
echo ""
for lat in $(seq $(echo "$BBOX_SOUTH" | cut -d. -f1) $(echo "$BBOX_NORTH" | cut -d. -f1)); do
for lon in $(seq $(echo "$BBOX_WEST" | cut -d. -f1) $(echo "$BBOX_EAST" | cut -d. -f1)); do
lat_prefix=$([ $lat -ge 0 ] && echo "N" || echo "S")
lon_prefix=$([ $lon -ge 0 ] && echo "E" || echo "W")
printf " %s%02d%s%03d.tif\n" "$lat_prefix" "${lat#-}" "$lon_prefix" "${lon#-}"
done
done
echo ""
exit 0
fi
# Confirm download
echo -e "${RED}ACHTUNG: Download startet ~${ESTIMATED_SIZE} MB Daten!${NC}"
echo ""
read -p "Download starten? (j/N) " -n 1 -r
echo ""
if [[ ! $REPLY =~ ^[JjYy]$ ]]; then
echo "Download abgebrochen."
exit 1
fi
# Create data directory
mkdir -p "$DATA_DIR"
cd "$DATA_DIR"
# Check for AWS CLI (preferred method for Copernicus)
if ! command -v aws &> /dev/null; then
echo -e "${YELLOW}HINWEIS: AWS CLI nicht installiert.${NC}"
echo "Installiere mit: pip install awscli"
echo ""
echo "Alternative: Manueller Download von:"
echo " https://dataspace.copernicus.eu/explore-data/data-collections/copernicus-contributing-missions/collections-description/COP-DEM"
exit 1
fi
# Download tiles
echo ""
echo -e "${GREEN}Starte Download...${NC}"
DOWNLOADED=0
FAILED=0
for lat in $(seq $(echo "$BBOX_SOUTH" | cut -d. -f1) $(echo "$BBOX_NORTH" | cut -d. -f1)); do
for lon in $(seq $(echo "$BBOX_WEST" | cut -d. -f1) $(echo "$BBOX_EAST" | cut -d. -f1)); do
lat_prefix=$([ $lat -ge 0 ] && echo "N" || echo "S")
lon_prefix=$([ $lon -ge 0 ] && echo "E" || echo "W")
lat_abs=${lat#-}
lon_abs=${lon#-}
filename=$(printf "%s%02d%s%03d.tif" "$lat_prefix" "$lat_abs" "$lon_prefix" "$lon_abs")
if [ -f "$filename" ]; then
echo "$filename (bereits vorhanden)"
DOWNLOADED=$((DOWNLOADED + 1))
continue
fi
echo "$filename"
# Copernicus DEM S3 bucket path
# Format: s3://copernicus-dem-30m/Copernicus_DSM_COG_10_N47_00_E008_00_DEM/
s3_path="s3://copernicus-dem-30m/Copernicus_DSM_COG_10_${lat_prefix}${lat_abs}_00_${lon_prefix}${lon_abs}_00_DEM/"
if aws s3 cp "${s3_path}Copernicus_DSM_COG_10_${lat_prefix}${lat_abs}_00_${lon_prefix}${lon_abs}_00_DEM.tif" "$filename" --no-sign-request 2>/dev/null; then
DOWNLOADED=$((DOWNLOADED + 1))
else
echo -e " ${YELLOW}(nicht verfuegbar)${NC}"
FAILED=$((FAILED + 1))
fi
done
done
echo ""
echo -e "${GREEN}============================================${NC}"
echo -e "${GREEN}Download abgeschlossen${NC}"
echo -e "${GREEN}============================================${NC}"
echo ""
echo "Heruntergeladen: $DOWNLOADED Tiles"
echo "Nicht verfuegbar: $FAILED Tiles"
echo "Speicherort: $DATA_DIR"
echo ""
echo -e "${YELLOW}Naechster Schritt:${NC}"
echo " Die Tiles werden automatisch vom geo-service geladen."

View File

@@ -0,0 +1,113 @@
#!/bin/bash
# ============================================
# OSM Data Download Script for GeoEdu Service
# ============================================
#
# WICHTIG: Dieses Script startet einen Download von ca. 4.4 GB!
# Nur nach expliziter Freigabe ausfuehren!
#
# Quelle: Geofabrik (offizieller OSM-Mirror)
# Lizenz: ODbL (Open Database License)
# Attribution: © OpenStreetMap contributors
#
# Nutzung:
# ./download_osm.sh [--dry-run]
#
# ============================================
set -e
# Configuration
DATA_DIR="${OSM_DATA_DIR:-/app/data/osm}"
GEOFABRIK_URL="https://download.geofabrik.de/europe/germany-latest.osm.pbf"
GEOFABRIK_MD5_URL="https://download.geofabrik.de/europe/germany-latest.osm.pbf.md5"
OUTPUT_FILE="germany-latest.osm.pbf"
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
echo -e "${YELLOW}============================================${NC}"
echo -e "${YELLOW}GeoEdu Service - OSM Data Download${NC}"
echo -e "${YELLOW}============================================${NC}"
echo ""
echo -e "Quelle: ${GREEN}Geofabrik${NC}"
echo -e "Datei: ${GREEN}Germany PBF${NC}"
echo -e "Groesse: ${YELLOW}~4.4 GB${NC}"
echo -e "Lizenz: ${GREEN}ODbL (Open Database License)${NC}"
echo -e "Attribution: ${GREEN}© OpenStreetMap contributors${NC}"
echo ""
# Check for dry-run mode
if [[ "$1" == "--dry-run" ]]; then
echo -e "${YELLOW}[DRY-RUN] Kein Download wird durchgefuehrt.${NC}"
echo ""
echo "Wuerde herunterladen von:"
echo " $GEOFABRIK_URL"
echo ""
echo "Nach:"
echo " $DATA_DIR/$OUTPUT_FILE"
echo ""
exit 0
fi
# Confirm download
echo -e "${RED}ACHTUNG: Download startet ~4.4 GB Daten!${NC}"
echo ""
read -p "Download starten? (j/N) " -n 1 -r
echo ""
if [[ ! $REPLY =~ ^[JjYy]$ ]]; then
echo "Download abgebrochen."
exit 1
fi
# Create data directory
mkdir -p "$DATA_DIR"
cd "$DATA_DIR"
# Download MD5 checksum first
echo ""
echo -e "${GREEN}[1/3] Lade MD5-Pruefsumme herunter...${NC}"
curl -L -o "${OUTPUT_FILE}.md5" "$GEOFABRIK_MD5_URL"
# Download OSM data
echo ""
echo -e "${GREEN}[2/3] Lade OSM-Daten herunter (~4.4 GB)...${NC}"
echo " Dies kann je nach Internetverbindung 5-60 Minuten dauern."
echo ""
# Use wget with resume support
if command -v wget &> /dev/null; then
wget -c -O "$OUTPUT_FILE" "$GEOFABRIK_URL"
else
curl -L -C - -o "$OUTPUT_FILE" "$GEOFABRIK_URL"
fi
# Verify checksum
echo ""
echo -e "${GREEN}[3/3] Verifiziere MD5-Pruefsumme...${NC}"
if md5sum -c "${OUTPUT_FILE}.md5"; then
echo ""
echo -e "${GREEN}✓ Download erfolgreich!${NC}"
echo ""
echo "Datei: $DATA_DIR/$OUTPUT_FILE"
echo "Groesse: $(du -h "$OUTPUT_FILE" | cut -f1)"
echo ""
echo -e "${YELLOW}Naechster Schritt:${NC}"
echo " ./import_osm.sh # OSM in PostGIS importieren"
echo " ./generate_tiles.sh # PMTiles generieren"
else
echo ""
echo -e "${RED}✗ MD5-Pruefsumme stimmt nicht ueberein!${NC}"
echo "Bitte Download erneut starten."
exit 1
fi
echo ""
echo -e "${YELLOW}============================================${NC}"
echo -e "${YELLOW}Download abgeschlossen${NC}"
echo -e "${YELLOW}============================================${NC}"

View File

@@ -0,0 +1,184 @@
#!/bin/bash
# ============================================
# PMTiles Generation Script for GeoEdu Service
# ============================================
#
# Generiert PMTiles Vector Tiles aus PostGIS OSM Daten
# Verwendet tippecanoe fuer die Tile-Generierung
#
# Voraussetzungen:
# - OSM Daten in PostGIS importiert
# - tippecanoe installiert
# - ogr2ogr (GDAL) installiert
#
# Nutzung:
# ./generate_tiles.sh [--min-zoom 0] [--max-zoom 14]
#
# ============================================
set -e
# Configuration
DATA_DIR="${OSM_DATA_DIR:-/app/data/osm}"
OUTPUT_FILE="${DATA_DIR}/germany.pmtiles"
DATABASE_URL="${DATABASE_URL:-postgresql://breakpilot:breakpilot123@postgres:5432/breakpilot_db}"
MIN_ZOOM="${MIN_ZOOM:-0}"
MAX_ZOOM="${MAX_ZOOM:-14}"
# Parse DATABASE_URL
DB_USER=$(echo $DATABASE_URL | sed -n 's|.*://\([^:]*\):.*|\1|p')
DB_PASS=$(echo $DATABASE_URL | sed -n 's|.*://[^:]*:\([^@]*\)@.*|\1|p')
DB_HOST=$(echo $DATABASE_URL | sed -n 's|.*@\([^:]*\):.*|\1|p')
DB_PORT=$(echo $DATABASE_URL | sed -n 's|.*:\([0-9]*\)/.*|\1|p')
DB_NAME=$(echo $DATABASE_URL | sed -n 's|.*/\([^?]*\).*|\1|p')
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m'
echo -e "${YELLOW}============================================${NC}"
echo -e "${YELLOW}GeoEdu Service - PMTiles Generation${NC}"
echo -e "${YELLOW}============================================${NC}"
echo ""
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
--min-zoom)
MIN_ZOOM="$2"
shift 2
;;
--max-zoom)
MAX_ZOOM="$2"
shift 2
;;
*)
echo "Unbekannte Option: $1"
exit 1
;;
esac
done
echo "Zoom Level: $MIN_ZOOM - $MAX_ZOOM"
echo "Output: $OUTPUT_FILE"
echo ""
# Estimate output size
if [ "$MAX_ZOOM" -le 14 ]; then
ESTIMATED_SIZE="200-500 GB"
elif [ "$MAX_ZOOM" -le 16 ]; then
ESTIMATED_SIZE="500-800 GB"
else
ESTIMATED_SIZE="2-4 TB"
fi
echo -e "${YELLOW}Geschaetzte Groesse: $ESTIMATED_SIZE${NC}"
echo -e "${YELLOW}Geschaetzte Dauer: 12-24 Stunden (Zoom 0-14)${NC}"
echo ""
# Check for required tools
for tool in tippecanoe ogr2ogr; do
if ! command -v $tool &> /dev/null; then
echo -e "${RED}Fehler: $tool nicht installiert!${NC}"
echo ""
if [ "$tool" == "tippecanoe" ]; then
echo "Installation:"
echo " git clone https://github.com/felt/tippecanoe.git"
echo " cd tippecanoe && make -j && sudo make install"
else
echo "Installation:"
echo " apt-get install gdal-bin # Debian/Ubuntu"
echo " brew install gdal # macOS"
fi
exit 1
fi
done
echo "tippecanoe: $(tippecanoe --version 2>&1 | head -1)"
echo "ogr2ogr: $(ogr2ogr --version | head -1)"
echo ""
read -p "Tile-Generierung starten? (j/N) " -n 1 -r
echo ""
if [[ ! $REPLY =~ ^[JjYy]$ ]]; then
echo "Abgebrochen."
exit 1
fi
# Create temporary directory for GeoJSON
TEMP_DIR=$(mktemp -d)
trap "rm -rf $TEMP_DIR" EXIT
echo ""
echo -e "${GREEN}[1/5] Exportiere Landuse/Landcover...${NC}"
PGPASSWORD="$DB_PASS" ogr2ogr -f GeoJSONSeq \
"$TEMP_DIR/landuse.geojsonseq" \
"PG:host=$DB_HOST port=$DB_PORT user=$DB_USER dbname=$DB_NAME password=$DB_PASS" \
-sql "SELECT way AS geometry, landuse, natural, name FROM planet_osm_polygon WHERE landuse IS NOT NULL OR natural IS NOT NULL"
echo -e "${GREEN}[2/5] Exportiere Gebaude...${NC}"
PGPASSWORD="$DB_PASS" ogr2ogr -f GeoJSONSeq \
"$TEMP_DIR/building.geojsonseq" \
"PG:host=$DB_HOST port=$DB_PORT user=$DB_USER dbname=$DB_NAME password=$DB_PASS" \
-sql "SELECT way AS geometry, building, name, addr_housenumber FROM planet_osm_polygon WHERE building IS NOT NULL"
echo -e "${GREEN}[3/5] Exportiere Strassen...${NC}"
PGPASSWORD="$DB_PASS" ogr2ogr -f GeoJSONSeq \
"$TEMP_DIR/transportation.geojsonseq" \
"PG:host=$DB_HOST port=$DB_PORT user=$DB_USER dbname=$DB_NAME password=$DB_PASS" \
-sql "SELECT way AS geometry, highway, railway, name, ref FROM planet_osm_line WHERE highway IS NOT NULL OR railway IS NOT NULL"
echo -e "${GREEN}[4/5] Exportiere Gewaesser...${NC}"
PGPASSWORD="$DB_PASS" ogr2ogr -f GeoJSONSeq \
"$TEMP_DIR/water.geojsonseq" \
"PG:host=$DB_HOST port=$DB_PORT user=$DB_USER dbname=$DB_NAME password=$DB_PASS" \
-sql "SELECT way AS geometry, waterway, water, name FROM planet_osm_polygon WHERE water IS NOT NULL OR waterway IS NOT NULL
UNION ALL
SELECT way AS geometry, waterway, NULL as water, name FROM planet_osm_line WHERE waterway IS NOT NULL"
echo -e "${GREEN}[5/5] Exportiere Orte (POIs)...${NC}"
PGPASSWORD="$DB_PASS" ogr2ogr -f GeoJSONSeq \
"$TEMP_DIR/place.geojsonseq" \
"PG:host=$DB_HOST port=$DB_PORT user=$DB_USER dbname=$DB_NAME password=$DB_PASS" \
-sql "SELECT way AS geometry, place, name, population FROM planet_osm_point WHERE place IS NOT NULL"
echo ""
echo -e "${GREEN}Generiere PMTiles...${NC}"
echo "Dies kann mehrere Stunden dauern!"
echo ""
# Run tippecanoe
tippecanoe \
--output="$OUTPUT_FILE" \
--force \
--name="GeoEdu Germany" \
--description="Self-hosted OSM tiles for DSGVO-compliant education" \
--attribution="© OpenStreetMap contributors" \
--minimum-zoom="$MIN_ZOOM" \
--maximum-zoom="$MAX_ZOOM" \
--drop-densest-as-needed \
--extend-zooms-if-still-dropping \
--layer=landuse:"$TEMP_DIR/landuse.geojsonseq" \
--layer=building:"$TEMP_DIR/building.geojsonseq" \
--layer=transportation:"$TEMP_DIR/transportation.geojsonseq" \
--layer=water:"$TEMP_DIR/water.geojsonseq" \
--layer=place:"$TEMP_DIR/place.geojsonseq"
echo ""
echo -e "${GREEN}============================================${NC}"
echo -e "${GREEN}PMTiles Generierung abgeschlossen!${NC}"
echo -e "${GREEN}============================================${NC}"
echo ""
echo "Ausgabe: $OUTPUT_FILE"
echo "Groesse: $(du -h "$OUTPUT_FILE" | cut -f1)"
echo ""
echo "Die Tiles sind jetzt bereit fuer den geo-service."

236
geo-service/scripts/import_osm.sh Executable file
View File

@@ -0,0 +1,236 @@
#!/bin/bash
# ============================================
# OSM PostGIS Import Script for GeoEdu Service
# ============================================
#
# Importiert OSM PBF-Daten in PostgreSQL/PostGIS
# Verwendet osm2pgsql fuer den Import
#
# Voraussetzungen:
# - PostgreSQL mit PostGIS Extension
# - osm2pgsql installiert
# - OSM PBF Datei heruntergeladen
#
# Nutzung:
# ./import_osm.sh [--slim] [--drop]
#
# ============================================
set -e
# Configuration
DATA_DIR="${OSM_DATA_DIR:-/app/data/osm}"
PBF_FILE="${DATA_DIR}/germany-latest.osm.pbf"
DATABASE_URL="${DATABASE_URL:-postgresql://breakpilot:breakpilot123@postgres:5432/breakpilot_db}"
# Parse DATABASE_URL
DB_USER=$(echo $DATABASE_URL | sed -n 's|.*://\([^:]*\):.*|\1|p')
DB_PASS=$(echo $DATABASE_URL | sed -n 's|.*://[^:]*:\([^@]*\)@.*|\1|p')
DB_HOST=$(echo $DATABASE_URL | sed -n 's|.*@\([^:]*\):.*|\1|p')
DB_PORT=$(echo $DATABASE_URL | sed -n 's|.*:\([0-9]*\)/.*|\1|p')
DB_NAME=$(echo $DATABASE_URL | sed -n 's|.*/\([^?]*\).*|\1|p')
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m'
echo -e "${YELLOW}============================================${NC}"
echo -e "${YELLOW}GeoEdu Service - OSM PostGIS Import${NC}"
echo -e "${YELLOW}============================================${NC}"
echo ""
# Check if PBF file exists
if [ ! -f "$PBF_FILE" ]; then
echo -e "${RED}Fehler: OSM PBF Datei nicht gefunden!${NC}"
echo "Erwartet: $PBF_FILE"
echo ""
echo "Bitte zuerst download_osm.sh ausfuehren."
exit 1
fi
echo "PBF Datei: $PBF_FILE"
echo "Groesse: $(du -h "$PBF_FILE" | cut -f1)"
echo ""
echo "Datenbank: $DB_NAME @ $DB_HOST:$DB_PORT"
echo ""
# Check for osm2pgsql
if ! command -v osm2pgsql &> /dev/null; then
echo -e "${RED}Fehler: osm2pgsql nicht installiert!${NC}"
echo ""
echo "Installation:"
echo " apt-get install osm2pgsql # Debian/Ubuntu"
echo " brew install osm2pgsql # macOS"
exit 1
fi
OSM2PGSQL_VERSION=$(osm2pgsql --version 2>&1 | head -1)
echo "osm2pgsql Version: $OSM2PGSQL_VERSION"
echo ""
# Parse arguments
SLIM_MODE=""
DROP_MODE=""
while [[ $# -gt 0 ]]; do
case $1 in
--slim)
SLIM_MODE="--slim"
echo "Modus: Slim (fuer Updates)"
shift
;;
--drop)
DROP_MODE="--drop"
echo "Modus: Drop (bestehende Tabellen loeschen)"
shift
;;
*)
echo "Unbekannte Option: $1"
exit 1
;;
esac
done
# Estimate time
PBF_SIZE=$(stat -f%z "$PBF_FILE" 2>/dev/null || stat -c%s "$PBF_FILE")
ESTIMATED_HOURS=$((PBF_SIZE / 1000000000 / 2)) # Rough estimate: ~2GB per hour
echo ""
echo -e "${YELLOW}ACHTUNG: Import kann 2-4 Stunden dauern!${NC}"
echo "Geschaetzte Dauer: ~${ESTIMATED_HOURS}-$((ESTIMATED_HOURS * 2)) Stunden"
echo ""
read -p "Import starten? (j/N) " -n 1 -r
echo ""
if [[ ! $REPLY =~ ^[JjYy]$ ]]; then
echo "Import abgebrochen."
exit 1
fi
# Create style file
STYLE_FILE="${DATA_DIR}/osm2pgsql.style"
cat > "$STYLE_FILE" << 'EOF'
# osm2pgsql style file for GeoEdu Service
# Optimized for educational geography content
# Common tags
node,way access text linear
node,way addr:housename text linear
node,way addr:housenumber text linear
node,way addr:interpolation text linear
node,way admin_level text linear
node,way aerialway text linear
node,way amenity text polygon
node,way area text linear
node,way barrier text linear
node,way bicycle text linear
node,way boundary text linear
node,way bridge text linear
node,way building text polygon
node,way construction text linear
node,way covered text linear
node,way foot text linear
node,way highway text linear
node,way historic text polygon
node,way junction text linear
node,way landuse text polygon
node,way layer text linear
node,way leisure text polygon
node,way man_made text polygon
node,way military text polygon
node,way name text linear
node,way natural text polygon
node,way oneway text linear
node,way place text polygon
node,way power text polygon
node,way railway text linear
node,way ref text linear
node,way religion text linear
node,way route text linear
node,way service text linear
node,way shop text polygon
node,way sport text polygon
node,way surface text linear
node,way tourism text polygon
node,way tracktype text linear
node,way tunnel text linear
node,way water text polygon
node,way waterway text linear
node,way wetland text polygon
node,way wood text polygon
node,way z_order int4 linear
# Elevation data from OSM
node,way ele text linear
# Name translations
node,way name:de text linear
node,way name:en text linear
# Population for place rendering
node,way population text linear
# Way area for polygon ordering
way way_area real linear
EOF
echo ""
echo -e "${GREEN}[1/3] PostGIS Extension aktivieren...${NC}"
PGPASSWORD="$DB_PASS" psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" -c "CREATE EXTENSION IF NOT EXISTS postgis;"
PGPASSWORD="$DB_PASS" psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" -c "CREATE EXTENSION IF NOT EXISTS hstore;"
echo ""
echo -e "${GREEN}[2/3] OSM Daten importieren...${NC}"
echo " Dies dauert mehrere Stunden!"
echo ""
# Run osm2pgsql
PGPASSWORD="$DB_PASS" osm2pgsql \
-H "$DB_HOST" \
-P "$DB_PORT" \
-U "$DB_USER" \
-d "$DB_NAME" \
-S "$STYLE_FILE" \
--cache 4000 \
--number-processes 4 \
$SLIM_MODE \
$DROP_MODE \
"$PBF_FILE"
echo ""
echo -e "${GREEN}[3/3] Indizes erstellen...${NC}"
PGPASSWORD="$DB_PASS" psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" << 'EOSQL'
-- Spatial indexes (if not created by osm2pgsql)
CREATE INDEX IF NOT EXISTS planet_osm_point_way_idx ON planet_osm_point USING GIST (way);
CREATE INDEX IF NOT EXISTS planet_osm_line_way_idx ON planet_osm_line USING GIST (way);
CREATE INDEX IF NOT EXISTS planet_osm_polygon_way_idx ON planet_osm_polygon USING GIST (way);
CREATE INDEX IF NOT EXISTS planet_osm_roads_way_idx ON planet_osm_roads USING GIST (way);
-- Name indexes for search
CREATE INDEX IF NOT EXISTS planet_osm_point_name_idx ON planet_osm_point (name);
CREATE INDEX IF NOT EXISTS planet_osm_polygon_name_idx ON planet_osm_polygon (name);
-- Vacuum analyze for query optimization
VACUUM ANALYZE planet_osm_point;
VACUUM ANALYZE planet_osm_line;
VACUUM ANALYZE planet_osm_polygon;
VACUUM ANALYZE planet_osm_roads;
EOSQL
echo ""
echo -e "${GREEN}============================================${NC}"
echo -e "${GREEN}Import abgeschlossen!${NC}"
echo -e "${GREEN}============================================${NC}"
echo ""
echo "Tabellen erstellt:"
echo " - planet_osm_point"
echo " - planet_osm_line"
echo " - planet_osm_polygon"
echo " - planet_osm_roads"
echo ""
echo -e "${YELLOW}Naechster Schritt:${NC}"
echo " ./generate_tiles.sh # PMTiles generieren"

View File

@@ -0,0 +1,16 @@
"""
GeoEdu Service - Business Logic Services
"""
from .tile_server import TileServerService
from .dem_service import DEMService
from .aoi_packager import AOIPackagerService
from .osm_extractor import OSMExtractorService
from .learning_generator import LearningGeneratorService
__all__ = [
"TileServerService",
"DEMService",
"AOIPackagerService",
"OSMExtractorService",
"LearningGeneratorService",
]

View File

@@ -0,0 +1,420 @@
"""
AOI Packager Service
Creates Unity-compatible bundles from geographic areas
"""
import os
import json
import zipfile
import uuid
from typing import Optional, Tuple
from datetime import datetime
import math
import structlog
from shapely.geometry import shape, Polygon, mapping
from shapely.ops import transform
import pyproj
from config import settings
logger = structlog.get_logger(__name__)
# Germany bounding box
GERMANY_BOUNDS = Polygon([
(5.87, 47.27),
(15.04, 47.27),
(15.04, 55.06),
(5.87, 55.06),
(5.87, 47.27),
])
# AOI status storage (in production, use database)
_aoi_storage = {}
class AOIPackagerService:
"""
Service for packaging geographic areas for Unity 3D rendering.
Creates bundles containing:
- Terrain heightmap
- OSM features (buildings, roads, water, etc.)
- Learning node positions
- Attribution information
"""
def __init__(self):
self.bundle_dir = settings.bundle_dir
self.max_area_km2 = settings.max_aoi_size_km2
def calculate_area_km2(self, geojson: dict) -> float:
"""
Calculate the area of a GeoJSON polygon in square kilometers.
Uses an equal-area projection for accurate measurement.
"""
try:
geom = shape(geojson)
# Transform to equal-area projection (EPSG:3035 for Europe)
project = pyproj.Transformer.from_crs(
"EPSG:4326", # WGS84
"EPSG:3035", # ETRS89-LAEA
always_xy=True,
).transform
geom_projected = transform(project, geom)
area_m2 = geom_projected.area
area_km2 = area_m2 / 1_000_000
return area_km2
except Exception as e:
logger.error("Error calculating area", error=str(e))
raise ValueError(f"Invalid polygon geometry: {str(e)}")
def is_within_germany(self, geojson: dict) -> bool:
"""Check if a polygon is within Germany's bounds."""
try:
geom = shape(geojson)
return GERMANY_BOUNDS.contains(geom)
except Exception:
return False
def validate_polygon(self, geojson: dict) -> Tuple[bool, str]:
"""
Validate a GeoJSON polygon.
Checks:
- Valid GeoJSON format
- Valid polygon geometry
- Not self-intersecting
"""
try:
# Check type
if geojson.get("type") != "Polygon":
return False, "Geometry must be a Polygon"
# Check coordinates
coords = geojson.get("coordinates")
if not coords or not isinstance(coords, list):
return False, "Missing or invalid coordinates"
# Parse geometry
geom = shape(geojson)
# Check validity
if not geom.is_valid:
return False, "Invalid polygon geometry (possibly self-intersecting)"
# Check ring closure
outer_ring = coords[0]
if outer_ring[0] != outer_ring[-1]:
return False, "Polygon ring must be closed"
return True, "Valid"
except Exception as e:
return False, f"Error validating polygon: {str(e)}"
def estimate_bundle_size_mb(self, area_km2: float, quality: str) -> float:
"""Estimate the bundle size based on area and quality."""
# Base size per km² in MB
base_sizes = {
"low": 10,
"medium": 25,
"high": 50,
}
base = base_sizes.get(quality, 25)
return round(area_km2 * base, 1)
async def process_aoi(
self,
aoi_id: str,
polygon: dict,
theme: str,
quality: str,
):
"""
Process an AOI and create the Unity bundle.
This runs as a background task.
"""
logger.info("Processing AOI", aoi_id=aoi_id, theme=theme, quality=quality)
# Update status
_aoi_storage[aoi_id] = {
"status": "processing",
"polygon": polygon,
"theme": theme,
"quality": quality,
"area_km2": self.calculate_area_km2(polygon),
"created_at": datetime.utcnow().isoformat(),
}
try:
# Create bundle directory
bundle_path = os.path.join(self.bundle_dir, aoi_id)
os.makedirs(bundle_path, exist_ok=True)
# Generate terrain heightmap
await self._generate_terrain(aoi_id, polygon, quality)
# Extract OSM features
await self._extract_osm_features(aoi_id, polygon)
# Generate learning node positions
await self._generate_learning_positions(aoi_id, polygon, theme)
# Create attribution file
await self._create_attribution(aoi_id)
# Create manifest
await self._create_manifest(aoi_id, polygon, theme, quality)
# Create ZIP bundle
await self._create_zip_bundle(aoi_id)
# Update status
_aoi_storage[aoi_id]["status"] = "completed"
_aoi_storage[aoi_id]["completed_at"] = datetime.utcnow().isoformat()
logger.info("AOI processing complete", aoi_id=aoi_id)
except Exception as e:
logger.error("AOI processing failed", aoi_id=aoi_id, error=str(e))
_aoi_storage[aoi_id]["status"] = "failed"
_aoi_storage[aoi_id]["error"] = str(e)
async def _generate_terrain(self, aoi_id: str, polygon: dict, quality: str):
"""Generate terrain heightmap for the AOI."""
from services.dem_service import DEMService
dem_service = DEMService()
bundle_path = os.path.join(self.bundle_dir, aoi_id)
# Get bounding box of polygon
geom = shape(polygon)
bounds = geom.bounds # (minx, miny, maxx, maxy)
# Determine resolution based on quality
resolutions = {"low": 64, "medium": 256, "high": 512}
resolution = resolutions.get(quality, 256)
# Generate heightmap for bounds
from services.dem_service import tile_to_bounds
# For simplicity, generate a single heightmap image
# In production, this would be more sophisticated
heightmap_path = os.path.join(bundle_path, "terrain.heightmap.png")
# Save bounds info
terrain_info = {
"bounds": {
"west": bounds[0],
"south": bounds[1],
"east": bounds[2],
"north": bounds[3],
},
"resolution": resolution,
"heightmap_file": "terrain.heightmap.png",
"encoding": "terrain-rgb",
}
with open(os.path.join(bundle_path, "terrain.json"), "w") as f:
json.dump(terrain_info, f, indent=2)
logger.debug("Terrain generated", aoi_id=aoi_id, resolution=resolution)
async def _extract_osm_features(self, aoi_id: str, polygon: dict):
"""Extract OSM features within the AOI."""
from services.osm_extractor import OSMExtractorService
extractor = OSMExtractorService()
bundle_path = os.path.join(self.bundle_dir, aoi_id)
# Extract features
features = await extractor.extract_features(polygon)
# Save to file
features_path = os.path.join(bundle_path, "osm_features.json")
with open(features_path, "w") as f:
json.dump(features, f, indent=2)
logger.debug("OSM features extracted", aoi_id=aoi_id, count=len(features.get("features", [])))
async def _generate_learning_positions(self, aoi_id: str, polygon: dict, theme: str):
"""Generate suggested positions for learning nodes."""
geom = shape(polygon)
bounds = geom.bounds
centroid = geom.centroid
# Generate positions based on theme
# For now, create a grid of potential positions
positions = []
# Create a 3x3 grid of positions
for i in range(3):
for j in range(3):
lon = bounds[0] + (bounds[2] - bounds[0]) * (i + 0.5) / 3
lat = bounds[1] + (bounds[3] - bounds[1]) * (j + 0.5) / 3
# Check if point is within polygon
from shapely.geometry import Point
if geom.contains(Point(lon, lat)):
positions.append({
"id": str(uuid.uuid4()),
"position": {"longitude": lon, "latitude": lat},
"suggested_theme": theme,
"status": "pending",
})
bundle_path = os.path.join(self.bundle_dir, aoi_id)
positions_path = os.path.join(bundle_path, "learning_positions.json")
with open(positions_path, "w") as f:
json.dump({"positions": positions}, f, indent=2)
logger.debug("Learning positions generated", aoi_id=aoi_id, count=len(positions))
async def _create_attribution(self, aoi_id: str):
"""Create attribution file with required license notices."""
attribution = {
"sources": [
{
"name": "OpenStreetMap",
"license": "Open Database License (ODbL) v1.0",
"url": "https://www.openstreetmap.org/copyright",
"attribution": "© OpenStreetMap contributors",
"required": True,
},
{
"name": "Copernicus DEM",
"license": "Copernicus Data License",
"url": "https://spacedata.copernicus.eu/",
"attribution": "© Copernicus Service Information 2024",
"required": True,
},
],
"generated_at": datetime.utcnow().isoformat(),
"notice": "This data must be attributed according to the licenses above when used publicly.",
}
bundle_path = os.path.join(self.bundle_dir, aoi_id)
attribution_path = os.path.join(bundle_path, "attribution.json")
with open(attribution_path, "w") as f:
json.dump(attribution, f, indent=2)
async def _create_manifest(self, aoi_id: str, polygon: dict, theme: str, quality: str):
"""Create Unity bundle manifest."""
geom = shape(polygon)
bounds = geom.bounds
centroid = geom.centroid
manifest = {
"version": "1.0.0",
"aoi_id": aoi_id,
"created_at": datetime.utcnow().isoformat(),
"bounds": {
"west": bounds[0],
"south": bounds[1],
"east": bounds[2],
"north": bounds[3],
},
"center": {
"longitude": centroid.x,
"latitude": centroid.y,
},
"area_km2": self.calculate_area_km2(polygon),
"theme": theme,
"quality": quality,
"assets": {
"terrain": {
"file": "terrain.heightmap.png",
"config": "terrain.json",
},
"osm_features": {
"file": "osm_features.json",
},
"learning_positions": {
"file": "learning_positions.json",
},
"attribution": {
"file": "attribution.json",
},
},
"unity": {
"coordinate_system": "Unity (Y-up, left-handed)",
"scale": 1.0, # 1 Unity unit = 1 meter
"terrain_resolution": {"low": 64, "medium": 256, "high": 512}[quality],
},
}
bundle_path = os.path.join(self.bundle_dir, aoi_id)
manifest_path = os.path.join(bundle_path, "manifest.json")
with open(manifest_path, "w") as f:
json.dump(manifest, f, indent=2)
async def _create_zip_bundle(self, aoi_id: str):
"""Create ZIP archive of all bundle files."""
bundle_path = os.path.join(self.bundle_dir, aoi_id)
zip_path = os.path.join(bundle_path, "bundle.zip")
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zf:
for filename in os.listdir(bundle_path):
if filename != "bundle.zip":
filepath = os.path.join(bundle_path, filename)
zf.write(filepath, filename)
logger.debug("Bundle ZIP created", aoi_id=aoi_id, path=zip_path)
async def get_aoi_status(self, aoi_id: str) -> Optional[dict]:
"""Get the status of an AOI."""
return _aoi_storage.get(aoi_id)
async def get_manifest(self, aoi_id: str) -> Optional[dict]:
"""Get the manifest for a completed AOI."""
aoi_data = _aoi_storage.get(aoi_id)
if aoi_data is None or aoi_data.get("status") != "completed":
return None
manifest_path = os.path.join(self.bundle_dir, aoi_id, "manifest.json")
if not os.path.exists(manifest_path):
return None
with open(manifest_path) as f:
return json.load(f)
async def get_bundle_path(self, aoi_id: str) -> Optional[str]:
"""Get the path to a completed bundle ZIP."""
aoi_data = _aoi_storage.get(aoi_id)
if aoi_data is None or aoi_data.get("status") != "completed":
return None
zip_path = os.path.join(self.bundle_dir, aoi_id, "bundle.zip")
if not os.path.exists(zip_path):
return None
return zip_path
async def delete_aoi(self, aoi_id: str) -> bool:
"""Delete an AOI and its files."""
if aoi_id not in _aoi_storage:
return False
import shutil
bundle_path = os.path.join(self.bundle_dir, aoi_id)
if os.path.exists(bundle_path):
shutil.rmtree(bundle_path)
del _aoi_storage[aoi_id]
return True
async def generate_preview(self, aoi_id: str, width: int, height: int) -> Optional[bytes]:
"""Generate a preview image of the AOI (stub)."""
# Would generate a preview combining terrain and OSM features
return None

View File

@@ -0,0 +1,338 @@
"""
DEM (Digital Elevation Model) Service
Serves terrain data from Copernicus DEM GLO-30
"""
import os
import math
from typing import Optional, Tuple
import struct
import structlog
import numpy as np
from PIL import Image
from io import BytesIO
from config import settings
logger = structlog.get_logger(__name__)
# Germany bounding box
GERMANY_BOUNDS = {
"west": 5.87,
"south": 47.27,
"east": 15.04,
"north": 55.06,
}
def lat_lon_to_tile(lat: float, lon: float, zoom: int) -> Tuple[int, int]:
"""Convert latitude/longitude to tile coordinates."""
n = 2 ** zoom
x = int((lon + 180.0) / 360.0 * n)
lat_rad = math.radians(lat)
y = int((1.0 - math.asinh(math.tan(lat_rad)) / math.pi) / 2.0 * n)
return x, y
def tile_to_bounds(z: int, x: int, y: int) -> Tuple[float, float, float, float]:
"""Convert tile coordinates to bounding box (west, south, east, north)."""
n = 2 ** z
west = x / n * 360.0 - 180.0
east = (x + 1) / n * 360.0 - 180.0
north_rad = math.atan(math.sinh(math.pi * (1 - 2 * y / n)))
south_rad = math.atan(math.sinh(math.pi * (1 - 2 * (y + 1) / n)))
north = math.degrees(north_rad)
south = math.degrees(south_rad)
return west, south, east, north
class DEMService:
"""
Service for handling Digital Elevation Model data.
Uses Copernicus DEM GLO-30 (30m resolution) as the data source.
Generates terrain tiles in Mapbox Terrain-RGB format for MapLibre/Unity.
"""
def __init__(self):
self.dem_dir = settings.dem_data_dir
self.tile_size = settings.terrain_tile_size
self._dem_cache = {} # Cache loaded DEM files
def _get_dem_file_path(self, lat: int, lon: int) -> str:
"""
Get the path to a Copernicus DEM file for the given coordinates.
Copernicus DEM files are named like: Copernicus_DSM_COG_10_N47_00_E008_00_DEM.tif
"""
lat_prefix = "N" if lat >= 0 else "S"
lon_prefix = "E" if lon >= 0 else "W"
# Format: N47E008
filename = f"{lat_prefix}{abs(lat):02d}{lon_prefix}{abs(lon):03d}.tif"
return os.path.join(self.dem_dir, filename)
def _load_dem_tile(self, lat: int, lon: int) -> Optional[np.ndarray]:
"""Load a DEM tile from disk."""
cache_key = f"{lat}_{lon}"
if cache_key in self._dem_cache:
return self._dem_cache[cache_key]
filepath = self._get_dem_file_path(lat, lon)
if not os.path.exists(filepath):
logger.debug("DEM file not found", path=filepath)
return None
try:
import rasterio
with rasterio.open(filepath) as src:
data = src.read(1) # Read first band
self._dem_cache[cache_key] = data
return data
except ImportError:
logger.warning("rasterio not available, using fallback")
return self._load_dem_fallback(filepath)
except Exception as e:
logger.error("Error loading DEM", path=filepath, error=str(e))
return None
def _load_dem_fallback(self, filepath: str) -> Optional[np.ndarray]:
"""Fallback DEM loader without rasterio (for development)."""
# In development, return synthetic terrain
return None
async def get_heightmap_tile(self, z: int, x: int, y: int) -> Optional[bytes]:
"""
Generate a heightmap tile in Mapbox Terrain-RGB format.
The encoding stores elevation as: height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
This allows for -10000m to +1677721.6m range with 0.1m precision.
"""
# Get tile bounds
west, south, east, north = tile_to_bounds(z, x, y)
# Check if tile is within Germany
if east < GERMANY_BOUNDS["west"] or west > GERMANY_BOUNDS["east"]:
return None
if north < GERMANY_BOUNDS["south"] or south > GERMANY_BOUNDS["north"]:
return None
# Try to load elevation data for this tile
elevations = await self._get_elevations_for_bounds(west, south, east, north)
if elevations is None:
# No DEM data available - return placeholder or None
return self._generate_placeholder_heightmap()
# Convert to Terrain-RGB format
return self._encode_terrain_rgb(elevations)
async def _get_elevations_for_bounds(
self, west: float, south: float, east: float, north: float
) -> Optional[np.ndarray]:
"""Get elevation data for a bounding box."""
# Determine which DEM tiles we need
lat_min = int(math.floor(south))
lat_max = int(math.ceil(north))
lon_min = int(math.floor(west))
lon_max = int(math.ceil(east))
# Load all required DEM tiles
dem_tiles = []
for lat in range(lat_min, lat_max + 1):
for lon in range(lon_min, lon_max + 1):
tile = self._load_dem_tile(lat, lon)
if tile is not None:
dem_tiles.append((lat, lon, tile))
if not dem_tiles:
return None
# Interpolate elevations for our tile grid
elevations = np.zeros((self.tile_size, self.tile_size), dtype=np.float32)
for py in range(self.tile_size):
for px in range(self.tile_size):
# Calculate lat/lon for this pixel
lon = west + (east - west) * px / self.tile_size
lat = north - (north - south) * py / self.tile_size
# Get elevation (simplified - would need proper interpolation)
elevation = self._sample_elevation(lat, lon, dem_tiles)
elevations[py, px] = elevation if elevation is not None else 0
return elevations
def _sample_elevation(
self, lat: float, lon: float, dem_tiles: list
) -> Optional[float]:
"""Sample elevation at a specific point from loaded DEM tiles."""
tile_lat = int(math.floor(lat))
tile_lon = int(math.floor(lon))
for t_lat, t_lon, data in dem_tiles:
if t_lat == tile_lat and t_lon == tile_lon:
# Calculate pixel position within tile
# Assuming 1 degree = 3600 pixels (1 arcsecond for 30m DEM)
rows, cols = data.shape
px = int((lon - tile_lon) * cols)
py = int((t_lat + 1 - lat) * rows)
px = max(0, min(cols - 1, px))
py = max(0, min(rows - 1, py))
return float(data[py, px])
return None
def _encode_terrain_rgb(self, elevations: np.ndarray) -> bytes:
"""
Encode elevation data as Mapbox Terrain-RGB.
Format: height = -10000 + ((R * 256 * 256 + G * 256 + B) * 0.1)
"""
# Convert elevation to RGB values
# encoded = (elevation + 10000) / 0.1
encoded = ((elevations + 10000) / 0.1).astype(np.uint32)
r = (encoded // (256 * 256)) % 256
g = (encoded // 256) % 256
b = encoded % 256
# Create RGB image
rgb = np.stack([r, g, b], axis=-1).astype(np.uint8)
img = Image.fromarray(rgb, mode="RGB")
# Save to bytes
buffer = BytesIO()
img.save(buffer, format="PNG")
return buffer.getvalue()
def _generate_placeholder_heightmap(self) -> bytes:
"""Generate a flat placeholder heightmap (sea level)."""
# Sea level = 0m encoded as RGB
# encoded = (0 + 10000) / 0.1 = 100000
# R = 100000 // 65536 = 1
# G = (100000 // 256) % 256 = 134
# B = 100000 % 256 = 160
img = Image.new("RGB", (self.tile_size, self.tile_size), (1, 134, 160))
buffer = BytesIO()
img.save(buffer, format="PNG")
return buffer.getvalue()
async def get_hillshade_tile(
self, z: int, x: int, y: int, azimuth: float = 315, altitude: float = 45
) -> Optional[bytes]:
"""
Generate a hillshade tile for terrain visualization.
Args:
z, x, y: Tile coordinates
azimuth: Light direction in degrees (0=N, 90=E, 180=S, 270=W)
altitude: Light altitude in degrees above horizon
"""
west, south, east, north = tile_to_bounds(z, x, y)
elevations = await self._get_elevations_for_bounds(west, south, east, north)
if elevations is None:
return None
# Calculate hillshade using numpy
hillshade = self._calculate_hillshade(elevations, azimuth, altitude)
# Create grayscale image
img = Image.fromarray((hillshade * 255).astype(np.uint8), mode="L")
buffer = BytesIO()
img.save(buffer, format="PNG")
return buffer.getvalue()
def _calculate_hillshade(
self, dem: np.ndarray, azimuth: float, altitude: float
) -> np.ndarray:
"""Calculate hillshade from DEM array."""
# Convert angles to radians
azimuth_rad = math.radians(360 - azimuth + 90)
altitude_rad = math.radians(altitude)
# Calculate gradient
dy, dx = np.gradient(dem)
# Calculate slope and aspect
slope = np.arctan(np.sqrt(dx**2 + dy**2))
aspect = np.arctan2(-dy, dx)
# Calculate hillshade
hillshade = (
np.sin(altitude_rad) * np.cos(slope)
+ np.cos(altitude_rad) * np.sin(slope) * np.cos(azimuth_rad - aspect)
)
# Normalize to 0-1
hillshade = np.clip(hillshade, 0, 1)
return hillshade
async def get_contour_tile(
self, z: int, x: int, y: int, interval: int = 20
) -> Optional[bytes]:
"""Generate contour lines as vector tile (stub - requires more complex implementation)."""
# This would require generating contour lines from DEM and encoding as MVT
# For now, return None
logger.warning("Contour tiles not yet implemented")
return None
async def get_elevation(self, lat: float, lon: float) -> Optional[float]:
"""Get elevation at a specific point."""
tile_lat = int(math.floor(lat))
tile_lon = int(math.floor(lon))
dem_data = self._load_dem_tile(tile_lat, tile_lon)
if dem_data is None:
return None
return self._sample_elevation(lat, lon, [(tile_lat, tile_lon, dem_data)])
async def get_elevation_profile(
self, coordinates: list[list[float]], samples: int = 100
) -> list[dict]:
"""Get elevation profile along a path."""
from shapely.geometry import LineString
from shapely.ops import substring
# Create line from coordinates
line = LineString(coordinates)
total_length = line.length
# Sample points along line
profile = []
for i in range(samples):
fraction = i / (samples - 1)
point = line.interpolate(fraction, normalized=True)
elevation = await self.get_elevation(point.y, point.x)
profile.append({
"distance_m": fraction * total_length * 111320, # Approximate meters
"longitude": point.x,
"latitude": point.y,
"elevation_m": elevation,
})
return profile
async def get_metadata(self) -> dict:
"""Get metadata about available DEM data."""
dem_files = []
if os.path.exists(self.dem_dir):
dem_files = [f for f in os.listdir(self.dem_dir) if f.endswith(".tif")]
return {
"data_available": len(dem_files) > 0,
"tiles_generated": len(dem_files),
"resolution_m": 30,
"source": "Copernicus DEM GLO-30",
}

View File

@@ -0,0 +1,355 @@
"""
Learning Generator Service
Generates educational content for geographic areas using LLM
"""
import os
import json
import uuid
from typing import Optional
import structlog
import httpx
from config import settings
from models.learning_node import LearningNode, LearningTheme, NodeType
logger = structlog.get_logger(__name__)
# In-memory storage for learning nodes (use database in production)
_learning_nodes = {}
class LearningGeneratorService:
"""
Service for generating educational learning nodes using Ollama LLM.
Generates themed educational content based on geographic features
and didactic principles.
"""
def __init__(self):
self.ollama_url = settings.ollama_base_url
self.model = settings.ollama_model
self.timeout = settings.ollama_timeout
async def generate_nodes(
self,
aoi_id: str,
theme: LearningTheme,
difficulty: str,
node_count: int,
grade_level: Optional[str] = None,
language: str = "de",
) -> list[LearningNode]:
"""
Generate learning nodes for an AOI.
Uses the Ollama LLM to create educational content appropriate
for the theme, difficulty, and grade level.
"""
# Get AOI information
aoi_info = await self._get_aoi_info(aoi_id)
if aoi_info is None:
raise FileNotFoundError(f"AOI {aoi_id} not found")
# Build prompt for LLM
prompt = self._build_generation_prompt(
aoi_info=aoi_info,
theme=theme,
difficulty=difficulty,
node_count=node_count,
grade_level=grade_level,
language=language,
)
# Call Ollama
try:
response = await self._call_ollama(prompt)
nodes = self._parse_llm_response(response, aoi_id, theme)
except ConnectionError:
logger.warning("Ollama not available, using mock data")
nodes = self._generate_mock_nodes(aoi_id, theme, difficulty, node_count)
# Store nodes
if aoi_id not in _learning_nodes:
_learning_nodes[aoi_id] = []
_learning_nodes[aoi_id].extend(nodes)
return nodes
async def _get_aoi_info(self, aoi_id: str) -> Optional[dict]:
"""Get information about an AOI from its manifest."""
manifest_path = os.path.join(settings.bundle_dir, aoi_id, "manifest.json")
if os.path.exists(manifest_path):
with open(manifest_path) as f:
return json.load(f)
# Check in-memory storage
from services.aoi_packager import _aoi_storage
return _aoi_storage.get(aoi_id)
def _build_generation_prompt(
self,
aoi_info: dict,
theme: LearningTheme,
difficulty: str,
node_count: int,
grade_level: Optional[str],
language: str,
) -> str:
"""Build a prompt for the LLM to generate learning nodes."""
theme_descriptions = {
LearningTheme.TOPOGRAPHIE: "Landschaftsformen, Höhen und Geländemerkmale",
LearningTheme.LANDNUTZUNG: "Siedlungen, Landwirtschaft und Flächennutzung",
LearningTheme.ORIENTIERUNG: "Kartenlesen, Kompass und Navigation",
LearningTheme.GEOLOGIE: "Gesteinsarten und geologische Formationen",
LearningTheme.HYDROLOGIE: "Gewässer, Einzugsgebiete und Wasserkreislauf",
LearningTheme.VEGETATION: "Pflanzengemeinschaften und Klimazonen",
}
difficulty_descriptions = {
"leicht": "Grundlegende Beobachtungen und einfache Fakten",
"mittel": "Verknüpfung von Zusammenhängen und Vergleiche",
"schwer": "Analyse, Transfer und kritisches Denken",
}
bounds = aoi_info.get("bounds", {})
center = aoi_info.get("center", {})
prompt = f"""Du bist ein Erdkunde-Didaktiker und erstellst Lernstationen für eine interaktive 3D-Lernwelt.
GEBIET:
- Zentrum: {center.get('latitude', 0):.4f}°N, {center.get('longitude', 0):.4f}°E
- Fläche: ca. {aoi_info.get('area_km2', 0):.2f} km²
- Grenzen: West {bounds.get('west', 0):.4f}°, Süd {bounds.get('south', 0):.4f}°, Ost {bounds.get('east', 0):.4f}°, Nord {bounds.get('north', 0):.4f}°
THEMA: {theme.value} - {theme_descriptions.get(theme, '')}
SCHWIERIGKEITSGRAD: {difficulty} - {difficulty_descriptions.get(difficulty, '')}
ZIELGRUPPE: {grade_level if grade_level else 'Allgemein (Klasse 5-10)'}
AUFGABE:
Erstelle {node_count} Lernstationen im JSON-Format. Jede Station soll:
1. Eine geografische Position innerhalb des Gebiets haben
2. Eine Lernfrage oder Aufgabe enthalten
3. Hinweise zur Lösung bieten
4. Die richtige Antwort mit Erklärung enthalten
FORMAT (JSON-Array):
[
{{
"title": "Titel der Station",
"position": {{"latitude": 0.0, "longitude": 0.0}},
"question": "Die Lernfrage",
"hints": ["Hinweis 1", "Hinweis 2"],
"answer": "Die Antwort",
"explanation": "Didaktische Erklärung",
"node_type": "question|observation|exploration",
"points": 10
}}
]
WICHTIG:
- Positionen müssen innerhalb der Gebietsgrenzen liegen
- Fragen sollen zum Thema {theme.value} passen
- Sprache: {"Deutsch" if language == "de" else "English"}
- Altersgerechte Formulierungen verwenden
Antworte NUR mit dem JSON-Array, ohne weitere Erklärungen."""
return prompt
async def _call_ollama(self, prompt: str) -> str:
"""Call Ollama API to generate content."""
try:
async with httpx.AsyncClient(timeout=self.timeout) as client:
response = await client.post(
f"{self.ollama_url}/api/generate",
json={
"model": self.model,
"prompt": prompt,
"stream": False,
"options": {
"temperature": 0.7,
"top_p": 0.9,
},
},
)
if response.status_code != 200:
raise ConnectionError(f"Ollama returned {response.status_code}")
result = response.json()
return result.get("response", "")
except httpx.ConnectError:
raise ConnectionError("Cannot connect to Ollama")
except Exception as e:
logger.error("Ollama API error", error=str(e))
raise ConnectionError(f"Ollama error: {str(e)}")
def _parse_llm_response(
self, response: str, aoi_id: str, theme: LearningTheme
) -> list[LearningNode]:
"""Parse LLM response into LearningNode objects."""
try:
# Find JSON array in response
start = response.find("[")
end = response.rfind("]") + 1
if start == -1 or end == 0:
raise ValueError("No JSON array found in response")
json_str = response[start:end]
data = json.loads(json_str)
nodes = []
for item in data:
node = LearningNode(
id=str(uuid.uuid4()),
aoi_id=aoi_id,
title=item.get("title", "Unbenannte Station"),
theme=theme,
position={
"latitude": item.get("position", {}).get("latitude", 0),
"longitude": item.get("position", {}).get("longitude", 0),
},
question=item.get("question", ""),
hints=item.get("hints", []),
answer=item.get("answer", ""),
explanation=item.get("explanation", ""),
node_type=NodeType(item.get("node_type", "question")),
points=item.get("points", 10),
approved=False,
)
nodes.append(node)
return nodes
except (json.JSONDecodeError, ValueError) as e:
logger.error("Failed to parse LLM response", error=str(e))
return []
def _generate_mock_nodes(
self,
aoi_id: str,
theme: LearningTheme,
difficulty: str,
node_count: int,
) -> list[LearningNode]:
"""Generate mock learning nodes for development."""
mock_questions = {
LearningTheme.TOPOGRAPHIE: [
("Höhenbestimmung", "Schätze die Höhe dieses Punktes.", "Ca. 500m über NN"),
("Hangneigung", "Beschreibe die Steilheit des Hanges.", "Mäßig steil, ca. 15-20°"),
("Talform", "Welche Form hat dieses Tal?", "V-förmiges Erosionstal"),
],
LearningTheme.LANDNUTZUNG: [
("Gebäudetypen", "Welche Gebäude siehst du hier?", "Wohnhäuser und landwirtschaftliche Gebäude"),
("Flächennutzung", "Wie wird das Land genutzt?", "Landwirtschaft und Siedlung"),
("Infrastruktur", "Welche Verkehrswege erkennst du?", "Straße und Feldweg"),
],
LearningTheme.ORIENTIERUNG: [
("Himmelsrichtung", "In welche Richtung fließt der Bach?", "Nach Nordwesten"),
("Entfernung", "Wie weit ist es bis zum Waldrand?", "Etwa 200 Meter"),
("Wegbeschreibung", "Beschreibe den Weg zum Aussichtspunkt.", "Nordöstlich, bergauf"),
],
}
questions = mock_questions.get(theme, mock_questions[LearningTheme.TOPOGRAPHIE])
nodes = []
for i in range(min(node_count, len(questions))):
title, question, answer = questions[i]
nodes.append(LearningNode(
id=str(uuid.uuid4()),
aoi_id=aoi_id,
title=title,
theme=theme,
position={"latitude": 47.7 + i * 0.001, "longitude": 9.19 + i * 0.001},
question=question,
hints=[f"Hinweis {j + 1}" for j in range(2)],
answer=answer,
explanation=f"Diese Aufgabe trainiert die Beobachtung von {theme.value}.",
node_type=NodeType.QUESTION,
points=10,
approved=False,
))
return nodes
async def get_nodes_for_aoi(
self, aoi_id: str, theme: Optional[LearningTheme] = None
) -> Optional[list[LearningNode]]:
"""Get all learning nodes for an AOI."""
nodes = _learning_nodes.get(aoi_id)
if nodes is None:
return None
if theme is not None:
nodes = [n for n in nodes if n.theme == theme]
return nodes
async def update_node(
self, aoi_id: str, node_id: str, node_update: LearningNode
) -> bool:
"""Update a learning node."""
nodes = _learning_nodes.get(aoi_id)
if nodes is None:
return False
for i, node in enumerate(nodes):
if node.id == node_id:
_learning_nodes[aoi_id][i] = node_update
return True
return False
async def delete_node(self, aoi_id: str, node_id: str) -> bool:
"""Delete a learning node."""
nodes = _learning_nodes.get(aoi_id)
if nodes is None:
return False
for i, node in enumerate(nodes):
if node.id == node_id:
del _learning_nodes[aoi_id][i]
return True
return False
async def approve_node(self, aoi_id: str, node_id: str) -> bool:
"""Approve a learning node for student use."""
nodes = _learning_nodes.get(aoi_id)
if nodes is None:
return False
for node in nodes:
if node.id == node_id:
node.approved = True
return True
return False
async def get_statistics(self) -> dict:
"""Get statistics about learning node usage."""
total = 0
by_theme = {}
by_difficulty = {}
for aoi_nodes in _learning_nodes.values():
for node in aoi_nodes:
total += 1
theme = node.theme.value
by_theme[theme] = by_theme.get(theme, 0) + 1
return {
"total_nodes": total,
"by_theme": by_theme,
"by_difficulty": by_difficulty,
"avg_per_aoi": total / len(_learning_nodes) if _learning_nodes else 0,
"popular_theme": max(by_theme, key=by_theme.get) if by_theme else "topographie",
}

View File

@@ -0,0 +1,217 @@
"""
OSM Extractor Service
Extracts OpenStreetMap features from PostGIS or vector tiles
"""
import os
import json
from typing import Optional
import structlog
from shapely.geometry import shape, mapping
from config import settings
logger = structlog.get_logger(__name__)
class OSMExtractorService:
"""
Service for extracting OSM features from a geographic area.
Can extract from:
- PostGIS database (imported OSM data)
- PMTiles archive
"""
def __init__(self):
self.database_url = settings.database_url
async def extract_features(self, polygon: dict) -> dict:
"""
Extract OSM features within a polygon.
Returns a GeoJSON FeatureCollection with categorized features.
"""
geom = shape(polygon)
bounds = geom.bounds # (minx, miny, maxx, maxy)
# Feature collection structure
features = {
"type": "FeatureCollection",
"features": [],
"metadata": {
"source": "OpenStreetMap",
"license": "ODbL",
"bounds": {
"west": bounds[0],
"south": bounds[1],
"east": bounds[2],
"north": bounds[3],
},
},
}
# Try to extract from database
try:
db_features = await self._extract_from_database(geom)
if db_features:
features["features"].extend(db_features)
return features
except Exception as e:
logger.warning("Database extraction failed", error=str(e))
# Fall back to mock data for development
features["features"] = self._generate_mock_features(geom)
features["metadata"]["source"] = "Mock Data (OSM data not imported)"
return features
async def _extract_from_database(self, geom) -> list:
"""Extract features from PostGIS database."""
# This would use asyncpg to query the database
# For now, return empty list to trigger mock data
# Example query structure (not executed):
# SELECT ST_AsGeoJSON(way), name, building, highway, natural, waterway
# FROM planet_osm_polygon
# WHERE ST_Intersects(way, ST_GeomFromGeoJSON($1))
return []
def _generate_mock_features(self, geom) -> list:
"""Generate mock OSM features for development."""
from shapely.geometry import Point, LineString, Polygon as ShapelyPolygon
import random
bounds = geom.bounds
features = []
# Generate some mock buildings
for i in range(5):
x = random.uniform(bounds[0], bounds[2])
y = random.uniform(bounds[1], bounds[3])
# Small building polygon
size = 0.0002 # ~20m
building = ShapelyPolygon([
(x, y),
(x + size, y),
(x + size, y + size),
(x, y + size),
(x, y),
])
if geom.contains(building.centroid):
features.append({
"type": "Feature",
"geometry": mapping(building),
"properties": {
"category": "building",
"building": "yes",
"name": f"Gebäude {i + 1}",
},
})
# Generate some mock roads
for i in range(3):
x1 = random.uniform(bounds[0], bounds[2])
y1 = random.uniform(bounds[1], bounds[3])
x2 = random.uniform(bounds[0], bounds[2])
y2 = random.uniform(bounds[1], bounds[3])
road = LineString([(x1, y1), (x2, y2)])
features.append({
"type": "Feature",
"geometry": mapping(road),
"properties": {
"category": "road",
"highway": random.choice(["primary", "secondary", "residential"]),
"name": f"Straße {i + 1}",
},
})
# Generate mock water feature
cx = (bounds[0] + bounds[2]) / 2
cy = (bounds[1] + bounds[3]) / 2
size = min(bounds[2] - bounds[0], bounds[3] - bounds[1]) * 0.2
water = ShapelyPolygon([
(cx - size, cy - size / 2),
(cx + size, cy - size / 2),
(cx + size, cy + size / 2),
(cx - size, cy + size / 2),
(cx - size, cy - size / 2),
])
if geom.intersects(water):
features.append({
"type": "Feature",
"geometry": mapping(water.intersection(geom)),
"properties": {
"category": "water",
"natural": "water",
"name": "See",
},
})
# Generate mock forest
forest_size = size * 1.5
forest = ShapelyPolygon([
(bounds[0], bounds[1]),
(bounds[0] + forest_size, bounds[1]),
(bounds[0] + forest_size, bounds[1] + forest_size),
(bounds[0], bounds[1] + forest_size),
(bounds[0], bounds[1]),
])
if geom.intersects(forest):
features.append({
"type": "Feature",
"geometry": mapping(forest.intersection(geom)),
"properties": {
"category": "vegetation",
"landuse": "forest",
"name": "Wald",
},
})
return features
async def get_feature_statistics(self, polygon: dict) -> dict:
"""Get statistics about features in an area."""
features = await self.extract_features(polygon)
categories = {}
for feature in features.get("features", []):
category = feature.get("properties", {}).get("category", "other")
categories[category] = categories.get(category, 0) + 1
return {
"total_features": len(features.get("features", [])),
"by_category": categories,
}
async def search_features(
self,
polygon: dict,
category: str,
name_filter: Optional[str] = None,
) -> list:
"""Search for specific features within an area."""
all_features = await self.extract_features(polygon)
filtered = []
for feature in all_features.get("features", []):
props = feature.get("properties", {})
if props.get("category") != category:
continue
if name_filter:
name = props.get("name", "")
if name_filter.lower() not in name.lower():
continue
filtered.append(feature)
return filtered

View File

@@ -0,0 +1,186 @@
"""
Tile Server Service
Serves vector tiles from PMTiles format or generates on-demand from PostGIS
"""
import os
import gzip
from typing import Optional
from datetime import datetime
import structlog
from pmtiles.reader import Reader as PMTilesReader
from pmtiles.tile import TileType
from config import settings
logger = structlog.get_logger(__name__)
class MMapFileReader:
"""Memory-mapped file reader for PMTiles."""
def __init__(self, path: str):
self.path = path
self._file = None
self._size = 0
def __enter__(self):
self._file = open(self.path, "rb")
self._file.seek(0, 2) # Seek to end
self._size = self._file.tell()
self._file.seek(0)
return self
def __exit__(self, *args):
if self._file:
self._file.close()
def read(self, offset: int, length: int) -> bytes:
"""Read bytes from file at offset."""
self._file.seek(offset)
return self._file.read(length)
def size(self) -> int:
"""Get file size."""
return self._size
class TileServerService:
"""
Service for serving vector tiles from PMTiles format.
PMTiles is a cloud-optimized format for tile archives that allows
random access to individual tiles without extracting the entire archive.
"""
def __init__(self):
self.pmtiles_path = settings.pmtiles_path
self.cache_dir = settings.tile_cache_dir
self._reader = None
self._metadata_cache = None
def _get_reader(self) -> Optional[PMTilesReader]:
"""Get or create PMTiles reader."""
if not os.path.exists(self.pmtiles_path):
logger.warning("PMTiles file not found", path=self.pmtiles_path)
return None
if self._reader is None:
try:
file_reader = MMapFileReader(self.pmtiles_path)
file_reader.__enter__()
self._reader = PMTilesReader(file_reader)
logger.info("PMTiles reader initialized", path=self.pmtiles_path)
except Exception as e:
logger.error("Failed to initialize PMTiles reader", error=str(e))
return None
return self._reader
async def get_tile(self, z: int, x: int, y: int) -> Optional[bytes]:
"""
Get a vector tile at the specified coordinates.
Args:
z: Zoom level
x: Tile X coordinate
y: Tile Y coordinate
Returns:
Tile data as gzipped protobuf, or None if not found
"""
# Check cache first
cache_path = os.path.join(self.cache_dir, str(z), str(x), f"{y}.pbf")
if os.path.exists(cache_path):
with open(cache_path, "rb") as f:
return f.read()
# Try to get from PMTiles
reader = self._get_reader()
if reader is None:
raise FileNotFoundError("PMTiles file not available")
try:
tile_data = reader.get_tile(z, x, y)
if tile_data is None:
return None
# Cache the tile
await self._cache_tile(z, x, y, tile_data)
return tile_data
except Exception as e:
logger.error("Error reading tile", z=z, x=x, y=y, error=str(e))
return None
async def _cache_tile(self, z: int, x: int, y: int, data: bytes):
"""Cache a tile to disk."""
cache_path = os.path.join(self.cache_dir, str(z), str(x))
os.makedirs(cache_path, exist_ok=True)
tile_path = os.path.join(cache_path, f"{y}.pbf")
with open(tile_path, "wb") as f:
f.write(data)
async def get_metadata(self) -> dict:
"""
Get metadata about the tile archive.
Returns:
Dictionary with metadata including bounds, zoom levels, etc.
"""
if self._metadata_cache is not None:
return self._metadata_cache
reader = self._get_reader()
if reader is None:
return {
"data_available": False,
"minzoom": 0,
"maxzoom": 14,
"bounds": [5.87, 47.27, 15.04, 55.06],
"center": [10.45, 51.16, 6],
}
try:
header = reader.header()
metadata = reader.metadata()
self._metadata_cache = {
"data_available": True,
"minzoom": header.get("minZoom", 0),
"maxzoom": header.get("maxZoom", 14),
"bounds": header.get("bounds", [5.87, 47.27, 15.04, 55.06]),
"center": header.get("center", [10.45, 51.16, 6]),
"tile_type": "mvt", # Mapbox Vector Tiles
"last_updated": datetime.fromtimestamp(
os.path.getmtime(self.pmtiles_path)
).isoformat() if os.path.exists(self.pmtiles_path) else None,
**metadata,
}
return self._metadata_cache
except Exception as e:
logger.error("Error reading metadata", error=str(e))
return {"data_available": False}
def clear_cache(self):
"""Clear the tile cache."""
import shutil
if os.path.exists(self.cache_dir):
shutil.rmtree(self.cache_dir)
os.makedirs(self.cache_dir)
logger.info("Tile cache cleared")
def get_cache_size_mb(self) -> float:
"""Get the current cache size in MB."""
total_size = 0
for dirpath, dirnames, filenames in os.walk(self.cache_dir):
for filename in filenames:
filepath = os.path.join(dirpath, filename)
total_size += os.path.getsize(filepath)
return total_size / (1024 * 1024)

View File

@@ -0,0 +1,3 @@
"""
GeoEdu Service Tests
"""

View File

@@ -0,0 +1,271 @@
"""
Tests for AOI Packager Service
"""
import pytest
from unittest.mock import patch, MagicMock, AsyncMock
import json
import sys
sys.path.insert(0, '/app')
from services.aoi_packager import AOIPackagerService
class TestAOIPackagerService:
"""Tests for AOI Packager Service."""
@pytest.fixture
def service(self):
"""Create service instance."""
return AOIPackagerService()
def test_calculate_area_km2_small_polygon(self, service):
"""Test area calculation for small polygon."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.19, 47.70],
[9.20, 47.70],
[9.20, 47.71],
[9.19, 47.71],
[9.19, 47.70],
]
],
}
area = service.calculate_area_km2(polygon)
# Should be approximately 1 km² (rough estimate)
assert 0.5 < area < 2.0
def test_calculate_area_km2_mainau(self, service):
"""Test area calculation for Mainau island polygon."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.1875, 47.7055],
[9.1975, 47.7055],
[9.1975, 47.7115],
[9.1875, 47.7115],
[9.1875, 47.7055],
]
],
}
area = service.calculate_area_km2(polygon)
# Mainau template claims ~0.45 km²
assert 0.3 < area < 1.0
def test_is_within_germany_valid(self, service):
"""Test polygon within Germany."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.19, 47.70],
[9.20, 47.70],
[9.20, 47.71],
[9.19, 47.71],
[9.19, 47.70],
]
],
}
assert service.is_within_germany(polygon) == True
def test_is_within_germany_outside(self, service):
"""Test polygon outside Germany."""
# Paris
polygon = {
"type": "Polygon",
"coordinates": [
[
[2.3, 48.8],
[2.4, 48.8],
[2.4, 48.9],
[2.3, 48.9],
[2.3, 48.8],
]
],
}
assert service.is_within_germany(polygon) == False
def test_validate_polygon_valid(self, service):
"""Test valid polygon validation."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.19, 47.70],
[9.20, 47.70],
[9.20, 47.71],
[9.19, 47.71],
[9.19, 47.70],
]
],
}
is_valid, message = service.validate_polygon(polygon)
assert is_valid == True
assert message == "Valid"
def test_validate_polygon_not_closed(self, service):
"""Test polygon validation with unclosed ring."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.19, 47.70],
[9.20, 47.70],
[9.20, 47.71],
[9.19, 47.71],
# Missing closing point
]
],
}
is_valid, message = service.validate_polygon(polygon)
assert is_valid == False
assert "closed" in message.lower()
def test_validate_polygon_wrong_type(self, service):
"""Test polygon validation with wrong geometry type."""
polygon = {
"type": "Point",
"coordinates": [9.19, 47.70],
}
is_valid, message = service.validate_polygon(polygon)
assert is_valid == False
assert "Polygon" in message
def test_estimate_bundle_size_low(self, service):
"""Test bundle size estimation for low quality."""
size = service.estimate_bundle_size_mb(1.0, "low")
assert size == 10.0
def test_estimate_bundle_size_medium(self, service):
"""Test bundle size estimation for medium quality."""
size = service.estimate_bundle_size_mb(1.0, "medium")
assert size == 25.0
def test_estimate_bundle_size_high(self, service):
"""Test bundle size estimation for high quality."""
size = service.estimate_bundle_size_mb(1.0, "high")
assert size == 50.0
def test_estimate_bundle_size_scales_with_area(self, service):
"""Test bundle size scales with area."""
size_1km = service.estimate_bundle_size_mb(1.0, "medium")
size_2km = service.estimate_bundle_size_mb(2.0, "medium")
assert size_2km == size_1km * 2
class TestGeoUtils:
"""Tests for geo utility functions."""
def test_lat_lon_to_tile_berlin(self):
"""Test tile conversion for Berlin."""
from utils.geo_utils import lat_lon_to_tile
# Berlin: 52.52, 13.405
x, y = lat_lon_to_tile(52.52, 13.405, 10)
# At zoom 10, Berlin should be around tile 550, 335
assert 540 < x < 560
assert 325 < y < 345
def test_tile_to_bounds(self):
"""Test tile to bounds conversion."""
from utils.geo_utils import tile_to_bounds
west, south, east, north = tile_to_bounds(10, 550, 335)
# Should return valid bounds
assert west < east
assert south < north
# Should be somewhere in Germany
assert 5 < west < 20
assert 45 < south < 60
def test_calculate_distance(self):
"""Test distance calculation."""
from utils.geo_utils import calculate_distance
# Berlin to Munich: ~504 km
dist = calculate_distance(52.52, 13.405, 48.1351, 11.582)
assert 450000 < dist < 550000 # meters
def test_get_germany_bounds(self):
"""Test Germany bounds."""
from utils.geo_utils import get_germany_bounds
west, south, east, north = get_germany_bounds()
assert west == 5.87
assert south == 47.27
assert east == 15.04
assert north == 55.06
class TestLicenseChecker:
"""Tests for license checker utility."""
def test_osm_source_allowed(self):
"""Test OSM source is allowed."""
from utils.license_checker import LicenseChecker, DataSource
assert LicenseChecker.is_source_allowed(DataSource.OPENSTREETMAP) == True
def test_copernicus_source_allowed(self):
"""Test Copernicus source is allowed."""
from utils.license_checker import LicenseChecker, DataSource
assert LicenseChecker.is_source_allowed(DataSource.COPERNICUS_DEM) == True
def test_google_source_forbidden(self):
"""Test Google source is forbidden."""
from utils.license_checker import LicenseChecker, DataSource
assert LicenseChecker.is_source_allowed(DataSource.GOOGLE) == False
def test_validate_osm_url(self):
"""Test OSM URL validation."""
from utils.license_checker import LicenseChecker
is_allowed, message = LicenseChecker.validate_url(
"https://tile.openstreetmap.org/10/550/335.png"
)
assert is_allowed == True
assert "ALLOWED" in message
def test_validate_google_url(self):
"""Test Google URL validation."""
from utils.license_checker import LicenseChecker
is_allowed, message = LicenseChecker.validate_url(
"https://maps.googleapis.com/maps/api/staticmap"
)
assert is_allowed == False
assert "FORBIDDEN" in message
def test_get_attribution_for_sources(self):
"""Test getting attribution for sources."""
from utils.license_checker import LicenseChecker, DataSource
sources = [DataSource.OPENSTREETMAP, DataSource.COPERNICUS_DEM]
attributions = LicenseChecker.get_attribution_for_sources(sources)
assert len(attributions) == 2
assert any("OpenStreetMap" in a["name"] for a in attributions)
assert any("Copernicus" in a["name"] for a in attributions)
def test_check_commercial_use_allowed(self):
"""Test commercial use check for allowed sources."""
from utils.license_checker import LicenseChecker, DataSource
sources = [DataSource.OPENSTREETMAP, DataSource.COPERNICUS_DEM]
allowed, issues = LicenseChecker.check_commercial_use(sources)
assert allowed == True
assert len(issues) == 0
def test_check_commercial_use_forbidden(self):
"""Test commercial use check with forbidden source."""
from utils.license_checker import LicenseChecker, DataSource
sources = [DataSource.OPENSTREETMAP, DataSource.GOOGLE]
allowed, issues = LicenseChecker.check_commercial_use(sources)
assert allowed == False
assert len(issues) > 0

View File

@@ -0,0 +1,194 @@
"""
Tests for Learning Generator Service
"""
import pytest
from unittest.mock import patch, MagicMock, AsyncMock
import json
import sys
sys.path.insert(0, '/app')
from services.learning_generator import LearningGeneratorService
from models.learning_node import LearningTheme, NodeType
class TestLearningGeneratorService:
"""Tests for Learning Generator Service."""
@pytest.fixture
def service(self):
"""Create service instance."""
return LearningGeneratorService()
def test_build_generation_prompt_topographie(self, service):
"""Test prompt building for topography theme."""
aoi_info = {
"bounds": {"west": 9.18, "south": 47.70, "east": 9.20, "north": 47.72},
"center": {"latitude": 47.71, "longitude": 9.19},
"area_km2": 0.5,
}
prompt = service._build_generation_prompt(
aoi_info=aoi_info,
theme=LearningTheme.TOPOGRAPHIE,
difficulty="mittel",
node_count=5,
grade_level="7-8",
language="de",
)
assert "Topographie" in prompt or "topographie" in prompt
assert "47.71" in prompt # center latitude
assert "9.19" in prompt # center longitude
assert "5" in prompt # node count
assert "mittel" in prompt # difficulty
assert "7-8" in prompt # grade level
assert "JSON" in prompt
def test_build_generation_prompt_landnutzung(self, service):
"""Test prompt building for land use theme."""
aoi_info = {
"bounds": {"west": 9.18, "south": 47.70, "east": 9.20, "north": 47.72},
"center": {"latitude": 47.71, "longitude": 9.19},
"area_km2": 0.5,
}
prompt = service._build_generation_prompt(
aoi_info=aoi_info,
theme=LearningTheme.LANDNUTZUNG,
difficulty="leicht",
node_count=3,
grade_level=None,
language="de",
)
assert "Landnutzung" in prompt or "landnutzung" in prompt
def test_parse_llm_response_valid(self, service):
"""Test parsing valid LLM response."""
response = """
Here are the learning nodes:
[
{
"title": "Höhenbestimmung",
"position": {"latitude": 47.71, "longitude": 9.19},
"question": "Schätze die Höhe dieses Punktes.",
"hints": ["Schau auf die Vegetation", "Vergleiche mit dem See"],
"answer": "Ca. 500m",
"explanation": "Die Höhe lässt sich...",
"node_type": "question",
"points": 10
}
]
"""
nodes = service._parse_llm_response(
response, "test-aoi-id", LearningTheme.TOPOGRAPHIE
)
assert len(nodes) == 1
assert nodes[0].title == "Höhenbestimmung"
assert nodes[0].aoi_id == "test-aoi-id"
assert nodes[0].theme == LearningTheme.TOPOGRAPHIE
assert nodes[0].points == 10
def test_parse_llm_response_invalid_json(self, service):
"""Test parsing invalid LLM response."""
response = "This is not valid JSON"
nodes = service._parse_llm_response(
response, "test-aoi-id", LearningTheme.TOPOGRAPHIE
)
assert len(nodes) == 0
def test_generate_mock_nodes(self, service):
"""Test mock node generation."""
nodes = service._generate_mock_nodes(
aoi_id="test-aoi-id",
theme=LearningTheme.TOPOGRAPHIE,
difficulty="mittel",
node_count=3,
)
assert len(nodes) == 3
for node in nodes:
assert node.aoi_id == "test-aoi-id"
assert node.theme == LearningTheme.TOPOGRAPHIE
assert node.approved == False
assert len(node.hints) > 0
def test_generate_mock_nodes_orientierung(self, service):
"""Test mock node generation for orientation theme."""
nodes = service._generate_mock_nodes(
aoi_id="test-aoi-id",
theme=LearningTheme.ORIENTIERUNG,
difficulty="leicht",
node_count=2,
)
assert len(nodes) == 2
for node in nodes:
assert node.theme == LearningTheme.ORIENTIERUNG
@pytest.mark.asyncio
async def test_get_nodes_for_aoi_empty(self, service):
"""Test getting nodes for non-existent AOI."""
nodes = await service.get_nodes_for_aoi("nonexistent-aoi")
assert nodes is None
@pytest.mark.asyncio
async def test_get_statistics_empty(self, service):
"""Test statistics with no data."""
stats = await service.get_statistics()
assert stats["total_nodes"] == 0
assert stats["by_theme"] == {}
class TestLearningNodeModel:
"""Tests for Learning Node model."""
def test_learning_theme_values(self):
"""Test all theme enum values exist."""
themes = [
LearningTheme.TOPOGRAPHIE,
LearningTheme.LANDNUTZUNG,
LearningTheme.ORIENTIERUNG,
LearningTheme.GEOLOGIE,
LearningTheme.HYDROLOGIE,
LearningTheme.VEGETATION,
]
assert len(themes) == 6
def test_node_type_values(self):
"""Test all node type enum values exist."""
types = [
NodeType.QUESTION,
NodeType.OBSERVATION,
NodeType.EXPLORATION,
]
assert len(types) == 3
def test_create_learning_node(self):
"""Test creating a learning node."""
from models.learning_node import LearningNode
node = LearningNode(
id="test-id",
aoi_id="test-aoi",
title="Test Station",
theme=LearningTheme.TOPOGRAPHIE,
position={"latitude": 47.71, "longitude": 9.19},
question="Test question?",
hints=["Hint 1", "Hint 2"],
answer="Test answer",
explanation="Test explanation",
node_type=NodeType.QUESTION,
points=10,
approved=False,
)
assert node.id == "test-id"
assert node.title == "Test Station"
assert node.points == 10
assert len(node.hints) == 2

View File

@@ -0,0 +1,283 @@
"""
Tests for Tile Server API
"""
import pytest
from fastapi.testclient import TestClient
from unittest.mock import patch, MagicMock
# Import app after mocking to avoid import errors
import sys
sys.path.insert(0, '/app')
@pytest.fixture
def client():
"""Create test client."""
from main import app
return TestClient(app)
class TestTileEndpoints:
"""Tests for tile server endpoints."""
def test_health_check(self, client):
"""Test health check endpoint returns healthy status."""
response = client.get("/health")
assert response.status_code == 200
data = response.json()
assert data["status"] == "healthy"
assert data["service"] == "geo-service"
assert "data_status" in data
def test_root_endpoint(self, client):
"""Test root endpoint returns service info."""
response = client.get("/")
assert response.status_code == 200
data = response.json()
assert data["service"] == "GeoEdu Service"
assert "endpoints" in data
assert "attribution" in data
def test_tile_metadata(self, client):
"""Test tile metadata endpoint."""
response = client.get("/api/v1/tiles/metadata")
assert response.status_code == 200
data = response.json()
assert "name" in data
assert "minzoom" in data
assert "maxzoom" in data
assert "bounds" in data
assert data["attribution"] == "© OpenStreetMap contributors (ODbL)"
def test_tile_bounds(self, client):
"""Test tile bounds endpoint returns Germany bounds."""
response = client.get("/api/v1/tiles/bounds")
assert response.status_code == 200
data = response.json()
assert "bounds" in data
# Germany approximate bounds
bounds = data["bounds"]
assert bounds[0] < 6 # west
assert bounds[1] > 47 # south
assert bounds[2] > 14 # east
assert bounds[3] < 56 # north
def test_style_json(self, client):
"""Test MapLibre style endpoint."""
response = client.get("/api/v1/tiles/style.json")
assert response.status_code == 200
data = response.json()
assert data["version"] == 8
assert "sources" in data
assert "layers" in data
assert "osm" in data["sources"]
def test_tile_request_without_data(self, client):
"""Test tile request returns 503 when data not available."""
response = client.get("/api/v1/tiles/0/0/0.pbf")
# Should return 503 (data not available) or 204 (empty tile)
assert response.status_code in [204, 503]
def test_tile_request_invalid_zoom(self, client):
"""Test tile request with invalid zoom level."""
response = client.get("/api/v1/tiles/25/0/0.pbf")
assert response.status_code == 422 # Validation error
class TestTerrainEndpoints:
"""Tests for terrain/DEM endpoints."""
def test_terrain_metadata(self, client):
"""Test terrain metadata endpoint."""
response = client.get("/api/v1/terrain/metadata")
assert response.status_code == 200
data = response.json()
assert data["name"] == "Copernicus DEM GLO-30"
assert data["resolution_m"] == 30
assert "bounds" in data
def test_heightmap_request_without_data(self, client):
"""Test heightmap request returns appropriate status when no data."""
response = client.get("/api/v1/terrain/10/500/500.png")
# Should return 204 (no content) or 503 (data not available)
assert response.status_code in [204, 503]
def test_elevation_at_point_within_germany(self, client):
"""Test elevation endpoint with valid Germany coordinates."""
# Berlin coordinates
response = client.get("/api/v1/terrain/elevation?lat=52.52&lon=13.405")
# Should return 404 (no data) or 200 (with elevation) or 503
assert response.status_code in [200, 404, 503]
def test_elevation_at_point_outside_germany(self, client):
"""Test elevation endpoint with coordinates outside Germany."""
# Paris coordinates (outside Germany)
response = client.get("/api/v1/terrain/elevation?lat=48.8566&lon=2.3522")
# Should return 422 (validation error - outside bounds)
assert response.status_code == 422
class TestAOIEndpoints:
"""Tests for AOI (Area of Interest) endpoints."""
def test_validate_polygon_valid(self, client):
"""Test polygon validation with valid polygon."""
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.19, 47.71],
[9.20, 47.71],
[9.20, 47.70],
[9.19, 47.70],
[9.19, 47.71],
]
],
}
response = client.post("/api/v1/aoi/validate", json=polygon)
assert response.status_code == 200
data = response.json()
assert "valid" in data
assert "area_km2" in data
assert "within_germany" in data
def test_validate_polygon_too_large(self, client):
"""Test polygon validation with too large area."""
# Large polygon (> 4 km²)
polygon = {
"type": "Polygon",
"coordinates": [
[
[9.0, 47.5],
[9.5, 47.5],
[9.5, 48.0],
[9.0, 48.0],
[9.0, 47.5],
]
],
}
response = client.post("/api/v1/aoi/validate", json=polygon)
assert response.status_code == 200
data = response.json()
assert data["within_size_limit"] == False
def test_validate_polygon_outside_germany(self, client):
"""Test polygon validation outside Germany."""
# Paris area
polygon = {
"type": "Polygon",
"coordinates": [
[
[2.3, 48.8],
[2.4, 48.8],
[2.4, 48.9],
[2.3, 48.9],
[2.3, 48.8],
]
],
}
response = client.post("/api/v1/aoi/validate", json=polygon)
assert response.status_code == 200
data = response.json()
assert data["within_germany"] == False
def test_mainau_template(self, client):
"""Test Mainau demo template endpoint."""
response = client.get("/api/v1/aoi/templates/mainau")
assert response.status_code == 200
data = response.json()
assert data["name"] == "Insel Mainau"
assert "polygon" in data
assert "center" in data
assert "suggested_themes" in data
def test_create_aoi(self, client):
"""Test AOI creation with valid polygon."""
request = {
"polygon": {
"type": "Polygon",
"coordinates": [
[
[9.1875, 47.7055],
[9.1975, 47.7055],
[9.1975, 47.7115],
[9.1875, 47.7115],
[9.1875, 47.7055],
]
],
},
"theme": "topographie",
"quality": "medium",
}
response = client.post("/api/v1/aoi", json=request)
assert response.status_code == 200
data = response.json()
assert "aoi_id" in data
assert data["status"] == "queued"
assert "area_km2" in data
def test_create_aoi_too_large(self, client):
"""Test AOI creation fails with too large area."""
request = {
"polygon": {
"type": "Polygon",
"coordinates": [
[
[9.0, 47.5],
[9.5, 47.5],
[9.5, 48.0],
[9.0, 48.0],
[9.0, 47.5],
]
],
},
"theme": "topographie",
"quality": "medium",
}
response = client.post("/api/v1/aoi", json=request)
assert response.status_code == 400
def test_get_nonexistent_aoi(self, client):
"""Test getting non-existent AOI returns 404."""
response = client.get("/api/v1/aoi/nonexistent-id")
assert response.status_code == 404
class TestLearningEndpoints:
"""Tests for Learning Nodes endpoints."""
def test_learning_templates(self, client):
"""Test learning templates endpoint."""
response = client.get("/api/v1/learning/templates")
assert response.status_code == 200
data = response.json()
assert "themes" in data
assert "difficulties" in data
assert len(data["themes"]) == 6 # 6 themes
# Check theme structure
theme = data["themes"][0]
assert "id" in theme
assert "name" in theme
assert "description" in theme
assert "example_questions" in theme
def test_learning_statistics(self, client):
"""Test learning statistics endpoint."""
response = client.get("/api/v1/learning/statistics")
assert response.status_code == 200
data = response.json()
assert "total_nodes_generated" in data
assert "nodes_by_theme" in data
def test_generate_nodes_without_aoi(self, client):
"""Test node generation fails without valid AOI."""
request = {
"aoi_id": "nonexistent-aoi",
"theme": "topographie",
"difficulty": "mittel",
"node_count": 5,
}
response = client.post("/api/v1/learning/generate", json=request)
# Should return 404 (AOI not found) or 503 (Ollama not available)
assert response.status_code in [404, 503]

View File

@@ -0,0 +1,20 @@
"""
GeoEdu Service - Utility Functions
"""
from .geo_utils import (
lat_lon_to_tile,
tile_to_bounds,
calculate_distance,
transform_coordinates,
)
from .minio_client import MinioClient
from .license_checker import LicenseChecker
__all__ = [
"lat_lon_to_tile",
"tile_to_bounds",
"calculate_distance",
"transform_coordinates",
"MinioClient",
"LicenseChecker",
]

View File

@@ -0,0 +1,262 @@
"""
Geographic Utility Functions
Coordinate transformations, distance calculations, and tile math
"""
import math
from typing import Tuple, Optional
import pyproj
from shapely.geometry import Point, Polygon, shape
from shapely.ops import transform
# Web Mercator (EPSG:3857) and WGS84 (EPSG:4326) transformers
WGS84 = pyproj.CRS("EPSG:4326")
WEB_MERCATOR = pyproj.CRS("EPSG:3857")
ETRS89_LAEA = pyproj.CRS("EPSG:3035") # Equal area for Europe
def lat_lon_to_tile(lat: float, lon: float, zoom: int) -> Tuple[int, int]:
"""
Convert latitude/longitude to tile coordinates (XYZ scheme).
Args:
lat: Latitude in degrees (-85.05 to 85.05)
lon: Longitude in degrees (-180 to 180)
zoom: Zoom level (0-22)
Returns:
Tuple of (x, y) tile coordinates
"""
n = 2 ** zoom
x = int((lon + 180.0) / 360.0 * n)
lat_rad = math.radians(lat)
y = int((1.0 - math.asinh(math.tan(lat_rad)) / math.pi) / 2.0 * n)
# Clamp to valid range
x = max(0, min(n - 1, x))
y = max(0, min(n - 1, y))
return x, y
def tile_to_bounds(z: int, x: int, y: int) -> Tuple[float, float, float, float]:
"""
Convert tile coordinates to bounding box.
Args:
z: Zoom level
x: Tile X coordinate
y: Tile Y coordinate
Returns:
Tuple of (west, south, east, north) in degrees
"""
n = 2 ** z
west = x / n * 360.0 - 180.0
east = (x + 1) / n * 360.0 - 180.0
north_rad = math.atan(math.sinh(math.pi * (1 - 2 * y / n)))
south_rad = math.atan(math.sinh(math.pi * (1 - 2 * (y + 1) / n)))
north = math.degrees(north_rad)
south = math.degrees(south_rad)
return west, south, east, north
def tile_to_center(z: int, x: int, y: int) -> Tuple[float, float]:
"""
Get the center point of a tile.
Returns:
Tuple of (longitude, latitude) in degrees
"""
west, south, east, north = tile_to_bounds(z, x, y)
return (west + east) / 2, (south + north) / 2
def calculate_distance(
lat1: float, lon1: float, lat2: float, lon2: float
) -> float:
"""
Calculate the distance between two points using the Haversine formula.
Args:
lat1, lon1: First point coordinates in degrees
lat2, lon2: Second point coordinates in degrees
Returns:
Distance in meters
"""
R = 6371000 # Earth's radius in meters
lat1_rad = math.radians(lat1)
lat2_rad = math.radians(lat2)
delta_lat = math.radians(lat2 - lat1)
delta_lon = math.radians(lon2 - lon1)
a = (
math.sin(delta_lat / 2) ** 2
+ math.cos(lat1_rad) * math.cos(lat2_rad) * math.sin(delta_lon / 2) ** 2
)
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
return R * c
def transform_coordinates(
geometry,
from_crs: str = "EPSG:4326",
to_crs: str = "EPSG:3857",
):
"""
Transform a Shapely geometry between coordinate reference systems.
Args:
geometry: Shapely geometry object
from_crs: Source CRS (default WGS84)
to_crs: Target CRS (default Web Mercator)
Returns:
Transformed geometry
"""
transformer = pyproj.Transformer.from_crs(
from_crs,
to_crs,
always_xy=True,
)
return transform(transformer.transform, geometry)
def calculate_area_km2(geojson: dict) -> float:
"""
Calculate the area of a GeoJSON polygon in square kilometers.
Uses ETRS89-LAEA projection for accurate area calculation in Europe.
Args:
geojson: GeoJSON geometry dict
Returns:
Area in square kilometers
"""
geom = shape(geojson)
geom_projected = transform_coordinates(geom, "EPSG:4326", "EPSG:3035")
return geom_projected.area / 1_000_000
def is_within_bounds(
point: Tuple[float, float],
bounds: Tuple[float, float, float, float],
) -> bool:
"""
Check if a point is within a bounding box.
Args:
point: (longitude, latitude) tuple
bounds: (west, south, east, north) tuple
Returns:
True if point is within bounds
"""
lon, lat = point
west, south, east, north = bounds
return west <= lon <= east and south <= lat <= north
def get_germany_bounds() -> Tuple[float, float, float, float]:
"""Get the bounding box of Germany."""
return (5.87, 47.27, 15.04, 55.06)
def meters_per_pixel(lat: float, zoom: int) -> float:
"""
Calculate the ground resolution at a given latitude and zoom level.
Args:
lat: Latitude in degrees
zoom: Zoom level
Returns:
Meters per pixel at that location and zoom
"""
# Earth's circumference at equator in meters
C = 40075016.686
# Resolution at equator for this zoom level
resolution_equator = C / (256 * (2 ** zoom))
# Adjust for latitude (Mercator projection)
return resolution_equator * math.cos(math.radians(lat))
def simplify_polygon(geojson: dict, tolerance: float = 0.0001) -> dict:
"""
Simplify a polygon geometry to reduce complexity.
Args:
geojson: GeoJSON geometry dict
tolerance: Simplification tolerance in degrees
Returns:
Simplified GeoJSON geometry
"""
from shapely.geometry import mapping
geom = shape(geojson)
simplified = geom.simplify(tolerance, preserve_topology=True)
return mapping(simplified)
def buffer_polygon(geojson: dict, distance_meters: float) -> dict:
"""
Buffer a polygon by a distance in meters.
Args:
geojson: GeoJSON geometry dict
distance_meters: Buffer distance in meters
Returns:
Buffered GeoJSON geometry
"""
from shapely.geometry import mapping
geom = shape(geojson)
# Transform to metric CRS, buffer, transform back
geom_metric = transform_coordinates(geom, "EPSG:4326", "EPSG:3035")
buffered = geom_metric.buffer(distance_meters)
geom_wgs84 = transform_coordinates(buffered, "EPSG:3035", "EPSG:4326")
return mapping(geom_wgs84)
def get_tiles_for_bounds(
bounds: Tuple[float, float, float, float],
zoom: int,
) -> list[Tuple[int, int]]:
"""
Get all tile coordinates that cover a bounding box.
Args:
bounds: (west, south, east, north) in degrees
zoom: Zoom level
Returns:
List of (x, y) tile coordinates
"""
west, south, east, north = bounds
# Get corner tiles
x_min, y_max = lat_lon_to_tile(south, west, zoom)
x_max, y_min = lat_lon_to_tile(north, east, zoom)
# Generate all tiles in range
tiles = []
for x in range(x_min, x_max + 1):
for y in range(y_min, y_max + 1):
tiles.append((x, y))
return tiles

View File

@@ -0,0 +1,223 @@
"""
License Checker Utility
Validates data source licenses and generates attribution
"""
from typing import Optional
from enum import Enum
import structlog
logger = structlog.get_logger(__name__)
class LicenseType(Enum):
"""Supported data source license types."""
ODBL = "odbl" # OpenStreetMap
COPERNICUS = "copernicus" # Copernicus DEM
CC_BY = "cc-by" # Creative Commons Attribution
CC_BY_SA = "cc-by-sa" # Creative Commons Attribution-ShareAlike
CC0 = "cc0" # Public Domain
PROPRIETARY = "proprietary" # Not allowed
class DataSource(Enum):
"""Known data sources."""
OPENSTREETMAP = "openstreetmap"
COPERNICUS_DEM = "copernicus_dem"
OPENAERIAL = "openaerial"
WIKIMEDIA = "wikimedia"
GOOGLE = "google" # FORBIDDEN
BING = "bing" # FORBIDDEN
APPLE = "apple" # FORBIDDEN
HERE = "here" # FORBIDDEN
# License information for allowed sources
ALLOWED_SOURCES = {
DataSource.OPENSTREETMAP: {
"license": LicenseType.ODBL,
"attribution": "© OpenStreetMap contributors",
"url": "https://www.openstreetmap.org/copyright",
"commercial": True,
"derivative_allowed": True,
},
DataSource.COPERNICUS_DEM: {
"license": LicenseType.COPERNICUS,
"attribution": "© Copernicus Service Information",
"url": "https://spacedata.copernicus.eu/",
"commercial": True,
"derivative_allowed": True,
},
DataSource.OPENAERIAL: {
"license": LicenseType.CC_BY,
"attribution": "© OpenAerialMap contributors",
"url": "https://openaerialmap.org/",
"commercial": True,
"derivative_allowed": True,
},
DataSource.WIKIMEDIA: {
"license": LicenseType.CC_BY_SA,
"attribution": "Wikimedia Commons",
"url": "https://commons.wikimedia.org/",
"commercial": True,
"derivative_allowed": True,
},
}
# Forbidden sources
FORBIDDEN_SOURCES = {
DataSource.GOOGLE: "Google Maps ToS prohibit derivatives and offline use",
DataSource.BING: "Bing Maps has restrictive licensing",
DataSource.APPLE: "Apple Maps prohibits commercial use",
DataSource.HERE: "HERE requires paid licensing",
}
class LicenseChecker:
"""
Utility for validating data source licenses and generating attributions.
Ensures DSGVO/GDPR compliance and proper licensing for educational use.
"""
@staticmethod
def is_source_allowed(source: DataSource) -> bool:
"""Check if a data source is allowed for use."""
return source in ALLOWED_SOURCES
@staticmethod
def get_forbidden_reason(source: DataSource) -> Optional[str]:
"""Get the reason why a source is forbidden."""
return FORBIDDEN_SOURCES.get(source)
@staticmethod
def validate_url(url: str) -> tuple[bool, str]:
"""
Validate if a URL is from an allowed source.
Returns:
Tuple of (is_allowed, message)
"""
url_lower = url.lower()
# Check for forbidden sources
forbidden_patterns = {
"google": DataSource.GOOGLE,
"googleapis": DataSource.GOOGLE,
"gstatic": DataSource.GOOGLE,
"bing.com": DataSource.BING,
"virtualearth": DataSource.BING,
"apple.com/maps": DataSource.APPLE,
"here.com": DataSource.HERE,
}
for pattern, source in forbidden_patterns.items():
if pattern in url_lower:
reason = FORBIDDEN_SOURCES.get(source, "Not allowed")
return False, f"FORBIDDEN: {source.value} - {reason}"
# Check for allowed sources
allowed_patterns = {
"openstreetmap": DataSource.OPENSTREETMAP,
"tile.osm": DataSource.OPENSTREETMAP,
"copernicus": DataSource.COPERNICUS_DEM,
"openaerialmap": DataSource.OPENAERIAL,
"wikimedia": DataSource.WIKIMEDIA,
}
for pattern, source in allowed_patterns.items():
if pattern in url_lower:
info = ALLOWED_SOURCES[source]
return True, f"ALLOWED: {source.value} ({info['license'].value})"
# Unknown source - warn but allow with custom attribution
return True, "UNKNOWN: Verify license manually"
@staticmethod
def get_attribution_for_sources(
sources: list[DataSource],
) -> list[dict]:
"""
Get attribution information for a list of data sources.
Args:
sources: List of data sources used
Returns:
List of attribution dictionaries
"""
attributions = []
for source in sources:
if source in ALLOWED_SOURCES:
info = ALLOWED_SOURCES[source]
attributions.append({
"name": source.value.replace("_", " ").title(),
"license": info["license"].value.upper(),
"attribution": info["attribution"],
"url": info["url"],
"required": True,
})
return attributions
@staticmethod
def generate_attribution_html(sources: list[DataSource]) -> str:
"""
Generate HTML attribution footer for web display.
Args:
sources: List of data sources used
Returns:
HTML string with attribution
"""
attributions = LicenseChecker.get_attribution_for_sources(sources)
if not attributions:
return ""
parts = []
for attr in attributions:
parts.append(
f'<a href="{attr["url"]}" target="_blank" rel="noopener">'
f'{attr["attribution"]}</a>'
)
return " | ".join(parts)
@staticmethod
def generate_attribution_text(sources: list[DataSource]) -> str:
"""
Generate plain text attribution.
Args:
sources: List of data sources used
Returns:
Plain text attribution string
"""
attributions = LicenseChecker.get_attribution_for_sources(sources)
if not attributions:
return ""
return " | ".join(attr["attribution"] for attr in attributions)
@staticmethod
def check_commercial_use(sources: list[DataSource]) -> tuple[bool, list[str]]:
"""
Check if all sources allow commercial use.
Returns:
Tuple of (all_allowed, list_of_issues)
"""
issues = []
for source in sources:
if source in FORBIDDEN_SOURCES:
issues.append(f"{source.value}: {FORBIDDEN_SOURCES[source]}")
elif source in ALLOWED_SOURCES:
if not ALLOWED_SOURCES[source]["commercial"]:
issues.append(f"{source.value}: Commercial use not allowed")
return len(issues) == 0, issues

View File

@@ -0,0 +1,237 @@
"""
MinIO Client Utility
S3-compatible storage operations for AOI bundles
"""
import os
from typing import Optional, BinaryIO
import structlog
from minio import Minio
from minio.error import S3Error
from config import settings
logger = structlog.get_logger(__name__)
class MinioClient:
"""
Client for MinIO S3-compatible storage.
Used for storing generated AOI bundles and assets.
"""
def __init__(self):
self.endpoint = settings.minio_endpoint
self.access_key = settings.minio_access_key
self.secret_key = settings.minio_secret_key
self.bucket = settings.minio_bucket
self.secure = settings.minio_secure
self._client: Optional[Minio] = None
@property
def client(self) -> Minio:
"""Get or create MinIO client instance."""
if self._client is None:
self._client = Minio(
self.endpoint,
access_key=self.access_key,
secret_key=self.secret_key,
secure=self.secure,
)
self._ensure_bucket_exists()
return self._client
def _ensure_bucket_exists(self):
"""Create the bucket if it doesn't exist."""
try:
if not self._client.bucket_exists(self.bucket):
self._client.make_bucket(self.bucket)
logger.info("Created MinIO bucket", bucket=self.bucket)
except S3Error as e:
logger.error("Error creating bucket", error=str(e))
async def upload_file(
self,
local_path: str,
object_name: str,
content_type: str = "application/octet-stream",
) -> Optional[str]:
"""
Upload a file to MinIO.
Args:
local_path: Path to local file
object_name: Name in MinIO (can include path)
content_type: MIME type of the file
Returns:
Object URL or None on failure
"""
try:
self.client.fput_object(
self.bucket,
object_name,
local_path,
content_type=content_type,
)
logger.info("Uploaded file to MinIO", object_name=object_name)
return f"{self.endpoint}/{self.bucket}/{object_name}"
except S3Error as e:
logger.error("Error uploading file", error=str(e))
return None
async def upload_bytes(
self,
data: bytes,
object_name: str,
content_type: str = "application/octet-stream",
) -> Optional[str]:
"""
Upload bytes to MinIO.
Args:
data: Bytes to upload
object_name: Name in MinIO
content_type: MIME type
Returns:
Object URL or None on failure
"""
from io import BytesIO
try:
stream = BytesIO(data)
self.client.put_object(
self.bucket,
object_name,
stream,
length=len(data),
content_type=content_type,
)
logger.info("Uploaded bytes to MinIO", object_name=object_name, size=len(data))
return f"{self.endpoint}/{self.bucket}/{object_name}"
except S3Error as e:
logger.error("Error uploading bytes", error=str(e))
return None
async def download_file(
self,
object_name: str,
local_path: str,
) -> bool:
"""
Download a file from MinIO.
Args:
object_name: Name in MinIO
local_path: Destination path
Returns:
True on success
"""
try:
self.client.fget_object(self.bucket, object_name, local_path)
logger.info("Downloaded file from MinIO", object_name=object_name)
return True
except S3Error as e:
logger.error("Error downloading file", error=str(e))
return False
async def get_bytes(self, object_name: str) -> Optional[bytes]:
"""
Get object content as bytes.
Args:
object_name: Name in MinIO
Returns:
File content or None
"""
try:
response = self.client.get_object(self.bucket, object_name)
data = response.read()
response.close()
response.release_conn()
return data
except S3Error as e:
logger.error("Error getting bytes", error=str(e))
return None
async def delete_object(self, object_name: str) -> bool:
"""
Delete an object from MinIO.
Args:
object_name: Name in MinIO
Returns:
True on success
"""
try:
self.client.remove_object(self.bucket, object_name)
logger.info("Deleted object from MinIO", object_name=object_name)
return True
except S3Error as e:
logger.error("Error deleting object", error=str(e))
return False
async def list_objects(self, prefix: str = "") -> list[str]:
"""
List objects in the bucket.
Args:
prefix: Filter by prefix
Returns:
List of object names
"""
try:
objects = self.client.list_objects(self.bucket, prefix=prefix)
return [obj.object_name for obj in objects]
except S3Error as e:
logger.error("Error listing objects", error=str(e))
return []
async def get_presigned_url(
self,
object_name: str,
expiry_hours: int = 24,
) -> Optional[str]:
"""
Get a presigned URL for downloading an object.
Args:
object_name: Name in MinIO
expiry_hours: URL expiry time in hours
Returns:
Presigned URL or None
"""
from datetime import timedelta
try:
url = self.client.presigned_get_object(
self.bucket,
object_name,
expires=timedelta(hours=expiry_hours),
)
return url
except S3Error as e:
logger.error("Error generating presigned URL", error=str(e))
return None
async def object_exists(self, object_name: str) -> bool:
"""Check if an object exists."""
try:
self.client.stat_object(self.bucket, object_name)
return True
except S3Error:
return False
async def get_object_size(self, object_name: str) -> Optional[int]:
"""Get the size of an object in bytes."""
try:
stat = self.client.stat_object(self.bucket, object_name)
return stat.size
except S3Error:
return None