Compare commits
1 Commits
fix/multip
...
fix/cascad
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f8eb4ea84d |
@@ -70,7 +70,7 @@ jobs:
|
|||||||
|
|
||||||
# Tests (reuses compilation artifacts from clippy)
|
# Tests (reuses compilation artifacts from clippy)
|
||||||
- name: Tests (core + agent)
|
- name: Tests (core + agent)
|
||||||
run: cargo test -p compliance-core -p compliance-agent --lib
|
run: cargo test -p compliance-core -p compliance-agent
|
||||||
- name: Tests (dashboard server)
|
- name: Tests (dashboard server)
|
||||||
run: cargo test -p compliance-dashboard --features server --no-default-features
|
run: cargo test -p compliance-dashboard --features server --no-default-features
|
||||||
- name: Tests (dashboard web)
|
- name: Tests (dashboard web)
|
||||||
|
|||||||
@@ -1,52 +0,0 @@
|
|||||||
name: Nightly E2E Tests
|
|
||||||
|
|
||||||
on:
|
|
||||||
schedule:
|
|
||||||
- cron: '0 3 * * *' # 3 AM UTC daily
|
|
||||||
workflow_dispatch: # Allow manual trigger
|
|
||||||
|
|
||||||
env:
|
|
||||||
CARGO_TERM_COLOR: always
|
|
||||||
RUSTFLAGS: "-D warnings"
|
|
||||||
RUSTC_WRAPPER: /usr/local/bin/sccache
|
|
||||||
SCCACHE_DIR: /tmp/sccache
|
|
||||||
TEST_MONGODB_URI: "mongodb://root:example@mongo:27017/?authSource=admin"
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: nightly-e2e
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
e2e:
|
|
||||||
name: E2E Tests
|
|
||||||
runs-on: docker
|
|
||||||
container:
|
|
||||||
image: rust:1.94-bookworm
|
|
||||||
services:
|
|
||||||
mongo:
|
|
||||||
image: mongo:7
|
|
||||||
env:
|
|
||||||
MONGO_INITDB_ROOT_USERNAME: root
|
|
||||||
MONGO_INITDB_ROOT_PASSWORD: example
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
run: |
|
|
||||||
git init
|
|
||||||
git remote add origin "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
|
|
||||||
git fetch --depth=1 origin "${GITHUB_SHA:-refs/heads/main}"
|
|
||||||
git checkout FETCH_HEAD
|
|
||||||
|
|
||||||
- name: Install sccache
|
|
||||||
run: |
|
|
||||||
curl -fsSL https://github.com/mozilla/sccache/releases/download/v0.9.1/sccache-v0.9.1-x86_64-unknown-linux-musl.tar.gz \
|
|
||||||
| tar xz --strip-components=1 -C /usr/local/bin/ sccache-v0.9.1-x86_64-unknown-linux-musl/sccache
|
|
||||||
chmod +x /usr/local/bin/sccache
|
|
||||||
env:
|
|
||||||
RUSTC_WRAPPER: ""
|
|
||||||
|
|
||||||
- name: Run E2E tests
|
|
||||||
run: cargo test -p compliance-agent --test e2e -- --test-threads=4
|
|
||||||
|
|
||||||
- name: Show sccache stats
|
|
||||||
run: sccache --show-stats
|
|
||||||
if: always()
|
|
||||||
@@ -33,11 +33,6 @@ RUN pip3 install --break-system-packages ruff
|
|||||||
|
|
||||||
COPY --from=builder /app/target/release/compliance-agent /usr/local/bin/compliance-agent
|
COPY --from=builder /app/target/release/compliance-agent /usr/local/bin/compliance-agent
|
||||||
|
|
||||||
# Copy documentation for the help chat assistant
|
|
||||||
COPY --from=builder /app/README.md /app/README.md
|
|
||||||
COPY --from=builder /app/docs /app/docs
|
|
||||||
ENV HELP_DOCS_PATH=/app
|
|
||||||
|
|
||||||
# Ensure SSH key directory exists
|
# Ensure SSH key directory exists
|
||||||
RUN mkdir -p /data/compliance-scanner/ssh
|
RUN mkdir -p /data/compliance-scanner/ssh
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
FROM rust:1.94-bookworm AS builder
|
FROM rust:1.94-bookworm AS builder
|
||||||
|
|
||||||
RUN cargo install dioxus-cli --version 0.7.3 --locked
|
RUN cargo install dioxus-cli --version 0.7.3
|
||||||
|
|
||||||
ARG DOCS_URL=/docs
|
ARG DOCS_URL=/docs
|
||||||
|
|
||||||
|
|||||||
120
README.md
120
README.md
@@ -28,9 +28,9 @@
|
|||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
Compliance Scanner is an autonomous agent that continuously monitors git repositories for security vulnerabilities, GDPR/OAuth compliance patterns, and dependency risks. It creates issues in external trackers (GitHub/GitLab/Jira/Gitea) with evidence and remediation suggestions, reviews pull requests with multi-pass LLM analysis, runs autonomous penetration tests, and exposes a Dioxus-based dashboard for visualization.
|
Compliance Scanner is an autonomous agent that continuously monitors git repositories for security vulnerabilities, GDPR/OAuth compliance patterns, and dependency risks. It creates issues in external trackers (GitHub/GitLab/Jira) with evidence and remediation suggestions, reviews pull requests, and exposes a Dioxus-based dashboard for visualization.
|
||||||
|
|
||||||
> **How it works:** The agent runs as a lazy daemon -- it only scans when new commits are detected, triggered by cron schedules or webhooks. LLM-powered triage filters out false positives and generates actionable remediation with multi-language awareness.
|
> **How it works:** The agent runs as a lazy daemon -- it only scans when new commits are detected, triggered by cron schedules or webhooks. LLM-powered triage filters out false positives and generates actionable remediation.
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
@@ -41,38 +41,31 @@ Compliance Scanner is an autonomous agent that continuously monitors git reposit
|
|||||||
| **CVE Monitoring** | OSV.dev batch queries, NVD CVSS enrichment, SearXNG context |
|
| **CVE Monitoring** | OSV.dev batch queries, NVD CVSS enrichment, SearXNG context |
|
||||||
| **GDPR Patterns** | Detect PII logging, missing consent, hardcoded retention, missing deletion |
|
| **GDPR Patterns** | Detect PII logging, missing consent, hardcoded retention, missing deletion |
|
||||||
| **OAuth Patterns** | Detect implicit grant, missing PKCE, token in localStorage, token in URLs |
|
| **OAuth Patterns** | Detect implicit grant, missing PKCE, token in localStorage, token in URLs |
|
||||||
| **LLM Triage** | Multi-language-aware confidence scoring (Rust, Python, Go, Java, Ruby, PHP, C++) |
|
| **LLM Triage** | Confidence scoring via LiteLLM to filter false positives |
|
||||||
| **Issue Creation** | Auto-create issues in GitHub, GitLab, Jira, or Gitea with dedup via fingerprints |
|
| **Issue Creation** | Auto-create issues in GitHub, GitLab, or Jira with code evidence |
|
||||||
| **PR Reviews** | Multi-pass security review (logic, security, convention, complexity) with dedup |
|
| **PR Reviews** | Post security review comments on pull requests |
|
||||||
| **DAST Scanning** | Black-box security testing with endpoint discovery and parameter fuzzing |
|
| **Dashboard** | Fullstack Dioxus UI with findings, SBOM, issues, and statistics |
|
||||||
| **AI Pentesting** | Autonomous LLM-orchestrated penetration testing with encrypted reports |
|
| **Webhooks** | GitHub (HMAC-SHA256) and GitLab webhook receivers for push/PR events |
|
||||||
| **Code Graph** | Interactive code knowledge graph with impact analysis |
|
|
||||||
| **AI Chat (RAG)** | Natural language Q&A grounded in repository source code |
|
|
||||||
| **Help Assistant** | Documentation-grounded help chat accessible from every dashboard page |
|
|
||||||
| **MCP Server** | Expose live security data to Claude, Cursor, and other AI tools |
|
|
||||||
| **Dashboard** | Fullstack Dioxus UI with findings, SBOM, issues, DAST, pentest, and graph |
|
|
||||||
| **Webhooks** | GitHub, GitLab, and Gitea webhook receivers for push/PR events |
|
|
||||||
| **Finding Dedup** | SHA-256 fingerprint dedup for SAST, CWE-based dedup for DAST findings |
|
|
||||||
|
|
||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
```
|
```
|
||||||
┌──────────────────────────────────────────────────────────────────────────┐
|
┌─────────────────────────────────────────────────────────────┐
|
||||||
│ Cargo Workspace │
|
│ Cargo Workspace │
|
||||||
├──────────────┬──────────────────┬──────────────┬──────────┬─────────────┤
|
├──────────────┬──────────────────┬───────────────────────────┤
|
||||||
│ compliance- │ compliance- │ compliance- │ complian-│ compliance- │
|
│ compliance- │ compliance- │ compliance- │
|
||||||
│ core (lib) │ agent (bin) │ dashboard │ ce-graph │ mcp (bin) │
|
│ core │ agent │ dashboard │
|
||||||
│ │ │ (bin) │ (lib) │ │
|
│ (lib) │ (bin) │ (bin, Dioxus 0.7.3) │
|
||||||
│ Models │ Scan Pipeline │ Dioxus 0.7 │ Tree- │ MCP Server │
|
│ │ │ │
|
||||||
│ Traits │ LLM Client │ Fullstack UI │ sitter │ Live data │
|
│ Models │ Scan Pipeline │ Fullstack Web UI │
|
||||||
│ Config │ Issue Trackers │ Help Chat │ Graph │ for AI │
|
│ Traits │ LLM Client │ Server Functions │
|
||||||
│ Errors │ Pentest Engine │ Server Fns │ Embedds │ tools │
|
│ Config │ Issue Trackers │ Charts + Tables │
|
||||||
│ │ DAST Tools │ │ RAG │ │
|
│ Errors │ Scheduler │ Settings Page │
|
||||||
│ │ REST API │ │ │ │
|
│ │ REST API │ │
|
||||||
│ │ Webhooks │ │ │ │
|
│ │ Webhooks │ │
|
||||||
└──────────────┴──────────────────┴──────────────┴──────────┴─────────────┘
|
└──────────────┴──────────────────┴───────────────────────────┘
|
||||||
│
|
│
|
||||||
MongoDB (shared)
|
MongoDB (shared)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Scan Pipeline (7 Stages)
|
## Scan Pipeline (7 Stages)
|
||||||
@@ -91,16 +84,11 @@ Compliance Scanner is an autonomous agent that continuously monitors git reposit
|
|||||||
|-------|-----------|
|
|-------|-----------|
|
||||||
| Shared Library | `compliance-core` -- models, traits, config |
|
| Shared Library | `compliance-core` -- models, traits, config |
|
||||||
| Agent | Axum REST API, git2, tokio-cron-scheduler, Semgrep, Syft |
|
| Agent | Axum REST API, git2, tokio-cron-scheduler, Semgrep, Syft |
|
||||||
| Dashboard | Dioxus 0.7.3 fullstack, Tailwind CSS 4 |
|
| Dashboard | Dioxus 0.7.3 fullstack, Tailwind CSS |
|
||||||
| Code Graph | `compliance-graph` -- tree-sitter parsing, embeddings, RAG |
|
|
||||||
| MCP Server | `compliance-mcp` -- Model Context Protocol for AI tools |
|
|
||||||
| DAST | `compliance-dast` -- dynamic application security testing |
|
|
||||||
| Database | MongoDB with typed collections |
|
| Database | MongoDB with typed collections |
|
||||||
| LLM | LiteLLM (OpenAI-compatible API for chat, triage, embeddings) |
|
| LLM | LiteLLM (OpenAI-compatible API) |
|
||||||
| Issue Trackers | GitHub (octocrab), GitLab (REST v4), Jira (REST v3), Gitea |
|
| Issue Trackers | GitHub (octocrab), GitLab (REST v4), Jira (REST v3) |
|
||||||
| CVE Sources | OSV.dev, NVD, SearXNG |
|
| CVE Sources | OSV.dev, NVD, SearXNG |
|
||||||
| Auth | Keycloak (OAuth2/PKCE, SSO) |
|
|
||||||
| Browser Automation | Chromium (headless, for pentesting and PDF generation) |
|
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
|
|
||||||
@@ -163,35 +151,20 @@ The agent exposes a REST API on port 3001:
|
|||||||
| `GET` | `/api/v1/sbom` | List dependencies |
|
| `GET` | `/api/v1/sbom` | List dependencies |
|
||||||
| `GET` | `/api/v1/issues` | List cross-tracker issues |
|
| `GET` | `/api/v1/issues` | List cross-tracker issues |
|
||||||
| `GET` | `/api/v1/scan-runs` | Scan execution history |
|
| `GET` | `/api/v1/scan-runs` | Scan execution history |
|
||||||
| `GET` | `/api/v1/graph/:repo_id` | Code knowledge graph |
|
|
||||||
| `POST` | `/api/v1/graph/:repo_id/build` | Trigger graph build |
|
|
||||||
| `GET` | `/api/v1/dast/targets` | List DAST targets |
|
|
||||||
| `POST` | `/api/v1/dast/targets` | Add DAST target |
|
|
||||||
| `GET` | `/api/v1/dast/findings` | List DAST findings |
|
|
||||||
| `POST` | `/api/v1/chat/:repo_id` | RAG-powered code chat |
|
|
||||||
| `POST` | `/api/v1/help/chat` | Documentation-grounded help chat |
|
|
||||||
| `POST` | `/api/v1/pentest/sessions` | Create pentest session |
|
|
||||||
| `POST` | `/api/v1/pentest/sessions/:id/export` | Export encrypted pentest report |
|
|
||||||
| `POST` | `/webhook/github` | GitHub webhook (HMAC-SHA256) |
|
| `POST` | `/webhook/github` | GitHub webhook (HMAC-SHA256) |
|
||||||
| `POST` | `/webhook/gitlab` | GitLab webhook (token verify) |
|
| `POST` | `/webhook/gitlab` | GitLab webhook (token verify) |
|
||||||
| `POST` | `/webhook/gitea` | Gitea webhook |
|
|
||||||
|
|
||||||
## Dashboard Pages
|
## Dashboard Pages
|
||||||
|
|
||||||
| Page | Description |
|
| Page | Description |
|
||||||
|------|-------------|
|
|------|-------------|
|
||||||
| **Overview** | Stat cards, severity distribution, AI chat cards, MCP status |
|
| **Overview** | Stat cards, severity distribution chart |
|
||||||
| **Repositories** | Add/manage tracked repos, trigger scans, webhook config |
|
| **Repositories** | Add/manage tracked repos, trigger scans |
|
||||||
| **Findings** | Filterable table by severity, type, status, scanner |
|
| **Findings** | Filterable table by severity, type, status |
|
||||||
| **Finding Detail** | Code evidence, remediation, suggested fix, linked issue |
|
| **Finding Detail** | Code evidence, remediation, suggested fix, linked issue |
|
||||||
| **SBOM** | Dependency inventory with vulnerability badges, license summary |
|
| **SBOM** | Dependency inventory with vulnerability badges |
|
||||||
| **Issues** | Cross-tracker view (GitHub + GitLab + Jira + Gitea) |
|
| **Issues** | Cross-tracker view (GitHub + GitLab + Jira) |
|
||||||
| **Code Graph** | Interactive architecture visualization, impact analysis |
|
| **Settings** | Configure LiteLLM, tracker tokens, SearXNG URL |
|
||||||
| **AI Chat** | RAG-powered Q&A about repository code |
|
|
||||||
| **DAST** | Dynamic scanning targets, findings, and scan history |
|
|
||||||
| **Pentest** | AI-driven pentest sessions, attack chain visualization |
|
|
||||||
| **MCP Servers** | Model Context Protocol server management |
|
|
||||||
| **Help Chat** | Floating assistant (available on every page) for product Q&A |
|
|
||||||
|
|
||||||
## Project Structure
|
## Project Structure
|
||||||
|
|
||||||
@@ -200,24 +173,19 @@ compliance-scanner/
|
|||||||
├── compliance-core/ Shared library (models, traits, config, errors)
|
├── compliance-core/ Shared library (models, traits, config, errors)
|
||||||
├── compliance-agent/ Agent daemon (pipeline, LLM, trackers, API, webhooks)
|
├── compliance-agent/ Agent daemon (pipeline, LLM, trackers, API, webhooks)
|
||||||
│ └── src/
|
│ └── src/
|
||||||
│ ├── pipeline/ 7-stage scan pipeline, dedup, PR reviews, code review
|
│ ├── pipeline/ 7-stage scan pipeline
|
||||||
│ ├── llm/ LiteLLM client, triage, descriptions, fixes, review prompts
|
│ ├── llm/ LiteLLM client, triage, descriptions, fixes, PR review
|
||||||
│ ├── trackers/ GitHub, GitLab, Jira, Gitea integrations
|
│ ├── trackers/ GitHub, GitLab, Jira integrations
|
||||||
│ ├── pentest/ AI-driven pentest orchestrator, tools, reports
|
│ ├── api/ REST API (Axum)
|
||||||
│ ├── rag/ RAG pipeline, chunking, embedding
|
│ └── webhooks/ GitHub + GitLab webhook receivers
|
||||||
│ ├── api/ REST API (Axum), help chat
|
|
||||||
│ └── webhooks/ GitHub, GitLab, Gitea webhook receivers
|
|
||||||
├── compliance-dashboard/ Dioxus fullstack dashboard
|
├── compliance-dashboard/ Dioxus fullstack dashboard
|
||||||
│ └── src/
|
│ └── src/
|
||||||
│ ├── components/ Reusable UI (sidebar, help chat, attack chain, etc.)
|
│ ├── components/ Reusable UI components
|
||||||
│ ├── infrastructure/ Server functions, DB, config, auth
|
│ ├── infrastructure/ Server functions, DB, config
|
||||||
│ └── pages/ Full page views (overview, DAST, pentest, graph, etc.)
|
│ └── pages/ Full page views
|
||||||
├── compliance-graph/ Code knowledge graph (tree-sitter, embeddings, RAG)
|
|
||||||
├── compliance-dast/ Dynamic application security testing
|
|
||||||
├── compliance-mcp/ Model Context Protocol server
|
|
||||||
├── docs/ VitePress documentation site
|
|
||||||
├── assets/ Static assets (CSS, icons)
|
├── assets/ Static assets (CSS, icons)
|
||||||
└── styles/ Tailwind input stylesheet
|
├── styles/ Tailwind input stylesheet
|
||||||
|
└── bin/ Dashboard binary entrypoint
|
||||||
```
|
```
|
||||||
|
|
||||||
## External Services
|
## External Services
|
||||||
@@ -225,12 +193,10 @@ compliance-scanner/
|
|||||||
| Service | Purpose | Default URL |
|
| Service | Purpose | Default URL |
|
||||||
|---------|---------|-------------|
|
|---------|---------|-------------|
|
||||||
| MongoDB | Persistence | `mongodb://localhost:27017` |
|
| MongoDB | Persistence | `mongodb://localhost:27017` |
|
||||||
| LiteLLM | LLM proxy (chat, triage, embeddings) | `http://localhost:4000` |
|
| LiteLLM | LLM proxy for triage and generation | `http://localhost:4000` |
|
||||||
| SearXNG | CVE context search | `http://localhost:8888` |
|
| SearXNG | CVE context search | `http://localhost:8888` |
|
||||||
| Keycloak | Authentication (OAuth2/PKCE, SSO) | `http://localhost:8080` |
|
|
||||||
| Semgrep | SAST scanning | CLI tool |
|
| Semgrep | SAST scanning | CLI tool |
|
||||||
| Syft | SBOM generation | CLI tool |
|
| Syft | SBOM generation | CLI tool |
|
||||||
| Chromium | Headless browser (pentesting, PDF) | Managed via Docker |
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ uuid = { workspace = true }
|
|||||||
secrecy = { workspace = true }
|
secrecy = { workspace = true }
|
||||||
regex = { workspace = true }
|
regex = { workspace = true }
|
||||||
axum = "0.8"
|
axum = "0.8"
|
||||||
tower-http = { version = "0.6", features = ["cors", "trace", "set-header"] }
|
tower-http = { version = "0.6", features = ["cors", "trace"] }
|
||||||
git2 = "0.20"
|
git2 = "0.20"
|
||||||
octocrab = "0.44"
|
octocrab = "0.44"
|
||||||
tokio-cron-scheduler = "0.13"
|
tokio-cron-scheduler = "0.13"
|
||||||
@@ -42,14 +42,3 @@ tokio-tungstenite = { version = "0.26", features = ["rustls-tls-webpki-roots"] }
|
|||||||
futures-core = "0.3"
|
futures-core = "0.3"
|
||||||
dashmap = { workspace = true }
|
dashmap = { workspace = true }
|
||||||
tokio-stream = { workspace = true }
|
tokio-stream = { workspace = true }
|
||||||
|
|
||||||
[dev-dependencies]
|
|
||||||
compliance-core = { workspace = true, features = ["mongodb"] }
|
|
||||||
reqwest = { workspace = true }
|
|
||||||
serde_json = { workspace = true }
|
|
||||||
tokio = { workspace = true }
|
|
||||||
mongodb = { workspace = true }
|
|
||||||
uuid = { workspace = true }
|
|
||||||
secrecy = { workspace = true }
|
|
||||||
axum = "0.8"
|
|
||||||
tower-http = { version = "0.6", features = ["cors"] }
|
|
||||||
|
|||||||
@@ -90,13 +90,10 @@ pub async fn chat(
|
|||||||
};
|
};
|
||||||
|
|
||||||
let system_prompt = format!(
|
let system_prompt = format!(
|
||||||
"You are a code assistant for this repository. Answer questions using the code context below.\n\n\
|
"You are an expert code assistant for a software repository. \
|
||||||
Rules:\n\
|
Answer the user's question based on the code context below. \
|
||||||
- Reference specific files, functions, and line numbers\n\
|
Reference specific files and functions when relevant. \
|
||||||
- Show code snippets when they help explain the answer\n\
|
If the context doesn't contain enough information, say so.\n\n\
|
||||||
- If the context is insufficient, say what's missing rather than guessing\n\
|
|
||||||
- Be concise — lead with the answer, then explain if needed\n\
|
|
||||||
- For security questions, note relevant CWEs and link to the finding if one exists\n\n\
|
|
||||||
## Code Context\n\n{code_context}"
|
## Code Context\n\n{code_context}"
|
||||||
);
|
);
|
||||||
|
|
||||||
|
|||||||
@@ -1,217 +0,0 @@
|
|||||||
use std::path::{Path, PathBuf};
|
|
||||||
use std::sync::OnceLock;
|
|
||||||
|
|
||||||
use axum::extract::Extension;
|
|
||||||
use axum::http::StatusCode;
|
|
||||||
use axum::Json;
|
|
||||||
use serde::{Deserialize, Serialize};
|
|
||||||
use walkdir::WalkDir;
|
|
||||||
|
|
||||||
use super::dto::{AgentExt, ApiResponse};
|
|
||||||
|
|
||||||
// ── DTOs ─────────────────────────────────────────────────────────────────────
|
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
|
||||||
pub struct HelpChatMessage {
|
|
||||||
pub role: String,
|
|
||||||
pub content: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
|
||||||
pub struct HelpChatRequest {
|
|
||||||
pub message: String,
|
|
||||||
#[serde(default)]
|
|
||||||
pub history: Vec<HelpChatMessage>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Serialize)]
|
|
||||||
pub struct HelpChatResponse {
|
|
||||||
pub message: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
// ── Doc cache ────────────────────────────────────────────────────────────────
|
|
||||||
|
|
||||||
static DOC_CONTEXT: OnceLock<String> = OnceLock::new();
|
|
||||||
|
|
||||||
/// Walk upward from `start` until we find a directory containing both
|
|
||||||
/// `README.md` and a `docs/` subdirectory.
|
|
||||||
fn find_project_root(start: &Path) -> Option<PathBuf> {
|
|
||||||
let mut current = start.to_path_buf();
|
|
||||||
loop {
|
|
||||||
if current.join("README.md").is_file() && current.join("docs").is_dir() {
|
|
||||||
return Some(current);
|
|
||||||
}
|
|
||||||
if !current.pop() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Read README.md + all docs/**/*.md (excluding node_modules).
|
|
||||||
fn load_docs(root: &Path) -> String {
|
|
||||||
let mut parts: Vec<String> = Vec::new();
|
|
||||||
|
|
||||||
// Root README first
|
|
||||||
if let Ok(content) = std::fs::read_to_string(root.join("README.md")) {
|
|
||||||
parts.push(format!("<!-- file: README.md -->\n{content}"));
|
|
||||||
}
|
|
||||||
|
|
||||||
// docs/**/*.md, skipping node_modules
|
|
||||||
for entry in WalkDir::new(root.join("docs"))
|
|
||||||
.follow_links(false)
|
|
||||||
.into_iter()
|
|
||||||
.filter_entry(|e| {
|
|
||||||
!e.path()
|
|
||||||
.components()
|
|
||||||
.any(|c| c.as_os_str() == "node_modules")
|
|
||||||
})
|
|
||||||
.filter_map(|e| e.ok())
|
|
||||||
{
|
|
||||||
let path = entry.path();
|
|
||||||
if !path.is_file() {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if path
|
|
||||||
.extension()
|
|
||||||
.and_then(|s| s.to_str())
|
|
||||||
.map(|s| !s.eq_ignore_ascii_case("md"))
|
|
||||||
.unwrap_or(true)
|
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let rel = path.strip_prefix(root).unwrap_or(path);
|
|
||||||
if let Ok(content) = std::fs::read_to_string(path) {
|
|
||||||
parts.push(format!("<!-- file: {} -->\n{content}", rel.display()));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if parts.is_empty() {
|
|
||||||
tracing::warn!(
|
|
||||||
"help_chat: no documentation files found under {}",
|
|
||||||
root.display()
|
|
||||||
);
|
|
||||||
} else {
|
|
||||||
tracing::info!(
|
|
||||||
"help_chat: loaded {} documentation file(s) from {}",
|
|
||||||
parts.len(),
|
|
||||||
root.display()
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
parts.join("\n\n---\n\n")
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Returns a reference to the cached doc context string, initialised on
|
|
||||||
/// first call via `OnceLock`.
|
|
||||||
///
|
|
||||||
/// Discovery order:
|
|
||||||
/// 1. `HELP_DOCS_PATH` env var (explicit override)
|
|
||||||
/// 2. Walk up from the binary location
|
|
||||||
/// 3. Current working directory
|
|
||||||
/// 4. Common Docker paths (/app, /opt/compliance-scanner)
|
|
||||||
fn doc_context() -> &'static str {
|
|
||||||
DOC_CONTEXT.get_or_init(|| {
|
|
||||||
// 1. Explicit env var
|
|
||||||
if let Ok(path) = std::env::var("HELP_DOCS_PATH") {
|
|
||||||
let p = PathBuf::from(&path);
|
|
||||||
if p.join("README.md").is_file() || p.join("docs").is_dir() {
|
|
||||||
tracing::info!("help_chat: loading docs from HELP_DOCS_PATH={path}");
|
|
||||||
return load_docs(&p);
|
|
||||||
}
|
|
||||||
tracing::warn!("help_chat: HELP_DOCS_PATH={path} has no README.md or docs/");
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Walk up from binary location
|
|
||||||
let start = std::env::current_exe()
|
|
||||||
.ok()
|
|
||||||
.and_then(|p| p.parent().map(Path::to_path_buf))
|
|
||||||
.unwrap_or_else(|| PathBuf::from("."));
|
|
||||||
|
|
||||||
if let Some(root) = find_project_root(&start) {
|
|
||||||
return load_docs(&root);
|
|
||||||
}
|
|
||||||
|
|
||||||
// 3. Current working directory
|
|
||||||
if let Ok(cwd) = std::env::current_dir() {
|
|
||||||
if let Some(root) = find_project_root(&cwd) {
|
|
||||||
return load_docs(&root);
|
|
||||||
}
|
|
||||||
if cwd.join("README.md").is_file() {
|
|
||||||
return load_docs(&cwd);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 4. Common Docker/deployment paths
|
|
||||||
for candidate in ["/app", "/opt/compliance-scanner", "/srv/compliance-scanner"] {
|
|
||||||
let p = PathBuf::from(candidate);
|
|
||||||
if p.join("README.md").is_file() || p.join("docs").is_dir() {
|
|
||||||
tracing::info!("help_chat: found docs at {candidate}");
|
|
||||||
return load_docs(&p);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
tracing::error!(
|
|
||||||
"help_chat: could not locate project root; doc context will be empty. \
|
|
||||||
Set HELP_DOCS_PATH to the directory containing README.md and docs/"
|
|
||||||
);
|
|
||||||
String::new()
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// ── Handler ──────────────────────────────────────────────────────────────────
|
|
||||||
|
|
||||||
/// POST /api/v1/help/chat — Answer questions about the compliance-scanner
|
|
||||||
/// using the project documentation as grounding context.
|
|
||||||
#[tracing::instrument(skip_all)]
|
|
||||||
pub async fn help_chat(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
Json(req): Json<HelpChatRequest>,
|
|
||||||
) -> Result<Json<ApiResponse<HelpChatResponse>>, StatusCode> {
|
|
||||||
let context = doc_context();
|
|
||||||
|
|
||||||
let system_prompt = if context.is_empty() {
|
|
||||||
"You are a helpful assistant for the Compliance Scanner project. \
|
|
||||||
Answer questions about how to use and configure it. \
|
|
||||||
No documentation was loaded at startup, so rely on your general knowledge."
|
|
||||||
.to_string()
|
|
||||||
} else {
|
|
||||||
format!(
|
|
||||||
"You are a helpful assistant for the Compliance Scanner project. \
|
|
||||||
Answer questions about how to use, configure, and understand it \
|
|
||||||
using the documentation below as your primary source of truth.\n\n\
|
|
||||||
Rules:\n\
|
|
||||||
- Prefer information from the provided docs over general knowledge\n\
|
|
||||||
- Quote or reference the relevant doc section when it helps\n\
|
|
||||||
- If the docs do not cover the topic, say so clearly\n\
|
|
||||||
- Be concise — lead with the answer, then explain if needed\n\
|
|
||||||
- Use markdown formatting for readability\n\n\
|
|
||||||
## Project Documentation\n\n{context}"
|
|
||||||
)
|
|
||||||
};
|
|
||||||
|
|
||||||
let mut messages: Vec<(String, String)> = Vec::with_capacity(req.history.len() + 2);
|
|
||||||
messages.push(("system".to_string(), system_prompt));
|
|
||||||
|
|
||||||
for msg in &req.history {
|
|
||||||
messages.push((msg.role.clone(), msg.content.clone()));
|
|
||||||
}
|
|
||||||
messages.push(("user".to_string(), req.message));
|
|
||||||
|
|
||||||
let response_text = agent
|
|
||||||
.llm
|
|
||||||
.chat_with_messages(messages, Some(0.3))
|
|
||||||
.await
|
|
||||||
.map_err(|e| {
|
|
||||||
tracing::error!("LLM help chat failed: {e}");
|
|
||||||
StatusCode::INTERNAL_SERVER_ERROR
|
|
||||||
})?;
|
|
||||||
|
|
||||||
Ok(Json(ApiResponse {
|
|
||||||
data: HelpChatResponse {
|
|
||||||
message: response_text,
|
|
||||||
},
|
|
||||||
total: None,
|
|
||||||
page: None,
|
|
||||||
}))
|
|
||||||
}
|
|
||||||
@@ -4,9 +4,7 @@ pub mod dto;
|
|||||||
pub mod findings;
|
pub mod findings;
|
||||||
pub mod graph;
|
pub mod graph;
|
||||||
pub mod health;
|
pub mod health;
|
||||||
pub mod help_chat;
|
|
||||||
pub mod issues;
|
pub mod issues;
|
||||||
pub mod notifications;
|
|
||||||
pub mod pentest_handlers;
|
pub mod pentest_handlers;
|
||||||
pub use pentest_handlers as pentest;
|
pub use pentest_handlers as pentest;
|
||||||
pub mod repos;
|
pub mod repos;
|
||||||
|
|||||||
@@ -1,178 +0,0 @@
|
|||||||
use axum::extract::Extension;
|
|
||||||
use axum::http::StatusCode;
|
|
||||||
use axum::Json;
|
|
||||||
use mongodb::bson::doc;
|
|
||||||
use serde::Deserialize;
|
|
||||||
|
|
||||||
use compliance_core::models::notification::CveNotification;
|
|
||||||
|
|
||||||
use super::dto::{AgentExt, ApiResponse};
|
|
||||||
|
|
||||||
/// GET /api/v1/notifications — List CVE notifications (newest first)
|
|
||||||
#[tracing::instrument(skip_all)]
|
|
||||||
pub async fn list_notifications(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
axum::extract::Query(params): axum::extract::Query<NotificationFilter>,
|
|
||||||
) -> Result<Json<ApiResponse<Vec<CveNotification>>>, StatusCode> {
|
|
||||||
let mut filter = doc! {};
|
|
||||||
|
|
||||||
// Filter by status (default: show new + read, exclude dismissed)
|
|
||||||
match params.status.as_deref() {
|
|
||||||
Some("all") => {}
|
|
||||||
Some(s) => {
|
|
||||||
filter.insert("status", s);
|
|
||||||
}
|
|
||||||
None => {
|
|
||||||
filter.insert("status", doc! { "$in": ["new", "read"] });
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Filter by severity
|
|
||||||
if let Some(ref sev) = params.severity {
|
|
||||||
filter.insert("severity", sev.as_str());
|
|
||||||
}
|
|
||||||
|
|
||||||
// Filter by repo
|
|
||||||
if let Some(ref repo_id) = params.repo_id {
|
|
||||||
filter.insert("repo_id", repo_id.as_str());
|
|
||||||
}
|
|
||||||
|
|
||||||
let page = params.page.unwrap_or(1).max(1);
|
|
||||||
let limit = params.limit.unwrap_or(50).min(200);
|
|
||||||
let skip = (page - 1) * limit as u64;
|
|
||||||
|
|
||||||
let total = agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.count_documents(filter.clone())
|
|
||||||
.await
|
|
||||||
.unwrap_or(0);
|
|
||||||
|
|
||||||
let notifications: Vec<CveNotification> = match agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.find(filter)
|
|
||||||
.sort(doc! { "created_at": -1 })
|
|
||||||
.skip(skip)
|
|
||||||
.limit(limit)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(cursor) => {
|
|
||||||
use futures_util::StreamExt;
|
|
||||||
let mut items = Vec::new();
|
|
||||||
let mut cursor = cursor;
|
|
||||||
while let Some(Ok(n)) = cursor.next().await {
|
|
||||||
items.push(n);
|
|
||||||
}
|
|
||||||
items
|
|
||||||
}
|
|
||||||
Err(e) => {
|
|
||||||
tracing::error!("Failed to list notifications: {e}");
|
|
||||||
return Err(StatusCode::INTERNAL_SERVER_ERROR);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
Ok(Json(ApiResponse {
|
|
||||||
data: notifications,
|
|
||||||
total: Some(total),
|
|
||||||
page: Some(page),
|
|
||||||
}))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// GET /api/v1/notifications/count — Count of unread notifications
|
|
||||||
#[tracing::instrument(skip_all)]
|
|
||||||
pub async fn notification_count(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
) -> Result<Json<serde_json::Value>, StatusCode> {
|
|
||||||
let count = agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.count_documents(doc! { "status": "new" })
|
|
||||||
.await
|
|
||||||
.unwrap_or(0);
|
|
||||||
|
|
||||||
Ok(Json(serde_json::json!({ "count": count })))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// PATCH /api/v1/notifications/:id/read — Mark a notification as read
|
|
||||||
#[tracing::instrument(skip_all, fields(id = %id))]
|
|
||||||
pub async fn mark_read(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
axum::extract::Path(id): axum::extract::Path<String>,
|
|
||||||
) -> Result<Json<serde_json::Value>, StatusCode> {
|
|
||||||
let oid = mongodb::bson::oid::ObjectId::parse_str(&id).map_err(|_| StatusCode::BAD_REQUEST)?;
|
|
||||||
|
|
||||||
let result = agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.update_one(
|
|
||||||
doc! { "_id": oid },
|
|
||||||
doc! { "$set": {
|
|
||||||
"status": "read",
|
|
||||||
"read_at": mongodb::bson::DateTime::now(),
|
|
||||||
}},
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
|
|
||||||
|
|
||||||
if result.matched_count == 0 {
|
|
||||||
return Err(StatusCode::NOT_FOUND);
|
|
||||||
}
|
|
||||||
Ok(Json(serde_json::json!({ "status": "read" })))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// PATCH /api/v1/notifications/:id/dismiss — Dismiss a notification
|
|
||||||
#[tracing::instrument(skip_all, fields(id = %id))]
|
|
||||||
pub async fn dismiss_notification(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
axum::extract::Path(id): axum::extract::Path<String>,
|
|
||||||
) -> Result<Json<serde_json::Value>, StatusCode> {
|
|
||||||
let oid = mongodb::bson::oid::ObjectId::parse_str(&id).map_err(|_| StatusCode::BAD_REQUEST)?;
|
|
||||||
|
|
||||||
let result = agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.update_one(
|
|
||||||
doc! { "_id": oid },
|
|
||||||
doc! { "$set": { "status": "dismissed" } },
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
|
|
||||||
|
|
||||||
if result.matched_count == 0 {
|
|
||||||
return Err(StatusCode::NOT_FOUND);
|
|
||||||
}
|
|
||||||
Ok(Json(serde_json::json!({ "status": "dismissed" })))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// POST /api/v1/notifications/read-all — Mark all new notifications as read
|
|
||||||
#[tracing::instrument(skip_all)]
|
|
||||||
pub async fn mark_all_read(
|
|
||||||
Extension(agent): AgentExt,
|
|
||||||
) -> Result<Json<serde_json::Value>, StatusCode> {
|
|
||||||
let result = agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.update_many(
|
|
||||||
doc! { "status": "new" },
|
|
||||||
doc! { "$set": {
|
|
||||||
"status": "read",
|
|
||||||
"read_at": mongodb::bson::DateTime::now(),
|
|
||||||
}},
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
|
|
||||||
|
|
||||||
Ok(Json(
|
|
||||||
serde_json::json!({ "updated": result.modified_count }),
|
|
||||||
))
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Deserialize)]
|
|
||||||
pub struct NotificationFilter {
|
|
||||||
pub status: Option<String>,
|
|
||||||
pub severity: Option<String>,
|
|
||||||
pub repo_id: Option<String>,
|
|
||||||
pub page: Option<u64>,
|
|
||||||
pub limit: Option<i64>,
|
|
||||||
}
|
|
||||||
@@ -99,29 +99,6 @@ pub fn build_router() -> Router {
|
|||||||
"/api/v1/chat/{repo_id}/status",
|
"/api/v1/chat/{repo_id}/status",
|
||||||
get(handlers::chat::embedding_status),
|
get(handlers::chat::embedding_status),
|
||||||
)
|
)
|
||||||
// Help chat (documentation-grounded Q&A)
|
|
||||||
.route("/api/v1/help/chat", post(handlers::help_chat::help_chat))
|
|
||||||
// CVE notification endpoints
|
|
||||||
.route(
|
|
||||||
"/api/v1/notifications",
|
|
||||||
get(handlers::notifications::list_notifications),
|
|
||||||
)
|
|
||||||
.route(
|
|
||||||
"/api/v1/notifications/count",
|
|
||||||
get(handlers::notifications::notification_count),
|
|
||||||
)
|
|
||||||
.route(
|
|
||||||
"/api/v1/notifications/read-all",
|
|
||||||
post(handlers::notifications::mark_all_read),
|
|
||||||
)
|
|
||||||
.route(
|
|
||||||
"/api/v1/notifications/{id}/read",
|
|
||||||
patch(handlers::notifications::mark_read),
|
|
||||||
)
|
|
||||||
.route(
|
|
||||||
"/api/v1/notifications/{id}/dismiss",
|
|
||||||
patch(handlers::notifications::dismiss_notification),
|
|
||||||
)
|
|
||||||
// Pentest API endpoints
|
// Pentest API endpoints
|
||||||
.route(
|
.route(
|
||||||
"/api/v1/pentest/lookup-repo",
|
"/api/v1/pentest/lookup-repo",
|
||||||
|
|||||||
@@ -1,10 +1,8 @@
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
|
|
||||||
use axum::http::HeaderValue;
|
|
||||||
use axum::{middleware, Extension};
|
use axum::{middleware, Extension};
|
||||||
use tokio::sync::RwLock;
|
use tokio::sync::RwLock;
|
||||||
use tower_http::cors::CorsLayer;
|
use tower_http::cors::CorsLayer;
|
||||||
use tower_http::set_header::SetResponseHeaderLayer;
|
|
||||||
use tower_http::trace::TraceLayer;
|
use tower_http::trace::TraceLayer;
|
||||||
|
|
||||||
use crate::agent::ComplianceAgent;
|
use crate::agent::ComplianceAgent;
|
||||||
@@ -16,24 +14,7 @@ pub async fn start_api_server(agent: ComplianceAgent, port: u16) -> Result<(), A
|
|||||||
let mut app = routes::build_router()
|
let mut app = routes::build_router()
|
||||||
.layer(Extension(Arc::new(agent.clone())))
|
.layer(Extension(Arc::new(agent.clone())))
|
||||||
.layer(CorsLayer::permissive())
|
.layer(CorsLayer::permissive())
|
||||||
.layer(TraceLayer::new_for_http())
|
.layer(TraceLayer::new_for_http());
|
||||||
// Security headers (defense-in-depth, primary enforcement via Traefik)
|
|
||||||
.layer(SetResponseHeaderLayer::overriding(
|
|
||||||
axum::http::header::STRICT_TRANSPORT_SECURITY,
|
|
||||||
HeaderValue::from_static("max-age=31536000; includeSubDomains"),
|
|
||||||
))
|
|
||||||
.layer(SetResponseHeaderLayer::overriding(
|
|
||||||
axum::http::header::X_FRAME_OPTIONS,
|
|
||||||
HeaderValue::from_static("DENY"),
|
|
||||||
))
|
|
||||||
.layer(SetResponseHeaderLayer::overriding(
|
|
||||||
axum::http::header::X_CONTENT_TYPE_OPTIONS,
|
|
||||||
HeaderValue::from_static("nosniff"),
|
|
||||||
))
|
|
||||||
.layer(SetResponseHeaderLayer::overriding(
|
|
||||||
axum::http::header::REFERRER_POLICY,
|
|
||||||
HeaderValue::from_static("strict-origin-when-cross-origin"),
|
|
||||||
));
|
|
||||||
|
|
||||||
if let (Some(kc_url), Some(kc_realm)) =
|
if let (Some(kc_url), Some(kc_realm)) =
|
||||||
(&agent.config.keycloak_url, &agent.config.keycloak_realm)
|
(&agent.config.keycloak_url, &agent.config.keycloak_realm)
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ pub fn load_config() -> Result<AgentConfig, AgentError> {
|
|||||||
.unwrap_or(3001),
|
.unwrap_or(3001),
|
||||||
scan_schedule: env_var_opt("SCAN_SCHEDULE").unwrap_or_else(|| "0 0 */6 * * *".to_string()),
|
scan_schedule: env_var_opt("SCAN_SCHEDULE").unwrap_or_else(|| "0 0 */6 * * *".to_string()),
|
||||||
cve_monitor_schedule: env_var_opt("CVE_MONITOR_SCHEDULE")
|
cve_monitor_schedule: env_var_opt("CVE_MONITOR_SCHEDULE")
|
||||||
.unwrap_or_else(|| "0 0 * * * *".to_string()),
|
.unwrap_or_else(|| "0 0 0 * * *".to_string()),
|
||||||
git_clone_base_path: env_var_opt("GIT_CLONE_BASE_PATH")
|
git_clone_base_path: env_var_opt("GIT_CLONE_BASE_PATH")
|
||||||
.unwrap_or_else(|| "/tmp/compliance-scanner/repos".to_string()),
|
.unwrap_or_else(|| "/tmp/compliance-scanner/repos".to_string()),
|
||||||
ssh_key_path: env_var_opt("SSH_KEY_PATH")
|
ssh_key_path: env_var_opt("SSH_KEY_PATH")
|
||||||
|
|||||||
@@ -78,25 +78,6 @@ impl Database {
|
|||||||
)
|
)
|
||||||
.await?;
|
.await?;
|
||||||
|
|
||||||
// cve_notifications: unique cve_id + repo_id + package, status filter
|
|
||||||
self.cve_notifications()
|
|
||||||
.create_index(
|
|
||||||
IndexModel::builder()
|
|
||||||
.keys(
|
|
||||||
doc! { "cve_id": 1, "repo_id": 1, "package_name": 1, "package_version": 1 },
|
|
||||||
)
|
|
||||||
.options(IndexOptions::builder().unique(true).build())
|
|
||||||
.build(),
|
|
||||||
)
|
|
||||||
.await?;
|
|
||||||
self.cve_notifications()
|
|
||||||
.create_index(
|
|
||||||
IndexModel::builder()
|
|
||||||
.keys(doc! { "status": 1, "created_at": -1 })
|
|
||||||
.build(),
|
|
||||||
)
|
|
||||||
.await?;
|
|
||||||
|
|
||||||
// tracker_issues: unique finding_id
|
// tracker_issues: unique finding_id
|
||||||
self.tracker_issues()
|
self.tracker_issues()
|
||||||
.create_index(
|
.create_index(
|
||||||
@@ -241,12 +222,6 @@ impl Database {
|
|||||||
self.inner.collection("cve_alerts")
|
self.inner.collection("cve_alerts")
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn cve_notifications(
|
|
||||||
&self,
|
|
||||||
) -> Collection<compliance_core::models::notification::CveNotification> {
|
|
||||||
self.inner.collection("cve_notifications")
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn tracker_issues(&self) -> Collection<TrackerIssue> {
|
pub fn tracker_issues(&self) -> Collection<TrackerIssue> {
|
||||||
self.inner.collection("tracker_issues")
|
self.inner.collection("tracker_issues")
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,16 +0,0 @@
|
|||||||
// Library entrypoint — re-exports for integration tests and the binary.
|
|
||||||
|
|
||||||
pub mod agent;
|
|
||||||
pub mod api;
|
|
||||||
pub mod config;
|
|
||||||
pub mod database;
|
|
||||||
pub mod error;
|
|
||||||
pub mod llm;
|
|
||||||
pub mod pentest;
|
|
||||||
pub mod pipeline;
|
|
||||||
pub mod rag;
|
|
||||||
pub mod scheduler;
|
|
||||||
pub mod ssh;
|
|
||||||
#[allow(dead_code)]
|
|
||||||
pub mod trackers;
|
|
||||||
pub mod webhooks;
|
|
||||||
@@ -5,20 +5,15 @@ use compliance_core::models::Finding;
|
|||||||
use crate::error::AgentError;
|
use crate::error::AgentError;
|
||||||
use crate::llm::LlmClient;
|
use crate::llm::LlmClient;
|
||||||
|
|
||||||
const DESCRIPTION_SYSTEM_PROMPT: &str = r#"You are a security engineer writing a bug tracker issue for a developer to fix. Be direct and actionable — developers skim issue descriptions, so lead with what matters.
|
const DESCRIPTION_SYSTEM_PROMPT: &str = r#"You are a security engineer writing issue descriptions for a bug tracker. Generate a clear, actionable issue body in Markdown format that includes:
|
||||||
|
|
||||||
Format in Markdown:
|
1. **Summary**: 1-2 sentence overview
|
||||||
|
2. **Evidence**: Code location, snippet, and what was detected
|
||||||
|
3. **Impact**: What could happen if not fixed
|
||||||
|
4. **Remediation**: Step-by-step fix instructions
|
||||||
|
5. **References**: Relevant CWE/CVE links if applicable
|
||||||
|
|
||||||
1. **What**: 1 sentence — what's wrong and where (file:line)
|
Keep it concise and professional. Use code blocks for code snippets."#;
|
||||||
2. **Why it matters**: 1-2 sentences — concrete impact if not fixed. Avoid generic "could lead to" phrasing; describe the specific attack or failure scenario.
|
|
||||||
3. **Fix**: The specific code change needed. Use a code block with the corrected code if possible. If the fix is configuration-based, show the exact config change.
|
|
||||||
4. **References**: CWE/CVE link if applicable (one line, not a section)
|
|
||||||
|
|
||||||
Rules:
|
|
||||||
- No filler paragraphs or background explanations
|
|
||||||
- No restating the finding title in the body
|
|
||||||
- Code blocks should show the FIX, not the vulnerable code (the developer can see that in the diff)
|
|
||||||
- If the remediation is a one-liner, just say it — don't wrap it in a section header"#;
|
|
||||||
|
|
||||||
pub async fn generate_issue_description(
|
pub async fn generate_issue_description(
|
||||||
llm: &Arc<LlmClient>,
|
llm: &Arc<LlmClient>,
|
||||||
|
|||||||
@@ -5,24 +5,7 @@ use compliance_core::models::Finding;
|
|||||||
use crate::error::AgentError;
|
use crate::error::AgentError;
|
||||||
use crate::llm::LlmClient;
|
use crate::llm::LlmClient;
|
||||||
|
|
||||||
const FIX_SYSTEM_PROMPT: &str = r#"You are a security engineer suggesting a code fix. Return ONLY the corrected code that replaces the vulnerable snippet — no explanations, no markdown fences, no before/after comparison.
|
const FIX_SYSTEM_PROMPT: &str = r#"You are a security engineer. Given a security finding with code context, suggest a concrete code fix. Return ONLY the fixed code snippet that can directly replace the vulnerable code. Include brief inline comments explaining the fix."#;
|
||||||
|
|
||||||
Rules:
|
|
||||||
- The fix must be a drop-in replacement for the vulnerable code
|
|
||||||
- Preserve the original code's style, indentation, and naming conventions
|
|
||||||
- Add at most one brief inline comment on the changed line explaining the security fix
|
|
||||||
- If the fix requires importing a new module, include the import on a separate line prefixed with the language's comment syntax + "Add import: "
|
|
||||||
- Do not refactor, rename variables, or "improve" unrelated code
|
|
||||||
- If the vulnerability is a false positive and the code is actually safe, return the original code unchanged with a comment explaining why no fix is needed
|
|
||||||
|
|
||||||
Language-specific fix guidance:
|
|
||||||
- Rust: use `?` for error propagation, prefer `SecretString` for secrets, use parameterized queries with `sqlx`/`diesel`
|
|
||||||
- Python: use parameterized queries (never f-strings in SQL), use `secrets` module not `random`, use `subprocess.run([...])` list form, use `markupsafe.escape()` for HTML
|
|
||||||
- Go: use `sql.Query` with `$1`/`?` placeholders, use `crypto/rand` not `math/rand`, use `html/template` not `text/template`, return errors don't panic
|
|
||||||
- Java/Kotlin: use `PreparedStatement` with `?` params, use `SecureRandom`, use `Jsoup.clean()` for HTML sanitization, use `@Valid` for input validation
|
|
||||||
- Ruby: use ActiveRecord parameterized finders, use `SecureRandom`, use `ERB::Util.html_escape`, use `strong_parameters`
|
|
||||||
- PHP: use PDO prepared statements with `:param` or `?`, use `random_bytes()`/`random_int()`, use `htmlspecialchars()` with `ENT_QUOTES`, use `password_hash(PASSWORD_BCRYPT)`
|
|
||||||
- C/C++: use `snprintf` not `sprintf`, use bounds-checked APIs, free resources in reverse allocation order, use `memset_s` for secret cleanup"#;
|
|
||||||
|
|
||||||
pub async fn suggest_fix(llm: &Arc<LlmClient>, finding: &Finding) -> Result<String, AgentError> {
|
pub async fn suggest_fix(llm: &Arc<LlmClient>, finding: &Finding) -> Result<String, AgentError> {
|
||||||
let user_prompt = format!(
|
let user_prompt = format!(
|
||||||
|
|||||||
@@ -1,138 +1,69 @@
|
|||||||
// System prompts for multi-pass LLM code review.
|
// System prompts for multi-pass LLM code review.
|
||||||
// Each pass focuses on a different aspect to avoid overloading a single prompt.
|
// Each pass focuses on a different aspect to avoid overloading a single prompt.
|
||||||
|
|
||||||
pub const LOGIC_REVIEW_PROMPT: &str = r#"You are a senior software engineer reviewing a code diff. Report ONLY genuine logic bugs that would cause incorrect behavior at runtime.
|
pub const LOGIC_REVIEW_PROMPT: &str = r#"You are a senior software engineer reviewing code changes. Focus ONLY on logic and correctness issues.
|
||||||
|
|
||||||
Report:
|
Look for:
|
||||||
- Off-by-one errors, wrong comparisons, missing edge cases that cause wrong results
|
- Off-by-one errors, wrong comparisons, missing edge cases
|
||||||
- Incorrect control flow that produces wrong output (not style preferences)
|
- Incorrect control flow (unreachable code, missing returns, wrong loop conditions)
|
||||||
- Actual race conditions with concrete shared-state mutation (not theoretical ones)
|
- Race conditions or concurrency bugs
|
||||||
- Resource leaks where cleanup is truly missing (not just "could be improved")
|
- Resource leaks (unclosed handles, missing cleanup)
|
||||||
- Wrong variable used (copy-paste errors) — must be provably wrong, not just suspicious
|
- Wrong variable used (copy-paste errors)
|
||||||
- Swallowed errors that silently hide failures in a way that matters
|
- Incorrect error handling (swallowed errors, wrong error type)
|
||||||
|
|
||||||
Do NOT report:
|
Ignore: style, naming, formatting, documentation, minor improvements.
|
||||||
- Style, naming, formatting, documentation, or code organization preferences
|
|
||||||
- Theoretical issues without a concrete triggering scenario
|
|
||||||
- "Potential" problems that require assumptions not supported by the visible code
|
|
||||||
- Complexity or function length — that's a separate review pass
|
|
||||||
|
|
||||||
Language-idiomatic patterns that are NOT bugs (do not flag these):
|
For each issue found, respond with a JSON array:
|
||||||
- Rust: `||`/`&&` short-circuit evaluation, variable shadowing, `let` rebinding, `clone()`, `impl` blocks, `match` arms with guards, `?` operator chaining, `unsafe` blocks with safety comments
|
|
||||||
- Python: duck typing, EAFP pattern (try/except vs check-first), `*args`/`**kwargs`, walrus operator `:=`, truthiness checks on containers, bare `except:` in top-level handlers
|
|
||||||
- Go: multiple return values for errors, `if err != nil` patterns, goroutine + channel patterns, blank identifier `_`, named returns, `defer` for cleanup, `init()` functions
|
|
||||||
- Java/Kotlin: checked exception patterns, method overloading, `Optional` vs null checks, Kotlin `?.` safe calls, `!!` non-null assertions in tests, `when` exhaustive matching, companion objects, `lateinit`
|
|
||||||
- Ruby: monkey patching in libraries, method_missing, blocks/procs/lambdas, `rescue => e` patterns, `send`/`respond_to?` metaprogramming, `nil` checks via `&.` safe navigation
|
|
||||||
- PHP: loose comparisons with `==` (only flag if `===` was clearly intended), `@` error suppression in legacy code, `isset()`/`empty()` patterns, magic methods (`__get`, `__call`), array functions as callbacks
|
|
||||||
- C/C++: RAII patterns, move semantics, `const_cast`/`static_cast` in appropriate contexts, macro usage for platform compat, pointer arithmetic in low-level code, `goto` for cleanup in C
|
|
||||||
|
|
||||||
Severity guide:
|
|
||||||
- high: Will cause incorrect behavior in normal usage
|
|
||||||
- medium: Will cause incorrect behavior in edge cases
|
|
||||||
- low: Minor correctness concern with limited blast radius
|
|
||||||
|
|
||||||
Prefer returning [] over reporting low-confidence guesses. A false positive wastes more developer time than a missed low-severity issue.
|
|
||||||
|
|
||||||
Respond with a JSON array (no markdown fences):
|
|
||||||
[{"title": "...", "description": "...", "severity": "high|medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
[{"title": "...", "description": "...", "severity": "high|medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
||||||
|
|
||||||
If no issues found, respond with: []"#;
|
If no issues found, respond with: []"#;
|
||||||
|
|
||||||
pub const SECURITY_REVIEW_PROMPT: &str = r#"You are a security engineer reviewing a code diff. Report ONLY exploitable security vulnerabilities with a realistic attack scenario.
|
pub const SECURITY_REVIEW_PROMPT: &str = r#"You are a security engineer reviewing code changes. Focus ONLY on security vulnerabilities.
|
||||||
|
|
||||||
Report:
|
Look for:
|
||||||
- Injection vulnerabilities (SQL, command, XSS, template) where untrusted input reaches a sink
|
- Injection vulnerabilities (SQL, command, XSS, template injection)
|
||||||
- Authentication/authorization bypasses with a concrete exploit path
|
- Authentication/authorization bypasses
|
||||||
- Sensitive data exposure: secrets in code, credentials in logs, PII leaks
|
- Sensitive data exposure (logging secrets, hardcoded credentials)
|
||||||
- Insecure cryptography: weak algorithms, predictable randomness, hardcoded keys
|
- Insecure cryptography (weak algorithms, predictable randomness)
|
||||||
- Path traversal, SSRF, open redirects — only where user input reaches the vulnerable API
|
- Path traversal, SSRF, open redirects
|
||||||
- Unsafe deserialization of untrusted data
|
- Unsafe deserialization
|
||||||
- Missing input validation at EXTERNAL trust boundaries (user input, API responses)
|
- Missing input validation at trust boundaries
|
||||||
|
|
||||||
Do NOT report:
|
Ignore: code style, performance, general quality.
|
||||||
- Internal code that only handles trusted/validated data
|
|
||||||
- Hash functions used for non-security purposes (dedup fingerprints, cache keys, content addressing)
|
|
||||||
- Logging of non-sensitive operational data (finding titles, counts, performance metrics)
|
|
||||||
- "Information disclosure" for data that is already public or user-facing
|
|
||||||
- Code style, performance, or general quality issues
|
|
||||||
- Missing validation on internal function parameters (trust the caller within the same module/crate/package)
|
|
||||||
- Theoretical attacks that require preconditions not present in the code
|
|
||||||
|
|
||||||
Language-specific patterns that are NOT vulnerabilities (do not flag these):
|
For each issue found, respond with a JSON array:
|
||||||
- Python: `pickle` used on trusted internal data, `eval()`/`exec()` on hardcoded strings, `subprocess` with hardcoded commands, Django `mark_safe()` on static content, `assert` in non-security contexts
|
|
||||||
- Go: `crypto/rand` is secure (don't confuse with `math/rand`), `sql.DB` with parameterized queries is safe, `http.ListenAndServe` without TLS in dev/internal, error strings in responses (Go convention)
|
|
||||||
- Java/Kotlin: Spring Security annotations are sufficient auth checks, `@Transactional` provides atomicity, JPA parameterized queries are safe, Kotlin `require()`/`check()` are assertion patterns not vulnerabilities
|
|
||||||
- Ruby: Rails `params.permit()` is input validation, `render html:` with `html_safe` on generated content, ActiveRecord parameterized finders are safe, Devise/Warden patterns for auth
|
|
||||||
- PHP: PDO prepared statements are safe, Laravel Eloquent is parameterized, `htmlspecialchars()` is XSS mitigation, Symfony security voters are auth checks, `password_hash()`/`password_verify()` are correct bcrypt usage
|
|
||||||
- C/C++: `strncpy`/`snprintf` are bounds-checked (vs `strcpy`/`sprintf`), smart pointers manage memory, RAII handles cleanup, `static_assert` is compile-time only, OpenSSL with proper context setup
|
|
||||||
- Rust: `sha2`/`blake3` for fingerprinting is not "weak crypto", `unsafe` with documented invariants, `secrecy::SecretString` properly handles secrets
|
|
||||||
|
|
||||||
Severity guide:
|
|
||||||
- critical: Remote code execution, auth bypass, or data breach with no preconditions
|
|
||||||
- high: Exploitable vulnerability requiring minimal preconditions
|
|
||||||
- medium: Vulnerability requiring specific conditions or limited impact
|
|
||||||
|
|
||||||
Prefer returning [] over reporting speculative vulnerabilities. Every false positive erodes trust in the scanner.
|
|
||||||
|
|
||||||
Respond with a JSON array (no markdown fences):
|
|
||||||
[{"title": "...", "description": "...", "severity": "critical|high|medium", "file": "...", "line": N, "cwe": "CWE-XXX", "suggestion": "..."}]
|
[{"title": "...", "description": "...", "severity": "critical|high|medium", "file": "...", "line": N, "cwe": "CWE-XXX", "suggestion": "..."}]
|
||||||
|
|
||||||
If no issues found, respond with: []"#;
|
If no issues found, respond with: []"#;
|
||||||
|
|
||||||
pub const CONVENTION_REVIEW_PROMPT: &str = r#"You are a code reviewer checking for convention violations that indicate likely bugs. Report ONLY deviations from the project's visible patterns that could cause real problems.
|
pub const CONVENTION_REVIEW_PROMPT: &str = r#"You are a code reviewer checking adherence to project conventions. Focus ONLY on patterns that indicate likely bugs or maintenance problems.
|
||||||
|
|
||||||
Report:
|
Look for:
|
||||||
- Inconsistent error handling within the same module where the inconsistency could hide failures
|
- Inconsistent error handling patterns within the same module
|
||||||
- Public API that breaks the module's established contract (not just different style)
|
- Public API that doesn't follow the project's established patterns
|
||||||
- Anti-patterns that are bugs in this language: e.g. `unwrap()` in Rust library code where the CI enforces `clippy::unwrap_used`, `any` defeating TypeScript's type system
|
- Missing or incorrect type annotations that could cause runtime issues
|
||||||
|
- Anti-patterns specific to the language (e.g. unwrap in Rust library code, any in TypeScript)
|
||||||
|
|
||||||
Do NOT report:
|
Do NOT report: minor style preferences, documentation gaps, formatting.
|
||||||
- Style preferences, formatting, naming conventions, or documentation
|
Only report issues with HIGH confidence that they deviate from the visible codebase conventions.
|
||||||
- Code organization suggestions ("this function should be split")
|
|
||||||
- Patterns that are valid in the language even if you'd write them differently
|
|
||||||
- "Missing type annotations" unless the code literally won't compile or causes a type inference bug
|
|
||||||
|
|
||||||
Language-specific patterns that are conventional (do not flag these):
|
For each issue found, respond with a JSON array:
|
||||||
- Rust: variable shadowing, `||`/`&&` short-circuit, `let` rebinding, builder patterns, `clone()`, `From`/`Into` impl chains, `#[allow(...)]` attributes
|
|
||||||
- Python: `**kwargs` forwarding, `@property` setters, `__dunder__` methods, list comprehensions with conditions, `if TYPE_CHECKING` imports, `noqa` comments
|
|
||||||
- Go: stuttering names (`http.HTTPClient`) discouraged but not a bug, `context.Context` as first param, init() functions, `//nolint` directives, returning concrete types vs interfaces in internal code
|
|
||||||
- Java/Kotlin: builder pattern boilerplate, Lombok annotations (`@Data`, `@Builder`), Kotlin data classes, `companion object` factories, `@Suppress` annotations, checked exception wrapping
|
|
||||||
- Ruby: `attr_accessor` usage, `Enumerable` mixin patterns, `module_function`, `class << self` syntax, DSL blocks (Rake, RSpec, Sinatra routes)
|
|
||||||
- PHP: `__construct` with property promotion, Laravel facades, static factory methods, nullable types with `?`, attribute syntax `#[...]`
|
|
||||||
- C/C++: header guards vs `#pragma once`, forward declarations, `const` correctness patterns, template specialization, `auto` type deduction
|
|
||||||
|
|
||||||
Severity guide:
|
|
||||||
- medium: Convention violation that will likely cause a bug or maintenance problem
|
|
||||||
- low: Convention violation that is a minor concern
|
|
||||||
|
|
||||||
Return at most 3 findings. Prefer [] over marginal findings.
|
|
||||||
|
|
||||||
Respond with a JSON array (no markdown fences):
|
|
||||||
[{"title": "...", "description": "...", "severity": "medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
[{"title": "...", "description": "...", "severity": "medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
||||||
|
|
||||||
If no issues found, respond with: []"#;
|
If no issues found, respond with: []"#;
|
||||||
|
|
||||||
pub const COMPLEXITY_REVIEW_PROMPT: &str = r#"You are reviewing code changes for complexity that is likely to cause bugs. Report ONLY complexity that makes the code demonstrably harder to reason about.
|
pub const COMPLEXITY_REVIEW_PROMPT: &str = r#"You are reviewing code changes for excessive complexity that could lead to bugs.
|
||||||
|
|
||||||
Report:
|
Look for:
|
||||||
- Functions over 80 lines with multiple interleaved responsibilities (not just long)
|
- Functions over 50 lines that should be decomposed
|
||||||
- Deeply nested control flow (5+ levels) where flattening would prevent bugs
|
- Deeply nested control flow (4+ levels)
|
||||||
- Complex boolean expressions that a reader would likely misinterpret
|
- Complex boolean expressions that are hard to reason about
|
||||||
|
- Functions with 5+ parameters
|
||||||
|
- Code duplication within the changed files
|
||||||
|
|
||||||
Do NOT report:
|
Only report complexity issues that are HIGH risk for future bugs. Ignore acceptable complexity in configuration, CLI argument parsing, or generated code.
|
||||||
- Functions that are long but linear and easy to follow
|
|
||||||
- Acceptable complexity: configuration setup, CLI parsing, test helpers, builder patterns
|
|
||||||
- Code that is complex because the problem is complex — only report if restructuring would reduce bug risk
|
|
||||||
- "This function does multiple things" unless you can identify a specific bug risk from the coupling
|
|
||||||
- Suggestions that would just move complexity elsewhere without reducing it
|
|
||||||
|
|
||||||
Severity guide:
|
For each issue found, respond with a JSON array:
|
||||||
- medium: Complexity that has a concrete risk of causing bugs during future changes
|
|
||||||
- low: Complexity that makes review harder but is unlikely to cause bugs
|
|
||||||
|
|
||||||
Return at most 2 findings. Prefer [] over reporting complexity that is justified.
|
|
||||||
|
|
||||||
Respond with a JSON array (no markdown fences):
|
|
||||||
[{"title": "...", "description": "...", "severity": "medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
[{"title": "...", "description": "...", "severity": "medium|low", "file": "...", "line": N, "suggestion": "..."}]
|
||||||
|
|
||||||
If no issues found, respond with: []"#;
|
If no issues found, respond with: []"#;
|
||||||
|
|||||||
@@ -8,46 +8,22 @@ use crate::pipeline::orchestrator::GraphContext;
|
|||||||
/// Maximum number of findings to include in a single LLM triage call.
|
/// Maximum number of findings to include in a single LLM triage call.
|
||||||
const TRIAGE_CHUNK_SIZE: usize = 30;
|
const TRIAGE_CHUNK_SIZE: usize = 30;
|
||||||
|
|
||||||
const TRIAGE_SYSTEM_PROMPT: &str = r#"You are a pragmatic security triage expert. Your job is to filter out noise and keep only findings that a developer should actually fix. Be aggressive about dismissing false positives — a clean, high-signal list is more valuable than a comprehensive one.
|
const TRIAGE_SYSTEM_PROMPT: &str = r#"You are a security finding triage expert. Analyze each of the following security findings with its code context and determine the appropriate action.
|
||||||
|
|
||||||
Actions:
|
Actions:
|
||||||
- "confirm": True positive with real impact. Keep severity as-is.
|
- "confirm": The finding is a true positive at the reported severity. Keep as-is.
|
||||||
- "downgrade": Real issue but over-reported severity. Lower it.
|
- "downgrade": The finding is real but over-reported. Lower severity recommended.
|
||||||
- "upgrade": Under-reported — higher severity warranted.
|
- "upgrade": The finding is under-reported. Higher severity recommended.
|
||||||
- "dismiss": False positive, not exploitable, or not actionable. Remove it.
|
- "dismiss": The finding is a false positive. Should be removed.
|
||||||
|
|
||||||
Dismiss when:
|
Consider:
|
||||||
- The scanner flagged a language idiom as a bug (see examples below)
|
- Is the code in a test, example, or generated file? (lower confidence for test code)
|
||||||
- The finding is in test/example/generated/vendored code
|
- Does the surrounding code context confirm or refute the finding?
|
||||||
- The "vulnerability" requires preconditions that don't exist in the code
|
- Is the finding actionable by a developer?
|
||||||
- The finding is about code style, complexity, or theoretical concerns rather than actual bugs
|
- Would a real attacker be able to exploit this?
|
||||||
- A hash function is used for non-security purposes (dedup, caching, content addressing)
|
|
||||||
- Internal logging of non-sensitive operational data is flagged as "information disclosure"
|
|
||||||
- The finding duplicates another finding already in the list
|
|
||||||
- Framework-provided security is already in place (e.g. ORM parameterized queries, CSRF middleware, auth decorators)
|
|
||||||
|
|
||||||
Common false positive patterns by language (dismiss these):
|
Respond with a JSON array, one entry per finding in the same order they were presented:
|
||||||
- Rust: short-circuit `||`/`&&`, variable shadowing, `clone()`, `unsafe` with safety docs, `sha2` for fingerprinting
|
[{"id": "<fingerprint>", "action": "confirm|downgrade|upgrade|dismiss", "confidence": 0-10, "rationale": "brief explanation", "remediation": "optional fix suggestion"}, ...]"#;
|
||||||
- Python: EAFP try/except, `subprocess` with hardcoded args, `pickle` on trusted data, Django `mark_safe` on static content
|
|
||||||
- Go: `if err != nil` is not "swallowed error", `crypto/rand` is secure, returning errors is not "information disclosure"
|
|
||||||
- Java/Kotlin: Spring Security annotations are valid auth, JPA parameterized queries are safe, Kotlin `!!` in tests is fine
|
|
||||||
- Ruby: Rails `params.permit` is validation, ActiveRecord finders are parameterized, `html_safe` on generated content
|
|
||||||
- PHP: PDO prepared statements are safe, Laravel Eloquent is parameterized, `htmlspecialchars` is XSS mitigation
|
|
||||||
- C/C++: `strncpy`/`snprintf` are bounds-checked, smart pointers manage memory, RAII handles cleanup
|
|
||||||
|
|
||||||
Confirm only when:
|
|
||||||
- You can describe a concrete scenario where the bug manifests or the vulnerability is exploitable
|
|
||||||
- The fix is actionable (developer can change specific code to resolve it)
|
|
||||||
- The finding is in production code that handles external input or sensitive data
|
|
||||||
|
|
||||||
Confidence scoring (0-10):
|
|
||||||
- 8-10: Certain true positive with clear exploit/bug scenario
|
|
||||||
- 5-7: Likely true positive, some assumptions required
|
|
||||||
- 3-4: Uncertain, needs manual review
|
|
||||||
- 0-2: Almost certainly a false positive
|
|
||||||
|
|
||||||
Respond with a JSON array, one entry per finding in the same order presented (no markdown fences):
|
|
||||||
[{"id": "<fingerprint>", "action": "confirm|downgrade|upgrade|dismiss", "confidence": 0-10, "rationale": "1-2 sentences", "remediation": "optional fix"}, ...]"#;
|
|
||||||
|
|
||||||
pub async fn triage_findings(
|
pub async fn triage_findings(
|
||||||
llm: &Arc<LlmClient>,
|
llm: &Arc<LlmClient>,
|
||||||
|
|||||||
@@ -1,4 +1,17 @@
|
|||||||
use compliance_agent::{agent, api, config, database, scheduler, ssh, webhooks};
|
mod agent;
|
||||||
|
mod api;
|
||||||
|
pub(crate) mod config;
|
||||||
|
mod database;
|
||||||
|
mod error;
|
||||||
|
mod llm;
|
||||||
|
mod pentest;
|
||||||
|
mod pipeline;
|
||||||
|
mod rag;
|
||||||
|
mod scheduler;
|
||||||
|
mod ssh;
|
||||||
|
#[allow(dead_code)]
|
||||||
|
mod trackers;
|
||||||
|
mod webhooks;
|
||||||
|
|
||||||
#[tokio::main]
|
#[tokio::main]
|
||||||
async fn main() -> Result<(), Box<dyn std::error::Error>> {
|
async fn main() -> Result<(), Box<dyn std::error::Error>> {
|
||||||
|
|||||||
@@ -314,21 +314,6 @@ impl PentestOrchestrator {
|
|||||||
- For SPA apps: a 200 HTTP status does NOT mean the page is accessible — check the actual
|
- For SPA apps: a 200 HTTP status does NOT mean the page is accessible — check the actual
|
||||||
page content with the browser tool to verify if it shows real data or a login redirect.
|
page content with the browser tool to verify if it shows real data or a login redirect.
|
||||||
|
|
||||||
## Finding Quality Rules
|
|
||||||
- **Do not report the same issue twice.** If multiple tools detect the same missing header or
|
|
||||||
vulnerability on the same endpoint, report it ONCE with the most specific tool's output.
|
|
||||||
For example, if the recon tool and the header scanner both find missing HSTS, report it only
|
|
||||||
from the header scanner (more specific).
|
|
||||||
- **Group related findings.** Missing security headers on the same endpoint are ONE finding
|
|
||||||
("Missing security headers") listing all missing headers, not separate findings per header.
|
|
||||||
- **Severity must match real impact:**
|
|
||||||
- critical/high: Exploitable vulnerability (you can demonstrate the exploit)
|
|
||||||
- medium: Real misconfiguration with security implications but not directly exploitable
|
|
||||||
- low: Best-practice recommendation, defense-in-depth, or informational
|
|
||||||
- **Missing headers are medium at most** unless you can demonstrate a concrete exploit enabled
|
|
||||||
by the missing header (e.g., missing CSP + confirmed XSS = high for CSP finding).
|
|
||||||
- Console.log in third-party/vendored JS (node_modules, minified libraries) is informational only.
|
|
||||||
|
|
||||||
## Important
|
## Important
|
||||||
- This is an authorized penetration test. All testing is permitted within the target scope.
|
- This is an authorized penetration test. All testing is permitted within the target scope.
|
||||||
- Respect the rate limit of {rate_limit} requests per second.
|
- Respect the rate limit of {rate_limit} requests per second.
|
||||||
|
|||||||
@@ -315,67 +315,20 @@ impl PipelineOrchestrator {
|
|||||||
.await?;
|
.await?;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Persist CVE alerts and create notifications
|
// Persist CVE alerts (upsert by cve_id + repo_id)
|
||||||
{
|
for alert in &cve_alerts {
|
||||||
use compliance_core::models::notification::{parse_severity, CveNotification};
|
let filter = doc! {
|
||||||
|
"cve_id": &alert.cve_id,
|
||||||
let repo_name = repo.name.clone();
|
"repo_id": &alert.repo_id,
|
||||||
let mut new_notif_count = 0u32;
|
};
|
||||||
|
let update = mongodb::bson::to_document(alert)
|
||||||
for alert in &cve_alerts {
|
.map(|d| doc! { "$set": d })
|
||||||
// Upsert the alert
|
.unwrap_or_else(|_| doc! {});
|
||||||
let filter = doc! {
|
self.db
|
||||||
"cve_id": &alert.cve_id,
|
.cve_alerts()
|
||||||
"repo_id": &alert.repo_id,
|
.update_one(filter, update)
|
||||||
};
|
.upsert(true)
|
||||||
let update = mongodb::bson::to_document(alert)
|
.await?;
|
||||||
.map(|d| doc! { "$set": d })
|
|
||||||
.unwrap_or_else(|_| doc! {});
|
|
||||||
self.db
|
|
||||||
.cve_alerts()
|
|
||||||
.update_one(filter, update)
|
|
||||||
.upsert(true)
|
|
||||||
.await?;
|
|
||||||
|
|
||||||
// Create notification (dedup by cve_id + repo + package + version)
|
|
||||||
let notif_filter = doc! {
|
|
||||||
"cve_id": &alert.cve_id,
|
|
||||||
"repo_id": &alert.repo_id,
|
|
||||||
"package_name": &alert.affected_package,
|
|
||||||
"package_version": &alert.affected_version,
|
|
||||||
};
|
|
||||||
let severity = parse_severity(alert.severity.as_deref(), alert.cvss_score);
|
|
||||||
let mut notification = CveNotification::new(
|
|
||||||
alert.cve_id.clone(),
|
|
||||||
repo_id.clone(),
|
|
||||||
repo_name.clone(),
|
|
||||||
alert.affected_package.clone(),
|
|
||||||
alert.affected_version.clone(),
|
|
||||||
severity,
|
|
||||||
);
|
|
||||||
notification.cvss_score = alert.cvss_score;
|
|
||||||
notification.summary = alert.summary.clone();
|
|
||||||
notification.url = Some(format!("https://osv.dev/vulnerability/{}", alert.cve_id));
|
|
||||||
|
|
||||||
let notif_update = doc! {
|
|
||||||
"$setOnInsert": mongodb::bson::to_bson(¬ification).unwrap_or_default()
|
|
||||||
};
|
|
||||||
if let Ok(result) = self
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.update_one(notif_filter, notif_update)
|
|
||||||
.upsert(true)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
if result.upserted_id.is_some() {
|
|
||||||
new_notif_count += 1;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if new_notif_count > 0 {
|
|
||||||
tracing::info!("[{repo_id}] Created {new_notif_count} CVE notification(s)");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Stage 6: Issue Creation
|
// Stage 6: Issue Creation
|
||||||
|
|||||||
@@ -33,7 +33,6 @@ struct PatternRule {
|
|||||||
file_extensions: Vec<String>,
|
file_extensions: Vec<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[allow(clippy::new_without_default)]
|
|
||||||
impl GdprPatternScanner {
|
impl GdprPatternScanner {
|
||||||
pub fn new() -> Self {
|
pub fn new() -> Self {
|
||||||
let patterns = vec![
|
let patterns = vec![
|
||||||
@@ -99,7 +98,6 @@ impl Scanner for GdprPatternScanner {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[allow(clippy::new_without_default)]
|
|
||||||
impl OAuthPatternScanner {
|
impl OAuthPatternScanner {
|
||||||
pub fn new() -> Self {
|
pub fn new() -> Self {
|
||||||
let patterns = vec![
|
let patterns = vec![
|
||||||
|
|||||||
@@ -82,158 +82,24 @@ async fn scan_all_repos(agent: &ComplianceAgent) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async fn monitor_cves(agent: &ComplianceAgent) {
|
async fn monitor_cves(agent: &ComplianceAgent) {
|
||||||
use compliance_core::models::notification::{parse_severity, CveNotification};
|
|
||||||
use compliance_core::models::SbomEntry;
|
|
||||||
use futures_util::StreamExt;
|
use futures_util::StreamExt;
|
||||||
|
|
||||||
// Fetch all SBOM entries grouped by repo
|
// Re-scan all SBOM entries for new CVEs
|
||||||
let cursor = match agent.db.sbom_entries().find(doc! {}).await {
|
let cursor = match agent.db.sbom_entries().find(doc! {}).await {
|
||||||
Ok(c) => c,
|
Ok(c) => c,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
tracing::error!("CVE monitor: failed to list SBOM entries: {e}");
|
tracing::error!("Failed to list SBOM entries for CVE monitoring: {e}");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
let entries: Vec<SbomEntry> = cursor.filter_map(|r| async { r.ok() }).collect().await;
|
|
||||||
|
let entries: Vec<_> = cursor.filter_map(|r| async { r.ok() }).collect().await;
|
||||||
|
|
||||||
if entries.is_empty() {
|
if entries.is_empty() {
|
||||||
tracing::debug!("CVE monitor: no SBOM entries, skipping");
|
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
tracing::info!(
|
tracing::info!("CVE monitor: checking {} dependencies", entries.len());
|
||||||
"CVE monitor: checking {} dependencies for new CVEs",
|
// The actual CVE checking is handled by the CveScanner in the pipeline
|
||||||
entries.len()
|
// This is a simplified version that just logs the activity
|
||||||
);
|
|
||||||
|
|
||||||
// Build a repo_id → repo_name lookup
|
|
||||||
let repo_ids: std::collections::HashSet<String> =
|
|
||||||
entries.iter().map(|e| e.repo_id.clone()).collect();
|
|
||||||
let mut repo_names: std::collections::HashMap<String, String> =
|
|
||||||
std::collections::HashMap::new();
|
|
||||||
for rid in &repo_ids {
|
|
||||||
if let Ok(oid) = mongodb::bson::oid::ObjectId::parse_str(rid) {
|
|
||||||
if let Ok(Some(repo)) = agent.db.repositories().find_one(doc! { "_id": oid }).await {
|
|
||||||
repo_names.insert(rid.clone(), repo.name.clone());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Use the existing CveScanner to query OSV.dev
|
|
||||||
let nvd_key = agent.config.nvd_api_key.as_ref().map(|k| {
|
|
||||||
use secrecy::ExposeSecret;
|
|
||||||
k.expose_secret().to_string()
|
|
||||||
});
|
|
||||||
let scanner = crate::pipeline::cve::CveScanner::new(
|
|
||||||
agent.http.clone(),
|
|
||||||
agent.config.searxng_url.clone(),
|
|
||||||
nvd_key,
|
|
||||||
);
|
|
||||||
|
|
||||||
// Group entries by repo for scanning
|
|
||||||
let mut entries_by_repo: std::collections::HashMap<String, Vec<SbomEntry>> =
|
|
||||||
std::collections::HashMap::new();
|
|
||||||
for entry in entries {
|
|
||||||
entries_by_repo
|
|
||||||
.entry(entry.repo_id.clone())
|
|
||||||
.or_default()
|
|
||||||
.push(entry);
|
|
||||||
}
|
|
||||||
|
|
||||||
let mut new_notifications = 0u32;
|
|
||||||
|
|
||||||
for (repo_id, mut repo_entries) in entries_by_repo {
|
|
||||||
let repo_name = repo_names
|
|
||||||
.get(&repo_id)
|
|
||||||
.cloned()
|
|
||||||
.unwrap_or_else(|| repo_id.clone());
|
|
||||||
|
|
||||||
// Scan dependencies for CVEs
|
|
||||||
let alerts = match scanner.scan_dependencies(&repo_id, &mut repo_entries).await {
|
|
||||||
Ok(a) => a,
|
|
||||||
Err(e) => {
|
|
||||||
tracing::warn!("CVE monitor: scan failed for {repo_name}: {e}");
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Upsert CVE alerts (existing logic)
|
|
||||||
for alert in &alerts {
|
|
||||||
let filter = doc! { "cve_id": &alert.cve_id, "repo_id": &alert.repo_id };
|
|
||||||
let update = doc! { "$setOnInsert": mongodb::bson::to_bson(alert).unwrap_or_default() };
|
|
||||||
let _ = agent
|
|
||||||
.db
|
|
||||||
.cve_alerts()
|
|
||||||
.update_one(filter, update)
|
|
||||||
.upsert(true)
|
|
||||||
.await;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Update SBOM entries with discovered vulnerabilities
|
|
||||||
for entry in &repo_entries {
|
|
||||||
if entry.known_vulnerabilities.is_empty() {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if let Some(entry_id) = &entry.id {
|
|
||||||
let _ = agent
|
|
||||||
.db
|
|
||||||
.sbom_entries()
|
|
||||||
.update_one(
|
|
||||||
doc! { "_id": entry_id },
|
|
||||||
doc! { "$set": {
|
|
||||||
"known_vulnerabilities": mongodb::bson::to_bson(&entry.known_vulnerabilities).unwrap_or_default(),
|
|
||||||
"updated_at": mongodb::bson::DateTime::now(),
|
|
||||||
}},
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create notifications for NEW CVEs (dedup against existing notifications)
|
|
||||||
for alert in &alerts {
|
|
||||||
let filter = doc! {
|
|
||||||
"cve_id": &alert.cve_id,
|
|
||||||
"repo_id": &alert.repo_id,
|
|
||||||
"package_name": &alert.affected_package,
|
|
||||||
"package_version": &alert.affected_version,
|
|
||||||
};
|
|
||||||
// Only insert if not already exists (upsert with $setOnInsert)
|
|
||||||
let severity = parse_severity(alert.severity.as_deref(), alert.cvss_score);
|
|
||||||
let mut notification = CveNotification::new(
|
|
||||||
alert.cve_id.clone(),
|
|
||||||
repo_id.clone(),
|
|
||||||
repo_name.clone(),
|
|
||||||
alert.affected_package.clone(),
|
|
||||||
alert.affected_version.clone(),
|
|
||||||
severity,
|
|
||||||
);
|
|
||||||
notification.cvss_score = alert.cvss_score;
|
|
||||||
notification.summary = alert.summary.clone();
|
|
||||||
notification.url = Some(format!("https://osv.dev/vulnerability/{}", alert.cve_id));
|
|
||||||
|
|
||||||
let update = doc! {
|
|
||||||
"$setOnInsert": mongodb::bson::to_bson(¬ification).unwrap_or_default()
|
|
||||||
};
|
|
||||||
match agent
|
|
||||||
.db
|
|
||||||
.cve_notifications()
|
|
||||||
.update_one(filter, update)
|
|
||||||
.upsert(true)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(result) if result.upserted_id.is_some() => {
|
|
||||||
new_notifications += 1;
|
|
||||||
}
|
|
||||||
Err(e) => {
|
|
||||||
tracing::warn!("CVE monitor: failed to create notification: {e}");
|
|
||||||
}
|
|
||||||
_ => {} // Already exists
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if new_notifications > 0 {
|
|
||||||
tracing::info!("CVE monitor: created {new_notifications} new notification(s)");
|
|
||||||
} else {
|
|
||||||
tracing::info!("CVE monitor: no new CVEs found");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,165 +1,3 @@
|
|||||||
// Shared test harness for E2E / integration tests.
|
// Shared test helpers for compliance-agent integration tests.
|
||||||
//
|
//
|
||||||
// Spins up the agent API server on a random port with an isolated test
|
// Add database mocks, fixtures, and test utilities here.
|
||||||
// database. Each test gets a fresh database that is dropped on cleanup.
|
|
||||||
|
|
||||||
use std::sync::Arc;
|
|
||||||
|
|
||||||
use compliance_agent::agent::ComplianceAgent;
|
|
||||||
use compliance_agent::api;
|
|
||||||
use compliance_agent::database::Database;
|
|
||||||
use compliance_core::AgentConfig;
|
|
||||||
use secrecy::SecretString;
|
|
||||||
|
|
||||||
/// A running test server with a unique database.
|
|
||||||
pub struct TestServer {
|
|
||||||
pub base_url: String,
|
|
||||||
pub client: reqwest::Client,
|
|
||||||
db_name: String,
|
|
||||||
mongodb_uri: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl TestServer {
|
|
||||||
/// Start an agent API server on a random port with an isolated database.
|
|
||||||
pub async fn start() -> Self {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
|
|
||||||
// Unique database name per test run to avoid collisions
|
|
||||||
let db_name = format!("test_{}", uuid::Uuid::new_v4().simple());
|
|
||||||
|
|
||||||
let db = Database::connect(&mongodb_uri, &db_name)
|
|
||||||
.await
|
|
||||||
.expect("Failed to connect to MongoDB — is it running?");
|
|
||||||
db.ensure_indexes().await.expect("Failed to create indexes");
|
|
||||||
|
|
||||||
let config = AgentConfig {
|
|
||||||
mongodb_uri: mongodb_uri.clone(),
|
|
||||||
mongodb_database: db_name.clone(),
|
|
||||||
litellm_url: std::env::var("TEST_LITELLM_URL")
|
|
||||||
.unwrap_or_else(|_| "http://localhost:4000".into()),
|
|
||||||
litellm_api_key: SecretString::from(String::new()),
|
|
||||||
litellm_model: "gpt-4o".into(),
|
|
||||||
litellm_embed_model: "text-embedding-3-small".into(),
|
|
||||||
agent_port: 0, // not used — we bind ourselves
|
|
||||||
scan_schedule: String::new(),
|
|
||||||
cve_monitor_schedule: String::new(),
|
|
||||||
git_clone_base_path: "/tmp/compliance-scanner-tests/repos".into(),
|
|
||||||
ssh_key_path: "/tmp/compliance-scanner-tests/ssh/id_ed25519".into(),
|
|
||||||
github_token: None,
|
|
||||||
github_webhook_secret: None,
|
|
||||||
gitlab_url: None,
|
|
||||||
gitlab_token: None,
|
|
||||||
gitlab_webhook_secret: None,
|
|
||||||
jira_url: None,
|
|
||||||
jira_email: None,
|
|
||||||
jira_api_token: None,
|
|
||||||
jira_project_key: None,
|
|
||||||
searxng_url: None,
|
|
||||||
nvd_api_key: None,
|
|
||||||
keycloak_url: None,
|
|
||||||
keycloak_realm: None,
|
|
||||||
keycloak_admin_username: None,
|
|
||||||
keycloak_admin_password: None,
|
|
||||||
pentest_verification_email: None,
|
|
||||||
pentest_imap_host: None,
|
|
||||||
pentest_imap_port: None,
|
|
||||||
pentest_imap_tls: false,
|
|
||||||
pentest_imap_username: None,
|
|
||||||
pentest_imap_password: None,
|
|
||||||
};
|
|
||||||
|
|
||||||
let agent = ComplianceAgent::new(config, db);
|
|
||||||
|
|
||||||
// Build the router with the agent extension
|
|
||||||
let app = api::routes::build_router()
|
|
||||||
.layer(axum::extract::Extension(Arc::new(agent)))
|
|
||||||
.layer(tower_http::cors::CorsLayer::permissive());
|
|
||||||
|
|
||||||
// Bind to port 0 to get a random available port
|
|
||||||
let listener = tokio::net::TcpListener::bind("127.0.0.1:0")
|
|
||||||
.await
|
|
||||||
.expect("Failed to bind test server");
|
|
||||||
let port = listener.local_addr().expect("no local addr").port();
|
|
||||||
|
|
||||||
tokio::spawn(async move {
|
|
||||||
axum::serve(listener, app).await.ok();
|
|
||||||
});
|
|
||||||
|
|
||||||
let base_url = format!("http://127.0.0.1:{port}");
|
|
||||||
let client = reqwest::Client::builder()
|
|
||||||
.timeout(std::time::Duration::from_secs(30))
|
|
||||||
.build()
|
|
||||||
.expect("Failed to build HTTP client");
|
|
||||||
|
|
||||||
// Wait for server to be ready
|
|
||||||
for _ in 0..50 {
|
|
||||||
if client
|
|
||||||
.get(format!("{base_url}/api/v1/health"))
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.is_ok()
|
|
||||||
{
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
tokio::time::sleep(std::time::Duration::from_millis(50)).await;
|
|
||||||
}
|
|
||||||
|
|
||||||
Self {
|
|
||||||
base_url,
|
|
||||||
client,
|
|
||||||
db_name,
|
|
||||||
mongodb_uri,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// GET helper
|
|
||||||
pub async fn get(&self, path: &str) -> reqwest::Response {
|
|
||||||
self.client
|
|
||||||
.get(format!("{}{path}", self.base_url))
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.expect("GET request failed")
|
|
||||||
}
|
|
||||||
|
|
||||||
/// POST helper with JSON body
|
|
||||||
pub async fn post(&self, path: &str, body: &serde_json::Value) -> reqwest::Response {
|
|
||||||
self.client
|
|
||||||
.post(format!("{}{path}", self.base_url))
|
|
||||||
.json(body)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.expect("POST request failed")
|
|
||||||
}
|
|
||||||
|
|
||||||
/// PATCH helper with JSON body
|
|
||||||
pub async fn patch(&self, path: &str, body: &serde_json::Value) -> reqwest::Response {
|
|
||||||
self.client
|
|
||||||
.patch(format!("{}{path}", self.base_url))
|
|
||||||
.json(body)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.expect("PATCH request failed")
|
|
||||||
}
|
|
||||||
|
|
||||||
/// DELETE helper
|
|
||||||
pub async fn delete(&self, path: &str) -> reqwest::Response {
|
|
||||||
self.client
|
|
||||||
.delete(format!("{}{path}", self.base_url))
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.expect("DELETE request failed")
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Get the unique database name for direct MongoDB access in tests.
|
|
||||||
pub fn db_name(&self) -> &str {
|
|
||||||
&self.db_name
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Drop the test database on cleanup
|
|
||||||
pub async fn cleanup(&self) {
|
|
||||||
if let Ok(client) = mongodb::Client::with_uri_str(&self.mongodb_uri).await {
|
|
||||||
client.database(&self.db_name).drop().await.ok();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -1,7 +0,0 @@
|
|||||||
// E2E test entry point.
|
|
||||||
//
|
|
||||||
// Run with: cargo test -p compliance-agent --test e2e
|
|
||||||
// Requires: MongoDB running (set TEST_MONGODB_URI if not default)
|
|
||||||
|
|
||||||
mod common;
|
|
||||||
mod integration;
|
|
||||||
@@ -1,221 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
use serde_json::json;
|
|
||||||
|
|
||||||
/// Insert a DAST target directly into MongoDB linked to a repo.
|
|
||||||
async fn insert_dast_target(server: &TestServer, repo_id: &str, name: &str) -> String {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
|
|
||||||
let result = db
|
|
||||||
.collection::<mongodb::bson::Document>("dast_targets")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"name": name,
|
|
||||||
"base_url": format!("https://{name}.example.com"),
|
|
||||||
"target_type": "webapp",
|
|
||||||
"repo_id": repo_id,
|
|
||||||
"rate_limit": 10,
|
|
||||||
"allow_destructive": false,
|
|
||||||
"created_at": mongodb::bson::DateTime::now(),
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
|
|
||||||
result.inserted_id.as_object_id().unwrap().to_hex()
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Insert a pentest session linked to a target.
|
|
||||||
async fn insert_pentest_session(server: &TestServer, target_id: &str, repo_id: &str) -> String {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
|
|
||||||
let result = db
|
|
||||||
.collection::<mongodb::bson::Document>("pentest_sessions")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"target_id": target_id,
|
|
||||||
"repo_id": repo_id,
|
|
||||||
"strategy": "comprehensive",
|
|
||||||
"status": "completed",
|
|
||||||
"findings_count": 1_i32,
|
|
||||||
"exploitable_count": 0_i32,
|
|
||||||
"created_at": mongodb::bson::DateTime::now(),
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
|
|
||||||
result.inserted_id.as_object_id().unwrap().to_hex()
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Insert an attack chain node linked to a session.
|
|
||||||
async fn insert_attack_node(server: &TestServer, session_id: &str) {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
|
|
||||||
db.collection::<mongodb::bson::Document>("attack_chain_nodes")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"session_id": session_id,
|
|
||||||
"node_id": "node-1",
|
|
||||||
"tool_name": "recon",
|
|
||||||
"status": "completed",
|
|
||||||
"created_at": mongodb::bson::DateTime::now(),
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Insert a DAST finding linked to a target.
|
|
||||||
async fn insert_dast_finding(server: &TestServer, target_id: &str, session_id: &str) {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
|
|
||||||
db.collection::<mongodb::bson::Document>("dast_findings")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"scan_run_id": "run-1",
|
|
||||||
"target_id": target_id,
|
|
||||||
"vuln_type": "xss",
|
|
||||||
"title": "Reflected XSS",
|
|
||||||
"description": "XSS in search param",
|
|
||||||
"severity": "high",
|
|
||||||
"endpoint": "https://example.com/search",
|
|
||||||
"method": "GET",
|
|
||||||
"exploitable": true,
|
|
||||||
"evidence": [],
|
|
||||||
"session_id": session_id,
|
|
||||||
"created_at": mongodb::bson::DateTime::now(),
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Helper to count documents in a collection
|
|
||||||
async fn count_docs(server: &TestServer, collection: &str) -> u64 {
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
db.collection::<mongodb::bson::Document>(collection)
|
|
||||||
.count_documents(mongodb::bson::doc! {})
|
|
||||||
.await
|
|
||||||
.unwrap()
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn delete_repo_cascades_to_dast_and_pentest_data() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Create a repo
|
|
||||||
let resp = server
|
|
||||||
.post(
|
|
||||||
"/api/v1/repositories",
|
|
||||||
&json!({
|
|
||||||
"name": "cascade-test",
|
|
||||||
"git_url": "https://github.com/example/cascade-test.git",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let repo_id = body["data"]["id"].as_str().unwrap().to_string();
|
|
||||||
|
|
||||||
// Insert DAST target linked to repo
|
|
||||||
let target_id = insert_dast_target(&server, &repo_id, "cascade-target").await;
|
|
||||||
|
|
||||||
// Insert pentest session linked to target
|
|
||||||
let session_id = insert_pentest_session(&server, &target_id, &repo_id).await;
|
|
||||||
|
|
||||||
// Insert downstream data
|
|
||||||
insert_attack_node(&server, &session_id).await;
|
|
||||||
insert_dast_finding(&server, &target_id, &session_id).await;
|
|
||||||
|
|
||||||
// Verify data exists
|
|
||||||
assert_eq!(count_docs(&server, "dast_targets").await, 1);
|
|
||||||
assert_eq!(count_docs(&server, "pentest_sessions").await, 1);
|
|
||||||
assert_eq!(count_docs(&server, "attack_chain_nodes").await, 1);
|
|
||||||
assert_eq!(count_docs(&server, "dast_findings").await, 1);
|
|
||||||
|
|
||||||
// Delete the repo
|
|
||||||
let resp = server
|
|
||||||
.delete(&format!("/api/v1/repositories/{repo_id}"))
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// All downstream data should be gone
|
|
||||||
assert_eq!(count_docs(&server, "dast_targets").await, 0);
|
|
||||||
assert_eq!(count_docs(&server, "pentest_sessions").await, 0);
|
|
||||||
assert_eq!(count_docs(&server, "attack_chain_nodes").await, 0);
|
|
||||||
assert_eq!(count_docs(&server, "dast_findings").await, 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn delete_repo_cascades_sast_findings_and_sbom() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Create a repo
|
|
||||||
let resp = server
|
|
||||||
.post(
|
|
||||||
"/api/v1/repositories",
|
|
||||||
&json!({
|
|
||||||
"name": "sast-cascade",
|
|
||||||
"git_url": "https://github.com/example/sast-cascade.git",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let repo_id = body["data"]["id"].as_str().unwrap().to_string();
|
|
||||||
|
|
||||||
// Insert SAST finding and SBOM entry
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
let now = mongodb::bson::DateTime::now();
|
|
||||||
|
|
||||||
db.collection::<mongodb::bson::Document>("findings")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"repo_id": &repo_id,
|
|
||||||
"fingerprint": "fp-test-1",
|
|
||||||
"scanner": "semgrep",
|
|
||||||
"scan_type": "sast",
|
|
||||||
"title": "SQL Injection",
|
|
||||||
"description": "desc",
|
|
||||||
"severity": "critical",
|
|
||||||
"status": "open",
|
|
||||||
"created_at": now,
|
|
||||||
"updated_at": now,
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
|
|
||||||
db.collection::<mongodb::bson::Document>("sbom_entries")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"repo_id": &repo_id,
|
|
||||||
"name": "lodash",
|
|
||||||
"version": "4.17.20",
|
|
||||||
"package_manager": "npm",
|
|
||||||
"known_vulnerabilities": [],
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
|
|
||||||
assert_eq!(count_docs(&server, "findings").await, 1);
|
|
||||||
assert_eq!(count_docs(&server, "sbom_entries").await, 1);
|
|
||||||
|
|
||||||
// Delete repo
|
|
||||||
server
|
|
||||||
.delete(&format!("/api/v1/repositories/{repo_id}"))
|
|
||||||
.await;
|
|
||||||
|
|
||||||
// Both should be gone
|
|
||||||
assert_eq!(count_docs(&server, "findings").await, 0);
|
|
||||||
assert_eq!(count_docs(&server, "sbom_entries").await, 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,48 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
use serde_json::json;
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn add_and_list_dast_targets() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Initially empty
|
|
||||||
let resp = server.get("/api/v1/dast/targets").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"].as_array().unwrap().len(), 0);
|
|
||||||
|
|
||||||
// Add a target
|
|
||||||
let resp = server
|
|
||||||
.post(
|
|
||||||
"/api/v1/dast/targets",
|
|
||||||
&json!({
|
|
||||||
"name": "test-app",
|
|
||||||
"base_url": "https://test-app.example.com",
|
|
||||||
"target_type": "webapp",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// List should return 1
|
|
||||||
let resp = server.get("/api/v1/dast/targets").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let targets = body["data"].as_array().unwrap();
|
|
||||||
assert_eq!(targets.len(), 1);
|
|
||||||
assert_eq!(targets[0]["name"], "test-app");
|
|
||||||
assert_eq!(targets[0]["base_url"], "https://test-app.example.com");
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn list_dast_findings_empty() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/dast/findings").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"].as_array().unwrap().len(), 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,144 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
use serde_json::json;
|
|
||||||
|
|
||||||
/// Helper: insert a finding directly via MongoDB for testing query endpoints.
|
|
||||||
async fn insert_finding(server: &TestServer, repo_id: &str, title: &str, severity: &str) {
|
|
||||||
// We insert via the agent's DB by posting to the internal test path.
|
|
||||||
// Since there's no direct "create finding" API, we use MongoDB directly.
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
|
|
||||||
// Extract the database name from the server's unique DB
|
|
||||||
// We'll use the agent's internal DB through the stats endpoint to verify
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
|
|
||||||
// Get the DB name from the test server by parsing the health response
|
|
||||||
// For now, we use a direct insert approach
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
|
|
||||||
let now = mongodb::bson::DateTime::now();
|
|
||||||
db.collection::<mongodb::bson::Document>("findings")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"repo_id": repo_id,
|
|
||||||
"fingerprint": format!("fp-{title}-{severity}"),
|
|
||||||
"scanner": "test-scanner",
|
|
||||||
"scan_type": "sast",
|
|
||||||
"title": title,
|
|
||||||
"description": format!("Test finding: {title}"),
|
|
||||||
"severity": severity,
|
|
||||||
"status": "open",
|
|
||||||
"created_at": now,
|
|
||||||
"updated_at": now,
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn list_findings_empty() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/findings").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"].as_array().unwrap().len(), 0);
|
|
||||||
assert_eq!(body["total"], 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn list_findings_with_data() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
insert_finding(&server, "repo1", "SQL Injection", "critical").await;
|
|
||||||
insert_finding(&server, "repo1", "XSS", "high").await;
|
|
||||||
insert_finding(&server, "repo2", "Info Leak", "low").await;
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/findings").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["total"], 3);
|
|
||||||
|
|
||||||
// Filter by severity
|
|
||||||
let resp = server.get("/api/v1/findings?severity=critical").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["total"], 1);
|
|
||||||
assert_eq!(body["data"][0]["title"], "SQL Injection");
|
|
||||||
|
|
||||||
// Filter by repo
|
|
||||||
let resp = server.get("/api/v1/findings?repo_id=repo1").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["total"], 2);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn update_finding_status() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
insert_finding(&server, "repo1", "Test Bug", "medium").await;
|
|
||||||
|
|
||||||
// Get the finding ID
|
|
||||||
let resp = server.get("/api/v1/findings").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let finding_id = body["data"][0]["_id"]["$oid"].as_str().unwrap();
|
|
||||||
|
|
||||||
// Update status to resolved
|
|
||||||
let resp = server
|
|
||||||
.patch(
|
|
||||||
&format!("/api/v1/findings/{finding_id}/status"),
|
|
||||||
&json!({ "status": "resolved" }),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// Verify it's updated
|
|
||||||
let resp = server.get(&format!("/api/v1/findings/{finding_id}")).await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"]["status"], "resolved");
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn bulk_update_finding_status() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
insert_finding(&server, "repo1", "Bug A", "high").await;
|
|
||||||
insert_finding(&server, "repo1", "Bug B", "high").await;
|
|
||||||
|
|
||||||
// Get both finding IDs
|
|
||||||
let resp = server.get("/api/v1/findings").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let ids: Vec<String> = body["data"]
|
|
||||||
.as_array()
|
|
||||||
.unwrap()
|
|
||||||
.iter()
|
|
||||||
.map(|f| f["_id"]["$oid"].as_str().unwrap().to_string())
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
// Bulk update
|
|
||||||
let resp = server
|
|
||||||
.patch(
|
|
||||||
"/api/v1/findings/bulk-status",
|
|
||||||
&json!({
|
|
||||||
"ids": ids,
|
|
||||||
"status": "false_positive"
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// Verify both are updated
|
|
||||||
for id in &ids {
|
|
||||||
let resp = server.get(&format!("/api/v1/findings/{id}")).await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"]["status"], "false_positive");
|
|
||||||
}
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,29 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn health_endpoint_returns_ok() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/health").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["status"], "ok");
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn stats_overview_returns_zeroes_on_empty_db() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/stats/overview").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let data = &body["data"];
|
|
||||||
assert_eq!(data["repositories"], 0);
|
|
||||||
assert_eq!(data["total_findings"], 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
mod cascade_delete;
|
|
||||||
mod dast;
|
|
||||||
mod findings;
|
|
||||||
mod health;
|
|
||||||
mod repositories;
|
|
||||||
mod stats;
|
|
||||||
@@ -1,110 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
use serde_json::json;
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn add_and_list_repository() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Initially empty
|
|
||||||
let resp = server.get("/api/v1/repositories").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"].as_array().unwrap().len(), 0);
|
|
||||||
|
|
||||||
// Add a repository
|
|
||||||
let resp = server
|
|
||||||
.post(
|
|
||||||
"/api/v1/repositories",
|
|
||||||
&json!({
|
|
||||||
"name": "test-repo",
|
|
||||||
"git_url": "https://github.com/example/test-repo.git",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let repo_id = body["data"]["id"].as_str().unwrap().to_string();
|
|
||||||
assert!(!repo_id.is_empty());
|
|
||||||
|
|
||||||
// List should now return 1
|
|
||||||
let resp = server.get("/api/v1/repositories").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let repos = body["data"].as_array().unwrap();
|
|
||||||
assert_eq!(repos.len(), 1);
|
|
||||||
assert_eq!(repos[0]["name"], "test-repo");
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn add_duplicate_repository_fails() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let payload = json!({
|
|
||||||
"name": "dup-repo",
|
|
||||||
"git_url": "https://github.com/example/dup-repo.git",
|
|
||||||
});
|
|
||||||
|
|
||||||
// First add succeeds
|
|
||||||
let resp = server.post("/api/v1/repositories", &payload).await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// Second add with same git_url should fail (unique index)
|
|
||||||
let resp = server.post("/api/v1/repositories", &payload).await;
|
|
||||||
assert_ne!(resp.status(), 200);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn delete_repository() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Add a repo
|
|
||||||
let resp = server
|
|
||||||
.post(
|
|
||||||
"/api/v1/repositories",
|
|
||||||
&json!({
|
|
||||||
"name": "to-delete",
|
|
||||||
"git_url": "https://github.com/example/to-delete.git",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let repo_id = body["data"]["id"].as_str().unwrap();
|
|
||||||
|
|
||||||
// Delete it
|
|
||||||
let resp = server
|
|
||||||
.delete(&format!("/api/v1/repositories/{repo_id}"))
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
// List should be empty again
|
|
||||||
let resp = server.get("/api/v1/repositories").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"].as_array().unwrap().len(), 0);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn delete_nonexistent_repository_returns_404() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server
|
|
||||||
.delete("/api/v1/repositories/000000000000000000000000")
|
|
||||||
.await;
|
|
||||||
assert_eq!(resp.status(), 404);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn delete_invalid_id_returns_400() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
let resp = server.delete("/api/v1/repositories/not-a-valid-id").await;
|
|
||||||
assert_eq!(resp.status(), 400);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,111 +0,0 @@
|
|||||||
use crate::common::TestServer;
|
|
||||||
use serde_json::json;
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn stats_overview_reflects_inserted_data() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Add a repo
|
|
||||||
server
|
|
||||||
.post(
|
|
||||||
"/api/v1/repositories",
|
|
||||||
&json!({
|
|
||||||
"name": "stats-repo",
|
|
||||||
"git_url": "https://github.com/example/stats-repo.git",
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
|
|
||||||
// Insert findings directly
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
let now = mongodb::bson::DateTime::now();
|
|
||||||
|
|
||||||
for (title, severity) in [
|
|
||||||
("Critical Bug", "critical"),
|
|
||||||
("High Bug", "high"),
|
|
||||||
("Medium Bug", "medium"),
|
|
||||||
("Low Bug", "low"),
|
|
||||||
] {
|
|
||||||
db.collection::<mongodb::bson::Document>("findings")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"repo_id": "test-repo-id",
|
|
||||||
"fingerprint": format!("fp-{title}"),
|
|
||||||
"scanner": "test",
|
|
||||||
"scan_type": "sast",
|
|
||||||
"title": title,
|
|
||||||
"description": "desc",
|
|
||||||
"severity": severity,
|
|
||||||
"status": "open",
|
|
||||||
"created_at": now,
|
|
||||||
"updated_at": now,
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
}
|
|
||||||
|
|
||||||
let resp = server.get("/api/v1/stats/overview").await;
|
|
||||||
assert_eq!(resp.status(), 200);
|
|
||||||
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
let data = &body["data"];
|
|
||||||
assert_eq!(data["repositories"], 1);
|
|
||||||
assert_eq!(data["total_findings"], 4);
|
|
||||||
assert_eq!(data["critical"], 1);
|
|
||||||
assert_eq!(data["high"], 1);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn stats_update_after_finding_status_change() {
|
|
||||||
let server = TestServer::start().await;
|
|
||||||
|
|
||||||
// Insert a finding
|
|
||||||
let mongodb_uri = std::env::var("TEST_MONGODB_URI")
|
|
||||||
.unwrap_or_else(|_| "mongodb://root:example@localhost:27017/?authSource=admin".into());
|
|
||||||
let client = mongodb::Client::with_uri_str(&mongodb_uri).await.unwrap();
|
|
||||||
let db = client.database(&server.db_name());
|
|
||||||
let now = mongodb::bson::DateTime::now();
|
|
||||||
|
|
||||||
let result = db
|
|
||||||
.collection::<mongodb::bson::Document>("findings")
|
|
||||||
.insert_one(mongodb::bson::doc! {
|
|
||||||
"repo_id": "repo-1",
|
|
||||||
"fingerprint": "fp-stats-test",
|
|
||||||
"scanner": "test",
|
|
||||||
"scan_type": "sast",
|
|
||||||
"title": "Stats Test Finding",
|
|
||||||
"description": "desc",
|
|
||||||
"severity": "high",
|
|
||||||
"status": "open",
|
|
||||||
"created_at": now,
|
|
||||||
"updated_at": now,
|
|
||||||
})
|
|
||||||
.await
|
|
||||||
.unwrap();
|
|
||||||
let finding_id = result.inserted_id.as_object_id().unwrap().to_hex();
|
|
||||||
|
|
||||||
// Stats should show 1 finding
|
|
||||||
let resp = server.get("/api/v1/stats/overview").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
assert_eq!(body["data"]["total_findings"], 1);
|
|
||||||
|
|
||||||
// Mark it as resolved
|
|
||||||
server
|
|
||||||
.patch(
|
|
||||||
&format!("/api/v1/findings/{finding_id}/status"),
|
|
||||||
&json!({ "status": "resolved" }),
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
|
|
||||||
// The finding still exists (status changed, not deleted)
|
|
||||||
let resp = server.get("/api/v1/stats/overview").await;
|
|
||||||
let body: serde_json::Value = resp.json().await.unwrap();
|
|
||||||
// total_findings counts all findings regardless of status
|
|
||||||
assert_eq!(body["data"]["total_findings"], 1);
|
|
||||||
|
|
||||||
server.cleanup().await;
|
|
||||||
}
|
|
||||||
@@ -1,9 +1,4 @@
|
|||||||
// E2E / Integration tests for the compliance-agent API.
|
// Integration tests for the compliance-agent crate.
|
||||||
//
|
//
|
||||||
// These tests require a running MongoDB instance. Set TEST_MONGODB_URI
|
// Add tests that exercise the full pipeline, API handlers,
|
||||||
// if it's not at the default `mongodb://root:example@localhost:27017`.
|
// and cross-module interactions here.
|
||||||
//
|
|
||||||
// Run with: cargo test -p compliance-agent --test e2e
|
|
||||||
// Or nightly: (via CI with MongoDB service container)
|
|
||||||
|
|
||||||
mod api;
|
|
||||||
|
|||||||
@@ -7,7 +7,6 @@ pub mod finding;
|
|||||||
pub mod graph;
|
pub mod graph;
|
||||||
pub mod issue;
|
pub mod issue;
|
||||||
pub mod mcp;
|
pub mod mcp;
|
||||||
pub mod notification;
|
|
||||||
pub mod pentest;
|
pub mod pentest;
|
||||||
pub mod repository;
|
pub mod repository;
|
||||||
pub mod sbom;
|
pub mod sbom;
|
||||||
@@ -28,7 +27,6 @@ pub use graph::{
|
|||||||
};
|
};
|
||||||
pub use issue::{IssueStatus, TrackerIssue, TrackerType};
|
pub use issue::{IssueStatus, TrackerIssue, TrackerType};
|
||||||
pub use mcp::{McpServerConfig, McpServerStatus, McpTransport};
|
pub use mcp::{McpServerConfig, McpServerStatus, McpTransport};
|
||||||
pub use notification::{CveNotification, NotificationSeverity, NotificationStatus};
|
|
||||||
pub use pentest::{
|
pub use pentest::{
|
||||||
AttackChainNode, AttackNodeStatus, AuthMode, CodeContextHint, Environment, IdentityProvider,
|
AttackChainNode, AttackNodeStatus, AuthMode, CodeContextHint, Environment, IdentityProvider,
|
||||||
PentestAuthConfig, PentestConfig, PentestEvent, PentestMessage, PentestSession, PentestStats,
|
PentestAuthConfig, PentestConfig, PentestEvent, PentestMessage, PentestSession, PentestStats,
|
||||||
|
|||||||
@@ -1,103 +0,0 @@
|
|||||||
use chrono::{DateTime, Utc};
|
|
||||||
use serde::{Deserialize, Serialize};
|
|
||||||
|
|
||||||
/// Status of a CVE notification
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
|
|
||||||
#[serde(rename_all = "lowercase")]
|
|
||||||
pub enum NotificationStatus {
|
|
||||||
/// Newly created, not yet seen by the user
|
|
||||||
New,
|
|
||||||
/// User has seen it (e.g., opened the notification panel)
|
|
||||||
Read,
|
|
||||||
/// User has explicitly acknowledged/dismissed it
|
|
||||||
Dismissed,
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Severity level for notification filtering
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, PartialOrd, Ord)]
|
|
||||||
#[serde(rename_all = "lowercase")]
|
|
||||||
pub enum NotificationSeverity {
|
|
||||||
Low,
|
|
||||||
Medium,
|
|
||||||
High,
|
|
||||||
Critical,
|
|
||||||
}
|
|
||||||
|
|
||||||
/// A notification about a newly discovered CVE affecting a tracked dependency.
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
|
||||||
pub struct CveNotification {
|
|
||||||
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
|
|
||||||
pub id: Option<bson::oid::ObjectId>,
|
|
||||||
/// The CVE/GHSA identifier
|
|
||||||
pub cve_id: String,
|
|
||||||
/// Repository where the vulnerable dependency is used
|
|
||||||
pub repo_id: String,
|
|
||||||
/// Repository name (denormalized for display)
|
|
||||||
pub repo_name: String,
|
|
||||||
/// Affected package name
|
|
||||||
pub package_name: String,
|
|
||||||
/// Affected version
|
|
||||||
pub package_version: String,
|
|
||||||
/// Human-readable severity
|
|
||||||
pub severity: NotificationSeverity,
|
|
||||||
/// CVSS score if available
|
|
||||||
pub cvss_score: Option<f64>,
|
|
||||||
/// Short summary of the vulnerability
|
|
||||||
pub summary: Option<String>,
|
|
||||||
/// Link to vulnerability details
|
|
||||||
pub url: Option<String>,
|
|
||||||
/// Notification lifecycle status
|
|
||||||
pub status: NotificationStatus,
|
|
||||||
/// When the CVE was first detected for this dependency
|
|
||||||
#[serde(with = "super::serde_helpers::bson_datetime")]
|
|
||||||
pub created_at: DateTime<Utc>,
|
|
||||||
/// When the user last interacted with this notification
|
|
||||||
pub read_at: Option<DateTime<Utc>>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl CveNotification {
|
|
||||||
pub fn new(
|
|
||||||
cve_id: String,
|
|
||||||
repo_id: String,
|
|
||||||
repo_name: String,
|
|
||||||
package_name: String,
|
|
||||||
package_version: String,
|
|
||||||
severity: NotificationSeverity,
|
|
||||||
) -> Self {
|
|
||||||
Self {
|
|
||||||
id: None,
|
|
||||||
cve_id,
|
|
||||||
repo_id,
|
|
||||||
repo_name,
|
|
||||||
package_name,
|
|
||||||
package_version,
|
|
||||||
severity,
|
|
||||||
cvss_score: None,
|
|
||||||
summary: None,
|
|
||||||
url: None,
|
|
||||||
status: NotificationStatus::New,
|
|
||||||
created_at: Utc::now(),
|
|
||||||
read_at: None,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Map an OSV/NVD severity string to our notification severity
|
|
||||||
pub fn parse_severity(s: Option<&str>, cvss: Option<f64>) -> NotificationSeverity {
|
|
||||||
// Prefer CVSS score if available
|
|
||||||
if let Some(score) = cvss {
|
|
||||||
return match score {
|
|
||||||
s if s >= 9.0 => NotificationSeverity::Critical,
|
|
||||||
s if s >= 7.0 => NotificationSeverity::High,
|
|
||||||
s if s >= 4.0 => NotificationSeverity::Medium,
|
|
||||||
_ => NotificationSeverity::Low,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
// Fall back to string severity
|
|
||||||
match s.map(|s| s.to_uppercase()).as_deref() {
|
|
||||||
Some("CRITICAL") => NotificationSeverity::Critical,
|
|
||||||
Some("HIGH") => NotificationSeverity::High,
|
|
||||||
Some("MODERATE" | "MEDIUM") => NotificationSeverity::Medium,
|
|
||||||
_ => NotificationSeverity::Low,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -3645,247 +3645,3 @@ tbody tr:last-child td {
|
|||||||
.wizard-toggle.active .wizard-toggle-knob {
|
.wizard-toggle.active .wizard-toggle-knob {
|
||||||
transform: translateX(16px);
|
transform: translateX(16px);
|
||||||
}
|
}
|
||||||
|
|
||||||
/* ═══════════════════════════════════════════════════════════════
|
|
||||||
HELP CHAT WIDGET
|
|
||||||
Floating assistant for documentation Q&A
|
|
||||||
═══════════════════════════════════════════════════════════════ */
|
|
||||||
|
|
||||||
.help-chat-toggle {
|
|
||||||
position: fixed;
|
|
||||||
bottom: 24px;
|
|
||||||
right: 28px;
|
|
||||||
z-index: 50;
|
|
||||||
width: 48px;
|
|
||||||
height: 48px;
|
|
||||||
border-radius: 50%;
|
|
||||||
background: var(--accent);
|
|
||||||
color: var(--bg-primary);
|
|
||||||
border: none;
|
|
||||||
cursor: pointer;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
box-shadow: 0 4px 20px rgba(0, 200, 255, 0.3);
|
|
||||||
transition: transform 0.15s, box-shadow 0.15s;
|
|
||||||
}
|
|
||||||
.help-chat-toggle:hover {
|
|
||||||
transform: scale(1.08);
|
|
||||||
box-shadow: 0 6px 28px rgba(0, 200, 255, 0.4);
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-chat-panel {
|
|
||||||
position: fixed;
|
|
||||||
bottom: 24px;
|
|
||||||
right: 28px;
|
|
||||||
z-index: 51;
|
|
||||||
width: 400px;
|
|
||||||
height: 520px;
|
|
||||||
background: var(--bg-secondary);
|
|
||||||
border: 1px solid var(--border-bright);
|
|
||||||
border-radius: 16px;
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
overflow: hidden;
|
|
||||||
box-shadow: 0 12px 48px rgba(0, 0, 0, 0.5), var(--accent-glow);
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-chat-header {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
padding: 14px 18px;
|
|
||||||
border-bottom: 1px solid var(--border);
|
|
||||||
background: var(--bg-primary);
|
|
||||||
}
|
|
||||||
.help-chat-title {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 8px;
|
|
||||||
font-family: 'Outfit', sans-serif;
|
|
||||||
font-weight: 600;
|
|
||||||
font-size: 14px;
|
|
||||||
color: var(--text-primary);
|
|
||||||
}
|
|
||||||
.help-chat-close {
|
|
||||||
background: none;
|
|
||||||
border: none;
|
|
||||||
color: var(--text-secondary);
|
|
||||||
cursor: pointer;
|
|
||||||
padding: 4px;
|
|
||||||
border-radius: 6px;
|
|
||||||
display: flex;
|
|
||||||
}
|
|
||||||
.help-chat-close:hover {
|
|
||||||
color: var(--text-primary);
|
|
||||||
background: var(--bg-elevated);
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-chat-messages {
|
|
||||||
flex: 1;
|
|
||||||
overflow-y: auto;
|
|
||||||
padding: 16px;
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
gap: 12px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-chat-empty {
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
height: 100%;
|
|
||||||
text-align: center;
|
|
||||||
color: var(--text-secondary);
|
|
||||||
font-size: 13px;
|
|
||||||
gap: 8px;
|
|
||||||
}
|
|
||||||
.help-chat-hint {
|
|
||||||
font-size: 12px;
|
|
||||||
color: var(--text-tertiary);
|
|
||||||
font-style: italic;
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-msg {
|
|
||||||
max-width: 88%;
|
|
||||||
animation: helpMsgIn 0.15s ease-out;
|
|
||||||
}
|
|
||||||
@keyframes helpMsgIn {
|
|
||||||
from { opacity: 0; transform: translateY(6px); }
|
|
||||||
to { opacity: 1; transform: translateY(0); }
|
|
||||||
}
|
|
||||||
.help-msg-user {
|
|
||||||
align-self: flex-end;
|
|
||||||
}
|
|
||||||
.help-msg-assistant {
|
|
||||||
align-self: flex-start;
|
|
||||||
}
|
|
||||||
.help-msg-content {
|
|
||||||
padding: 10px 14px;
|
|
||||||
border-radius: 12px;
|
|
||||||
font-size: 13px;
|
|
||||||
line-height: 1.55;
|
|
||||||
word-wrap: break-word;
|
|
||||||
}
|
|
||||||
.help-msg-user .help-msg-content {
|
|
||||||
background: var(--accent);
|
|
||||||
color: var(--bg-primary);
|
|
||||||
border-bottom-right-radius: 4px;
|
|
||||||
}
|
|
||||||
.help-msg-assistant .help-msg-content {
|
|
||||||
background: var(--bg-elevated);
|
|
||||||
color: var(--text-primary);
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
border-bottom-left-radius: 4px;
|
|
||||||
}
|
|
||||||
.help-msg-assistant .help-msg-content code {
|
|
||||||
background: rgba(0, 200, 255, 0.1);
|
|
||||||
padding: 1px 5px;
|
|
||||||
border-radius: 3px;
|
|
||||||
font-family: 'JetBrains Mono', monospace;
|
|
||||||
font-size: 12px;
|
|
||||||
}
|
|
||||||
.help-msg-loading {
|
|
||||||
padding: 10px 14px;
|
|
||||||
border-radius: 12px;
|
|
||||||
background: var(--bg-elevated);
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
border-bottom-left-radius: 4px;
|
|
||||||
color: var(--text-secondary);
|
|
||||||
font-size: 13px;
|
|
||||||
animation: helpPulse 1.2s ease-in-out infinite;
|
|
||||||
}
|
|
||||||
@keyframes helpPulse {
|
|
||||||
0%, 100% { opacity: 0.6; }
|
|
||||||
50% { opacity: 1; }
|
|
||||||
}
|
|
||||||
|
|
||||||
.help-chat-input {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 8px;
|
|
||||||
padding: 12px 14px;
|
|
||||||
border-top: 1px solid var(--border);
|
|
||||||
background: var(--bg-primary);
|
|
||||||
}
|
|
||||||
.help-chat-input input {
|
|
||||||
flex: 1;
|
|
||||||
background: var(--bg-elevated);
|
|
||||||
border: 1px solid var(--border);
|
|
||||||
border-radius: 8px;
|
|
||||||
padding: 10px 14px;
|
|
||||||
color: var(--text-primary);
|
|
||||||
font-size: 13px;
|
|
||||||
font-family: 'DM Sans', sans-serif;
|
|
||||||
outline: none;
|
|
||||||
transition: border-color 0.15s;
|
|
||||||
}
|
|
||||||
.help-chat-input input:focus {
|
|
||||||
border-color: var(--accent);
|
|
||||||
}
|
|
||||||
.help-chat-input input::placeholder {
|
|
||||||
color: var(--text-tertiary);
|
|
||||||
}
|
|
||||||
.help-chat-send {
|
|
||||||
width: 36px;
|
|
||||||
height: 36px;
|
|
||||||
border-radius: 8px;
|
|
||||||
background: var(--accent);
|
|
||||||
color: var(--bg-primary);
|
|
||||||
border: none;
|
|
||||||
cursor: pointer;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
transition: opacity 0.15s;
|
|
||||||
}
|
|
||||||
.help-chat-send:disabled {
|
|
||||||
opacity: 0.4;
|
|
||||||
cursor: not-allowed;
|
|
||||||
}
|
|
||||||
.help-chat-send:not(:disabled):hover {
|
|
||||||
background: var(--accent-hover);
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ═══════════════════════════════════════════════════════════════
|
|
||||||
NOTIFICATION BELL — CVE alert dropdown
|
|
||||||
═══════════════════════════════════════════════════════════════ */
|
|
||||||
.notification-bell-wrapper { position: fixed; top: 16px; right: 28px; z-index: 48; }
|
|
||||||
.notification-bell-btn { position: relative; background: var(--bg-elevated); border: 1px solid var(--border); border-radius: 10px; padding: 8px 10px; color: var(--text-secondary); cursor: pointer; display: flex; align-items: center; transition: color 0.15s, border-color 0.15s; }
|
|
||||||
.notification-bell-btn:hover { color: var(--text-primary); border-color: var(--border-bright); }
|
|
||||||
.notification-badge { position: absolute; top: -4px; right: -4px; background: var(--danger); color: #fff; font-size: 10px; font-weight: 700; min-width: 18px; height: 18px; border-radius: 9px; display: flex; align-items: center; justify-content: center; padding: 0 4px; font-family: 'Outfit', sans-serif; }
|
|
||||||
.notification-panel { position: absolute; top: 44px; right: 0; width: 380px; max-height: 480px; background: var(--bg-secondary); border: 1px solid var(--border-bright); border-radius: 12px; overflow: hidden; box-shadow: 0 12px 48px rgba(0,0,0,0.5); display: flex; flex-direction: column; }
|
|
||||||
.notification-panel-header { display: flex; align-items: center; justify-content: space-between; padding: 12px 16px; border-bottom: 1px solid var(--border); font-family: 'Outfit', sans-serif; font-weight: 600; font-size: 14px; color: var(--text-primary); }
|
|
||||||
.notification-close-btn { background: none; border: none; color: var(--text-secondary); cursor: pointer; padding: 2px; }
|
|
||||||
.notification-panel-body { overflow-y: auto; flex: 1; padding: 8px; }
|
|
||||||
.notification-loading, .notification-empty { display: flex; flex-direction: column; align-items: center; justify-content: center; padding: 32px 16px; color: var(--text-secondary); font-size: 13px; gap: 8px; }
|
|
||||||
.notification-item { padding: 10px 12px; border-radius: 8px; margin-bottom: 4px; background: var(--bg-card); border: 1px solid var(--border); transition: border-color 0.15s; }
|
|
||||||
.notification-item:hover { border-color: var(--border-bright); }
|
|
||||||
.notification-item-header { display: flex; align-items: center; gap: 8px; margin-bottom: 4px; }
|
|
||||||
.notification-sev { font-size: 10px; font-weight: 700; padding: 2px 6px; border-radius: 4px; text-transform: uppercase; letter-spacing: 0.5px; font-family: 'Outfit', sans-serif; }
|
|
||||||
.notification-sev.sev-critical { background: var(--danger-bg); color: var(--danger); }
|
|
||||||
.notification-sev.sev-high { background: rgba(255,140,0,0.12); color: #ff8c00; }
|
|
||||||
.notification-sev.sev-medium { background: var(--warning-bg); color: var(--warning); }
|
|
||||||
.notification-sev.sev-low { background: rgba(0,200,255,0.08); color: var(--accent); }
|
|
||||||
.notification-cve-id { font-size: 12px; font-weight: 600; color: var(--text-primary); font-family: 'JetBrains Mono', monospace; }
|
|
||||||
.notification-cve-id a { color: var(--accent); text-decoration: none; }
|
|
||||||
.notification-cve-id a:hover { text-decoration: underline; }
|
|
||||||
.notification-cvss { font-size: 10px; color: var(--text-secondary); margin-left: auto; font-family: 'JetBrains Mono', monospace; }
|
|
||||||
.notification-dismiss-btn { background: none; border: none; color: var(--text-tertiary); cursor: pointer; padding: 2px; margin-left: 4px; }
|
|
||||||
.notification-dismiss-btn:hover { color: var(--danger); }
|
|
||||||
.notification-item-pkg { font-size: 12px; color: var(--text-primary); font-family: 'JetBrains Mono', monospace; }
|
|
||||||
.notification-item-repo { font-size: 11px; color: var(--text-secondary); margin-bottom: 4px; }
|
|
||||||
.notification-item-summary { font-size: 11px; color: var(--text-secondary); line-height: 1.4; display: -webkit-box; -webkit-line-clamp: 2; -webkit-box-orient: vertical; overflow: hidden; }
|
|
||||||
|
|
||||||
/* ═══════════════════════════════════════════════════════════════
|
|
||||||
COPY BUTTON — Reusable clipboard copy component
|
|
||||||
═══════════════════════════════════════════════════════════════ */
|
|
||||||
.copy-btn { background: none; border: 1px solid var(--border); border-radius: 6px; padding: 5px 7px; color: var(--text-secondary); cursor: pointer; display: inline-flex; align-items: center; transition: color 0.15s, border-color 0.15s, background 0.15s; flex-shrink: 0; }
|
|
||||||
.copy-btn:hover { color: var(--accent); border-color: var(--accent); background: var(--accent-muted); }
|
|
||||||
.copy-btn-sm { padding: 3px 5px; border-radius: 4px; }
|
|
||||||
/* Copyable inline field pattern: value + copy button side by side */
|
|
||||||
.copyable { display: flex; align-items: center; gap: 6px; }
|
|
||||||
.copyable code, .copyable .mono { flex: 1; min-width: 0; overflow: hidden; text-overflow: ellipsis; white-space: nowrap; }
|
|
||||||
.code-snippet-wrapper { position: relative; }
|
|
||||||
.code-snippet-header { display: flex; align-items: center; justify-content: space-between; margin-bottom: 4px; gap: 8px; }
|
|
||||||
|
|||||||
@@ -44,6 +44,8 @@ pub enum Route {
|
|||||||
PentestSessionPage { session_id: String },
|
PentestSessionPage { session_id: String },
|
||||||
#[route("/mcp-servers")]
|
#[route("/mcp-servers")]
|
||||||
McpServersPage {},
|
McpServersPage {},
|
||||||
|
#[route("/settings")]
|
||||||
|
SettingsPage {},
|
||||||
}
|
}
|
||||||
|
|
||||||
const FAVICON: Asset = asset!("/assets/favicon.svg");
|
const FAVICON: Asset = asset!("/assets/favicon.svg");
|
||||||
|
|||||||
@@ -1,8 +1,6 @@
|
|||||||
use dioxus::prelude::*;
|
use dioxus::prelude::*;
|
||||||
|
|
||||||
use crate::app::Route;
|
use crate::app::Route;
|
||||||
use crate::components::help_chat::HelpChat;
|
|
||||||
use crate::components::notification_bell::NotificationBell;
|
|
||||||
use crate::components::sidebar::Sidebar;
|
use crate::components::sidebar::Sidebar;
|
||||||
use crate::components::toast::{ToastContainer, Toasts};
|
use crate::components::toast::{ToastContainer, Toasts};
|
||||||
use crate::infrastructure::auth_check::check_auth;
|
use crate::infrastructure::auth_check::check_auth;
|
||||||
@@ -22,9 +20,7 @@ pub fn AppShell() -> Element {
|
|||||||
main { class: "main-content",
|
main { class: "main-content",
|
||||||
Outlet::<Route> {}
|
Outlet::<Route> {}
|
||||||
}
|
}
|
||||||
NotificationBell {}
|
|
||||||
ToastContainer {}
|
ToastContainer {}
|
||||||
HelpChat {}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,5 @@
|
|||||||
use dioxus::prelude::*;
|
use dioxus::prelude::*;
|
||||||
|
|
||||||
use crate::components::copy_button::CopyButton;
|
|
||||||
|
|
||||||
#[component]
|
#[component]
|
||||||
pub fn CodeSnippet(
|
pub fn CodeSnippet(
|
||||||
code: String,
|
code: String,
|
||||||
@@ -9,18 +7,15 @@ pub fn CodeSnippet(
|
|||||||
#[props(default)] line_number: u32,
|
#[props(default)] line_number: u32,
|
||||||
) -> Element {
|
) -> Element {
|
||||||
rsx! {
|
rsx! {
|
||||||
div { class: "code-snippet-wrapper",
|
div {
|
||||||
div { class: "code-snippet-header",
|
if !file_path.is_empty() {
|
||||||
if !file_path.is_empty() {
|
div {
|
||||||
span {
|
style: "font-size: 12px; color: var(--text-secondary); margin-bottom: 4px; font-family: monospace;",
|
||||||
style: "font-size: 12px; color: var(--text-secondary); font-family: monospace;",
|
"{file_path}"
|
||||||
"{file_path}"
|
if line_number > 0 {
|
||||||
if line_number > 0 {
|
":{line_number}"
|
||||||
":{line_number}"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
CopyButton { value: code.clone(), small: true }
|
|
||||||
}
|
}
|
||||||
pre { class: "code-block", "{code}" }
|
pre { class: "code-block", "{code}" }
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,49 +0,0 @@
|
|||||||
use dioxus::prelude::*;
|
|
||||||
use dioxus_free_icons::icons::bs_icons::*;
|
|
||||||
use dioxus_free_icons::Icon;
|
|
||||||
|
|
||||||
/// A small copy-to-clipboard button that shows a checkmark after copying.
|
|
||||||
///
|
|
||||||
/// Usage: `CopyButton { value: "text to copy" }`
|
|
||||||
#[component]
|
|
||||||
pub fn CopyButton(value: String, #[props(default = false)] small: bool) -> Element {
|
|
||||||
let mut copied = use_signal(|| false);
|
|
||||||
|
|
||||||
let size = if small { 12 } else { 14 };
|
|
||||||
let class = if small {
|
|
||||||
"copy-btn copy-btn-sm"
|
|
||||||
} else {
|
|
||||||
"copy-btn"
|
|
||||||
};
|
|
||||||
|
|
||||||
rsx! {
|
|
||||||
button {
|
|
||||||
class: class,
|
|
||||||
title: if copied() { "Copied!" } else { "Copy to clipboard" },
|
|
||||||
onclick: move |_| {
|
|
||||||
let val = value.clone();
|
|
||||||
// Escape for JS single-quoted string
|
|
||||||
let escaped = val
|
|
||||||
.replace('\\', "\\\\")
|
|
||||||
.replace('\'', "\\'")
|
|
||||||
.replace('\n', "\\n")
|
|
||||||
.replace('\r', "\\r");
|
|
||||||
let js = format!("navigator.clipboard.writeText('{escaped}')");
|
|
||||||
document::eval(&js);
|
|
||||||
copied.set(true);
|
|
||||||
spawn(async move {
|
|
||||||
#[cfg(feature = "web")]
|
|
||||||
gloo_timers::future::TimeoutFuture::new(2000).await;
|
|
||||||
#[cfg(not(feature = "web"))]
|
|
||||||
tokio::time::sleep(std::time::Duration::from_secs(2)).await;
|
|
||||||
copied.set(false);
|
|
||||||
});
|
|
||||||
},
|
|
||||||
if copied() {
|
|
||||||
Icon { icon: BsCheckLg, width: size, height: size }
|
|
||||||
} else {
|
|
||||||
Icon { icon: BsClipboard, width: size, height: size }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,198 +0,0 @@
|
|||||||
use dioxus::prelude::*;
|
|
||||||
use dioxus_free_icons::icons::bs_icons::*;
|
|
||||||
use dioxus_free_icons::Icon;
|
|
||||||
|
|
||||||
use crate::infrastructure::help_chat::{send_help_chat_message, HelpChatHistoryMessage};
|
|
||||||
|
|
||||||
// ── Message model ────────────────────────────────────────────────────────────
|
|
||||||
|
|
||||||
#[derive(Clone, Debug)]
|
|
||||||
struct ChatMsg {
|
|
||||||
role: String,
|
|
||||||
content: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
// ── Component ────────────────────────────────────────────────────────────────
|
|
||||||
|
|
||||||
#[component]
|
|
||||||
pub fn HelpChat() -> Element {
|
|
||||||
let mut is_open = use_signal(|| false);
|
|
||||||
let mut messages = use_signal(Vec::<ChatMsg>::new);
|
|
||||||
let mut input_text = use_signal(String::new);
|
|
||||||
let mut is_loading = use_signal(|| false);
|
|
||||||
|
|
||||||
// Send message handler
|
|
||||||
let on_send = move |_| {
|
|
||||||
let text = input_text().trim().to_string();
|
|
||||||
if text.is_empty() || is_loading() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Push user message
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "user".into(),
|
|
||||||
content: text.clone(),
|
|
||||||
});
|
|
||||||
input_text.set(String::new());
|
|
||||||
is_loading.set(true);
|
|
||||||
|
|
||||||
// Build history for API call (exclude last user message, it goes as `message`)
|
|
||||||
let history: Vec<HelpChatHistoryMessage> = messages()
|
|
||||||
.iter()
|
|
||||||
.rev()
|
|
||||||
.skip(1) // skip the user message we just added
|
|
||||||
.rev()
|
|
||||||
.map(|m| HelpChatHistoryMessage {
|
|
||||||
role: m.role.clone(),
|
|
||||||
content: m.content.clone(),
|
|
||||||
})
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
spawn(async move {
|
|
||||||
match send_help_chat_message(text, history).await {
|
|
||||||
Ok(resp) => {
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "assistant".into(),
|
|
||||||
content: resp.data.message,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
Err(e) => {
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "assistant".into(),
|
|
||||||
content: format!("Error: {e}"),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
is_loading.set(false);
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
// Key handler for Enter to send
|
|
||||||
let on_keydown = move |e: KeyboardEvent| {
|
|
||||||
if e.key() == Key::Enter && !e.modifiers().shift() {
|
|
||||||
e.prevent_default();
|
|
||||||
let text = input_text().trim().to_string();
|
|
||||||
if text.is_empty() || is_loading() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "user".into(),
|
|
||||||
content: text.clone(),
|
|
||||||
});
|
|
||||||
input_text.set(String::new());
|
|
||||||
is_loading.set(true);
|
|
||||||
|
|
||||||
let history: Vec<HelpChatHistoryMessage> = messages()
|
|
||||||
.iter()
|
|
||||||
.rev()
|
|
||||||
.skip(1)
|
|
||||||
.rev()
|
|
||||||
.map(|m| HelpChatHistoryMessage {
|
|
||||||
role: m.role.clone(),
|
|
||||||
content: m.content.clone(),
|
|
||||||
})
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
spawn(async move {
|
|
||||||
match send_help_chat_message(text, history).await {
|
|
||||||
Ok(resp) => {
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "assistant".into(),
|
|
||||||
content: resp.data.message,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
Err(e) => {
|
|
||||||
messages.write().push(ChatMsg {
|
|
||||||
role: "assistant".into(),
|
|
||||||
content: format!("Error: {e}"),
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
is_loading.set(false);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
rsx! {
|
|
||||||
// Floating toggle button
|
|
||||||
if !is_open() {
|
|
||||||
button {
|
|
||||||
class: "help-chat-toggle",
|
|
||||||
onclick: move |_| is_open.set(true),
|
|
||||||
title: "Help",
|
|
||||||
Icon { icon: BsQuestionCircle, width: 22, height: 22 }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Chat panel
|
|
||||||
if is_open() {
|
|
||||||
div { class: "help-chat-panel",
|
|
||||||
// Header
|
|
||||||
div { class: "help-chat-header",
|
|
||||||
span { class: "help-chat-title",
|
|
||||||
Icon { icon: BsRobot, width: 16, height: 16 }
|
|
||||||
"Help Assistant"
|
|
||||||
}
|
|
||||||
button {
|
|
||||||
class: "help-chat-close",
|
|
||||||
onclick: move |_| is_open.set(false),
|
|
||||||
Icon { icon: BsX, width: 18, height: 18 }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Messages area
|
|
||||||
div { class: "help-chat-messages",
|
|
||||||
if messages().is_empty() {
|
|
||||||
div { class: "help-chat-empty",
|
|
||||||
p { "Ask me anything about the Compliance Scanner." }
|
|
||||||
p { class: "help-chat-hint",
|
|
||||||
"e.g. \"How do I add a repository?\" or \"What is SBOM?\""
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for (i, msg) in messages().iter().enumerate() {
|
|
||||||
div {
|
|
||||||
key: "{i}",
|
|
||||||
class: if msg.role == "user" { "help-msg help-msg-user" } else { "help-msg help-msg-assistant" },
|
|
||||||
div { class: "help-msg-content",
|
|
||||||
dangerous_inner_html: if msg.role == "assistant" {
|
|
||||||
// Basic markdown rendering: bold, code, newlines
|
|
||||||
msg.content
|
|
||||||
.replace("**", "<strong>")
|
|
||||||
.replace("\n\n", "<br><br>")
|
|
||||||
.replace("\n- ", "<br>- ")
|
|
||||||
.replace("`", "<code>")
|
|
||||||
} else {
|
|
||||||
msg.content.clone()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if is_loading() {
|
|
||||||
div { class: "help-msg help-msg-assistant",
|
|
||||||
div { class: "help-msg-loading", "Thinking..." }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Input area
|
|
||||||
div { class: "help-chat-input",
|
|
||||||
input {
|
|
||||||
r#type: "text",
|
|
||||||
placeholder: "Ask a question...",
|
|
||||||
value: "{input_text}",
|
|
||||||
disabled: is_loading(),
|
|
||||||
oninput: move |e| input_text.set(e.value()),
|
|
||||||
onkeydown: on_keydown,
|
|
||||||
}
|
|
||||||
button {
|
|
||||||
class: "help-chat-send",
|
|
||||||
disabled: is_loading() || input_text().trim().is_empty(),
|
|
||||||
onclick: on_send,
|
|
||||||
Icon { icon: BsSend, width: 14, height: 14 }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -2,10 +2,7 @@ pub mod app_shell;
|
|||||||
pub mod attack_chain;
|
pub mod attack_chain;
|
||||||
pub mod code_inspector;
|
pub mod code_inspector;
|
||||||
pub mod code_snippet;
|
pub mod code_snippet;
|
||||||
pub mod copy_button;
|
|
||||||
pub mod file_tree;
|
pub mod file_tree;
|
||||||
pub mod help_chat;
|
|
||||||
pub mod notification_bell;
|
|
||||||
pub mod page_header;
|
pub mod page_header;
|
||||||
pub mod pagination;
|
pub mod pagination;
|
||||||
pub mod pentest_wizard;
|
pub mod pentest_wizard;
|
||||||
|
|||||||
@@ -1,155 +0,0 @@
|
|||||||
use dioxus::prelude::*;
|
|
||||||
use dioxus_free_icons::icons::bs_icons::*;
|
|
||||||
use dioxus_free_icons::Icon;
|
|
||||||
|
|
||||||
use crate::infrastructure::notifications::{
|
|
||||||
dismiss_notification, fetch_notification_count, fetch_notifications,
|
|
||||||
mark_all_notifications_read,
|
|
||||||
};
|
|
||||||
|
|
||||||
#[component]
|
|
||||||
pub fn NotificationBell() -> Element {
|
|
||||||
let mut is_open = use_signal(|| false);
|
|
||||||
let mut count = use_signal(|| 0u64);
|
|
||||||
let mut notifications = use_signal(Vec::new);
|
|
||||||
let mut is_loading = use_signal(|| false);
|
|
||||||
|
|
||||||
// Poll notification count every 30 seconds
|
|
||||||
use_resource(move || async move {
|
|
||||||
loop {
|
|
||||||
if let Ok(c) = fetch_notification_count().await {
|
|
||||||
count.set(c);
|
|
||||||
}
|
|
||||||
#[cfg(feature = "web")]
|
|
||||||
{
|
|
||||||
gloo_timers::future::TimeoutFuture::new(30_000).await;
|
|
||||||
}
|
|
||||||
#[cfg(not(feature = "web"))]
|
|
||||||
{
|
|
||||||
tokio::time::sleep(std::time::Duration::from_secs(30)).await;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Load notifications when panel opens
|
|
||||||
let load_notifications = move |_| {
|
|
||||||
is_open.set(!is_open());
|
|
||||||
if !is_open() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
is_loading.set(true);
|
|
||||||
spawn(async move {
|
|
||||||
if let Ok(resp) = fetch_notifications().await {
|
|
||||||
notifications.set(resp.data);
|
|
||||||
}
|
|
||||||
// Mark all as read when panel opens
|
|
||||||
let _ = mark_all_notifications_read().await;
|
|
||||||
count.set(0);
|
|
||||||
is_loading.set(false);
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
let on_dismiss = move |id: String| {
|
|
||||||
spawn(async move {
|
|
||||||
let _ = dismiss_notification(id.clone()).await;
|
|
||||||
notifications.write().retain(|n| {
|
|
||||||
n.id.as_ref()
|
|
||||||
.and_then(|v| v.get("$oid"))
|
|
||||||
.and_then(|v| v.as_str())
|
|
||||||
!= Some(&id)
|
|
||||||
});
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
rsx! {
|
|
||||||
div { class: "notification-bell-wrapper",
|
|
||||||
// Bell button
|
|
||||||
button {
|
|
||||||
class: "notification-bell-btn",
|
|
||||||
onclick: load_notifications,
|
|
||||||
title: "CVE Alerts",
|
|
||||||
Icon { icon: BsBell, width: 18, height: 18 }
|
|
||||||
if count() > 0 {
|
|
||||||
span { class: "notification-badge", "{count()}" }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Dropdown panel
|
|
||||||
if is_open() {
|
|
||||||
div { class: "notification-panel",
|
|
||||||
div { class: "notification-panel-header",
|
|
||||||
span { "CVE Alerts" }
|
|
||||||
button {
|
|
||||||
class: "notification-close-btn",
|
|
||||||
onclick: move |_| is_open.set(false),
|
|
||||||
Icon { icon: BsX, width: 16, height: 16 }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
div { class: "notification-panel-body",
|
|
||||||
if is_loading() {
|
|
||||||
div { class: "notification-loading", "Loading..." }
|
|
||||||
} else if notifications().is_empty() {
|
|
||||||
div { class: "notification-empty",
|
|
||||||
Icon { icon: BsShieldCheck, width: 32, height: 32 }
|
|
||||||
p { "No CVE alerts" }
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
for notif in notifications().iter() {
|
|
||||||
{
|
|
||||||
let id = notif.id.as_ref()
|
|
||||||
.and_then(|v| v.get("$oid"))
|
|
||||||
.and_then(|v| v.as_str())
|
|
||||||
.unwrap_or("")
|
|
||||||
.to_string();
|
|
||||||
let sev_class = match notif.severity.as_str() {
|
|
||||||
"critical" => "sev-critical",
|
|
||||||
"high" => "sev-high",
|
|
||||||
"medium" => "sev-medium",
|
|
||||||
_ => "sev-low",
|
|
||||||
};
|
|
||||||
let dismiss_id = id.clone();
|
|
||||||
rsx! {
|
|
||||||
div { class: "notification-item",
|
|
||||||
div { class: "notification-item-header",
|
|
||||||
span { class: "notification-sev {sev_class}",
|
|
||||||
"{notif.severity.to_uppercase()}"
|
|
||||||
}
|
|
||||||
span { class: "notification-cve-id",
|
|
||||||
if let Some(ref url) = notif.url {
|
|
||||||
a { href: "{url}", target: "_blank", "{notif.cve_id}" }
|
|
||||||
} else {
|
|
||||||
"{notif.cve_id}"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if let Some(score) = notif.cvss_score {
|
|
||||||
span { class: "notification-cvss", "CVSS {score:.1}" }
|
|
||||||
}
|
|
||||||
button {
|
|
||||||
class: "notification-dismiss-btn",
|
|
||||||
title: "Dismiss",
|
|
||||||
onclick: move |_| on_dismiss(dismiss_id.clone()),
|
|
||||||
Icon { icon: BsXCircle, width: 14, height: 14 }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
div { class: "notification-item-pkg",
|
|
||||||
"{notif.package_name} {notif.package_version}"
|
|
||||||
}
|
|
||||||
div { class: "notification-item-repo",
|
|
||||||
"{notif.repo_name}"
|
|
||||||
}
|
|
||||||
if let Some(ref summary) = notif.summary {
|
|
||||||
div { class: "notification-item-summary",
|
|
||||||
"{summary}"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -52,6 +52,11 @@ pub fn Sidebar() -> Element {
|
|||||||
route: Route::PentestDashboardPage {},
|
route: Route::PentestDashboardPage {},
|
||||||
icon: rsx! { Icon { icon: BsLightningCharge, width: 18, height: 18 } },
|
icon: rsx! { Icon { icon: BsLightningCharge, width: 18, height: 18 } },
|
||||||
},
|
},
|
||||||
|
NavItem {
|
||||||
|
label: "Settings",
|
||||||
|
route: Route::SettingsPage {},
|
||||||
|
icon: rsx! { Icon { icon: BsGear, width: 18, height: 18 } },
|
||||||
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
let docs_url = option_env!("DOCS_URL").unwrap_or("/docs");
|
let docs_url = option_env!("DOCS_URL").unwrap_or("/docs");
|
||||||
|
|||||||
@@ -1,59 +0,0 @@
|
|||||||
use dioxus::prelude::*;
|
|
||||||
use serde::{Deserialize, Serialize};
|
|
||||||
|
|
||||||
// ── Response types ──
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
pub struct HelpChatApiResponse {
|
|
||||||
pub data: HelpChatResponseData,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
pub struct HelpChatResponseData {
|
|
||||||
pub message: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
// ── History message type ──
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
|
||||||
pub struct HelpChatHistoryMessage {
|
|
||||||
pub role: String,
|
|
||||||
pub content: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
// ── Server function ──
|
|
||||||
|
|
||||||
#[server]
|
|
||||||
pub async fn send_help_chat_message(
|
|
||||||
message: String,
|
|
||||||
history: Vec<HelpChatHistoryMessage>,
|
|
||||||
) -> Result<HelpChatApiResponse, ServerFnError> {
|
|
||||||
let state: super::server_state::ServerState =
|
|
||||||
dioxus_fullstack::FullstackContext::extract().await?;
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/help/chat", state.agent_api_url);
|
|
||||||
let client = reqwest::Client::builder()
|
|
||||||
.timeout(std::time::Duration::from_secs(120))
|
|
||||||
.build()
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
|
|
||||||
let resp = client
|
|
||||||
.post(&url)
|
|
||||||
.json(&serde_json::json!({
|
|
||||||
"message": message,
|
|
||||||
"history": history,
|
|
||||||
}))
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(format!("Help chat request failed: {e}")))?;
|
|
||||||
|
|
||||||
let text = resp
|
|
||||||
.text()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(format!("Failed to read response: {e}")))?;
|
|
||||||
|
|
||||||
let body: HelpChatApiResponse = serde_json::from_str(&text)
|
|
||||||
.map_err(|e| ServerFnError::new(format!("Failed to parse response: {e}")))?;
|
|
||||||
|
|
||||||
Ok(body)
|
|
||||||
}
|
|
||||||
@@ -5,10 +5,8 @@ pub mod chat;
|
|||||||
pub mod dast;
|
pub mod dast;
|
||||||
pub mod findings;
|
pub mod findings;
|
||||||
pub mod graph;
|
pub mod graph;
|
||||||
pub mod help_chat;
|
|
||||||
pub mod issues;
|
pub mod issues;
|
||||||
pub mod mcp;
|
pub mod mcp;
|
||||||
pub mod notifications;
|
|
||||||
pub mod pentest;
|
pub mod pentest;
|
||||||
#[allow(clippy::too_many_arguments)]
|
#[allow(clippy::too_many_arguments)]
|
||||||
pub mod repositories;
|
pub mod repositories;
|
||||||
|
|||||||
@@ -1,91 +0,0 @@
|
|||||||
use dioxus::prelude::*;
|
|
||||||
use serde::{Deserialize, Serialize};
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
pub struct NotificationListResponse {
|
|
||||||
pub data: Vec<CveNotificationData>,
|
|
||||||
#[serde(default)]
|
|
||||||
pub total: Option<u64>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
pub struct CveNotificationData {
|
|
||||||
#[serde(rename = "_id")]
|
|
||||||
pub id: Option<serde_json::Value>,
|
|
||||||
pub cve_id: String,
|
|
||||||
pub repo_name: String,
|
|
||||||
pub package_name: String,
|
|
||||||
pub package_version: String,
|
|
||||||
pub severity: String,
|
|
||||||
pub cvss_score: Option<f64>,
|
|
||||||
pub summary: Option<String>,
|
|
||||||
pub url: Option<String>,
|
|
||||||
pub status: String,
|
|
||||||
#[serde(default)]
|
|
||||||
pub created_at: Option<serde_json::Value>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
pub struct NotificationCountResponse {
|
|
||||||
pub count: u64,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[server]
|
|
||||||
pub async fn fetch_notification_count() -> Result<u64, ServerFnError> {
|
|
||||||
let state: super::server_state::ServerState =
|
|
||||||
dioxus_fullstack::FullstackContext::extract().await?;
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/notifications/count", state.agent_api_url);
|
|
||||||
let resp = reqwest::get(&url)
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
let body: NotificationCountResponse = resp
|
|
||||||
.json()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
Ok(body.count)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[server]
|
|
||||||
pub async fn fetch_notifications() -> Result<NotificationListResponse, ServerFnError> {
|
|
||||||
let state: super::server_state::ServerState =
|
|
||||||
dioxus_fullstack::FullstackContext::extract().await?;
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/notifications?limit=20", state.agent_api_url);
|
|
||||||
let resp = reqwest::get(&url)
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
let body: NotificationListResponse = resp
|
|
||||||
.json()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
Ok(body)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[server]
|
|
||||||
pub async fn mark_all_notifications_read() -> Result<(), ServerFnError> {
|
|
||||||
let state: super::server_state::ServerState =
|
|
||||||
dioxus_fullstack::FullstackContext::extract().await?;
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/notifications/read-all", state.agent_api_url);
|
|
||||||
reqwest::Client::new()
|
|
||||||
.post(&url)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
#[server]
|
|
||||||
pub async fn dismiss_notification(id: String) -> Result<(), ServerFnError> {
|
|
||||||
let state: super::server_state::ServerState =
|
|
||||||
dioxus_fullstack::FullstackContext::extract().await?;
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/notifications/{id}/dismiss", state.agent_api_url);
|
|
||||||
reqwest::Client::new()
|
|
||||||
.patch(&url)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| ServerFnError::new(e.to_string()))?;
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -259,10 +259,7 @@ pub fn McpServersPage() -> Element {
|
|||||||
div { class: "mcp-detail-row",
|
div { class: "mcp-detail-row",
|
||||||
Icon { icon: BsGlobe, width: 13, height: 13 }
|
Icon { icon: BsGlobe, width: 13, height: 13 }
|
||||||
span { class: "mcp-detail-label", "Endpoint" }
|
span { class: "mcp-detail-label", "Endpoint" }
|
||||||
div { class: "copyable",
|
code { class: "mcp-detail-value", "{server.endpoint_url}" }
|
||||||
code { class: "mcp-detail-value", "{server.endpoint_url}" }
|
|
||||||
crate::components::copy_button::CopyButton { value: server.endpoint_url.clone(), small: true }
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
div { class: "mcp-detail-row",
|
div { class: "mcp-detail-row",
|
||||||
Icon { icon: BsHddNetwork, width: 13, height: 13 }
|
Icon { icon: BsHddNetwork, width: 13, height: 13 }
|
||||||
|
|||||||
@@ -16,6 +16,7 @@ pub mod pentest_dashboard;
|
|||||||
pub mod pentest_session;
|
pub mod pentest_session;
|
||||||
pub mod repositories;
|
pub mod repositories;
|
||||||
pub mod sbom;
|
pub mod sbom;
|
||||||
|
pub mod settings;
|
||||||
|
|
||||||
pub use chat::ChatPage;
|
pub use chat::ChatPage;
|
||||||
pub use chat_index::ChatIndexPage;
|
pub use chat_index::ChatIndexPage;
|
||||||
@@ -35,3 +36,4 @@ pub use pentest_dashboard::PentestDashboardPage;
|
|||||||
pub use pentest_session::PentestSessionPage;
|
pub use pentest_session::PentestSessionPage;
|
||||||
pub use repositories::RepositoriesPage;
|
pub use repositories::RepositoriesPage;
|
||||||
pub use sbom::SbomPage;
|
pub use sbom::SbomPage;
|
||||||
|
pub use settings::SettingsPage;
|
||||||
|
|||||||
@@ -137,18 +137,11 @@ pub fn RepositoriesPage() -> Element {
|
|||||||
"For SSH URLs: add this deploy key (read-only) to your repository"
|
"For SSH URLs: add this deploy key (read-only) to your repository"
|
||||||
}
|
}
|
||||||
div {
|
div {
|
||||||
class: "copyable",
|
style: "margin-top: 4px; padding: 8px; background: var(--bg-secondary); border-radius: 4px; font-family: monospace; font-size: 11px; word-break: break-all; user-select: all;",
|
||||||
style: "margin-top: 4px; padding: 8px; background: var(--bg-secondary); border-radius: 4px;",
|
if ssh_public_key().is_empty() {
|
||||||
code {
|
"Loading..."
|
||||||
style: "font-size: 11px; word-break: break-all; user-select: all;",
|
} else {
|
||||||
if ssh_public_key().is_empty() {
|
"{ssh_public_key}"
|
||||||
"Loading..."
|
|
||||||
} else {
|
|
||||||
"{ssh_public_key}"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if !ssh_public_key().is_empty() {
|
|
||||||
crate::components::copy_button::CopyButton { value: ssh_public_key(), small: true }
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -397,37 +390,28 @@ pub fn RepositoriesPage() -> Element {
|
|||||||
}
|
}
|
||||||
div { class: "form-group",
|
div { class: "form-group",
|
||||||
label { "Webhook URL" }
|
label { "Webhook URL" }
|
||||||
{
|
input {
|
||||||
#[cfg(feature = "web")]
|
r#type: "text",
|
||||||
let origin = web_sys::window()
|
readonly: true,
|
||||||
.and_then(|w: web_sys::Window| w.location().origin().ok())
|
style: "font-family: monospace; font-size: 12px;",
|
||||||
.unwrap_or_default();
|
value: {
|
||||||
#[cfg(not(feature = "web"))]
|
#[cfg(feature = "web")]
|
||||||
let origin = String::new();
|
let origin = web_sys::window()
|
||||||
let webhook_url = format!("{origin}/webhook/{}/{eid}", edit_webhook_tracker());
|
.and_then(|w: web_sys::Window| w.location().origin().ok())
|
||||||
rsx! {
|
.unwrap_or_default();
|
||||||
div { class: "copyable",
|
#[cfg(not(feature = "web"))]
|
||||||
input {
|
let origin = String::new();
|
||||||
r#type: "text",
|
format!("{origin}/webhook/{}/{eid}", edit_webhook_tracker())
|
||||||
readonly: true,
|
},
|
||||||
style: "font-family: monospace; font-size: 12px; flex: 1;",
|
|
||||||
value: "{webhook_url}",
|
|
||||||
}
|
|
||||||
crate::components::copy_button::CopyButton { value: webhook_url.clone() }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
div { class: "form-group",
|
div { class: "form-group",
|
||||||
label { "Webhook Secret" }
|
label { "Webhook Secret" }
|
||||||
div { class: "copyable",
|
input {
|
||||||
input {
|
r#type: "text",
|
||||||
r#type: "text",
|
readonly: true,
|
||||||
readonly: true,
|
style: "font-family: monospace; font-size: 12px;",
|
||||||
style: "font-family: monospace; font-size: 12px; flex: 1;",
|
value: "{secret}",
|
||||||
value: "{secret}",
|
|
||||||
}
|
|
||||||
crate::components::copy_button::CopyButton { value: secret.clone() }
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
142
compliance-dashboard/src/pages/settings.rs
Normal file
142
compliance-dashboard/src/pages/settings.rs
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
use dioxus::prelude::*;
|
||||||
|
|
||||||
|
use crate::components::page_header::PageHeader;
|
||||||
|
|
||||||
|
#[component]
|
||||||
|
pub fn SettingsPage() -> Element {
|
||||||
|
let mut litellm_url = use_signal(|| "http://localhost:4000".to_string());
|
||||||
|
let mut litellm_model = use_signal(|| "gpt-4o".to_string());
|
||||||
|
let mut github_token = use_signal(String::new);
|
||||||
|
let mut gitlab_url = use_signal(|| "https://gitlab.com".to_string());
|
||||||
|
let mut gitlab_token = use_signal(String::new);
|
||||||
|
let mut jira_url = use_signal(String::new);
|
||||||
|
let mut jira_email = use_signal(String::new);
|
||||||
|
let mut jira_token = use_signal(String::new);
|
||||||
|
let mut jira_project = use_signal(String::new);
|
||||||
|
let mut searxng_url = use_signal(|| "http://localhost:8888".to_string());
|
||||||
|
|
||||||
|
rsx! {
|
||||||
|
PageHeader {
|
||||||
|
title: "Settings",
|
||||||
|
description: "Configure integrations and scanning parameters",
|
||||||
|
}
|
||||||
|
|
||||||
|
div { class: "card",
|
||||||
|
div { class: "card-header", "LiteLLM Configuration" }
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "LiteLLM URL" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
value: "{litellm_url}",
|
||||||
|
oninput: move |e| litellm_url.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Model" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
value: "{litellm_model}",
|
||||||
|
oninput: move |e| litellm_model.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
div { class: "card",
|
||||||
|
div { class: "card-header", "GitHub Integration" }
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Personal Access Token" }
|
||||||
|
input {
|
||||||
|
r#type: "password",
|
||||||
|
placeholder: "ghp_...",
|
||||||
|
value: "{github_token}",
|
||||||
|
oninput: move |e| github_token.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
div { class: "card",
|
||||||
|
div { class: "card-header", "GitLab Integration" }
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "GitLab URL" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
value: "{gitlab_url}",
|
||||||
|
oninput: move |e| gitlab_url.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Access Token" }
|
||||||
|
input {
|
||||||
|
r#type: "password",
|
||||||
|
placeholder: "glpat-...",
|
||||||
|
value: "{gitlab_token}",
|
||||||
|
oninput: move |e| gitlab_token.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
div { class: "card",
|
||||||
|
div { class: "card-header", "Jira Integration" }
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Jira URL" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
placeholder: "https://your-org.atlassian.net",
|
||||||
|
value: "{jira_url}",
|
||||||
|
oninput: move |e| jira_url.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Email" }
|
||||||
|
input {
|
||||||
|
r#type: "email",
|
||||||
|
value: "{jira_email}",
|
||||||
|
oninput: move |e| jira_email.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "API Token" }
|
||||||
|
input {
|
||||||
|
r#type: "password",
|
||||||
|
value: "{jira_token}",
|
||||||
|
oninput: move |e| jira_token.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "Project Key" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
placeholder: "SEC",
|
||||||
|
value: "{jira_project}",
|
||||||
|
oninput: move |e| jira_project.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
div { class: "card",
|
||||||
|
div { class: "card-header", "SearXNG" }
|
||||||
|
div { class: "form-group",
|
||||||
|
label { "SearXNG URL" }
|
||||||
|
input {
|
||||||
|
r#type: "text",
|
||||||
|
value: "{searxng_url}",
|
||||||
|
oninput: move |e| searxng_url.set(e.value()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
div { style: "margin-top: 16px;",
|
||||||
|
button {
|
||||||
|
class: "btn btn-primary",
|
||||||
|
onclick: move |_| {
|
||||||
|
tracing::info!("Settings save not yet implemented - settings are managed via .env");
|
||||||
|
},
|
||||||
|
"Save Settings"
|
||||||
|
}
|
||||||
|
p {
|
||||||
|
style: "margin-top: 8px; font-size: 12px; color: var(--text-secondary);",
|
||||||
|
"Note: Settings are currently configured via environment variables (.env file). Dashboard-based settings persistence coming soon."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -15,30 +15,6 @@ use crate::parsers::registry::ParserRegistry;
|
|||||||
use super::community::detect_communities;
|
use super::community::detect_communities;
|
||||||
use super::impact::ImpactAnalyzer;
|
use super::impact::ImpactAnalyzer;
|
||||||
|
|
||||||
/// Walk up the qualified-name hierarchy to find the closest ancestor
|
|
||||||
/// that exists in the node map.
|
|
||||||
///
|
|
||||||
/// For `"src/main.rs::config::load"` this tries:
|
|
||||||
/// 1. `"src/main.rs::config"` (trim last `::` segment)
|
|
||||||
/// 2. `"src/main.rs"` (trim again)
|
|
||||||
///
|
|
||||||
/// Returns the first match found, or `None` if the node is a root.
|
|
||||||
fn find_parent_qname(qname: &str, node_map: &HashMap<String, NodeIndex>) -> Option<String> {
|
|
||||||
let mut current = qname.to_string();
|
|
||||||
loop {
|
|
||||||
// Try stripping the last "::" segment
|
|
||||||
if let Some(pos) = current.rfind("::") {
|
|
||||||
current.truncate(pos);
|
|
||||||
if node_map.contains_key(¤t) {
|
|
||||||
return Some(current);
|
|
||||||
}
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
// No more "::" — this is a top-level node (file), no parent
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// The main graph engine that builds and manages code knowledge graphs
|
/// The main graph engine that builds and manages code knowledge graphs
|
||||||
pub struct GraphEngine {
|
pub struct GraphEngine {
|
||||||
parser_registry: ParserRegistry,
|
parser_registry: ParserRegistry,
|
||||||
@@ -113,12 +89,7 @@ impl GraphEngine {
|
|||||||
Ok((code_graph, build_run))
|
Ok((code_graph, build_run))
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Build petgraph from parsed output, resolving edges to node indices.
|
/// Build petgraph from parsed output, resolving edges to node indices
|
||||||
///
|
|
||||||
/// After resolving the explicit edges from parsers, we synthesise
|
|
||||||
/// `Contains` edges so that every node is reachable from its parent
|
|
||||||
/// file or module. This eliminates disconnected "islands" that
|
|
||||||
/// otherwise appear when files share no direct call/import edges.
|
|
||||||
fn build_petgraph(&self, parse_output: ParseOutput) -> Result<CodeGraph, CoreError> {
|
fn build_petgraph(&self, parse_output: ParseOutput) -> Result<CodeGraph, CoreError> {
|
||||||
let mut graph = DiGraph::new();
|
let mut graph = DiGraph::new();
|
||||||
let mut node_map: HashMap<String, NodeIndex> = HashMap::new();
|
let mut node_map: HashMap<String, NodeIndex> = HashMap::new();
|
||||||
@@ -131,13 +102,15 @@ impl GraphEngine {
|
|||||||
node_map.insert(node.qualified_name.clone(), idx);
|
node_map.insert(node.qualified_name.clone(), idx);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Resolve and add explicit edges from parsers
|
// Resolve and add edges — rewrite target to the resolved qualified name
|
||||||
|
// so the persisted edge references match node qualified_names.
|
||||||
let mut resolved_edges = Vec::new();
|
let mut resolved_edges = Vec::new();
|
||||||
for mut edge in parse_output.edges {
|
for mut edge in parse_output.edges {
|
||||||
let source_idx = node_map.get(&edge.source);
|
let source_idx = node_map.get(&edge.source);
|
||||||
let resolved = self.resolve_edge_target(&edge.target, &node_map);
|
let resolved = self.resolve_edge_target(&edge.target, &node_map);
|
||||||
|
|
||||||
if let (Some(&src), Some(tgt)) = (source_idx, resolved) {
|
if let (Some(&src), Some(tgt)) = (source_idx, resolved) {
|
||||||
|
// Update target to the resolved qualified name
|
||||||
let resolved_name = node_map
|
let resolved_name = node_map
|
||||||
.iter()
|
.iter()
|
||||||
.find(|(_, &idx)| idx == tgt)
|
.find(|(_, &idx)| idx == tgt)
|
||||||
@@ -148,48 +121,7 @@ impl GraphEngine {
|
|||||||
graph.add_edge(src, tgt, edge.kind.clone());
|
graph.add_edge(src, tgt, edge.kind.clone());
|
||||||
resolved_edges.push(edge);
|
resolved_edges.push(edge);
|
||||||
}
|
}
|
||||||
}
|
// Skip unresolved edges (cross-file, external deps) — conservative approach
|
||||||
|
|
||||||
// Synthesise Contains edges: connect each node to its closest
|
|
||||||
// parent in the qualified-name hierarchy.
|
|
||||||
//
|
|
||||||
// For "src/main.rs::config::load", the parent chain is:
|
|
||||||
// "src/main.rs::config" → "src/main.rs"
|
|
||||||
//
|
|
||||||
// We walk up the qualified name (splitting on "::") and link to
|
|
||||||
// the first ancestor that exists in the node map.
|
|
||||||
let repo_id = nodes.first().map(|n| n.repo_id.as_str()).unwrap_or("");
|
|
||||||
let build_id = nodes
|
|
||||||
.first()
|
|
||||||
.map(|n| n.graph_build_id.as_str())
|
|
||||||
.unwrap_or("");
|
|
||||||
|
|
||||||
let qualified_names: Vec<String> = nodes.iter().map(|n| n.qualified_name.clone()).collect();
|
|
||||||
let file_paths: HashMap<String, String> = nodes
|
|
||||||
.iter()
|
|
||||||
.map(|n| (n.qualified_name.clone(), n.file_path.clone()))
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
for qname in &qualified_names {
|
|
||||||
if let Some(parent_qname) = find_parent_qname(qname, &node_map) {
|
|
||||||
let child_idx = node_map[qname];
|
|
||||||
let parent_idx = node_map[&parent_qname];
|
|
||||||
|
|
||||||
// Avoid duplicate edges
|
|
||||||
if !graph.contains_edge(parent_idx, child_idx) {
|
|
||||||
graph.add_edge(parent_idx, child_idx, CodeEdgeKind::Contains);
|
|
||||||
resolved_edges.push(CodeEdge {
|
|
||||||
id: None,
|
|
||||||
repo_id: repo_id.to_string(),
|
|
||||||
graph_build_id: build_id.to_string(),
|
|
||||||
source: parent_qname,
|
|
||||||
target: qname.clone(),
|
|
||||||
kind: CodeEdgeKind::Contains,
|
|
||||||
file_path: file_paths.get(qname).cloned().unwrap_or_default(),
|
|
||||||
line_number: None,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
Ok(CodeGraph {
|
Ok(CodeGraph {
|
||||||
@@ -200,62 +132,33 @@ impl GraphEngine {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Try to resolve an edge target to a known node.
|
/// Try to resolve an edge target to a known node
|
||||||
///
|
|
||||||
/// Resolution strategies (in order):
|
|
||||||
/// 1. Direct qualified-name match
|
|
||||||
/// 2. Suffix match: "foo" matches "src/main.rs::mod::foo"
|
|
||||||
/// 3. Module-path match: "config::load" matches "src/config.rs::load"
|
|
||||||
/// 4. Self-method: "self.method" matches "::method"
|
|
||||||
fn resolve_edge_target(
|
fn resolve_edge_target(
|
||||||
&self,
|
&self,
|
||||||
target: &str,
|
target: &str,
|
||||||
node_map: &HashMap<String, NodeIndex>,
|
node_map: &HashMap<String, NodeIndex>,
|
||||||
) -> Option<NodeIndex> {
|
) -> Option<NodeIndex> {
|
||||||
// 1. Direct match
|
// Direct match
|
||||||
if let Some(idx) = node_map.get(target) {
|
if let Some(idx) = node_map.get(target) {
|
||||||
return Some(*idx);
|
return Some(*idx);
|
||||||
}
|
}
|
||||||
|
|
||||||
// 2. Suffix match: "foo" → "path/file.rs::foo"
|
// Try matching just the function/type name (intra-file resolution)
|
||||||
let suffix_pattern = format!("::{target}");
|
|
||||||
let dot_pattern = format!(".{target}");
|
|
||||||
for (qualified, idx) in node_map {
|
for (qualified, idx) in node_map {
|
||||||
if qualified.ends_with(&suffix_pattern) || qualified.ends_with(&dot_pattern) {
|
// Match "foo" to "path/file.rs::foo" or "path/file.rs::Type::foo"
|
||||||
|
if qualified.ends_with(&format!("::{target}"))
|
||||||
|
|| qualified.ends_with(&format!(".{target}"))
|
||||||
|
{
|
||||||
return Some(*idx);
|
return Some(*idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Module-path match: "config::load" → try matching the last N
|
// Try matching method calls like "self.method" -> look for "::method"
|
||||||
// segments of the target against node qualified names.
|
|
||||||
// This handles cross-file calls like `crate::config::load` or
|
|
||||||
// `super::handlers::process` where the prefix differs.
|
|
||||||
if target.contains("::") {
|
|
||||||
// Strip common Rust path prefixes
|
|
||||||
let stripped = target
|
|
||||||
.strip_prefix("crate::")
|
|
||||||
.or_else(|| target.strip_prefix("super::"))
|
|
||||||
.or_else(|| target.strip_prefix("self::"))
|
|
||||||
.unwrap_or(target);
|
|
||||||
|
|
||||||
let segments: Vec<&str> = stripped.split("::").collect();
|
|
||||||
// Try matching progressively shorter suffixes
|
|
||||||
for start in 0..segments.len() {
|
|
||||||
let suffix = segments[start..].join("::");
|
|
||||||
let pattern = format!("::{suffix}");
|
|
||||||
for (qualified, idx) in node_map {
|
|
||||||
if qualified.ends_with(&pattern) {
|
|
||||||
return Some(*idx);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 4. Self-method: "self.method" → "::method"
|
|
||||||
if let Some(method_name) = target.strip_prefix("self.") {
|
if let Some(method_name) = target.strip_prefix("self.") {
|
||||||
let pattern = format!("::{method_name}");
|
|
||||||
for (qualified, idx) in node_map {
|
for (qualified, idx) in node_map {
|
||||||
if qualified.ends_with(&pattern) {
|
if qualified.ends_with(&format!("::{method_name}"))
|
||||||
|
|| qualified.ends_with(&format!(".{method_name}"))
|
||||||
|
{
|
||||||
return Some(*idx);
|
return Some(*idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -450,83 +353,4 @@ mod tests {
|
|||||||
assert!(code_graph.node_map.contains_key("a::c"));
|
assert!(code_graph.node_map.contains_key("a::c"));
|
||||||
assert!(code_graph.node_map.contains_key("a::d"));
|
assert!(code_graph.node_map.contains_key("a::d"));
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn test_contains_edges_synthesised() {
|
|
||||||
let engine = GraphEngine::new(1000);
|
|
||||||
let mut output = ParseOutput::default();
|
|
||||||
// File → Module → Function hierarchy
|
|
||||||
output.nodes.push(make_node("src/main.rs"));
|
|
||||||
output.nodes.push(make_node("src/main.rs::config"));
|
|
||||||
output.nodes.push(make_node("src/main.rs::config::load"));
|
|
||||||
|
|
||||||
let code_graph = engine.build_petgraph(output).unwrap();
|
|
||||||
|
|
||||||
// Should have 2 Contains edges:
|
|
||||||
// src/main.rs → src/main.rs::config
|
|
||||||
// src/main.rs::config → src/main.rs::config::load
|
|
||||||
let contains_edges: Vec<_> = code_graph
|
|
||||||
.edges
|
|
||||||
.iter()
|
|
||||||
.filter(|e| matches!(e.kind, CodeEdgeKind::Contains))
|
|
||||||
.collect();
|
|
||||||
assert_eq!(contains_edges.len(), 2, "expected 2 Contains edges");
|
|
||||||
|
|
||||||
let sources: Vec<&str> = contains_edges.iter().map(|e| e.source.as_str()).collect();
|
|
||||||
assert!(sources.contains(&"src/main.rs"));
|
|
||||||
assert!(sources.contains(&"src/main.rs::config"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn test_contains_edges_no_duplicates_with_existing_edges() {
|
|
||||||
let engine = GraphEngine::new(1000);
|
|
||||||
let mut output = ParseOutput::default();
|
|
||||||
output.nodes.push(make_node("src/main.rs"));
|
|
||||||
output.nodes.push(make_node("src/main.rs::foo"));
|
|
||||||
|
|
||||||
// Explicit Calls edge (foo calls itself? just for testing)
|
|
||||||
output.edges.push(CodeEdge {
|
|
||||||
id: None,
|
|
||||||
repo_id: "test".to_string(),
|
|
||||||
graph_build_id: "build1".to_string(),
|
|
||||||
source: "src/main.rs::foo".to_string(),
|
|
||||||
target: "src/main.rs::foo".to_string(),
|
|
||||||
kind: CodeEdgeKind::Calls,
|
|
||||||
file_path: "src/main.rs".to_string(),
|
|
||||||
line_number: Some(1),
|
|
||||||
});
|
|
||||||
|
|
||||||
let code_graph = engine.build_petgraph(output).unwrap();
|
|
||||||
|
|
||||||
// 1 Calls + 1 Contains = 2 edges total
|
|
||||||
assert_eq!(code_graph.edges.len(), 2);
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn test_cross_file_resolution_with_module_path() {
|
|
||||||
let engine = GraphEngine::new(1000);
|
|
||||||
let node_map = build_test_node_map(&["src/config.rs::load_config", "src/main.rs::main"]);
|
|
||||||
// "crate::config::load_config" should resolve to "src/config.rs::load_config"
|
|
||||||
let result = engine.resolve_edge_target("crate::config::load_config", &node_map);
|
|
||||||
assert!(result.is_some(), "cross-file crate:: path should resolve");
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn test_find_parent_qname() {
|
|
||||||
let node_map = build_test_node_map(&[
|
|
||||||
"src/main.rs",
|
|
||||||
"src/main.rs::config",
|
|
||||||
"src/main.rs::config::load",
|
|
||||||
]);
|
|
||||||
|
|
||||||
assert_eq!(
|
|
||||||
find_parent_qname("src/main.rs::config::load", &node_map),
|
|
||||||
Some("src/main.rs::config".to_string())
|
|
||||||
);
|
|
||||||
assert_eq!(
|
|
||||||
find_parent_qname("src/main.rs::config", &node_map),
|
|
||||||
Some("src/main.rs".to_string())
|
|
||||||
);
|
|
||||||
assert_eq!(find_parent_qname("src/main.rs", &node_map), None);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,61 +0,0 @@
|
|||||||
# Finding Deduplication
|
|
||||||
|
|
||||||
The Compliance Scanner automatically deduplicates findings across all scanning surfaces to prevent noise and duplicate issues.
|
|
||||||
|
|
||||||
## SAST Finding Dedup
|
|
||||||
|
|
||||||
Static analysis findings are deduplicated using SHA-256 fingerprints computed from:
|
|
||||||
|
|
||||||
- Repository ID
|
|
||||||
- Scanner rule ID (e.g., Semgrep check ID)
|
|
||||||
- File path
|
|
||||||
- Line number
|
|
||||||
|
|
||||||
Before inserting a new finding, the pipeline checks if a finding with the same fingerprint already exists. If it does, the finding is skipped.
|
|
||||||
|
|
||||||
## DAST / Pentest Finding Dedup
|
|
||||||
|
|
||||||
Dynamic testing findings go through two-phase deduplication:
|
|
||||||
|
|
||||||
### Phase 1: Exact Dedup
|
|
||||||
|
|
||||||
Findings with the same canonicalized title, endpoint, and HTTP method are merged. Evidence from duplicate findings is combined into a single finding, keeping the highest severity.
|
|
||||||
|
|
||||||
**Title canonicalization** handles common variations:
|
|
||||||
- Domain names and URLs are stripped from titles (e.g., "Missing HSTS header for example.com" becomes "Missing HSTS header")
|
|
||||||
- Known synonyms are resolved (e.g., "HSTS" maps to "strict-transport-security", "CSP" maps to "content-security-policy")
|
|
||||||
|
|
||||||
### Phase 2: CWE-Based Dedup
|
|
||||||
|
|
||||||
After exact dedup, findings with the same CWE and endpoint are merged. This catches cases where different tools report the same underlying issue with different titles or vulnerability types (e.g., a missing HSTS header reported as both `security_header_missing` and `tls_misconfiguration`).
|
|
||||||
|
|
||||||
The primary finding is selected by highest severity, then most evidence, then longest description. Evidence from merged findings is preserved.
|
|
||||||
|
|
||||||
### When Dedup Applies
|
|
||||||
|
|
||||||
- **At insertion time**: During a pentest session, before each finding is stored in MongoDB
|
|
||||||
- **At report export**: When generating a pentest report, all session findings are deduplicated before rendering
|
|
||||||
|
|
||||||
## PR Review Comment Dedup
|
|
||||||
|
|
||||||
PR review comments are deduplicated to prevent posting the same finding multiple times:
|
|
||||||
|
|
||||||
- Each comment includes a fingerprint computed from the repository, PR number, file path, line, and finding title
|
|
||||||
- Within a single review run, duplicate findings are skipped
|
|
||||||
- The fingerprint is embedded as an HTML comment in the review body for future cross-run dedup
|
|
||||||
|
|
||||||
## Issue Tracker Dedup
|
|
||||||
|
|
||||||
Before creating an issue in GitHub, GitLab, Jira, or Gitea, the scanner:
|
|
||||||
|
|
||||||
1. Searches for an existing issue matching the finding's fingerprint
|
|
||||||
2. Falls back to searching by issue title
|
|
||||||
3. Skips creation if a match is found
|
|
||||||
|
|
||||||
## Code Review Dedup
|
|
||||||
|
|
||||||
Multi-pass LLM code reviews (logic, security, convention, complexity) are deduplicated across passes using proximity-aware keys:
|
|
||||||
|
|
||||||
- Findings within 3 lines of each other on the same file with similar normalized titles are considered duplicates
|
|
||||||
- The finding with the highest severity is kept
|
|
||||||
- CWE information is merged from duplicates
|
|
||||||
@@ -1,60 +0,0 @@
|
|||||||
# Help Chat Assistant
|
|
||||||
|
|
||||||
The Help Chat is a floating assistant available on every page of the dashboard. It answers questions about the Compliance Scanner using the project documentation as its knowledge base.
|
|
||||||
|
|
||||||
## How It Works
|
|
||||||
|
|
||||||
1. Click the **?** button in the bottom-right corner of any page
|
|
||||||
2. Type your question and press Enter
|
|
||||||
3. The assistant responds with answers grounded in the project documentation
|
|
||||||
|
|
||||||
The chat supports multi-turn conversations -- you can ask follow-up questions and the assistant will remember the context of your conversation.
|
|
||||||
|
|
||||||
## What You Can Ask
|
|
||||||
|
|
||||||
- **Getting started**: "How do I add a repository?" / "How do I trigger a scan?"
|
|
||||||
- **Features**: "What is SBOM?" / "How does the code knowledge graph work?"
|
|
||||||
- **Configuration**: "How do I set up webhooks?" / "What environment variables are needed?"
|
|
||||||
- **Scanning**: "What does the scan pipeline do?" / "How does LLM triage work?"
|
|
||||||
- **DAST & Pentesting**: "How do I run a pentest?" / "What DAST tools are available?"
|
|
||||||
- **Integrations**: "How do I connect to GitHub?" / "What is MCP?"
|
|
||||||
|
|
||||||
## Technical Details
|
|
||||||
|
|
||||||
The help chat loads all project documentation (README, guides, feature docs, reference) at startup and caches them in memory. When you ask a question, it sends your message along with the full documentation context to the LLM via LiteLLM, which generates a grounded response.
|
|
||||||
|
|
||||||
### API Endpoint
|
|
||||||
|
|
||||||
```
|
|
||||||
POST /api/v1/help/chat
|
|
||||||
Content-Type: application/json
|
|
||||||
|
|
||||||
{
|
|
||||||
"message": "How do I add a repository?",
|
|
||||||
"history": [
|
|
||||||
{ "role": "user", "content": "previous question" },
|
|
||||||
{ "role": "assistant", "content": "previous answer" }
|
|
||||||
]
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Configuration
|
|
||||||
|
|
||||||
The help chat uses the same LiteLLM configuration as other LLM features:
|
|
||||||
|
|
||||||
| Environment Variable | Description | Default |
|
|
||||||
|---------------------|-------------|---------|
|
|
||||||
| `LITELLM_URL` | LiteLLM API base URL | `http://localhost:4000` |
|
|
||||||
| `LITELLM_MODEL` | Model for chat responses | `gpt-4o` |
|
|
||||||
| `LITELLM_API_KEY` | API key (optional) | -- |
|
|
||||||
|
|
||||||
### Documentation Sources
|
|
||||||
|
|
||||||
The assistant indexes the following documentation at startup:
|
|
||||||
|
|
||||||
- `README.md` -- Project overview and quick start
|
|
||||||
- `docs/guide/` -- Getting started, repositories, findings, SBOM, scanning, issues, webhooks
|
|
||||||
- `docs/features/` -- AI Chat, DAST, Code Graph, MCP Server, Pentesting, Help Chat
|
|
||||||
- `docs/reference/` -- Glossary, tools reference
|
|
||||||
|
|
||||||
If documentation files are not found at startup (e.g., in a minimal Docker deployment), the assistant falls back to general knowledge about the project.
|
|
||||||
@@ -1,6 +1,8 @@
|
|||||||
# Dashboard Overview
|
# Dashboard Overview
|
||||||
|
|
||||||
The Overview page is the landing page of the Compliance Scanner. It gives you a high-level view of your security posture across all tracked repositories.
|
The Overview page is the landing page of Certifai. It gives you a high-level view of your security posture across all tracked repositories.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
## Stats Cards
|
## Stats Cards
|
||||||
|
|
||||||
@@ -32,10 +34,6 @@ The overview includes quick-access cards for the AI Chat feature. Each card repr
|
|||||||
|
|
||||||
If you have MCP servers registered, they appear on the overview page with their status and connection details. This lets you quickly check that your MCP integrations are running. See [MCP Integration](/features/mcp-server) for details.
|
If you have MCP servers registered, they appear on the overview page with their status and connection details. This lets you quickly check that your MCP integrations are running. See [MCP Integration](/features/mcp-server) for details.
|
||||||
|
|
||||||
## Help Chat Assistant
|
|
||||||
|
|
||||||
A floating help chat button is available in the bottom-right corner of every page. Click it to ask questions about the Compliance Scanner -- how to configure repositories, understand findings, set up webhooks, or use any feature. The assistant is grounded in the project documentation and uses LiteLLM for responses.
|
|
||||||
|
|
||||||
## Recent Scan Runs
|
## Recent Scan Runs
|
||||||
|
|
||||||
The bottom section lists the most recent scan runs across all repositories, showing:
|
The bottom section lists the most recent scan runs across all repositories, showing:
|
||||||
|
|||||||
Reference in New Issue
Block a user