Compare commits

..

12 Commits

Author SHA1 Message Date
Sharang Parnerkar 4424db5acb feat(dashboard): add light/dark theme with sidebar toggle
CI / Check (pull_request) Successful in 10m12s
CI / Detect Changes (pull_request) Has been skipped
CI / Deploy Agent (pull_request) Has been skipped
CI / Deploy Dashboard (pull_request) Has been skipped
CI / Deploy Docs (pull_request) Has been skipped
CI / Deploy MCP (pull_request) Has been skipped
Introduces a light theme alongside the existing dark Obsidian Control
look, plus a sun/moon toggle in the sidebar footer.

The dashboard's CSS already drove every surface through custom
properties on :root, so the light theme is added as a second token set
under `:root[data-theme="light"]` and, in parallel, inside a
`@media (prefers-color-scheme: light)` block guarded by
`:not([data-theme="dark"])`. Net effect:
- A user with no stored preference gets their OS theme via the media
  query (no flash, no JS required).
- A user who clicked the toggle gets `data-theme="light|dark"` set on
  `<html>`, which wins over the media query.

The toggle component (`theme_toggle.rs`) reads `localStorage` first
then `prefers-color-scheme` on mount, and writes both the DOM
attribute and `localStorage` on click. All `web_sys` calls are gated
behind `#[cfg(feature = "web")]` so the server build stays clean.

Three CSS rules that hardcoded near-black hex values (the page dot
grid, `.code-block`, and the graph stabilization overlay) get explicit
light-mode overrides so they don't render as dark patches on white.

web-sys feature list extended with Storage, MediaQueryList, and
Element so the toggle can read the media query and set the attribute.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-13 13:26:37 +02:00
sharang 927fbc8ecb fix: live progress + concurrency for embedding builds (#80)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 5s
CI / Deploy Agent (push) Successful in 7m59s
CI / Deploy Dashboard (push) Has been skipped
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
2026-05-13 10:01:05 +00:00
sharang e67a13535a fix: add HTTP timeout to reqwest client and CVE stage timeout (#79)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 5s
CI / Deploy Agent (push) Successful in 8m26s
CI / Deploy Dashboard (push) Has been skipped
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
2026-05-13 07:30:26 +00:00
sharang df0063abc0 fix: scanner timeouts, semgrep memory cap, syft remote lookups, Script error (#78)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 5s
CI / Deploy Agent (push) Successful in 9m41s
CI / Deploy Dashboard (push) Successful in 15m19s
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Successful in 3m7s
## Summary

- **Scan produces no results in Orca** — semgrep (`--config=auto`, unbounded memory) and syft (remote license network calls) were getting OOM-killed or hanging in resource-constrained Orca containers. Scan would "complete" with 0 findings/SBOMs silently because each scanner failure is caught and logged as a warning.
- **Dashboard Script error spam** — `document::Script` in Dioxus 0.7 needs a single text node child for inline scripts; `dangerous_inner_html` was invalid and spammed the error log on every unauthenticated page load.

## Changes

| File | Change |
|------|--------|
| `semgrep.rs` | Add `--max-memory 500 --jobs 1`; 10-minute timeout |
| `syft.rs` | Remove remote license lookup env vars; 5-minute timeout |
| `gitleaks.rs` | 5-minute timeout |
| `app_shell.rs` | Fix `dangerous_inner_html` → text child in `document::Script` |

## Test plan

- [ ] Trigger a scan on a repo in Orca — findings and SBOM entries should now appear
- [ ] Agent logs should show timeout/error warnings rather than silent empty results when tools are killed
- [ ] Navigate to dashboard unauthenticated — Script error gone from logs
- [ ] Verify scans work end-to-end with `docker compose up`

---------

Co-authored-by: Sharang Parnerkar <30073382+mighty840@users.noreply.github.com>
Reviewed-on: #78
2026-05-12 11:27:24 +00:00
Sharang Parnerkar 5cafd13f44 ci: log orca webhook response so deploy steps arent silent
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 5s
CI / Deploy Agent (push) Has been skipped
CI / Deploy Dashboard (push) Has been skipped
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
Nightly E2E Tests / E2E Tests (push) Failing after 2m59s
2026-04-08 15:09:27 +02:00
Sharang Parnerkar 69209649a5 ci: trigger first orca build for all services
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 4s
CI / Deploy Agent (push) Successful in 7m5s
CI / Deploy Docs (push) Successful in 30s
CI / Deploy MCP (push) Successful in 1m31s
CI / Deploy Dashboard (push) Failing after 21m28s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:10:07 +02:00
Sharang Parnerkar d5439adc0d ci: trigger build of dashboard, docs, mcp images for orca
CI / Check (push) Has been cancelled
CI / Detect Changes (push) Has been cancelled
CI / Deploy Agent (push) Has been cancelled
CI / Deploy Dashboard (push) Has been cancelled
CI / Deploy Docs (push) Has been cancelled
CI / Deploy MCP (push) Has been cancelled
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:09:49 +02:00
Sharang Parnerkar bc7cdd35e4 ci: replace coolify webhook with orca deploy
CI / Check (push) Has been cancelled
CI / Detect Changes (push) Has been cancelled
CI / Deploy Agent (push) Has been cancelled
CI / Deploy Dashboard (push) Has been cancelled
CI / Deploy Docs (push) Has been cancelled
CI / Deploy MCP (push) Has been cancelled
Each deploy job now builds the per-service image, pushes to the
private registry as :latest and :sha, then triggers an HMAC-signed
orca redeploy webhook. Coolify webhooks are no longer used.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 10:06:11 +02:00
Sharang Parnerkar c062d834a1 fix: downgrade dotenv missing file from FAILED to info message
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 3s
CI / Deploy Agent (push) Successful in 2s
CI / Deploy Dashboard (push) Has been skipped
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
Nightly E2E Tests / E2E Tests (push) Failing after 2m16s
Non-fatal in Docker where env vars come from container config.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 15:33:24 +02:00
sharang 23cf37b6c3 fix: CVE notifications during scan + help chat doc loading + Dockerfile (#55)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 3s
CI / Deploy Agent (push) Successful in 2s
CI / Deploy Dashboard (push) Successful in 2s
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
2026-03-30 13:10:56 +00:00
sharang 49d5cd4e0a feat: hourly CVE alerting with notification bell and API (#53)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 3s
CI / Deploy Agent (push) Successful in 2s
CI / Deploy Dashboard (push) Successful in 2s
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Successful in 2s
2026-03-30 10:39:39 +00:00
sharang 4388e98b5b feat: add E2E test suite with nightly CI, fix dashboard Dockerfile (#52)
CI / Check (push) Has been skipped
CI / Detect Changes (push) Successful in 2s
CI / Deploy Agent (push) Successful in 2s
CI / Deploy Dashboard (push) Successful in 2s
CI / Deploy Docs (push) Has been skipped
CI / Deploy MCP (push) Has been skipped
2026-03-30 10:04:07 +00:00
39 changed files with 1503 additions and 171 deletions
+10
View File
@@ -0,0 +1,10 @@
[advisories]
ignore = [
# hickory-proto 0.25.x pulled in transitively via mongodb → hickory-resolver.
# MongoDB 3.x has not yet released with hickory-resolver 0.26.x, so we cannot
# upgrade past this without a mongodb release. Both are DNS-layer DoS vectors
# requiring a MITM/controlled DNS server against MongoDB's hostname resolution —
# not a realistic attack surface here. Revisit when mongodb bumps hickory.
"RUSTSEC-2026-0118", # NSEC3 loop, no fix available upstream
"RUSTSEC-2026-0119", # O(n²) name compression, fixed in hickory-proto >=0.26.1
]
+48 -20
View File
@@ -145,13 +145,20 @@ jobs:
needs: [detect-changes] needs: [detect-changes]
if: needs.detect-changes.outputs.agent == 'true' if: needs.detect-changes.outputs.agent == 'true'
container: container:
image: alpine:latest image: docker:27-cli
steps: steps:
- name: Trigger Coolify deploy - name: Build, push and trigger orca redeploy
run: | run: |
apk add --no-cache curl apk add --no-cache git curl openssl
curl -sf "${{ secrets.COOLIFY_WEBHOOK_AGENT }}" \ git init && git remote add origin "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
-H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}" git fetch --depth=1 origin "${GITHUB_SHA}" && git checkout FETCH_HEAD
IMAGE=registry.meghsakha.com/compliance-agent
echo "${{ secrets.REGISTRY_PASSWORD }}" | docker login registry.meghsakha.com -u "${{ secrets.REGISTRY_USERNAME }}" --password-stdin
docker build -f Dockerfile.agent -t "$IMAGE:latest" -t "$IMAGE:${GITHUB_SHA}" .
docker push "$IMAGE:latest" && docker push "$IMAGE:${GITHUB_SHA}"
PAYLOAD=$(printf '{"ref":"refs/heads/main","repository":{"full_name":"sharang/compliance-scanner-agent"},"head_commit":{"id":"%s","message":"deploy agent"}}' "${GITHUB_SHA}")
SIG=$(printf '%s' "$PAYLOAD" | openssl dgst -sha256 -hmac "${{ secrets.ORCA_WEBHOOK_SECRET }}" | awk '{print $2}')
RESP=$(curl -fsS -w "\nHTTP %{http_code}" -X POST "http://46.225.100.82:6880/api/v1/webhooks/github" -H "Content-Type: application/json" -H "X-Hub-Signature-256: sha256=$SIG" -d "$PAYLOAD"); echo "$RESP"
deploy-dashboard: deploy-dashboard:
name: Deploy Dashboard name: Deploy Dashboard
@@ -159,13 +166,20 @@ jobs:
needs: [detect-changes] needs: [detect-changes]
if: needs.detect-changes.outputs.dashboard == 'true' if: needs.detect-changes.outputs.dashboard == 'true'
container: container:
image: alpine:latest image: docker:27-cli
steps: steps:
- name: Trigger Coolify deploy - name: Build, push and trigger orca redeploy
run: | run: |
apk add --no-cache curl apk add --no-cache git curl openssl
curl -sf "${{ secrets.COOLIFY_WEBHOOK_DASHBOARD }}" \ git init && git remote add origin "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
-H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}" git fetch --depth=1 origin "${GITHUB_SHA}" && git checkout FETCH_HEAD
IMAGE=registry.meghsakha.com/compliance-dashboard
echo "${{ secrets.REGISTRY_PASSWORD }}" | docker login registry.meghsakha.com -u "${{ secrets.REGISTRY_USERNAME }}" --password-stdin
docker build -f Dockerfile.dashboard -t "$IMAGE:latest" -t "$IMAGE:${GITHUB_SHA}" .
docker push "$IMAGE:latest" && docker push "$IMAGE:${GITHUB_SHA}"
PAYLOAD=$(printf '{"ref":"refs/heads/main","repository":{"full_name":"sharang/compliance-scanner-agent"},"head_commit":{"id":"%s","message":"deploy dashboard"}}' "${GITHUB_SHA}")
SIG=$(printf '%s' "$PAYLOAD" | openssl dgst -sha256 -hmac "${{ secrets.ORCA_WEBHOOK_SECRET }}" | awk '{print $2}')
RESP=$(curl -fsS -w "\nHTTP %{http_code}" -X POST "http://46.225.100.82:6880/api/v1/webhooks/github" -H "Content-Type: application/json" -H "X-Hub-Signature-256: sha256=$SIG" -d "$PAYLOAD"); echo "$RESP"
deploy-docs: deploy-docs:
name: Deploy Docs name: Deploy Docs
@@ -173,13 +187,20 @@ jobs:
needs: [detect-changes] needs: [detect-changes]
if: needs.detect-changes.outputs.docs == 'true' if: needs.detect-changes.outputs.docs == 'true'
container: container:
image: alpine:latest image: docker:27-cli
steps: steps:
- name: Trigger Coolify deploy - name: Build, push and trigger orca redeploy
run: | run: |
apk add --no-cache curl apk add --no-cache git curl openssl
curl -sf "${{ secrets.COOLIFY_WEBHOOK_DOCS }}" \ git init && git remote add origin "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
-H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}" git fetch --depth=1 origin "${GITHUB_SHA}" && git checkout FETCH_HEAD
IMAGE=registry.meghsakha.com/compliance-docs
echo "${{ secrets.REGISTRY_PASSWORD }}" | docker login registry.meghsakha.com -u "${{ secrets.REGISTRY_USERNAME }}" --password-stdin
docker build -f Dockerfile.docs -t "$IMAGE:latest" -t "$IMAGE:${GITHUB_SHA}" .
docker push "$IMAGE:latest" && docker push "$IMAGE:${GITHUB_SHA}"
PAYLOAD=$(printf '{"ref":"refs/heads/main","repository":{"full_name":"sharang/compliance-scanner-agent"},"head_commit":{"id":"%s","message":"deploy docs"}}' "${GITHUB_SHA}")
SIG=$(printf '%s' "$PAYLOAD" | openssl dgst -sha256 -hmac "${{ secrets.ORCA_WEBHOOK_SECRET }}" | awk '{print $2}')
RESP=$(curl -fsS -w "\nHTTP %{http_code}" -X POST "http://46.225.100.82:6880/api/v1/webhooks/github" -H "Content-Type: application/json" -H "X-Hub-Signature-256: sha256=$SIG" -d "$PAYLOAD"); echo "$RESP"
deploy-mcp: deploy-mcp:
name: Deploy MCP name: Deploy MCP
@@ -187,10 +208,17 @@ jobs:
needs: [detect-changes] needs: [detect-changes]
if: needs.detect-changes.outputs.mcp == 'true' if: needs.detect-changes.outputs.mcp == 'true'
container: container:
image: alpine:latest image: docker:27-cli
steps: steps:
- name: Trigger Coolify deploy - name: Build, push and trigger orca redeploy
run: | run: |
apk add --no-cache curl apk add --no-cache git curl openssl
curl -sf "${{ secrets.COOLIFY_WEBHOOK_MCP }}" \ git init && git remote add origin "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
-H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}" git fetch --depth=1 origin "${GITHUB_SHA}" && git checkout FETCH_HEAD
IMAGE=registry.meghsakha.com/compliance-mcp
echo "${{ secrets.REGISTRY_PASSWORD }}" | docker login registry.meghsakha.com -u "${{ secrets.REGISTRY_USERNAME }}" --password-stdin
docker build -f Dockerfile.mcp -t "$IMAGE:latest" -t "$IMAGE:${GITHUB_SHA}" .
docker push "$IMAGE:latest" && docker push "$IMAGE:${GITHUB_SHA}"
PAYLOAD=$(printf '{"ref":"refs/heads/main","repository":{"full_name":"sharang/compliance-scanner-agent"},"head_commit":{"id":"%s","message":"deploy mcp"}}' "${GITHUB_SHA}")
SIG=$(printf '%s' "$PAYLOAD" | openssl dgst -sha256 -hmac "${{ secrets.ORCA_WEBHOOK_SECRET }}" | awk '{print $2}')
RESP=$(curl -fsS -w "\nHTTP %{http_code}" -X POST "http://46.225.100.82:6880/api/v1/webhooks/github" -H "Content-Type: application/json" -H "X-Hub-Signature-256: sha256=$SIG" -d "$PAYLOAD"); echo "$RESP"
Generated
+6 -6
View File
@@ -3524,9 +3524,9 @@ checksum = "224484c5d09285a7b8cb0a0c117e847ebd14cb6e4470ecf68cdb89c503b0edb9"
[[package]] [[package]]
name = "mongodb" name = "mongodb"
version = "3.5.1" version = "3.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "803dd859e8afa084c255a8effd8000ff86f7c8076a50cd6d8c99e8f3496f75c2" checksum = "1ef2c933617431ad0246fb5b43c425ebdae18c7f7259c87de0726d93b0e7e91b"
dependencies = [ dependencies = [
"base64", "base64",
"bitflags", "bitflags",
@@ -3570,9 +3570,9 @@ dependencies = [
[[package]] [[package]]
name = "mongodb-internal-macros" name = "mongodb-internal-macros"
version = "3.5.1" version = "3.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a973ef3dd3dbc6f6e65bbdecfd9ec5e781b9e7493b0f369a7c62e35d8e5ae2c8" checksum = "9e5758dc828eb2d02ec30563cba365609d56ddd833190b192beaee2b475a7bb3"
dependencies = [ dependencies = [
"macro_magic", "macro_magic",
"proc-macro2", "proc-macro2",
@@ -4699,9 +4699,9 @@ dependencies = [
[[package]] [[package]]
name = "rustls-webpki" name = "rustls-webpki"
version = "0.103.10" version = "0.103.13"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df33b2b81ac578cabaf06b89b0631153a3f416b0a886e8a7a1707fb51abbd1ef" checksum = "61c429a8649f110dddef65e2a5ad240f747e85f7758a6bccc7e5777bd33f756e"
dependencies = [ dependencies = [
"ring", "ring",
"rustls-pki-types", "rustls-pki-types",
+6
View File
@@ -33,9 +33,15 @@ RUN pip3 install --break-system-packages ruff
COPY --from=builder /app/target/release/compliance-agent /usr/local/bin/compliance-agent COPY --from=builder /app/target/release/compliance-agent /usr/local/bin/compliance-agent
# Copy documentation for the help chat assistant
COPY --from=builder /app/README.md /app/README.md
COPY --from=builder /app/docs /app/docs
ENV HELP_DOCS_PATH=/app
# Ensure SSH key directory exists # Ensure SSH key directory exists
RUN mkdir -p /data/compliance-scanner/ssh RUN mkdir -p /data/compliance-scanner/ssh
EXPOSE 3001 3002 EXPOSE 3001 3002
ENTRYPOINT ["compliance-agent"] ENTRYPOINT ["compliance-agent"]
+2 -1
View File
@@ -1,6 +1,6 @@
FROM rust:1.94-bookworm AS builder FROM rust:1.94-bookworm AS builder
RUN cargo install dioxus-cli --version 0.7.4 RUN cargo install dioxus-cli --version 0.7.3 --locked
ARG DOCS_URL=/docs ARG DOCS_URL=/docs
@@ -20,3 +20,4 @@ ENV IP=0.0.0.0
EXPOSE 8080 EXPOSE 8080
ENTRYPOINT ["./compliance-dashboard"] ENTRYPOINT ["./compliance-dashboard"]
+1
View File
@@ -12,3 +12,4 @@ RUN rm /etc/nginx/conf.d/default.conf
COPY docs/nginx.conf /etc/nginx/conf.d/default.conf COPY docs/nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=builder /app/.vitepress/dist /usr/share/nginx/html COPY --from=builder /app/.vitepress/dist /usr/share/nginx/html
EXPOSE 80 EXPOSE 80
+1
View File
@@ -14,3 +14,4 @@ EXPOSE 8090
ENV MCP_PORT=8090 ENV MCP_PORT=8090
ENTRYPOINT ["compliance-mcp"] ENTRYPOINT ["compliance-mcp"]
+1 -1
View File
@@ -25,7 +25,7 @@ uuid = { workspace = true }
secrecy = { workspace = true } secrecy = { workspace = true }
regex = { workspace = true } regex = { workspace = true }
axum = "0.8" axum = "0.8"
tower-http = { version = "0.6", features = ["cors", "trace"] } tower-http = { version = "0.6", features = ["cors", "trace", "set-header"] }
git2 = "0.20" git2 = "0.20"
octocrab = "0.44" octocrab = "0.44"
tokio-cron-scheduler = "0.13" tokio-cron-scheduler = "0.13"
+6 -1
View File
@@ -35,11 +35,16 @@ impl ComplianceAgent {
config.litellm_model.clone(), config.litellm_model.clone(),
config.litellm_embed_model.clone(), config.litellm_embed_model.clone(),
)); ));
let http = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(30))
.connect_timeout(std::time::Duration::from_secs(10))
.build()
.unwrap_or_default();
Self { Self {
config, config,
db, db,
llm, llm,
http: reqwest::Client::new(), http,
session_streams: Arc::new(DashMap::new()), session_streams: Arc::new(DashMap::new()),
session_pause: Arc::new(DashMap::new()), session_pause: Arc::new(DashMap::new()),
session_semaphore: Arc::new(Semaphore::new(DEFAULT_MAX_CONCURRENT_SESSIONS)), session_semaphore: Arc::new(Semaphore::new(DEFAULT_MAX_CONCURRENT_SESSIONS)),
+43 -13
View File
@@ -104,28 +104,58 @@ fn load_docs(root: &Path) -> String {
/// Returns a reference to the cached doc context string, initialised on /// Returns a reference to the cached doc context string, initialised on
/// first call via `OnceLock`. /// first call via `OnceLock`.
///
/// Discovery order:
/// 1. `HELP_DOCS_PATH` env var (explicit override)
/// 2. Walk up from the binary location
/// 3. Current working directory
/// 4. Common Docker paths (/app, /opt/compliance-scanner)
fn doc_context() -> &'static str { fn doc_context() -> &'static str {
DOC_CONTEXT.get_or_init(|| { DOC_CONTEXT.get_or_init(|| {
// 1. Explicit env var
if let Ok(path) = std::env::var("HELP_DOCS_PATH") {
let p = PathBuf::from(&path);
if p.join("README.md").is_file() || p.join("docs").is_dir() {
tracing::info!("help_chat: loading docs from HELP_DOCS_PATH={path}");
return load_docs(&p);
}
tracing::warn!("help_chat: HELP_DOCS_PATH={path} has no README.md or docs/");
}
// 2. Walk up from binary location
let start = std::env::current_exe() let start = std::env::current_exe()
.ok() .ok()
.and_then(|p| p.parent().map(Path::to_path_buf)) .and_then(|p| p.parent().map(Path::to_path_buf))
.unwrap_or_else(|| PathBuf::from(".")); .unwrap_or_else(|| PathBuf::from("."));
match find_project_root(&start) { if let Some(root) = find_project_root(&start) {
Some(root) => load_docs(&root), return load_docs(&root);
None => { }
// Fallback: try current working directory
let cwd = std::env::current_dir().unwrap_or_else(|_| PathBuf::from(".")); // 3. Current working directory
if cwd.join("README.md").is_file() { if let Ok(cwd) = std::env::current_dir() {
return load_docs(&cwd); if let Some(root) = find_project_root(&cwd) {
} return load_docs(&root);
tracing::error!( }
"help_chat: could not locate project root from {}; doc context will be empty", if cwd.join("README.md").is_file() {
start.display() return load_docs(&cwd);
);
String::new()
} }
} }
// 4. Common Docker/deployment paths
for candidate in ["/app", "/opt/compliance-scanner", "/srv/compliance-scanner"] {
let p = PathBuf::from(candidate);
if p.join("README.md").is_file() || p.join("docs").is_dir() {
tracing::info!("help_chat: found docs at {candidate}");
return load_docs(&p);
}
}
tracing::error!(
"help_chat: could not locate project root; doc context will be empty. \
Set HELP_DOCS_PATH to the directory containing README.md and docs/"
);
String::new()
}) })
} }
+1
View File
@@ -6,6 +6,7 @@ pub mod graph;
pub mod health; pub mod health;
pub mod help_chat; pub mod help_chat;
pub mod issues; pub mod issues;
pub mod notifications;
pub mod pentest_handlers; pub mod pentest_handlers;
pub use pentest_handlers as pentest; pub use pentest_handlers as pentest;
pub mod repos; pub mod repos;
@@ -0,0 +1,178 @@
use axum::extract::Extension;
use axum::http::StatusCode;
use axum::Json;
use mongodb::bson::doc;
use serde::Deserialize;
use compliance_core::models::notification::CveNotification;
use super::dto::{AgentExt, ApiResponse};
/// GET /api/v1/notifications — List CVE notifications (newest first)
#[tracing::instrument(skip_all)]
pub async fn list_notifications(
Extension(agent): AgentExt,
axum::extract::Query(params): axum::extract::Query<NotificationFilter>,
) -> Result<Json<ApiResponse<Vec<CveNotification>>>, StatusCode> {
let mut filter = doc! {};
// Filter by status (default: show new + read, exclude dismissed)
match params.status.as_deref() {
Some("all") => {}
Some(s) => {
filter.insert("status", s);
}
None => {
filter.insert("status", doc! { "$in": ["new", "read"] });
}
}
// Filter by severity
if let Some(ref sev) = params.severity {
filter.insert("severity", sev.as_str());
}
// Filter by repo
if let Some(ref repo_id) = params.repo_id {
filter.insert("repo_id", repo_id.as_str());
}
let page = params.page.unwrap_or(1).max(1);
let limit = params.limit.unwrap_or(50).min(200);
let skip = (page - 1) * limit as u64;
let total = agent
.db
.cve_notifications()
.count_documents(filter.clone())
.await
.unwrap_or(0);
let notifications: Vec<CveNotification> = match agent
.db
.cve_notifications()
.find(filter)
.sort(doc! { "created_at": -1 })
.skip(skip)
.limit(limit)
.await
{
Ok(cursor) => {
use futures_util::StreamExt;
let mut items = Vec::new();
let mut cursor = cursor;
while let Some(Ok(n)) = cursor.next().await {
items.push(n);
}
items
}
Err(e) => {
tracing::error!("Failed to list notifications: {e}");
return Err(StatusCode::INTERNAL_SERVER_ERROR);
}
};
Ok(Json(ApiResponse {
data: notifications,
total: Some(total),
page: Some(page),
}))
}
/// GET /api/v1/notifications/count — Count of unread notifications
#[tracing::instrument(skip_all)]
pub async fn notification_count(
Extension(agent): AgentExt,
) -> Result<Json<serde_json::Value>, StatusCode> {
let count = agent
.db
.cve_notifications()
.count_documents(doc! { "status": "new" })
.await
.unwrap_or(0);
Ok(Json(serde_json::json!({ "count": count })))
}
/// PATCH /api/v1/notifications/:id/read — Mark a notification as read
#[tracing::instrument(skip_all, fields(id = %id))]
pub async fn mark_read(
Extension(agent): AgentExt,
axum::extract::Path(id): axum::extract::Path<String>,
) -> Result<Json<serde_json::Value>, StatusCode> {
let oid = mongodb::bson::oid::ObjectId::parse_str(&id).map_err(|_| StatusCode::BAD_REQUEST)?;
let result = agent
.db
.cve_notifications()
.update_one(
doc! { "_id": oid },
doc! { "$set": {
"status": "read",
"read_at": mongodb::bson::DateTime::now(),
}},
)
.await
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
if result.matched_count == 0 {
return Err(StatusCode::NOT_FOUND);
}
Ok(Json(serde_json::json!({ "status": "read" })))
}
/// PATCH /api/v1/notifications/:id/dismiss — Dismiss a notification
#[tracing::instrument(skip_all, fields(id = %id))]
pub async fn dismiss_notification(
Extension(agent): AgentExt,
axum::extract::Path(id): axum::extract::Path<String>,
) -> Result<Json<serde_json::Value>, StatusCode> {
let oid = mongodb::bson::oid::ObjectId::parse_str(&id).map_err(|_| StatusCode::BAD_REQUEST)?;
let result = agent
.db
.cve_notifications()
.update_one(
doc! { "_id": oid },
doc! { "$set": { "status": "dismissed" } },
)
.await
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
if result.matched_count == 0 {
return Err(StatusCode::NOT_FOUND);
}
Ok(Json(serde_json::json!({ "status": "dismissed" })))
}
/// POST /api/v1/notifications/read-all — Mark all new notifications as read
#[tracing::instrument(skip_all)]
pub async fn mark_all_read(
Extension(agent): AgentExt,
) -> Result<Json<serde_json::Value>, StatusCode> {
let result = agent
.db
.cve_notifications()
.update_many(
doc! { "status": "new" },
doc! { "$set": {
"status": "read",
"read_at": mongodb::bson::DateTime::now(),
}},
)
.await
.map_err(|_| StatusCode::INTERNAL_SERVER_ERROR)?;
Ok(Json(
serde_json::json!({ "updated": result.modified_count }),
))
}
#[derive(Debug, Deserialize)]
pub struct NotificationFilter {
pub status: Option<String>,
pub severity: Option<String>,
pub repo_id: Option<String>,
pub page: Option<u64>,
pub limit: Option<i64>,
}
+21
View File
@@ -101,6 +101,27 @@ pub fn build_router() -> Router {
) )
// Help chat (documentation-grounded Q&A) // Help chat (documentation-grounded Q&A)
.route("/api/v1/help/chat", post(handlers::help_chat::help_chat)) .route("/api/v1/help/chat", post(handlers::help_chat::help_chat))
// CVE notification endpoints
.route(
"/api/v1/notifications",
get(handlers::notifications::list_notifications),
)
.route(
"/api/v1/notifications/count",
get(handlers::notifications::notification_count),
)
.route(
"/api/v1/notifications/read-all",
post(handlers::notifications::mark_all_read),
)
.route(
"/api/v1/notifications/{id}/read",
patch(handlers::notifications::mark_read),
)
.route(
"/api/v1/notifications/{id}/dismiss",
patch(handlers::notifications::dismiss_notification),
)
// Pentest API endpoints // Pentest API endpoints
.route( .route(
"/api/v1/pentest/lookup-repo", "/api/v1/pentest/lookup-repo",
+20 -1
View File
@@ -1,8 +1,10 @@
use std::sync::Arc; use std::sync::Arc;
use axum::http::HeaderValue;
use axum::{middleware, Extension}; use axum::{middleware, Extension};
use tokio::sync::RwLock; use tokio::sync::RwLock;
use tower_http::cors::CorsLayer; use tower_http::cors::CorsLayer;
use tower_http::set_header::SetResponseHeaderLayer;
use tower_http::trace::TraceLayer; use tower_http::trace::TraceLayer;
use crate::agent::ComplianceAgent; use crate::agent::ComplianceAgent;
@@ -14,7 +16,24 @@ pub async fn start_api_server(agent: ComplianceAgent, port: u16) -> Result<(), A
let mut app = routes::build_router() let mut app = routes::build_router()
.layer(Extension(Arc::new(agent.clone()))) .layer(Extension(Arc::new(agent.clone())))
.layer(CorsLayer::permissive()) .layer(CorsLayer::permissive())
.layer(TraceLayer::new_for_http()); .layer(TraceLayer::new_for_http())
// Security headers (defense-in-depth, primary enforcement via Traefik)
.layer(SetResponseHeaderLayer::overriding(
axum::http::header::STRICT_TRANSPORT_SECURITY,
HeaderValue::from_static("max-age=31536000; includeSubDomains"),
))
.layer(SetResponseHeaderLayer::overriding(
axum::http::header::X_FRAME_OPTIONS,
HeaderValue::from_static("DENY"),
))
.layer(SetResponseHeaderLayer::overriding(
axum::http::header::X_CONTENT_TYPE_OPTIONS,
HeaderValue::from_static("nosniff"),
))
.layer(SetResponseHeaderLayer::overriding(
axum::http::header::REFERRER_POLICY,
HeaderValue::from_static("strict-origin-when-cross-origin"),
));
if let (Some(kc_url), Some(kc_realm)) = if let (Some(kc_url), Some(kc_realm)) =
(&agent.config.keycloak_url, &agent.config.keycloak_realm) (&agent.config.keycloak_url, &agent.config.keycloak_realm)
+1 -1
View File
@@ -42,7 +42,7 @@ pub fn load_config() -> Result<AgentConfig, AgentError> {
.unwrap_or(3001), .unwrap_or(3001),
scan_schedule: env_var_opt("SCAN_SCHEDULE").unwrap_or_else(|| "0 0 */6 * * *".to_string()), scan_schedule: env_var_opt("SCAN_SCHEDULE").unwrap_or_else(|| "0 0 */6 * * *".to_string()),
cve_monitor_schedule: env_var_opt("CVE_MONITOR_SCHEDULE") cve_monitor_schedule: env_var_opt("CVE_MONITOR_SCHEDULE")
.unwrap_or_else(|| "0 0 0 * * *".to_string()), .unwrap_or_else(|| "0 0 * * * *".to_string()),
git_clone_base_path: env_var_opt("GIT_CLONE_BASE_PATH") git_clone_base_path: env_var_opt("GIT_CLONE_BASE_PATH")
.unwrap_or_else(|| "/tmp/compliance-scanner/repos".to_string()), .unwrap_or_else(|| "/tmp/compliance-scanner/repos".to_string()),
ssh_key_path: env_var_opt("SSH_KEY_PATH") ssh_key_path: env_var_opt("SSH_KEY_PATH")
+25
View File
@@ -78,6 +78,25 @@ impl Database {
) )
.await?; .await?;
// cve_notifications: unique cve_id + repo_id + package, status filter
self.cve_notifications()
.create_index(
IndexModel::builder()
.keys(
doc! { "cve_id": 1, "repo_id": 1, "package_name": 1, "package_version": 1 },
)
.options(IndexOptions::builder().unique(true).build())
.build(),
)
.await?;
self.cve_notifications()
.create_index(
IndexModel::builder()
.keys(doc! { "status": 1, "created_at": -1 })
.build(),
)
.await?;
// tracker_issues: unique finding_id // tracker_issues: unique finding_id
self.tracker_issues() self.tracker_issues()
.create_index( .create_index(
@@ -222,6 +241,12 @@ impl Database {
self.inner.collection("cve_alerts") self.inner.collection("cve_alerts")
} }
pub fn cve_notifications(
&self,
) -> Collection<compliance_core::models::notification::CveNotification> {
self.inner.collection("cve_notifications")
}
pub fn tracker_issues(&self) -> Collection<TrackerIssue> { pub fn tracker_issues(&self) -> Collection<TrackerIssue> {
self.inner.collection("tracker_issues") self.inner.collection("tracker_issues")
} }
+6 -1
View File
@@ -19,12 +19,17 @@ impl LlmClient {
model: String, model: String,
embed_model: String, embed_model: String,
) -> Self { ) -> Self {
let http = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(300))
.connect_timeout(std::time::Duration::from_secs(10))
.build()
.unwrap_or_default();
Self { Self {
base_url, base_url,
api_key, api_key,
model, model,
embed_model, embed_model,
http: reqwest::Client::new(), http,
} }
} }
+1 -1
View File
@@ -4,7 +4,7 @@ use compliance_agent::{agent, api, config, database, scheduler, ssh, webhooks};
async fn main() -> Result<(), Box<dyn std::error::Error>> { async fn main() -> Result<(), Box<dyn std::error::Error>> {
match dotenvy::dotenv() { match dotenvy::dotenv() {
Ok(path) => eprintln!("[dotenv] Loaded from: {}", path.display()), Ok(path) => eprintln!("[dotenv] Loaded from: {}", path.display()),
Err(e) => eprintln!("[dotenv] FAILED: {e}"), Err(_) => eprintln!("[dotenv] No .env file found, using environment variables"),
} }
let _telemetry_guard = compliance_core::telemetry::init_telemetry("compliance-agent"); let _telemetry_guard = compliance_core::telemetry::init_telemetry("compliance-agent");
+27 -20
View File
@@ -19,26 +19,33 @@ impl Scanner for GitleaksScanner {
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
async fn scan(&self, repo_path: &Path, repo_id: &str) -> Result<ScanOutput, CoreError> { async fn scan(&self, repo_path: &Path, repo_id: &str) -> Result<ScanOutput, CoreError> {
let output = tokio::process::Command::new("gitleaks") let output = tokio::time::timeout(
.args([ std::time::Duration::from_secs(300),
"detect", tokio::process::Command::new("gitleaks")
"--source", .args([
".", "detect",
"--report-format", "--source",
"json", ".",
"--report-path", "--report-format",
"/dev/stdout", "json",
"--no-banner", "--report-path",
"--exit-code", "/dev/stdout",
"0", "--no-banner",
]) "--exit-code",
.current_dir(repo_path) "0",
.output() ])
.await .current_dir(repo_path)
.map_err(|e| CoreError::Scanner { .output(),
scanner: "gitleaks".to_string(), )
source: Box::new(e), .await
})?; .map_err(|_| CoreError::Scanner {
scanner: "gitleaks".to_string(),
source: "timed out after 5 minutes".into(),
})?
.map_err(|e| CoreError::Scanner {
scanner: "gitleaks".to_string(),
source: Box::new(e),
})?;
if output.stdout.is_empty() { if output.stdout.is_empty() {
return Ok(ScanOutput::default()); return Ok(ScanOutput::default());
+76 -22
View File
@@ -174,19 +174,26 @@ impl PipelineOrchestrator {
k.expose_secret().to_string() k.expose_secret().to_string()
}), }),
); );
let cve_alerts = match async { let cve_alerts = match tokio::time::timeout(
cve_scanner std::time::Duration::from_secs(600),
.scan_dependencies(&repo_id, &mut sbom_entries) async {
.await cve_scanner
} .scan_dependencies(&repo_id, &mut sbom_entries)
.instrument(tracing::info_span!("stage_cve_scanning")) .await
}
.instrument(tracing::info_span!("stage_cve_scanning")),
)
.await .await
{ {
Ok(alerts) => alerts, Ok(Ok(alerts)) => alerts,
Err(e) => { Ok(Err(e)) => {
tracing::warn!("[{repo_id}] CVE scanning failed: {e}"); tracing::warn!("[{repo_id}] CVE scanning failed: {e}");
Vec::new() Vec::new()
} }
Err(_) => {
tracing::warn!("[{repo_id}] CVE scanning timed out after 10 minutes");
Vec::new()
}
}; };
// Stage 4: Pattern Scanning (GDPR + OAuth) // Stage 4: Pattern Scanning (GDPR + OAuth)
@@ -315,20 +322,67 @@ impl PipelineOrchestrator {
.await?; .await?;
} }
// Persist CVE alerts (upsert by cve_id + repo_id) // Persist CVE alerts and create notifications
for alert in &cve_alerts { {
let filter = doc! { use compliance_core::models::notification::{parse_severity, CveNotification};
"cve_id": &alert.cve_id,
"repo_id": &alert.repo_id, let repo_name = repo.name.clone();
}; let mut new_notif_count = 0u32;
let update = mongodb::bson::to_document(alert)
.map(|d| doc! { "$set": d }) for alert in &cve_alerts {
.unwrap_or_else(|_| doc! {}); // Upsert the alert
self.db let filter = doc! {
.cve_alerts() "cve_id": &alert.cve_id,
.update_one(filter, update) "repo_id": &alert.repo_id,
.upsert(true) };
.await?; let update = mongodb::bson::to_document(alert)
.map(|d| doc! { "$set": d })
.unwrap_or_else(|_| doc! {});
self.db
.cve_alerts()
.update_one(filter, update)
.upsert(true)
.await?;
// Create notification (dedup by cve_id + repo + package + version)
let notif_filter = doc! {
"cve_id": &alert.cve_id,
"repo_id": &alert.repo_id,
"package_name": &alert.affected_package,
"package_version": &alert.affected_version,
};
let severity = parse_severity(alert.severity.as_deref(), alert.cvss_score);
let mut notification = CveNotification::new(
alert.cve_id.clone(),
repo_id.clone(),
repo_name.clone(),
alert.affected_package.clone(),
alert.affected_version.clone(),
severity,
);
notification.cvss_score = alert.cvss_score;
notification.summary = alert.summary.clone();
notification.url = Some(format!("https://osv.dev/vulnerability/{}", alert.cve_id));
let notif_update = doc! {
"$setOnInsert": mongodb::bson::to_bson(&notification).unwrap_or_default()
};
if let Ok(result) = self
.db
.cve_notifications()
.update_one(notif_filter, notif_update)
.upsert(true)
.await
{
if result.upserted_id.is_some() {
new_notif_count += 1;
}
}
}
if new_notif_count > 0 {
tracing::info!("[{repo_id}] Created {new_notif_count} CVE notification(s)");
}
} }
// Stage 6: Issue Creation // Stage 6: Issue Creation
+20 -14
View File
@@ -5,20 +5,26 @@ use compliance_core::CoreError;
#[tracing::instrument(skip_all, fields(repo_id = %repo_id))] #[tracing::instrument(skip_all, fields(repo_id = %repo_id))]
pub(super) async fn run_syft(repo_path: &Path, repo_id: &str) -> Result<Vec<SbomEntry>, CoreError> { pub(super) async fn run_syft(repo_path: &Path, repo_id: &str) -> Result<Vec<SbomEntry>, CoreError> {
let output = tokio::process::Command::new("syft") let output = tokio::time::timeout(
.arg(repo_path) std::time::Duration::from_secs(300),
.args(["-o", "cyclonedx-json"]) tokio::process::Command::new("syft")
// Enable remote license lookups for all ecosystems .arg(repo_path)
.env("SYFT_GOLANG_SEARCH_REMOTE_LICENSES", "true") .args(["-o", "cyclonedx-json"])
.env("SYFT_JAVASCRIPT_SEARCH_REMOTE_LICENSES", "true") .env("SYFT_GOLANG_SEARCH_REMOTE_LICENSES", "true")
.env("SYFT_PYTHON_SEARCH_REMOTE_LICENSES", "true") .env("SYFT_JAVASCRIPT_SEARCH_REMOTE_LICENSES", "true")
.env("SYFT_JAVA_USE_NETWORK", "true") .env("SYFT_PYTHON_SEARCH_REMOTE_LICENSES", "true")
.output() .env("SYFT_JAVA_USE_NETWORK", "true")
.await .output(),
.map_err(|e| CoreError::Scanner { )
scanner: "syft".to_string(), .await
source: Box::new(e), .map_err(|_| CoreError::Scanner {
})?; scanner: "syft".to_string(),
source: "timed out after 5 minutes".into(),
})?
.map_err(|e| CoreError::Scanner {
scanner: "syft".to_string(),
source: Box::new(e),
})?;
if !output.status.success() { if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr); let stderr = String::from_utf8_lossy(&output.stderr);
+24 -9
View File
@@ -19,15 +19,30 @@ impl Scanner for SemgrepScanner {
#[tracing::instrument(skip_all)] #[tracing::instrument(skip_all)]
async fn scan(&self, repo_path: &Path, repo_id: &str) -> Result<ScanOutput, CoreError> { async fn scan(&self, repo_path: &Path, repo_id: &str) -> Result<ScanOutput, CoreError> {
let output = tokio::process::Command::new("semgrep") let output = tokio::time::timeout(
.args(["--config=auto", "--json", "--quiet"]) std::time::Duration::from_secs(600),
.arg(repo_path) tokio::process::Command::new("semgrep")
.output() .args([
.await "--config=auto",
.map_err(|e| CoreError::Scanner { "--json",
scanner: "semgrep".to_string(), "--quiet",
source: Box::new(e), "--max-memory",
})?; "500",
"--jobs",
"1",
])
.arg(repo_path)
.output(),
)
.await
.map_err(|_| CoreError::Scanner {
scanner: "semgrep".to_string(),
source: "timed out after 10 minutes".into(),
})?
.map_err(|e| CoreError::Scanner {
scanner: "semgrep".to_string(),
source: Box::new(e),
})?;
if !output.status.success() && output.stdout.is_empty() { if !output.status.success() && output.stdout.is_empty() {
let stderr = String::from_utf8_lossy(&output.stderr); let stderr = String::from_utf8_lossy(&output.stderr);
+87 -19
View File
@@ -6,11 +6,16 @@ use compliance_core::models::embedding::{CodeEmbedding, EmbeddingBuildRun, Embed
use compliance_core::models::graph::CodeNode; use compliance_core::models::graph::CodeNode;
use compliance_graph::graph::chunking::extract_chunks; use compliance_graph::graph::chunking::extract_chunks;
use compliance_graph::graph::embedding_store::EmbeddingStore; use compliance_graph::graph::embedding_store::EmbeddingStore;
use futures_util::stream::{FuturesUnordered, StreamExt};
use tracing::{error, info}; use tracing::{error, info};
use crate::error::AgentError; use crate::error::AgentError;
use crate::llm::LlmClient; use crate::llm::LlmClient;
const EMBED_BATCH_SIZE: usize = 20;
const EMBED_CONCURRENCY: usize = 4;
const EMBED_FLUSH_EVERY: usize = 200;
/// RAG pipeline for building embeddings and performing retrieval /// RAG pipeline for building embeddings and performing retrieval
pub struct RagPipeline { pub struct RagPipeline {
llm: Arc<LlmClient>, llm: Arc<LlmClient>,
@@ -77,25 +82,33 @@ impl RagPipeline {
.await .await
.map_err(|e| AgentError::Other(format!("Failed to delete old embeddings: {e}")))?; .map_err(|e| AgentError::Other(format!("Failed to delete old embeddings: {e}")))?;
// Step 3: Batch embed (small batches to stay within model limits) // Step 3: Batch embed with bounded concurrency. Flush to Mongo and
let batch_size = 20; // update progress periodically so the dashboard can show live status.
let mut all_embeddings = Vec::new(); let mut pending = Vec::with_capacity(EMBED_FLUSH_EVERY);
let mut embedded_count = 0u32; let mut embedded_count = 0u32;
for batch_start in (0..chunks.len()).step_by(batch_size) { // Build the list of batch indices to process.
let batch_end = (batch_start + batch_size).min(chunks.len()); let batches: Vec<(usize, usize)> = (0..chunks.len())
let batch_chunks = &chunks[batch_start..batch_end]; .step_by(EMBED_BATCH_SIZE)
.map(|start| (start, (start + EMBED_BATCH_SIZE).min(chunks.len())))
.collect();
// Prepare texts: context_header + content let mut batch_iter = batches.into_iter();
let texts: Vec<String> = batch_chunks let mut in_flight = FuturesUnordered::new();
.iter()
.map(|c| format!("{}\n{}", c.context_header, c.content))
.collect();
match self.llm.embed(texts).await { // Prime up to EMBED_CONCURRENCY batches.
Ok(vectors) => { for _ in 0..EMBED_CONCURRENCY {
if let Some((start, end)) = batch_iter.next() {
in_flight.push(self.embed_batch(&chunks[start..end], start, end));
}
}
while let Some(result) = in_flight.next().await {
match result {
Ok((start, end, vectors)) => {
let batch_chunks = &chunks[start..end];
for (chunk, embedding) in batch_chunks.iter().zip(vectors) { for (chunk, embedding) in batch_chunks.iter().zip(vectors) {
all_embeddings.push(CodeEmbedding { pending.push(CodeEmbedding {
id: None, id: None,
repo_id: repo_id.to_string(), repo_id: repo_id.to_string(),
graph_build_id: graph_build_id.to_string(), graph_build_id: graph_build_id.to_string(),
@@ -113,9 +126,45 @@ impl RagPipeline {
}); });
} }
embedded_count += batch_chunks.len() as u32; embedded_count += batch_chunks.len() as u32;
// Flush pending embeddings to Mongo periodically and update progress.
if pending.len() >= EMBED_FLUSH_EVERY {
self.embedding_store
.store_embeddings(&pending)
.await
.map_err(|e| {
AgentError::Other(format!("Failed to store embeddings: {e}"))
})?;
pending.clear();
}
// Always update the progress counter on the build doc — even if
// we haven't flushed embeddings yet — so the UI shows movement.
if let Err(e) = self
.embedding_store
.update_build(
repo_id,
graph_build_id,
EmbeddingBuildStatus::Running,
embedded_count,
None,
)
.await
{
error!("[{repo_id}] Failed to update build progress: {e}");
}
// Queue the next batch to keep concurrency saturated.
if let Some((s, e)) = batch_iter.next() {
in_flight.push(self.embed_batch(&chunks[s..e], s, e));
}
} }
Err(e) => { Err(e) => {
error!("[{repo_id}] Embedding batch failed: {e}"); error!("[{repo_id}] Embedding batch failed: {e}");
// Flush whatever we have so partial progress isn't lost.
if !pending.is_empty() {
let _ = self.embedding_store.store_embeddings(&pending).await;
}
build.status = EmbeddingBuildStatus::Failed; build.status = EmbeddingBuildStatus::Failed;
build.error_message = Some(e.to_string()); build.error_message = Some(e.to_string());
build.completed_at = Some(Utc::now()); build.completed_at = Some(Utc::now());
@@ -134,11 +183,13 @@ impl RagPipeline {
} }
} }
// Step 4: Store all embeddings // Step 4: Flush any remaining embeddings
self.embedding_store if !pending.is_empty() {
.store_embeddings(&all_embeddings) self.embedding_store
.await .store_embeddings(&pending)
.map_err(|e| AgentError::Other(format!("Failed to store embeddings: {e}")))?; .await
.map_err(|e| AgentError::Other(format!("Failed to store embeddings: {e}")))?;
}
// Step 5: Update build status // Step 5: Update build status
build.status = EmbeddingBuildStatus::Completed; build.status = EmbeddingBuildStatus::Completed;
@@ -161,4 +212,21 @@ impl RagPipeline {
); );
Ok(build) Ok(build)
} }
/// Embed one batch of chunks. Returns the (start, end, vectors) tuple so
/// out-of-order completion from `FuturesUnordered` can still be reconciled
/// against the original chunk slice.
async fn embed_batch(
&self,
batch_chunks: &[compliance_graph::graph::chunking::CodeChunk],
start: usize,
end: usize,
) -> Result<(usize, usize, Vec<Vec<f64>>), AgentError> {
let texts: Vec<String> = batch_chunks
.iter()
.map(|c| format!("{}\n{}", c.context_header, c.content))
.collect();
let vectors = self.llm.embed(texts).await?;
Ok((start, end, vectors))
}
} }
+142 -8
View File
@@ -82,24 +82,158 @@ async fn scan_all_repos(agent: &ComplianceAgent) {
} }
async fn monitor_cves(agent: &ComplianceAgent) { async fn monitor_cves(agent: &ComplianceAgent) {
use compliance_core::models::notification::{parse_severity, CveNotification};
use compliance_core::models::SbomEntry;
use futures_util::StreamExt; use futures_util::StreamExt;
// Re-scan all SBOM entries for new CVEs // Fetch all SBOM entries grouped by repo
let cursor = match agent.db.sbom_entries().find(doc! {}).await { let cursor = match agent.db.sbom_entries().find(doc! {}).await {
Ok(c) => c, Ok(c) => c,
Err(e) => { Err(e) => {
tracing::error!("Failed to list SBOM entries for CVE monitoring: {e}"); tracing::error!("CVE monitor: failed to list SBOM entries: {e}");
return; return;
} }
}; };
let entries: Vec<SbomEntry> = cursor.filter_map(|r| async { r.ok() }).collect().await;
let entries: Vec<_> = cursor.filter_map(|r| async { r.ok() }).collect().await;
if entries.is_empty() { if entries.is_empty() {
tracing::debug!("CVE monitor: no SBOM entries, skipping");
return; return;
} }
tracing::info!("CVE monitor: checking {} dependencies", entries.len()); tracing::info!(
// The actual CVE checking is handled by the CveScanner in the pipeline "CVE monitor: checking {} dependencies for new CVEs",
// This is a simplified version that just logs the activity entries.len()
);
// Build a repo_id → repo_name lookup
let repo_ids: std::collections::HashSet<String> =
entries.iter().map(|e| e.repo_id.clone()).collect();
let mut repo_names: std::collections::HashMap<String, String> =
std::collections::HashMap::new();
for rid in &repo_ids {
if let Ok(oid) = mongodb::bson::oid::ObjectId::parse_str(rid) {
if let Ok(Some(repo)) = agent.db.repositories().find_one(doc! { "_id": oid }).await {
repo_names.insert(rid.clone(), repo.name.clone());
}
}
}
// Use the existing CveScanner to query OSV.dev
let nvd_key = agent.config.nvd_api_key.as_ref().map(|k| {
use secrecy::ExposeSecret;
k.expose_secret().to_string()
});
let scanner = crate::pipeline::cve::CveScanner::new(
agent.http.clone(),
agent.config.searxng_url.clone(),
nvd_key,
);
// Group entries by repo for scanning
let mut entries_by_repo: std::collections::HashMap<String, Vec<SbomEntry>> =
std::collections::HashMap::new();
for entry in entries {
entries_by_repo
.entry(entry.repo_id.clone())
.or_default()
.push(entry);
}
let mut new_notifications = 0u32;
for (repo_id, mut repo_entries) in entries_by_repo {
let repo_name = repo_names
.get(&repo_id)
.cloned()
.unwrap_or_else(|| repo_id.clone());
// Scan dependencies for CVEs
let alerts = match scanner.scan_dependencies(&repo_id, &mut repo_entries).await {
Ok(a) => a,
Err(e) => {
tracing::warn!("CVE monitor: scan failed for {repo_name}: {e}");
continue;
}
};
// Upsert CVE alerts (existing logic)
for alert in &alerts {
let filter = doc! { "cve_id": &alert.cve_id, "repo_id": &alert.repo_id };
let update = doc! { "$setOnInsert": mongodb::bson::to_bson(alert).unwrap_or_default() };
let _ = agent
.db
.cve_alerts()
.update_one(filter, update)
.upsert(true)
.await;
}
// Update SBOM entries with discovered vulnerabilities
for entry in &repo_entries {
if entry.known_vulnerabilities.is_empty() {
continue;
}
if let Some(entry_id) = &entry.id {
let _ = agent
.db
.sbom_entries()
.update_one(
doc! { "_id": entry_id },
doc! { "$set": {
"known_vulnerabilities": mongodb::bson::to_bson(&entry.known_vulnerabilities).unwrap_or_default(),
"updated_at": mongodb::bson::DateTime::now(),
}},
)
.await;
}
}
// Create notifications for NEW CVEs (dedup against existing notifications)
for alert in &alerts {
let filter = doc! {
"cve_id": &alert.cve_id,
"repo_id": &alert.repo_id,
"package_name": &alert.affected_package,
"package_version": &alert.affected_version,
};
// Only insert if not already exists (upsert with $setOnInsert)
let severity = parse_severity(alert.severity.as_deref(), alert.cvss_score);
let mut notification = CveNotification::new(
alert.cve_id.clone(),
repo_id.clone(),
repo_name.clone(),
alert.affected_package.clone(),
alert.affected_version.clone(),
severity,
);
notification.cvss_score = alert.cvss_score;
notification.summary = alert.summary.clone();
notification.url = Some(format!("https://osv.dev/vulnerability/{}", alert.cve_id));
let update = doc! {
"$setOnInsert": mongodb::bson::to_bson(&notification).unwrap_or_default()
};
match agent
.db
.cve_notifications()
.update_one(filter, update)
.upsert(true)
.await
{
Ok(result) if result.upserted_id.is_some() => {
new_notifications += 1;
}
Err(e) => {
tracing::warn!("CVE monitor: failed to create notification: {e}");
}
_ => {} // Already exists
}
}
}
if new_notifications > 0 {
tracing::info!("CVE monitor: created {new_notifications} new notification(s)");
} else {
tracing::info!("CVE monitor: no new CVEs found");
}
} }
+2
View File
@@ -7,6 +7,7 @@ pub mod finding;
pub mod graph; pub mod graph;
pub mod issue; pub mod issue;
pub mod mcp; pub mod mcp;
pub mod notification;
pub mod pentest; pub mod pentest;
pub mod repository; pub mod repository;
pub mod sbom; pub mod sbom;
@@ -27,6 +28,7 @@ pub use graph::{
}; };
pub use issue::{IssueStatus, TrackerIssue, TrackerType}; pub use issue::{IssueStatus, TrackerIssue, TrackerType};
pub use mcp::{McpServerConfig, McpServerStatus, McpTransport}; pub use mcp::{McpServerConfig, McpServerStatus, McpTransport};
pub use notification::{CveNotification, NotificationSeverity, NotificationStatus};
pub use pentest::{ pub use pentest::{
AttackChainNode, AttackNodeStatus, AuthMode, CodeContextHint, Environment, IdentityProvider, AttackChainNode, AttackNodeStatus, AuthMode, CodeContextHint, Environment, IdentityProvider,
PentestAuthConfig, PentestConfig, PentestEvent, PentestMessage, PentestSession, PentestStats, PentestAuthConfig, PentestConfig, PentestEvent, PentestMessage, PentestSession, PentestStats,
+103
View File
@@ -0,0 +1,103 @@
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
/// Status of a CVE notification
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)]
#[serde(rename_all = "lowercase")]
pub enum NotificationStatus {
/// Newly created, not yet seen by the user
New,
/// User has seen it (e.g., opened the notification panel)
Read,
/// User has explicitly acknowledged/dismissed it
Dismissed,
}
/// Severity level for notification filtering
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, PartialOrd, Ord)]
#[serde(rename_all = "lowercase")]
pub enum NotificationSeverity {
Low,
Medium,
High,
Critical,
}
/// A notification about a newly discovered CVE affecting a tracked dependency.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CveNotification {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
pub id: Option<bson::oid::ObjectId>,
/// The CVE/GHSA identifier
pub cve_id: String,
/// Repository where the vulnerable dependency is used
pub repo_id: String,
/// Repository name (denormalized for display)
pub repo_name: String,
/// Affected package name
pub package_name: String,
/// Affected version
pub package_version: String,
/// Human-readable severity
pub severity: NotificationSeverity,
/// CVSS score if available
pub cvss_score: Option<f64>,
/// Short summary of the vulnerability
pub summary: Option<String>,
/// Link to vulnerability details
pub url: Option<String>,
/// Notification lifecycle status
pub status: NotificationStatus,
/// When the CVE was first detected for this dependency
#[serde(with = "super::serde_helpers::bson_datetime")]
pub created_at: DateTime<Utc>,
/// When the user last interacted with this notification
pub read_at: Option<DateTime<Utc>>,
}
impl CveNotification {
pub fn new(
cve_id: String,
repo_id: String,
repo_name: String,
package_name: String,
package_version: String,
severity: NotificationSeverity,
) -> Self {
Self {
id: None,
cve_id,
repo_id,
repo_name,
package_name,
package_version,
severity,
cvss_score: None,
summary: None,
url: None,
status: NotificationStatus::New,
created_at: Utc::now(),
read_at: None,
}
}
}
/// Map an OSV/NVD severity string to our notification severity
pub fn parse_severity(s: Option<&str>, cvss: Option<f64>) -> NotificationSeverity {
// Prefer CVSS score if available
if let Some(score) = cvss {
return match score {
s if s >= 9.0 => NotificationSeverity::Critical,
s if s >= 7.0 => NotificationSeverity::High,
s if s >= 4.0 => NotificationSeverity::Medium,
_ => NotificationSeverity::Low,
};
}
// Fall back to string severity
match s.map(|s| s.to_uppercase()).as_deref() {
Some("CRITICAL") => NotificationSeverity::Critical,
Some("HIGH") => NotificationSeverity::High,
Some("MODERATE" | "MEDIUM") => NotificationSeverity::Medium,
_ => NotificationSeverity::Low,
}
}
+1 -1
View File
@@ -51,7 +51,7 @@ thiserror = { workspace = true }
# Web-only # Web-only
reqwest = { workspace = true, optional = true } reqwest = { workspace = true, optional = true }
web-sys = { version = "0.3", optional = true, features = ["Blob", "BlobPropertyBag", "HtmlAnchorElement", "Url", "Document", "Window"] } web-sys = { version = "0.3", optional = true, features = ["Blob", "BlobPropertyBag", "HtmlAnchorElement", "Url", "Document", "Element", "Window", "Storage", "MediaQueryList"] }
js-sys = { version = "0.3", optional = true } js-sys = { version = "0.3", optional = true }
wasm-bindgen = { version = "0.2", optional = true } wasm-bindgen = { version = "0.2", optional = true }
gloo-timers = { version = "0.3", features = ["futures"], optional = true } gloo-timers = { version = "0.3", features = ["futures"], optional = true }
+181
View File
@@ -61,6 +61,77 @@
--ease-spring: cubic-bezier(0.34, 1.56, 0.64, 1); --ease-spring: cubic-bezier(0.34, 1.56, 0.64, 1);
} }
/* ── Light theme tokens ──
Applied when the user has explicitly chosen light (`data-theme="light"`)
OR when their OS prefers light AND they have made no explicit choice. */
:root[data-theme="light"] {
--bg-primary: #f5f7fb;
--bg-secondary: #ffffff;
--bg-card: rgba(255, 255, 255, 0.85);
--bg-card-solid: #ffffff;
--bg-card-hover: #f1f5fb;
--bg-elevated: #f8fafc;
--text-primary: #0c1426;
--text-secondary: #475569;
--text-tertiary: #8a9bb4;
--accent: #0070d4;
--accent-hover: #0080f0;
--accent-muted: rgba(0, 112, 212, 0.10);
--accent-glow: 0 0 20px rgba(0, 112, 212, 0.10);
--border: #e2e8f0;
--border-bright: #cbd5e1;
--border-accent: rgba(0, 112, 212, 0.30);
--danger: #dc2626;
--danger-bg: rgba(220, 38, 38, 0.08);
--warning: #d97706;
--warning-bg: rgba(217, 119, 6, 0.08);
--success: #16a34a;
--success-bg: rgba(22, 163, 74, 0.08);
--info: #2563eb;
--info-bg: rgba(37, 99, 235, 0.08);
--orange: #ea580c;
--orange-bg: rgba(234, 88, 12, 0.08);
}
@media (prefers-color-scheme: light) {
:root:not([data-theme="dark"]) {
--bg-primary: #f5f7fb;
--bg-secondary: #ffffff;
--bg-card: rgba(255, 255, 255, 0.85);
--bg-card-solid: #ffffff;
--bg-card-hover: #f1f5fb;
--bg-elevated: #f8fafc;
--text-primary: #0c1426;
--text-secondary: #475569;
--text-tertiary: #8a9bb4;
--accent: #0070d4;
--accent-hover: #0080f0;
--accent-muted: rgba(0, 112, 212, 0.10);
--accent-glow: 0 0 20px rgba(0, 112, 212, 0.10);
--border: #e2e8f0;
--border-bright: #cbd5e1;
--border-accent: rgba(0, 112, 212, 0.30);
--danger: #dc2626;
--danger-bg: rgba(220, 38, 38, 0.08);
--warning: #d97706;
--warning-bg: rgba(217, 119, 6, 0.08);
--success: #16a34a;
--success-bg: rgba(22, 163, 74, 0.08);
--info: #2563eb;
--info-bg: rgba(37, 99, 235, 0.08);
--orange: #ea580c;
--orange-bg: rgba(234, 88, 12, 0.08);
}
}
/* ── Reset & Base ── */ /* ── Reset & Base ── */
@@ -396,6 +467,44 @@ code {
background: rgba(0, 200, 255, 0.06); background: rgba(0, 200, 255, 0.06);
} }
.theme-toggle {
background: none;
border: none;
border-top: 1px solid var(--border);
color: var(--text-secondary);
padding: 11px 18px;
cursor: pointer;
display: flex;
align-items: center;
gap: 11px;
font-family: var(--font-body);
font-size: 13.5px;
font-weight: 500;
transition: color 0.2s, background 0.2s;
width: 100%;
text-align: left;
}
.theme-toggle:hover {
color: var(--accent);
background: var(--accent-muted);
}
.theme-toggle svg {
flex-shrink: 0;
opacity: 0.75;
transition: opacity 0.2s;
}
.theme-toggle:hover svg {
opacity: 1;
}
.sidebar.collapsed .theme-toggle {
justify-content: center;
padding: 11px 0;
}
.sidebar.collapsed .sidebar-header { .sidebar.collapsed .sidebar-header {
padding: 22px 0; padding: 22px 0;
justify-content: center; justify-content: center;
@@ -3847,3 +3956,75 @@ tbody tr:last-child td {
.help-chat-send:not(:disabled):hover { .help-chat-send:not(:disabled):hover {
background: var(--accent-hover); background: var(--accent-hover);
} }
/* ═══════════════════════════════════════════════════════════════
NOTIFICATION BELL — CVE alert dropdown
═══════════════════════════════════════════════════════════════ */
.notification-bell-wrapper { position: fixed; top: 16px; right: 28px; z-index: 48; }
.notification-bell-btn { position: relative; background: var(--bg-elevated); border: 1px solid var(--border); border-radius: 10px; padding: 8px 10px; color: var(--text-secondary); cursor: pointer; display: flex; align-items: center; transition: color 0.15s, border-color 0.15s; }
.notification-bell-btn:hover { color: var(--text-primary); border-color: var(--border-bright); }
.notification-badge { position: absolute; top: -4px; right: -4px; background: var(--danger); color: #fff; font-size: 10px; font-weight: 700; min-width: 18px; height: 18px; border-radius: 9px; display: flex; align-items: center; justify-content: center; padding: 0 4px; font-family: 'Outfit', sans-serif; }
.notification-panel { position: absolute; top: 44px; right: 0; width: 380px; max-height: 480px; background: var(--bg-secondary); border: 1px solid var(--border-bright); border-radius: 12px; overflow: hidden; box-shadow: 0 12px 48px rgba(0,0,0,0.5); display: flex; flex-direction: column; }
.notification-panel-header { display: flex; align-items: center; justify-content: space-between; padding: 12px 16px; border-bottom: 1px solid var(--border); font-family: 'Outfit', sans-serif; font-weight: 600; font-size: 14px; color: var(--text-primary); }
.notification-close-btn { background: none; border: none; color: var(--text-secondary); cursor: pointer; padding: 2px; }
.notification-panel-body { overflow-y: auto; flex: 1; padding: 8px; }
.notification-loading, .notification-empty { display: flex; flex-direction: column; align-items: center; justify-content: center; padding: 32px 16px; color: var(--text-secondary); font-size: 13px; gap: 8px; }
.notification-item { padding: 10px 12px; border-radius: 8px; margin-bottom: 4px; background: var(--bg-card); border: 1px solid var(--border); transition: border-color 0.15s; }
.notification-item:hover { border-color: var(--border-bright); }
.notification-item-header { display: flex; align-items: center; gap: 8px; margin-bottom: 4px; }
.notification-sev { font-size: 10px; font-weight: 700; padding: 2px 6px; border-radius: 4px; text-transform: uppercase; letter-spacing: 0.5px; font-family: 'Outfit', sans-serif; }
.notification-sev.sev-critical { background: var(--danger-bg); color: var(--danger); }
.notification-sev.sev-high { background: rgba(255,140,0,0.12); color: #ff8c00; }
.notification-sev.sev-medium { background: var(--warning-bg); color: var(--warning); }
.notification-sev.sev-low { background: rgba(0,200,255,0.08); color: var(--accent); }
.notification-cve-id { font-size: 12px; font-weight: 600; color: var(--text-primary); font-family: 'JetBrains Mono', monospace; }
.notification-cve-id a { color: var(--accent); text-decoration: none; }
.notification-cve-id a:hover { text-decoration: underline; }
.notification-cvss { font-size: 10px; color: var(--text-secondary); margin-left: auto; font-family: 'JetBrains Mono', monospace; }
.notification-dismiss-btn { background: none; border: none; color: var(--text-tertiary); cursor: pointer; padding: 2px; margin-left: 4px; }
.notification-dismiss-btn:hover { color: var(--danger); }
.notification-item-pkg { font-size: 12px; color: var(--text-primary); font-family: 'JetBrains Mono', monospace; }
.notification-item-repo { font-size: 11px; color: var(--text-secondary); margin-bottom: 4px; }
.notification-item-summary { font-size: 11px; color: var(--text-secondary); line-height: 1.4; display: -webkit-box; -webkit-line-clamp: 2; -webkit-box-orient: vertical; overflow: hidden; }
/* ═══════════════════════════════════════════════════════════════
COPY BUTTON — Reusable clipboard copy component
═══════════════════════════════════════════════════════════════ */
.copy-btn { background: none; border: 1px solid var(--border); border-radius: 6px; padding: 5px 7px; color: var(--text-secondary); cursor: pointer; display: inline-flex; align-items: center; transition: color 0.15s, border-color 0.15s, background 0.15s; flex-shrink: 0; }
.copy-btn:hover { color: var(--accent); border-color: var(--accent); background: var(--accent-muted); }
.copy-btn-sm { padding: 3px 5px; border-radius: 4px; }
/* Copyable inline field pattern: value + copy button side by side */
.copyable { display: flex; align-items: center; gap: 6px; }
.copyable code, .copyable .mono { flex: 1; min-width: 0; overflow: hidden; text-overflow: ellipsis; white-space: nowrap; }
.code-snippet-wrapper { position: relative; }
.code-snippet-header { display: flex; align-items: center; justify-content: space-between; margin-bottom: 4px; gap: 8px; }
/* ═══════════════════════════════════════════════════════════════
LIGHT THEME — surface overrides for the few hardcoded dark
colors that don't go through CSS custom properties.
═══════════════════════════════════════════════════════════════ */
:root[data-theme="light"] .main-content {
background-image: radial-gradient(circle at 1px 1px, rgba(100, 116, 139, 0.18) 1px, transparent 0);
}
:root[data-theme="light"] .code-block {
background: #f8fafc;
color: #0c1426;
}
:root[data-theme="light"] .graph-stab-overlay {
background: radial-gradient(ellipse at center, rgba(245, 247, 251, 0.92) 0%, rgba(245, 247, 251, 0.98) 100%);
}
@media (prefers-color-scheme: light) {
:root:not([data-theme="dark"]) .main-content {
background-image: radial-gradient(circle at 1px 1px, rgba(100, 116, 139, 0.18) 1px, transparent 0);
}
:root:not([data-theme="dark"]) .code-block {
background: #f8fafc;
color: #0c1426;
}
:root:not([data-theme="dark"]) .graph-stab-overlay {
background: radial-gradient(ellipse at center, rgba(245, 247, 251, 0.92) 0%, rgba(245, 247, 251, 0.98) 100%);
}
}
@@ -2,6 +2,7 @@ use dioxus::prelude::*;
use crate::app::Route; use crate::app::Route;
use crate::components::help_chat::HelpChat; use crate::components::help_chat::HelpChat;
use crate::components::notification_bell::NotificationBell;
use crate::components::sidebar::Sidebar; use crate::components::sidebar::Sidebar;
use crate::components::toast::{ToastContainer, Toasts}; use crate::components::toast::{ToastContainer, Toasts};
use crate::infrastructure::auth_check::check_auth; use crate::infrastructure::auth_check::check_auth;
@@ -21,6 +22,7 @@ pub fn AppShell() -> Element {
main { class: "main-content", main { class: "main-content",
Outlet::<Route> {} Outlet::<Route> {}
} }
NotificationBell {}
ToastContainer {} ToastContainer {}
HelpChat {} HelpChat {}
} }
@@ -30,7 +32,7 @@ pub fn AppShell() -> Element {
// Not authenticated — redirect to Keycloak login // Not authenticated — redirect to Keycloak login
rsx! { rsx! {
document::Script { document::Script {
dangerous_inner_html: "window.location.href = '/auth';" "window.location.href = '/auth';"
} }
} }
} }
@@ -1,5 +1,7 @@
use dioxus::prelude::*; use dioxus::prelude::*;
use crate::components::copy_button::CopyButton;
#[component] #[component]
pub fn CodeSnippet( pub fn CodeSnippet(
code: String, code: String,
@@ -7,15 +9,18 @@ pub fn CodeSnippet(
#[props(default)] line_number: u32, #[props(default)] line_number: u32,
) -> Element { ) -> Element {
rsx! { rsx! {
div { div { class: "code-snippet-wrapper",
if !file_path.is_empty() { div { class: "code-snippet-header",
div { if !file_path.is_empty() {
style: "font-size: 12px; color: var(--text-secondary); margin-bottom: 4px; font-family: monospace;", span {
"{file_path}" style: "font-size: 12px; color: var(--text-secondary); font-family: monospace;",
if line_number > 0 { "{file_path}"
":{line_number}" if line_number > 0 {
":{line_number}"
}
} }
} }
CopyButton { value: code.clone(), small: true }
} }
pre { class: "code-block", "{code}" } pre { class: "code-block", "{code}" }
} }
@@ -0,0 +1,49 @@
use dioxus::prelude::*;
use dioxus_free_icons::icons::bs_icons::*;
use dioxus_free_icons::Icon;
/// A small copy-to-clipboard button that shows a checkmark after copying.
///
/// Usage: `CopyButton { value: "text to copy" }`
#[component]
pub fn CopyButton(value: String, #[props(default = false)] small: bool) -> Element {
let mut copied = use_signal(|| false);
let size = if small { 12 } else { 14 };
let class = if small {
"copy-btn copy-btn-sm"
} else {
"copy-btn"
};
rsx! {
button {
class: class,
title: if copied() { "Copied!" } else { "Copy to clipboard" },
onclick: move |_| {
let val = value.clone();
// Escape for JS single-quoted string
let escaped = val
.replace('\\', "\\\\")
.replace('\'', "\\'")
.replace('\n', "\\n")
.replace('\r', "\\r");
let js = format!("navigator.clipboard.writeText('{escaped}')");
document::eval(&js);
copied.set(true);
spawn(async move {
#[cfg(feature = "web")]
gloo_timers::future::TimeoutFuture::new(2000).await;
#[cfg(not(feature = "web"))]
tokio::time::sleep(std::time::Duration::from_secs(2)).await;
copied.set(false);
});
},
if copied() {
Icon { icon: BsCheckLg, width: size, height: size }
} else {
Icon { icon: BsClipboard, width: size, height: size }
}
}
}
}
@@ -2,12 +2,15 @@ pub mod app_shell;
pub mod attack_chain; pub mod attack_chain;
pub mod code_inspector; pub mod code_inspector;
pub mod code_snippet; pub mod code_snippet;
pub mod copy_button;
pub mod file_tree; pub mod file_tree;
pub mod help_chat; pub mod help_chat;
pub mod notification_bell;
pub mod page_header; pub mod page_header;
pub mod pagination; pub mod pagination;
pub mod pentest_wizard; pub mod pentest_wizard;
pub mod severity_badge; pub mod severity_badge;
pub mod sidebar; pub mod sidebar;
pub mod stat_card; pub mod stat_card;
pub mod theme_toggle;
pub mod toast; pub mod toast;
@@ -0,0 +1,155 @@
use dioxus::prelude::*;
use dioxus_free_icons::icons::bs_icons::*;
use dioxus_free_icons::Icon;
use crate::infrastructure::notifications::{
dismiss_notification, fetch_notification_count, fetch_notifications,
mark_all_notifications_read,
};
#[component]
pub fn NotificationBell() -> Element {
let mut is_open = use_signal(|| false);
let mut count = use_signal(|| 0u64);
let mut notifications = use_signal(Vec::new);
let mut is_loading = use_signal(|| false);
// Poll notification count every 30 seconds
use_resource(move || async move {
loop {
if let Ok(c) = fetch_notification_count().await {
count.set(c);
}
#[cfg(feature = "web")]
{
gloo_timers::future::TimeoutFuture::new(30_000).await;
}
#[cfg(not(feature = "web"))]
{
tokio::time::sleep(std::time::Duration::from_secs(30)).await;
}
}
});
// Load notifications when panel opens
let load_notifications = move |_| {
is_open.set(!is_open());
if !is_open() {
return;
}
is_loading.set(true);
spawn(async move {
if let Ok(resp) = fetch_notifications().await {
notifications.set(resp.data);
}
// Mark all as read when panel opens
let _ = mark_all_notifications_read().await;
count.set(0);
is_loading.set(false);
});
};
let on_dismiss = move |id: String| {
spawn(async move {
let _ = dismiss_notification(id.clone()).await;
notifications.write().retain(|n| {
n.id.as_ref()
.and_then(|v| v.get("$oid"))
.and_then(|v| v.as_str())
!= Some(&id)
});
});
};
rsx! {
div { class: "notification-bell-wrapper",
// Bell button
button {
class: "notification-bell-btn",
onclick: load_notifications,
title: "CVE Alerts",
Icon { icon: BsBell, width: 18, height: 18 }
if count() > 0 {
span { class: "notification-badge", "{count()}" }
}
}
// Dropdown panel
if is_open() {
div { class: "notification-panel",
div { class: "notification-panel-header",
span { "CVE Alerts" }
button {
class: "notification-close-btn",
onclick: move |_| is_open.set(false),
Icon { icon: BsX, width: 16, height: 16 }
}
}
div { class: "notification-panel-body",
if is_loading() {
div { class: "notification-loading", "Loading..." }
} else if notifications().is_empty() {
div { class: "notification-empty",
Icon { icon: BsShieldCheck, width: 32, height: 32 }
p { "No CVE alerts" }
}
} else {
for notif in notifications().iter() {
{
let id = notif.id.as_ref()
.and_then(|v| v.get("$oid"))
.and_then(|v| v.as_str())
.unwrap_or("")
.to_string();
let sev_class = match notif.severity.as_str() {
"critical" => "sev-critical",
"high" => "sev-high",
"medium" => "sev-medium",
_ => "sev-low",
};
let dismiss_id = id.clone();
rsx! {
div { class: "notification-item",
div { class: "notification-item-header",
span { class: "notification-sev {sev_class}",
"{notif.severity.to_uppercase()}"
}
span { class: "notification-cve-id",
if let Some(ref url) = notif.url {
a { href: "{url}", target: "_blank", "{notif.cve_id}" }
} else {
"{notif.cve_id}"
}
}
if let Some(score) = notif.cvss_score {
span { class: "notification-cvss", "CVSS {score:.1}" }
}
button {
class: "notification-dismiss-btn",
title: "Dismiss",
onclick: move |_| on_dismiss(dismiss_id.clone()),
Icon { icon: BsXCircle, width: 14, height: 14 }
}
}
div { class: "notification-item-pkg",
"{notif.package_name} {notif.package_version}"
}
div { class: "notification-item-repo",
"{notif.repo_name}"
}
if let Some(ref summary) = notif.summary {
div { class: "notification-item-summary",
"{summary}"
}
}
}
}
}
}
}
}
}
}
}
}
}
@@ -4,6 +4,7 @@ use dioxus_free_icons::icons::bs_icons::*;
use dioxus_free_icons::Icon; use dioxus_free_icons::Icon;
use crate::app::Route; use crate::app::Route;
use crate::components::theme_toggle::ThemeToggle;
struct NavItem { struct NavItem {
label: &'static str, label: &'static str,
@@ -106,6 +107,7 @@ pub fn Sidebar() -> Element {
} }
// Spacer pushes footer to the bottom // Spacer pushes footer to the bottom
div { class: "sidebar-spacer" } div { class: "sidebar-spacer" }
ThemeToggle { collapsed: collapsed() }
button { button {
class: "sidebar-toggle", class: "sidebar-toggle",
onclick: move |_| collapsed.set(!collapsed()), onclick: move |_| collapsed.set(!collapsed()),
@@ -0,0 +1,104 @@
use dioxus::prelude::*;
use dioxus_free_icons::icons::bs_icons::{BsMoonStars, BsSun};
use dioxus_free_icons::Icon;
#[cfg(feature = "web")]
const STORAGE_KEY: &str = "compliance-scanner.theme";
/// Sidebar-footer theme toggle. Reads the initial state on mount from
/// localStorage (explicit user choice) or `prefers-color-scheme` (OS default),
/// then writes back to both the `<html data-theme="...">` attribute and
/// localStorage on every click.
#[component]
pub fn ThemeToggle(collapsed: bool) -> Element {
// `None` until the on-mount effect resolves the real value, so SSR doesn't
// render the wrong icon for the user's actual theme.
let mut is_dark = use_signal(|| None::<bool>);
use_effect(move || {
let (dark, from_storage) = initial_theme();
is_dark.set(Some(dark));
// If the user already made an explicit choice (in localStorage), assert it
// on the DOM so an OS-vs-stored mismatch can't briefly show the wrong theme.
if from_storage {
apply_theme(dark);
}
});
let label = if collapsed {
""
} else if is_dark().unwrap_or(true) {
"Light mode"
} else {
"Dark mode"
};
let title = if is_dark().unwrap_or(true) {
"Switch to light mode"
} else {
"Switch to dark mode"
};
rsx! {
button {
class: "theme-toggle",
r#type: "button",
title: "{title}",
"aria-label": "{title}",
onclick: move |_| {
let next_dark = !is_dark().unwrap_or(true);
is_dark.set(Some(next_dark));
apply_theme(next_dark);
},
if is_dark().unwrap_or(true) {
Icon { icon: BsSun, width: 16, height: 16 }
} else {
Icon { icon: BsMoonStars, width: 16, height: 16 }
}
if !collapsed {
span { class: "theme-toggle-label", "{label}" }
}
}
}
}
/// Returns `(is_dark, from_storage)`. `from_storage` is true when an explicit
/// user choice is in localStorage; false when we fell back to OS preference
/// (or to the dark default).
#[cfg(feature = "web")]
fn initial_theme() -> (bool, bool) {
if let Some(window) = web_sys::window() {
if let Ok(Some(storage)) = window.local_storage() {
if let Ok(Some(value)) = storage.get_item(STORAGE_KEY) {
return (value == "dark", true);
}
}
if let Ok(Some(mql)) = window.match_media("(prefers-color-scheme: dark)") {
return (mql.matches(), false);
}
}
(true, false)
}
#[cfg(not(feature = "web"))]
fn initial_theme() -> (bool, bool) {
(true, false)
}
#[cfg(feature = "web")]
fn apply_theme(dark: bool) {
let theme = if dark { "dark" } else { "light" };
if let Some(window) = web_sys::window() {
if let Some(document) = window.document() {
if let Some(root) = document.document_element() {
let _ = root.set_attribute("data-theme", theme);
}
}
if let Ok(Some(storage)) = window.local_storage() {
let _ = storage.set_item(STORAGE_KEY, theme);
}
}
}
#[cfg(not(feature = "web"))]
fn apply_theme(_dark: bool) {}
@@ -8,6 +8,7 @@ pub mod graph;
pub mod help_chat; pub mod help_chat;
pub mod issues; pub mod issues;
pub mod mcp; pub mod mcp;
pub mod notifications;
pub mod pentest; pub mod pentest;
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
pub mod repositories; pub mod repositories;
@@ -0,0 +1,91 @@
use dioxus::prelude::*;
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct NotificationListResponse {
pub data: Vec<CveNotificationData>,
#[serde(default)]
pub total: Option<u64>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct CveNotificationData {
#[serde(rename = "_id")]
pub id: Option<serde_json::Value>,
pub cve_id: String,
pub repo_name: String,
pub package_name: String,
pub package_version: String,
pub severity: String,
pub cvss_score: Option<f64>,
pub summary: Option<String>,
pub url: Option<String>,
pub status: String,
#[serde(default)]
pub created_at: Option<serde_json::Value>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct NotificationCountResponse {
pub count: u64,
}
#[server]
pub async fn fetch_notification_count() -> Result<u64, ServerFnError> {
let state: super::server_state::ServerState =
dioxus_fullstack::FullstackContext::extract().await?;
let url = format!("{}/api/v1/notifications/count", state.agent_api_url);
let resp = reqwest::get(&url)
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
let body: NotificationCountResponse = resp
.json()
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
Ok(body.count)
}
#[server]
pub async fn fetch_notifications() -> Result<NotificationListResponse, ServerFnError> {
let state: super::server_state::ServerState =
dioxus_fullstack::FullstackContext::extract().await?;
let url = format!("{}/api/v1/notifications?limit=20", state.agent_api_url);
let resp = reqwest::get(&url)
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
let body: NotificationListResponse = resp
.json()
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
Ok(body)
}
#[server]
pub async fn mark_all_notifications_read() -> Result<(), ServerFnError> {
let state: super::server_state::ServerState =
dioxus_fullstack::FullstackContext::extract().await?;
let url = format!("{}/api/v1/notifications/read-all", state.agent_api_url);
reqwest::Client::new()
.post(&url)
.send()
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
Ok(())
}
#[server]
pub async fn dismiss_notification(id: String) -> Result<(), ServerFnError> {
let state: super::server_state::ServerState =
dioxus_fullstack::FullstackContext::extract().await?;
let url = format!("{}/api/v1/notifications/{id}/dismiss", state.agent_api_url);
reqwest::Client::new()
.patch(&url)
.send()
.await
.map_err(|e| ServerFnError::new(e.to_string()))?;
Ok(())
}
@@ -259,7 +259,10 @@ pub fn McpServersPage() -> Element {
div { class: "mcp-detail-row", div { class: "mcp-detail-row",
Icon { icon: BsGlobe, width: 13, height: 13 } Icon { icon: BsGlobe, width: 13, height: 13 }
span { class: "mcp-detail-label", "Endpoint" } span { class: "mcp-detail-label", "Endpoint" }
code { class: "mcp-detail-value", "{server.endpoint_url}" } div { class: "copyable",
code { class: "mcp-detail-value", "{server.endpoint_url}" }
crate::components::copy_button::CopyButton { value: server.endpoint_url.clone(), small: true }
}
} }
div { class: "mcp-detail-row", div { class: "mcp-detail-row",
Icon { icon: BsHddNetwork, width: 13, height: 13 } Icon { icon: BsHddNetwork, width: 13, height: 13 }
+39 -23
View File
@@ -137,11 +137,18 @@ pub fn RepositoriesPage() -> Element {
"For SSH URLs: add this deploy key (read-only) to your repository" "For SSH URLs: add this deploy key (read-only) to your repository"
} }
div { div {
style: "margin-top: 4px; padding: 8px; background: var(--bg-secondary); border-radius: 4px; font-family: monospace; font-size: 11px; word-break: break-all; user-select: all;", class: "copyable",
if ssh_public_key().is_empty() { style: "margin-top: 4px; padding: 8px; background: var(--bg-secondary); border-radius: 4px;",
"Loading..." code {
} else { style: "font-size: 11px; word-break: break-all; user-select: all;",
"{ssh_public_key}" if ssh_public_key().is_empty() {
"Loading..."
} else {
"{ssh_public_key}"
}
}
if !ssh_public_key().is_empty() {
crate::components::copy_button::CopyButton { value: ssh_public_key(), small: true }
} }
} }
} }
@@ -390,28 +397,37 @@ pub fn RepositoriesPage() -> Element {
} }
div { class: "form-group", div { class: "form-group",
label { "Webhook URL" } label { "Webhook URL" }
input { {
r#type: "text", #[cfg(feature = "web")]
readonly: true, let origin = web_sys::window()
style: "font-family: monospace; font-size: 12px;", .and_then(|w: web_sys::Window| w.location().origin().ok())
value: { .unwrap_or_default();
#[cfg(feature = "web")] #[cfg(not(feature = "web"))]
let origin = web_sys::window() let origin = String::new();
.and_then(|w: web_sys::Window| w.location().origin().ok()) let webhook_url = format!("{origin}/webhook/{}/{eid}", edit_webhook_tracker());
.unwrap_or_default(); rsx! {
#[cfg(not(feature = "web"))] div { class: "copyable",
let origin = String::new(); input {
format!("{origin}/webhook/{}/{eid}", edit_webhook_tracker()) r#type: "text",
}, readonly: true,
style: "font-family: monospace; font-size: 12px; flex: 1;",
value: "{webhook_url}",
}
crate::components::copy_button::CopyButton { value: webhook_url.clone() }
}
}
} }
} }
div { class: "form-group", div { class: "form-group",
label { "Webhook Secret" } label { "Webhook Secret" }
input { div { class: "copyable",
r#type: "text", input {
readonly: true, r#type: "text",
style: "font-family: monospace; font-size: 12px;", readonly: true,
value: "{secret}", style: "font-family: monospace; font-size: 12px; flex: 1;",
value: "{secret}",
}
crate::components::copy_button::CopyButton { value: secret.clone() }
} }
} }
} }