Project to scan vulnerabilities of any website given
A web-based tool that:
Allows users to input a target website URL
Runs DAST scans using OWASP ZAP (and in future, SAST/SCA tools)
Tracks each scan run: status, time, results
Visualizes metrics in a dashboard
Keeps historical data for comparisons and auditing
[User UI Input] ↓ [API Backend] ↓ [Queue/Scan Manager] ↓ [ZAP Scanner CLI/API] ↓ [Report Processor (Parser)] ↓ [Database Storage] ↓ [Dashboard UI: Current & Historical Metrics]
- Website Table
Table: websites
- id (UUID)
- url (VARCHAR, UNIQUE)
- name (optional alias)
- created_at (timestamp)
- last_scanned_at (timestamp)
- Scan Run Table Table: scan_runs
- id (UUID)
- website_id (FK)
- started_at (timestamp)
- ended_at (timestamp)
- status (ENUM: queued, running, success, failed)
- report_path (VARCHAR or JSONB if storing inline)
- risk_summary_json (JSON for quick access to metrics)
- Vulnerabilities Table Table: scan_vulnerabilities
- id (UUID)
- scan_run_id (FK)
- alert_name (VARCHAR)
- risk_level (ENUM: info, low, medium, high)
- confidence (VARCHAR)
- url (TEXT)
- description (TEXT)
- solution (TEXT)
Backend: Python (FastAPI) – for API & Scan Orchestration ZAP Python API – to trigger and fetch results Celery + Redis – to queue and manage background scan jobs PostgreSQL – robust DB for analytics and reports
Frontend: React + TailwindCSS OR Streamlit (for MVP) Chart.js or Plotly – for graphs
Deployment: Docker – containerize scanner, API, frontend NGINX – reverse proxy Optional: RabbitMQ (for scale), Prometheus + Grafana (for ops monitoring)
Website View List of all websites added Button to trigger new scan Last scan status, result summary
Scan Run View Risk level pie/bar chart List of vulnerabilities with filters (risk, confidence, etc.) Download report (XML/JSON/CSV) Trend graph over time Vulnerability Insights Top recurring issues Count of open/high-risk issues per site Heatmaps or severity timelines
🛠️ Development Plan 🔹 Phase 1: MVP ZAP integration via CLI/API FastAPI backend: endpoints for scan trigger, scan result PostgreSQL schema setup Streamlit dashboard for simple visualizations
🔹 Phase 2: Advanced Dashboard Move to full React frontend Scan scheduling feature (daily/weekly) Authentication (admin + user) Add webhook/email alerting on high-risk findings
🔹 Phase 3: Tool Extensibility Integrate SAST tools (e.g., Bandit, Semgrep) Integrate SCA tools (e.g., Trivy, Snyk) Unified report schema (tool-agnostic) Plugin system for adding new tools
-Secure, scalable dashboard app for scanning and reporting -Historical tracking for all URLs -Easy extensibility to new tools -Exportable reports & graphs -Queue-based scan system