Skip to content

This project demonstrates a production-grade DevSecOps implementation using HashiCorp Vault as the centralized secrets management solution. It showcases modern software engineering practices including infrastructure as code, automated security scanning, continuous integration/continuous deployment (CI/CD), and secrets management best practices.

Notifications You must be signed in to change notification settings

ines312692/DevSecOps-Vault-Project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DevSecOps Application with HashiCorp Vault

Vault Logo

Table of Contents

  1. Introduction
  2. What is HashiCorp Vault?
  3. Project Overview
  4. Architecture
  5. Prerequisites
  6. Installation
  7. Configuration
  8. Usage
  9. Security Features
  10. Example Use Cases
  11. CI/CD Pipeline
  12. Development Workflow
  13. Testing
  14. Deployment
  15. Troubleshooting
  16. Contributing
  17. License

Introduction

This project demonstrates a production-grade DevSecOps implementation using HashiCorp Vault as the centralized secrets management solution. It showcases modern software engineering practices including infrastructure as code, automated security scanning, continuous integration/continuous deployment (CI/CD), and secrets management best practices.

The application is built with Python Flask and integrates seamlessly with Vault to retrieve sensitive configuration data without exposing credentials in source code, environment variables, or configuration files.


What is HashiCorp Vault?

Overview

HashiCorp Vault is an identity-based secrets and encryption management system. It provides a unified interface to any secret while providing tight access control and recording a detailed audit log. Modern applications require access to secrets such as database credentials, API keys for external services, credentials for service-oriented architecture communication, and more. Vault provides these secrets in a secure, auditable manner.

Key Capabilities

Secrets Management Vault securely stores and tightly controls access to tokens, passwords, certificates, API keys, and other secrets. It provides a centralized workflow for distributing secrets across applications and systems, ensuring that sensitive data never appears in plain text in code repositories or configuration files.

Dynamic Secrets Vault can generate secrets on-demand for some systems, such as AWS, SQL databases, or other services. When an application needs to access a database, it asks Vault for credentials, and Vault generates a unique set of credentials with a specific time-to-live (TTL). Once the TTL expires, the credentials are automatically revoked.

Data Encryption Vault provides encryption as a service with centralized key management. Applications can encrypt and decrypt data without having to manage encryption keys themselves. Vault handles the complexity of key rotation and ensures that encryption keys are never exposed to applications.

Identity-Based Access Vault can authenticate and authorize users and applications based on trusted identities. It supports multiple authentication methods including tokens, username/password, LDAP, Kubernetes, cloud provider IAM, and more. Fine-grained policies control what secrets each identity can access.

Audit Logging Every interaction with Vault is logged to one or more audit devices. Audit logs record all requests and responses, including authentication attempts, secret access, policy changes, and administrative operations. This provides a complete audit trail for compliance and security analysis.

Why Use Vault?

Security by Default Traditional methods of storing secrets such as plaintext files, environment variables, or encrypted files in version control are fundamentally insecure. Vault ensures secrets are encrypted at rest and in transit, with fine-grained access controls and comprehensive audit logging.

Dynamic Credentials Static credentials pose significant security risks. If credentials are compromised, they remain valid until manually rotated. Vault's dynamic secrets are short-lived and automatically revoked, significantly reducing the attack surface.

Centralized Management Managing secrets across multiple applications, environments, and teams is complex and error-prone. Vault provides a single source of truth for all secrets, with a unified API and consistent access patterns.

Compliance and Governance Many regulatory frameworks require detailed audit trails of who accessed what data and when. Vault's comprehensive audit logging provides the evidence needed for compliance with standards such as SOC 2, PCI DSS, HIPAA, and GDPR.


Project Overview

Purpose

This project serves as a reference implementation for building secure, production-ready applications using modern DevSecOps practices. It demonstrates how to:

  • Integrate HashiCorp Vault for secrets management
  • Implement automated security scanning in CI/CD pipelines
  • Build containerized applications following security best practices
  • Manage infrastructure as code
  • Implement comprehensive testing and quality gates
  • Deploy applications securely with minimal manual intervention

Technology Stack

Application Layer

  • Python 3.11: Modern, secure Python runtime
  • Flask 3.0: Lightweight web framework for building RESTful APIs
  • HVAC: Official HashiCorp Vault client library for Python
  • Gunicorn: Production-grade WSGI HTTP server

Infrastructure Layer

  • Docker: Container runtime for application isolation
  • Docker Compose: Multi-container orchestration for local development
  • HashiCorp Vault 1.15: Secrets management and encryption service

Security Tools

  • Trivy: Comprehensive vulnerability scanner for containers and dependencies
  • Bandit: Security linter specifically designed for Python code
  • Gitleaks: Tool for detecting hardcoded secrets in source code
  • SonarCloud: Continuous code quality and security analysis platform

Development Tools

  • Pytest: Modern testing framework with extensive plugin ecosystem
  • Pylint: Static code analyzer for Python
  • Black: Opinionated code formatter for Python
  • Make: Build automation tool for common development tasks

Key Features

Secure Secrets Management All sensitive data including database credentials, API keys, and encryption keys are stored in Vault. The application retrieves secrets at runtime using authenticated API calls. No secrets are ever committed to version control or stored in configuration files.

Automated Security Scanning The CI/CD pipeline includes multiple security checks that run automatically on every commit. Trivy scans for known vulnerabilities in dependencies and container images. Bandit analyzes Python code for common security issues. Gitleaks prevents accidental exposure of secrets in source code.

Container Security Hardening Application containers run as non-root users with minimal privileges. Images are based on slim base images to reduce attack surface. Health checks ensure containers are functioning correctly before accepting traffic.

Comprehensive Testing Unit tests validate business logic and integration with Vault. Code coverage tracking ensures adequate test coverage. Tests run automatically in CI/CD pipelines before deployment.

Infrastructure as Code All infrastructure is defined in version-controlled configuration files. Docker Compose defines the application stack for local development. Dockerfiles specify how to build application images. This ensures consistency across development, testing, and production environments.

Developer Experience A comprehensive Makefile provides simple commands for common tasks. Color-coded output improves readability. Documentation is extensive and kept up-to-date. The project structure follows best practices and is easy to navigate.


Architecture

System Architecture

The application follows a microservices-inspired architecture with clear separation of concerns:

Application Container The Flask application runs in its own container, isolated from other services. It communicates with Vault over HTTP to retrieve secrets. The application exposes a RESTful API for client interactions. Health check endpoints allow monitoring systems to verify application status.

Vault Container Vault runs in development mode for local testing, with data stored in memory. In production, Vault would use a persistent storage backend such as Consul or integrated storage. The Vault API is exposed on port 8200 for both the application and administrators.

Network Architecture Containers communicate over a Docker bridge network. The application resolves Vault by service name using Docker's built-in DNS. Only necessary ports are exposed to the host system. In production, network policies would further restrict traffic between services.

Security Architecture

Authentication Flow The application authenticates to Vault using a token during initialization. In production, AppRole or Kubernetes authentication would be used instead of static tokens. Vault verifies the token and returns an authenticated session.

Authorization Model Vault policies define what secrets each identity can access. The application policy grants read-only access to secrets under the app/ path. Vault denies access to any secrets not explicitly allowed by policy.

Secret Retrieval Flow When the application needs a secret, it makes an authenticated API call to Vault. Vault verifies the token has permission to access the requested secret. If authorized, Vault returns the secret value. The application uses the secret and never logs or persists it.

Audit Trail Every Vault interaction is logged to audit devices. Logs include timestamp, requesting identity, requested path, and result. Audit logs are immutable and suitable for compliance requirements.

Data Flow

  1. Client sends HTTP request to application API endpoint
  2. Application determines which secrets are needed to fulfill request
  3. Application authenticates to Vault and requests secrets
  4. Vault validates authentication token and checks authorization policies
  5. Vault returns requested secrets if authorized, otherwise denies access
  6. Application uses secrets to perform required operations
  7. Application returns response to client
  8. All Vault interactions are logged to audit devices

Prerequisites

Before installing this project, ensure you have the following software installed on your system:

Required Software

Docker Engine 20.10 or later

Docker Compose 2.0 or later

Python 3.11 or later

Make utility

  • Linux/macOS: Usually pre-installed
  • Windows: Install via chocolatey (choco install make) or use WSL2
  • Verify installation: make --version

Optional Software

Git for version control

jq for JSON formatting in terminal

HashiCorp Vault CLI for manual Vault operations

System Requirements

Minimum:

  • 2 CPU cores
  • 4 GB RAM
  • 10 GB free disk space

Recommended:

  • 4 CPU cores
  • 8 GB RAM
  • 20 GB free disk space

Network Requirements

The following ports must be available on your system:

  • Port 5000: Flask application
  • Port 8200: Vault API and UI

Installation

Step 1: Clone the Repository

git clone https://github.com/yourusername/devsecops-vault-project.git
cd devsecops-vault-project

Step 2: Initial Setup

Run the setup command to create necessary configuration files:

make setup

This creates a .env file from the template. Review and modify settings as needed.

Step 3: Install Dependencies

Install Python dependencies for local development:

make install

This installs all required Python packages including Flask, HVAC, and development tools.

Step 4: Build Docker Images

Build the application Docker image:

make build

This builds the Docker image according to the Dockerfile specifications.

Step 5: Start Services

Start all services including Vault and the application:

make start

This command performs the following actions:

  • Starts Vault in development mode
  • Waits for Vault to become healthy
  • Initializes Vault with sample secrets
  • Starts the Flask application
  • Verifies all services are running correctly

Step 6: Verify Installation

Check that all services are running:

make health

You should see healthy responses from both the application and Vault.


Configuration

Environment Variables

The application uses environment variables for configuration. Create a .env file based on .env.example:

Vault Configuration

VAULT_ADDR=http://localhost:8200

The URL where Vault API is accessible. In production, this should use HTTPS.

VAULT_TOKEN=dev-token

Authentication token for Vault. In development mode, the root token is dev-token. In production, use AppRole or Kubernetes authentication instead of static tokens.

VAULT_NAMESPACE=

Vault namespace if using Vault Enterprise. Leave empty for Vault OSS.

Application Configuration

DEBUG=False

Enable or disable debug mode. Never enable debug mode in production as it exposes sensitive information.

SECRET_KEY=change-me-in-production

Secret key for Flask session encryption. Generate a random value for production using python -c "import secrets; print(secrets.token_hex(32))".

Vault Configuration

Secrets Storage

Secrets are stored in Vault's KV v2 secrets engine at the following paths:

secret/app/database - Database connection credentials
secret/app/api - External API keys and endpoints

Access Policies

The application uses a restrictive policy that grants read-only access to application secrets:

path "secret/data/app/*" {
  capabilities = ["read", "list"]
}

Initial Secrets

Sample secrets are automatically created during make start:

Database credentials:

  • host: postgresql.example.com
  • port: 5432
  • username: app_user
  • password: secure_password_123

API configuration:

Docker Configuration

docker-compose.yml

Defines the application stack including Vault and the Flask application. Modify this file to add additional services or change network configuration.

Dockerfile

Defines how to build the application container image. The image is based on Python 3.11 slim and includes security hardening measures such as running as a non-root user.


Usage

Starting the Application

Start all services:

make start

The application will be available at http://localhost:5000 and Vault UI at http://localhost:8200.

API Endpoints

Health Check

curl http://localhost:5000/health

Returns application health status. Used by monitoring systems and load balancers.

Retrieve Secret Example

curl http://localhost:5000/api/secret

Demonstrates retrieving database credentials from Vault. Returns connection information without exposing the password.

Configuration Example

curl http://localhost:5000/api/config

Demonstrates retrieving API configuration from Vault. Returns whether API credentials are configured.

Using the Vault UI

Access the Vault UI at http://localhost:8200 and login with token dev-token.

Navigate to the Secrets section to view and manage secrets. The UI provides a convenient way to:

  • Browse existing secrets
  • Create new secrets
  • Update secret values
  • View secret metadata and version history
  • Manage access policies

Using the Vault CLI

List secrets:

make vault-secrets

Read database secret:

make vault-read-db

Read API secret:

make vault-read-api

Open Vault shell for manual operations:

make vault-shell

Viewing Logs

View all logs:

make logs

View application logs only:

make logs-app

View Vault logs only:

make logs-vault

Stopping Services

Stop all services:

make stop

Stop and remove all containers, networks, and volumes:

make destroy

Security Features

Secrets Management

No Hardcoded Secrets All secrets are stored in Vault and retrieved at runtime. Source code, Docker images, and configuration files contain no sensitive information.

Dynamic Secret Retrieval Secrets are fetched on-demand when needed. The application never caches secrets in memory longer than necessary.

Audit Logging Every secret access is logged in Vault's audit log with timestamp, identity, and requested path.

Vulnerability Scanning

Dependency Scanning Trivy scans Python dependencies for known CVEs. The CI/CD pipeline fails if high or critical vulnerabilities are detected.

Container Image Scanning Trivy scans Docker images for vulnerabilities in base images and installed packages.

Code Scanning Bandit analyzes Python code for security issues such as SQL injection, hardcoded passwords, and insecure functions.

Secret Detection

Pre-commit Protection Gitleaks scans commits for accidentally included secrets such as API keys, passwords, and tokens.

Repository Scanning Gitleaks scans the entire repository history to detect any secrets that may have been committed in the past.

Code Quality

Static Analysis Pylint analyzes code for bugs, code smells, and style violations. The CI/CD pipeline enforces a minimum quality threshold.

Code Coverage Pytest measures test coverage and generates reports. The pipeline requires a minimum coverage percentage before allowing deployment.

Code Formatting Black automatically formats code to ensure consistency. Isort organizes imports according to best practices.

Container Security

Non-Root User The application container runs as a non-privileged user, not as root. This limits the impact of potential container escapes.

Minimal Base Image The container uses a slim Python image to reduce attack surface. Only necessary packages are installed.

Health Checks Docker health checks ensure the application is functioning correctly. Unhealthy containers are automatically restarted.

Resource Limits Production deployments should specify CPU and memory limits to prevent resource exhaustion attacks.


Example Use Cases

This section provides practical, real-world examples demonstrating how to implement the DevSecOps practices showcased in this project. These examples will help you understand how to apply these patterns in your own applications.

Use Case 1: Dynamic Database Credentials

This example demonstrates how a Flask application retrieves database credentials on-demand from Vault, eliminating the need for hardcoded or static credentials.

Scenario

Your application needs to connect to a PostgreSQL database, but you want to avoid storing database credentials in environment variables or configuration files. Instead, credentials are fetched from Vault at runtime, and in production, Vault can generate unique, short-lived credentials for each application instance.

Step 1: Configure Vault Database Secrets Engine (Production)

For production environments, enable the database secrets engine to generate dynamic credentials:

# Enable the database secrets engine
vault secrets enable database

# Configure PostgreSQL connection
vault write database/config/postgresql \
    plugin_name=postgresql-database-plugin \
    allowed_roles="app-role" \
    connection_url="postgresql://{{username}}:{{password}}@postgresql.example.com:5432/appdb" \
    username="vault_admin" \
    password="admin_password"

# Create a role that generates credentials with 1-hour TTL
vault write database/roles/app-role \
    db_name=postgresql \
    creation_statements="CREATE ROLE \"{{name}}\" WITH LOGIN PASSWORD '{{password}}' VALID UNTIL '{{expiration}}'; \
        GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO \"{{name}}\";" \
    default_ttl="1h" \
    max_ttl="24h"

Step 2: Application Code for Dynamic Credentials

Create a database connection manager that retrieves credentials from Vault:

# app/database.py
import hvac
import psycopg2
from contextlib import contextmanager
from functools import wraps
import time
import logging

logger = logging.getLogger(__name__)

class VaultDatabaseManager:
    """Manages database connections with Vault-provided credentials."""
    
    def __init__(self, vault_addr, vault_token, db_role="app-role"):
        self.client = hvac.Client(url=vault_addr, token=vault_token)
        self.db_role = db_role
        self._credentials = None
        self._lease_id = None
        self._lease_expiry = 0
    
    def _get_dynamic_credentials(self):
        """Retrieve dynamic database credentials from Vault."""
        try:
            # Request new credentials from Vault
            response = self.client.secrets.database.generate_credentials(
                name=self.db_role
            )
            
            self._credentials = {
                'username': response['data']['username'],
                'password': response['data']['password']
            }
            self._lease_id = response['lease_id']
            # Set expiry with 5-minute buffer for safety
            self._lease_expiry = time.time() + response['lease_duration'] - 300
            
            logger.info(f"Obtained new database credentials, lease expires in {response['lease_duration']}s")
            return self._credentials
            
        except Exception as e:
            logger.error(f"Failed to retrieve database credentials: {e}")
            raise
    
    def _get_static_credentials(self):
        """Retrieve static credentials from Vault KV store (for development)."""
        try:
            response = self.client.secrets.kv.v2.read_secret_version(
                path='app/database'
            )
            return {
                'host': response['data']['data']['host'],
                'port': response['data']['data']['port'],
                'username': response['data']['data']['username'],
                'password': response['data']['data']['password'],
                'database': response['data']['data'].get('database', 'appdb')
            }
        except Exception as e:
            logger.error(f"Failed to retrieve static credentials: {e}")
            raise
    
    def get_credentials(self, use_dynamic=False):
        """Get database credentials, refreshing if expired."""
        if use_dynamic:
            # Check if credentials are expired or will expire soon
            if self._credentials is None or time.time() >= self._lease_expiry:
                return self._get_dynamic_credentials()
            return self._credentials
        else:
            return self._get_static_credentials()
    
    @contextmanager
    def get_connection(self, use_dynamic=False):
        """Context manager for database connections with automatic credential handling."""
        creds = self.get_credentials(use_dynamic=use_dynamic)
        
        conn = psycopg2.connect(
            host=creds.get('host', 'postgresql.example.com'),
            port=creds.get('port', 5432),
            user=creds['username'],
            password=creds['password'],
            database=creds.get('database', 'appdb')
        )
        
        try:
            yield conn
            conn.commit()
        except Exception:
            conn.rollback()
            raise
        finally:
            conn.close()
    
    def revoke_credentials(self):
        """Revoke current database credentials (call on application shutdown)."""
        if self._lease_id:
            try:
                self.client.sys.revoke_lease(self._lease_id)
                logger.info("Database credentials revoked successfully")
            except Exception as e:
                logger.warning(f"Failed to revoke credentials: {e}")

Step 3: Flask Integration

Integrate the database manager with your Flask application:

# app/main.py
from flask import Flask, jsonify, g
from app.database import VaultDatabaseManager
import os

app = Flask(__name__)

# Initialize Vault database manager
db_manager = VaultDatabaseManager(
    vault_addr=os.getenv('VAULT_ADDR', 'http://localhost:8200'),
    vault_token=os.getenv('VAULT_TOKEN', 'dev-token')
)

@app.route('/api/users')
def get_users():
    """Example endpoint that queries the database with Vault-managed credentials."""
    try:
        with db_manager.get_connection() as conn:
            cursor = conn.cursor()
            cursor.execute("SELECT id, name, email FROM users LIMIT 10")
            users = [
                {'id': row[0], 'name': row[1], 'email': row[2]}
                for row in cursor.fetchall()
            ]
            return jsonify({'users': users})
    except Exception as e:
        return jsonify({'error': str(e)}), 500

@app.route('/api/db-status')
def db_status():
    """Check database connectivity with current credentials."""
    try:
        with db_manager.get_connection() as conn:
            cursor = conn.cursor()
            cursor.execute("SELECT 1")
            return jsonify({
                'status': 'connected',
                'message': 'Database connection successful using Vault credentials'
            })
    except Exception as e:
        return jsonify({
            'status': 'error',
            'message': str(e)
        }), 500

Step 4: Test the Implementation

# Start the application
make start

# Test database status endpoint
curl http://localhost:5000/api/db-status

# Expected response:
# {"status": "connected", "message": "Database connection successful using Vault credentials"}

Benefits of This Approach

  • No credentials in source code or configuration files
  • Credentials can be rotated without application restart
  • Each application instance gets unique credentials (with dynamic secrets)
  • Automatic credential revocation when TTL expires
  • Full audit trail of credential access in Vault

Use Case 2: Encryption as a Service

This example demonstrates how to use Vault's Transit secrets engine to encrypt sensitive data before storing it in a database, without exposing encryption keys to the application.

Scenario

Your application handles sensitive user data (such as social security numbers, payment information, or health records) that must be encrypted at rest. Rather than managing encryption keys in your application, you delegate all cryptographic operations to Vault.

Step 1: Configure Vault Transit Engine

Enable and configure the Transit secrets engine:

# Enable the Transit secrets engine
vault secrets enable transit

# Create an encryption key for user data
vault write -f transit/keys/user-data

# Create a key for payment information with automatic rotation
vault write transit/keys/payment-data \
    type=aes256-gcm96 \
    auto_rotate_period=720h  # Rotate every 30 days

Step 2: Create Encryption Service

Implement an encryption service that uses Vault Transit:

# app/encryption.py
import hvac
import base64
import logging
from typing import Optional

logger = logging.getLogger(__name__)

class VaultEncryptionService:
    """Provides encryption/decryption using Vault Transit secrets engine."""
    
    def __init__(self, vault_addr: str, vault_token: str):
        self.client = hvac.Client(url=vault_addr, token=vault_token)
        
    def encrypt(self, plaintext: str, key_name: str = "user-data") -> Optional[str]:
        """
        Encrypt plaintext using Vault Transit.
        
        Args:
            plaintext: The data to encrypt
            key_name: The Vault Transit key to use
            
        Returns:
            The ciphertext (Vault-prefixed format) or None if encryption fails
        """
        try:
            # Vault expects base64-encoded plaintext
            encoded_plaintext = base64.b64encode(plaintext.encode()).decode()
            
            response = self.client.secrets.transit.encrypt_data(
                name=key_name,
                plaintext=encoded_plaintext
            )
            
            ciphertext = response['data']['ciphertext']
            logger.debug(f"Successfully encrypted data with key '{key_name}'")
            return ciphertext
            
        except Exception as e:
            logger.error(f"Encryption failed: {e}")
            return None
    
    def decrypt(self, ciphertext: str, key_name: str = "user-data") -> Optional[str]:
        """
        Decrypt ciphertext using Vault Transit.
        
        Args:
            ciphertext: The Vault-encrypted ciphertext
            key_name: The Vault Transit key to use
            
        Returns:
            The decrypted plaintext or None if decryption fails
        """
        try:
            response = self.client.secrets.transit.decrypt_data(
                name=key_name,
                ciphertext=ciphertext
            )
            
            # Vault returns base64-encoded plaintext
            encoded_plaintext = response['data']['plaintext']
            plaintext = base64.b64decode(encoded_plaintext).decode()
            
            logger.debug(f"Successfully decrypted data with key '{key_name}'")
            return plaintext
            
        except Exception as e:
            logger.error(f"Decryption failed: {e}")
            return None
    
    def rewrap(self, ciphertext: str, key_name: str = "user-data") -> Optional[str]:
        """
        Re-encrypt data with the latest version of the key.
        Use this after key rotation to update stored ciphertext.
        
        Args:
            ciphertext: The existing ciphertext to rewrap
            key_name: The Vault Transit key to use
            
        Returns:
            New ciphertext encrypted with latest key version
        """
        try:
            response = self.client.secrets.transit.rewrap_data(
                name=key_name,
                ciphertext=ciphertext
            )
            
            new_ciphertext = response['data']['ciphertext']
            logger.info(f"Successfully rewrapped data with latest key version")
            return new_ciphertext
            
        except Exception as e:
            logger.error(f"Rewrap failed: {e}")
            return None
    
    def batch_encrypt(self, items: list, key_name: str = "user-data") -> list:
        """
        Encrypt multiple items in a single Vault request for efficiency.
        
        Args:
            items: List of plaintext strings to encrypt
            key_name: The Vault Transit key to use
            
        Returns:
            List of ciphertext strings in the same order
        """
        try:
            batch_input = [
                {"plaintext": base64.b64encode(item.encode()).decode()}
                for item in items
            ]
            
            response = self.client.secrets.transit.encrypt_data(
                name=key_name,
                batch_input=batch_input
            )
            
            return [item['ciphertext'] for item in response['data']['batch_results']]
            
        except Exception as e:
            logger.error(f"Batch encryption failed: {e}")
            return []

Step 3: User Model with Encrypted Fields

Create a user model that automatically encrypts sensitive fields:

# app/models/user.py
from dataclasses import dataclass
from typing import Optional
from app.encryption import VaultEncryptionService
import os

# Initialize encryption service
encryption = VaultEncryptionService(
    vault_addr=os.getenv('VAULT_ADDR', 'http://localhost:8200'),
    vault_token=os.getenv('VAULT_TOKEN', 'dev-token')
)

@dataclass
class User:
    """User model with encrypted sensitive fields."""
    
    id: int
    email: str
    name: str
    _ssn_encrypted: Optional[str] = None  # Stored encrypted in database
    _payment_token_encrypted: Optional[str] = None
    
    @property
    def ssn(self) -> Optional[str]:
        """Decrypt and return SSN."""
        if self._ssn_encrypted:
            return encryption.decrypt(self._ssn_encrypted, key_name="user-data")
        return None
    
    @ssn.setter
    def ssn(self, value: str):
        """Encrypt and store SSN."""
        if value:
            self._ssn_encrypted = encryption.encrypt(value, key_name="user-data")
    
    @property
    def payment_token(self) -> Optional[str]:
        """Decrypt and return payment token."""
        if self._payment_token_encrypted:
            return encryption.decrypt(self._payment_token_encrypted, key_name="payment-data")
        return None
    
    @payment_token.setter
    def payment_token(self, value: str):
        """Encrypt and store payment token."""
        if value:
            self._payment_token_encrypted = encryption.encrypt(value, key_name="payment-data")
    
    def to_db_dict(self) -> dict:
        """Return dictionary for database storage (with encrypted values)."""
        return {
            'id': self.id,
            'email': self.email,
            'name': self.name,
            'ssn_encrypted': self._ssn_encrypted,
            'payment_token_encrypted': self._payment_token_encrypted
        }
    
    @classmethod
    def from_db_row(cls, row: dict) -> 'User':
        """Create User instance from database row."""
        user = cls(
            id=row['id'],
            email=row['email'],
            name=row['name']
        )
        user._ssn_encrypted = row.get('ssn_encrypted')
        user._payment_token_encrypted = row.get('payment_token_encrypted')
        return user

Step 4: Flask API Endpoints

Create API endpoints that handle encrypted data:

# app/routes/users.py
from flask import Blueprint, request, jsonify
from app.models.user import User
from app.encryption import encryption

users_bp = Blueprint('users', __name__)

@users_bp.route('/api/users', methods=['POST'])
def create_user():
    """Create a new user with encrypted sensitive data."""
    data = request.get_json()
    
    user = User(
        id=None,  # Will be assigned by database
        email=data['email'],
        name=data['name']
    )
    
    # These will be automatically encrypted before storage
    if 'ssn' in data:
        user.ssn = data['ssn']
    if 'payment_token' in data:
        user.payment_token = data['payment_token']
    
    # In production, save to database here
    # db.session.add(user)
    # db.session.commit()
    
    return jsonify({
        'message': 'User created successfully',
        'user_id': user.id,
        'encrypted_ssn': user._ssn_encrypted[:50] + '...' if user._ssn_encrypted else None
    }), 201

@users_bp.route('/api/users/<int:user_id>/ssn', methods=['GET'])
def get_user_ssn(user_id):
    """Retrieve and decrypt user SSN (requires proper authorization)."""
    # In production, verify authorization here
    
    # Fetch user from database
    # user = User.query.get(user_id)
    
    # For demo, create sample user
    user = User(id=user_id, email="demo@example.com", name="Demo User")
    user._ssn_encrypted = "vault:v1:sample_encrypted_data"
    
    # The property accessor automatically decrypts
    decrypted_ssn = user.ssn
    
    if decrypted_ssn:
        # Mask SSN for display (show last 4 digits only)
        masked_ssn = f"***-**-{decrypted_ssn[-4:]}"
        return jsonify({'ssn_masked': masked_ssn})
    
    return jsonify({'error': 'SSN not found'}), 404

@users_bp.route('/api/encrypt', methods=['POST'])
def encrypt_data():
    """Demonstrate encryption endpoint."""
    data = request.get_json()
    plaintext = data.get('plaintext', '')
    key_name = data.get('key_name', 'user-data')
    
    ciphertext = encryption.encrypt(plaintext, key_name)
    
    if ciphertext:
        return jsonify({
            'ciphertext': ciphertext,
            'key_used': key_name
        })
    
    return jsonify({'error': 'Encryption failed'}), 500

@users_bp.route('/api/decrypt', methods=['POST'])
def decrypt_data():
    """Demonstrate decryption endpoint."""
    data = request.get_json()
    ciphertext = data.get('ciphertext', '')
    key_name = data.get('key_name', 'user-data')
    
    plaintext = encryption.decrypt(ciphertext, key_name)
    
    if plaintext:
        return jsonify({'plaintext': plaintext})
    
    return jsonify({'error': 'Decryption failed'}), 500

Step 5: Test Encryption

# Encrypt some data
curl -X POST http://localhost:5000/api/encrypt \
  -H "Content-Type: application/json" \
  -d '{"plaintext": "123-45-6789", "key_name": "user-data"}'

# Expected response:
# {"ciphertext": "vault:v1:abc123...", "key_used": "user-data"}

# Decrypt the data
curl -X POST http://localhost:5000/api/decrypt \
  -H "Content-Type: application/json" \
  -d '{"ciphertext": "vault:v1:abc123...", "key_name": "user-data"}'

# Expected response:
# {"plaintext": "123-45-6789"}

Benefits of This Approach

  • Encryption keys never leave Vault
  • Application cannot access raw encryption keys, only encrypted/decrypted data
  • Automatic key rotation with no application changes required
  • Centralized key management and audit logging
  • Supports compliance requirements (PCI DSS, HIPAA, GDPR)

Use Case 3: CI/CD Pipeline Security Integration

This example demonstrates how to integrate Vault and security scanning into a GitHub Actions CI/CD pipeline to enhance security during deployments.

Scenario

You want to build a secure CI/CD pipeline that automatically scans for vulnerabilities, retrieves deployment secrets from Vault, and ensures only secure code reaches production.

Step 1: GitHub Actions Workflow

Create a comprehensive CI/CD workflow:

# .github/workflows/secure-cicd.yml
name: Secure CI/CD Pipeline

on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main]

env:
  REGISTRY: ghcr.io
  IMAGE_NAME: ${{ github.repository }}

jobs:
  # ============================================
  # Stage 1: Security Scanning
  # ============================================
  security-scan:
    name: Security Scanning
    runs-on: ubuntu-latest
    permissions:
      security-events: write
      contents: read
    
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
        with:
          fetch-depth: 0  # Full history for secret scanning

      - name: Run Gitleaks (Secret Detection)
        uses: gitleaks/gitleaks-action@v2
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          GITLEAKS_LICENSE: ${{ secrets.GITLEAKS_LICENSE }}

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install dependencies
        run: |
          pip install bandit safety

      - name: Run Bandit (Python Security Linter)
        run: |
          bandit -r app/ -f json -o bandit-results.json || true
          bandit -r app/ -f txt
        continue-on-error: true

      - name: Check dependencies for vulnerabilities
        run: |
          pip install -r app/requirements.txt
          safety check --json > safety-results.json || true
          safety check
        continue-on-error: true

      - name: Run Trivy filesystem scan
        uses: aquasecurity/trivy-action@master
        with:
          scan-type: 'fs'
          scan-ref: '.'
          format: 'sarif'
          output: 'trivy-fs-results.sarif'
          severity: 'CRITICAL,HIGH'

      - name: Upload Trivy results to GitHub Security
        uses: github/codeql-action/upload-sarif@v3
        with:
          sarif_file: 'trivy-fs-results.sarif'

      - name: Upload security artifacts
        uses: actions/upload-artifact@v4
        with:
          name: security-scan-results
          path: |
            bandit-results.json
            safety-results.json
            trivy-fs-results.sarif

  # ============================================
  # Stage 2: Code Quality
  # ============================================
  code-quality:
    name: Code Quality Analysis
    runs-on: ubuntu-latest
    needs: security-scan
    
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install dependencies
        run: |
          pip install -r app/requirements.txt
          pip install -r requirements-dev.txt

      - name: Run Pylint
        run: |
          pylint app/ --output-format=json > pylint-results.json || true
          pylint app/ --exit-zero

      - name: Run tests with coverage
        run: |
          pytest tests/ --cov=app --cov-report=xml --cov-report=html -v

      - name: Upload coverage report
        uses: actions/upload-artifact@v4
        with:
          name: coverage-report
          path: htmlcov/

      - name: SonarCloud Scan
        uses: SonarSource/sonarcloud-github-action@master
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

  # ============================================
  # Stage 3: Build and Scan Container
  # ============================================
  build:
    name: Build and Scan Container
    runs-on: ubuntu-latest
    needs: [security-scan, code-quality]
    permissions:
      contents: read
      packages: write
      security-events: write
    
    outputs:
      image-digest: ${{ steps.build.outputs.digest }}
    
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Login to GitHub Container Registry
        uses: docker/login-action@v3
        with:
          registry: ${{ env.REGISTRY }}
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }}

      - name: Extract metadata
        id: meta
        uses: docker/metadata-action@v5
        with:
          images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
          tags: |
            type=sha,prefix=
            type=ref,event=branch
            type=semver,pattern={{version}}

      - name: Build Docker image
        id: build
        uses: docker/build-push-action@v5
        with:
          context: .
          file: ./infrastructure/Dockerfile
          push: false
          load: true
          tags: ${{ steps.meta.outputs.tags }}
          labels: ${{ steps.meta.outputs.labels }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

      - name: Scan image with Trivy
        uses: aquasecurity/trivy-action@master
        with:
          image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.sha }}
          format: 'sarif'
          output: 'trivy-image-results.sarif'
          severity: 'CRITICAL,HIGH'
          exit-code: '1'

      - name: Upload image scan results
        uses: github/codeql-action/upload-sarif@v3
        if: always()
        with:
          sarif_file: 'trivy-image-results.sarif'

      - name: Push image if scan passes
        if: success() && github.event_name != 'pull_request'
        uses: docker/build-push-action@v5
        with:
          context: .
          file: ./infrastructure/Dockerfile
          push: true
          tags: ${{ steps.meta.outputs.tags }}
          labels: ${{ steps.meta.outputs.labels }}

  # ============================================
  # Stage 4: Deploy with Vault Integration
  # ============================================
  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build
    if: github.ref == 'refs/heads/develop'
    environment: staging
    
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Import secrets from Vault
        uses: hashicorp/vault-action@v2
        with:
          url: ${{ secrets.VAULT_ADDR }}
          method: approle
          roleId: ${{ secrets.VAULT_ROLE_ID }}
          secretId: ${{ secrets.VAULT_SECRET_ID }}
          secrets: |
            secret/data/staging/database host | DB_HOST ;
            secret/data/staging/database username | DB_USER ;
            secret/data/staging/database password | DB_PASSWORD ;
            secret/data/staging/api key | API_KEY

      - name: Deploy to staging
        run: |
          echo "Deploying to staging environment..."
          # Secrets are available as environment variables
          # DB_HOST, DB_USER, DB_PASSWORD, API_KEY
          
          # Example: Update Kubernetes deployment
          # kubectl set image deployment/app app=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.sha }}
          
          echo "Deployment complete"

      - name: Run smoke tests
        run: |
          echo "Running smoke tests..."
          # curl -f https://staging.example.com/health || exit 1
          echo "Smoke tests passed"

  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: build
    if: github.ref == 'refs/heads/main'
    environment: production
    
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Import secrets from Vault
        uses: hashicorp/vault-action@v2
        with:
          url: ${{ secrets.VAULT_ADDR }}
          method: approle
          roleId: ${{ secrets.VAULT_ROLE_ID }}
          secretId: ${{ secrets.VAULT_SECRET_ID }}
          exportEnv: false  # More secure: don't export to env
          secrets: |
            secret/data/production/database host | DB_HOST ;
            secret/data/production/database username | DB_USER ;
            secret/data/production/database password | DB_PASSWORD

      - name: Deploy to production
        run: |
          echo "Deploying to production environment..."
          # Use secrets directly in deployment commands
          echo "Deployment complete"

      - name: Notify on success
        if: success()
        run: |
          echo "Production deployment successful!"
          # Send notification to Slack, Teams, etc.

Step 2: Vault Configuration for CI/CD

Configure Vault AppRole authentication for CI/CD:

# Enable AppRole auth method
vault auth enable approle

# Create policy for CI/CD
vault policy write cicd-policy - <<EOF
# Read staging secrets
path "secret/data/staging/*" {
  capabilities = ["read"]
}

# Read production secrets
path "secret/data/production/*" {
  capabilities = ["read"]
}

# Allow token renewal
path "auth/token/renew-self" {
  capabilities = ["update"]
}
EOF

# Create AppRole for CI/CD
vault write auth/approle/role/cicd-role \
    token_policies="cicd-policy" \
    token_ttl=10m \
    token_max_ttl=30m \
    secret_id_ttl=24h \
    secret_id_num_uses=100

# Get Role ID (store in GitHub Secrets as VAULT_ROLE_ID)
vault read auth/approle/role/cicd-role/role-id

# Generate Secret ID (store in GitHub Secrets as VAULT_SECRET_ID)
vault write -f auth/approle/role/cicd-role/secret-id

Step 3: Environment-Specific Secrets

Store deployment secrets in Vault:

# Staging environment secrets
vault kv put secret/staging/database \
    host="staging-db.example.com" \
    username="staging_app" \
    password="staging_password_123"

vault kv put secret/staging/api \
    key="staging_api_key_xyz"

# Production environment secrets
vault kv put secret/production/database \
    host="prod-db.example.com" \
    username="prod_app" \
    password="production_password_secure"

vault kv put secret/production/api \
    key="production_api_key_abc"

Step 4: Security Gates Configuration

Create a security gate that blocks deployments with critical vulnerabilities:

# .github/workflows/security-gate.yml
name: Security Gate

on:
  workflow_run:
    workflows: ["Secure CI/CD Pipeline"]
    types: [completed]

jobs:
  evaluate-security:
    runs-on: ubuntu-latest
    if: ${{ github.event.workflow_run.conclusion == 'success' }}
    
    steps:
      - name: Download security artifacts
        uses: actions/download-artifact@v4
        with:
          name: security-scan-results
          run-id: ${{ github.event.workflow_run.id }}

      - name: Evaluate security results
        run: |
          # Check for critical vulnerabilities
          CRITICAL_COUNT=$(jq '[.Results[]?.Vulnerabilities[]? | select(.Severity == "CRITICAL")] | length' trivy-fs-results.sarif 2>/dev/null || echo "0")
          
          if [ "$CRITICAL_COUNT" -gt 0 ]; then
            echo "::error::Found $CRITICAL_COUNT critical vulnerabilities. Blocking deployment."
            exit 1
          fi
          
          echo "Security gate passed. No critical vulnerabilities found."

Step 5: Local CI Simulation Script

Add to Makefile for local testing:

# Makefile additions for CI simulation

.PHONY: ci-local ci-security ci-quality ci-build

ci-local: ci-security ci-quality ci-build
	@echo "$(GREEN)Local CI pipeline completed successfully$(NC)"

ci-security:
	@echo "$(YELLOW)Running security scans...$(NC)"
	@bandit -r app/ -f txt || true
	@safety check || true
	@trivy fs . --severity HIGH,CRITICAL
	@gitleaks detect --source . --verbose

ci-quality:
	@echo "$(YELLOW)Running code quality checks...$(NC)"
	@pylint app/ --exit-zero
	@pytest tests/ --cov=app --cov-fail-under=80

ci-build:
	@echo "$(YELLOW)Building and scanning container...$(NC)"
	@docker build -f infrastructure/Dockerfile -t $(IMAGE_NAME):test .
	@trivy image $(IMAGE_NAME):test --severity HIGH,CRITICAL --exit-code 1

Benefits of This Approach

  • Secrets never stored in CI/CD configuration or logs
  • Multiple layers of security scanning catch vulnerabilities early
  • Automatic blocking of deployments with critical issues
  • Environment-specific secrets with proper access controls
  • Full audit trail of secret access during deployments
  • Consistent security checks across all deployments

Additional Examples and Patterns

Pattern: Secret Rotation Handler

Implement automatic handling of secret rotation:

# app/secret_rotation.py
import threading
import time
from app.vault_client import VaultClient

class SecretRotationHandler:
    """Handles automatic secret rotation and cache invalidation."""
    
    def __init__(self, vault_client: VaultClient, refresh_interval: int = 300):
        self.vault = vault_client
        self.refresh_interval = refresh_interval
        self._secrets_cache = {}
        self._lock = threading.Lock()
        self._running = False
    
    def start(self):
        """Start background refresh thread."""
        self._running = True
        thread = threading.Thread(target=self._refresh_loop, daemon=True)
        thread.start()
    
    def stop(self):
        """Stop background refresh thread."""
        self._running = False
    
    def _refresh_loop(self):
        """Periodically refresh secrets from Vault."""
        while self._running:
            try:
                self._refresh_all_secrets()
            except Exception as e:
                logging.error(f"Secret refresh failed: {e}")
            time.sleep(self.refresh_interval)
    
    def _refresh_all_secrets(self):
        """Refresh all cached secrets."""
        with self._lock:
            for path in list(self._secrets_cache.keys()):
                try:
                    self._secrets_cache[path] = self.vault.get_secret(path)
                except Exception:
                    del self._secrets_cache[path]
    
    def get_secret(self, path: str) -> dict:
        """Get secret with caching."""
        with self._lock:
            if path not in self._secrets_cache:
                self._secrets_cache[path] = self.vault.get_secret(path)
            return self._secrets_cache[path]

Pattern: Multi-Environment Configuration

Handle multiple environments with Vault namespaces:

# app/multi_env_config.py
import os
from app.vault_client import VaultClient

class MultiEnvironmentConfig:
    """Manage configuration across multiple environments using Vault."""
    
    ENVIRONMENTS = ['development', 'staging', 'production']
    
    def __init__(self):
        self.environment = os.getenv('APP_ENV', 'development')
        self.vault = VaultClient(
            addr=os.getenv('VAULT_ADDR'),
            token=os.getenv('VAULT_TOKEN')
        )
    
    def get_database_config(self) -> dict:
        """Get database configuration for current environment."""
        return self.vault.get_secret(f'app/{self.environment}/database')
    
    def get_api_config(self) -> dict:
        """Get API configuration for current environment."""
        return self.vault.get_secret(f'app/{self.environment}/api')
    
    def get_feature_flags(self) -> dict:
        """Get feature flags for current environment."""
        return self.vault.get_secret(f'app/{self.environment}/features')

CI/CD Pipeline

Pipeline Stages

The GitHub Actions workflow includes the following stages:

Security Scanning Stage

Runs multiple security scanners in parallel:

  • Trivy scans filesystem for vulnerabilities
  • Bandit scans Python code for security issues
  • Gitleaks scans for hardcoded secrets
  • Results are uploaded to GitHub Security tab

Code Quality Stage

Analyzes code quality:

  • Pylint performs static code analysis
  • Pytest runs unit tests with coverage measurement
  • SonarCloud performs comprehensive quality analysis
  • Pipeline fails if quality gates are not met

Build and Push Stage

Builds and publishes Docker images:

  • Builds Docker image using BuildKit for efficiency
  • Pushes image to GitHub Container Registry
  • Tags image with both latest and commit SHA
  • Scans built image for vulnerabilities

Deployment Stage

Deploys to target environment:

  • Updates deployment manifests
  • Applies infrastructure changes
  • Performs smoke tests
  • Rolls back on failure

Local CI Simulation

Run the complete CI pipeline locally before pushing:

make ci-local

This runs all checks that will run in CI:

  • Code formatting verification
  • Linting
  • Unit tests
  • Security scans

Fix any issues before pushing to avoid pipeline failures.

Pipeline Configuration

The pipeline is defined in .github/workflows/ci-cd.yml. Key configuration options:

Trigger Events

  • Push to main or develop branches
  • Pull requests to main branch

Secrets Required

  • GITHUB_TOKEN: Automatically provided by GitHub
  • SONAR_TOKEN: SonarCloud authentication token

Artifacts

  • Test coverage reports
  • Security scan results
  • Built Docker images

Development Workflow

Daily Development Workflow

Start your development session:

make start

Make code changes in your editor. The application supports hot reloading during development.

Run tests after making changes:

make test

View application logs to debug issues:

make logs-app

Format code before committing:

make format

Run linting to catch issues:

make lint

Run local CI pipeline before pushing:

make ci-local

Stop services at end of session:

make stop

Adding New Secrets

To add a new secret to Vault:

  1. Open Vault shell:
make vault-shell
  1. Create the secret:
vault kv put secret/app/newsecret key=value
  1. Update the Vault policy if needed to grant access

  2. Update the application code to retrieve the secret:

secret = vault.get_secret('app/data/newsecret')

Modifying the Application

The application code is organized as follows:

  • app/main.py: API endpoints and request handlers
  • app/vault_client.py: Vault integration logic
  • app/config.py: Configuration management
  • tests/: Unit tests

After modifying code, rebuild and restart the application:

make app-rebuild

Working with Vault Policies

Vault policies are defined in HCL files in vault/policies/.

To update a policy:

  1. Edit the policy file
  2. Apply the updated policy:
vault policy write app-policy /vault/policies/app-policy.hcl
  1. Restart the application to use the new policy

Testing

Running Tests

Run all tests:

make test

Run tests with coverage report:

make test-coverage

Run tests in watch mode (automatically re-run on file changes):

make test-watch

Writing Tests

Tests are located in the tests/ directory and use pytest.

Example test structure:

import unittest
from app.main import app

class TestAPI(unittest.TestCase):
    def setUp(self):
        self.client = app.test_client()
    
    def test_endpoint(self):
        response = self.client.get('/api/endpoint')
        self.assertEqual(response.status_code, 200)

Test Coverage

The project aims for high test coverage. Coverage reports show which lines of code are executed during tests.

View coverage report:

make test-coverage
open htmlcov/index.html

Integration Testing

Integration tests verify the application works correctly with Vault:

def test_vault_integration(self):
    secret = vault.get_secret('app/data/database')
    self.assertIsNotNone(secret)
    self.assertIn('host', secret)

Deployment

Production Considerations

Vault Configuration

For production deployments:

  1. Use Vault in server mode, not development mode
  2. Configure a persistent storage backend (Consul, Integrated Storage, etc.)
  3. Enable TLS for all Vault communication
  4. Use AppRole or Kubernetes authentication instead of tokens
  5. Enable audit logging to a persistent location
  6. Configure Vault for high availability with multiple instances
  7. Implement automated secret rotation policies
  8. Set up Vault monitoring and alerting

Application Configuration

For production deployments:

  1. Use environment-specific configuration files
  2. Enable application logging to a centralized logging system
  3. Configure resource limits for containers
  4. Implement horizontal pod autoscaling
  5. Set up health checks and liveness probes
  6. Configure network policies to restrict traffic
  7. Enable application performance monitoring
  8. Implement distributed tracing

Security Hardening

Additional security measures for production:

  1. Scan images for vulnerabilities before deployment
  2. Sign images to ensure authenticity
  3. Use minimal base images
  4. Run containers as non-root users
  5. Enable read-only root filesystems where possible
  6. Configure security contexts and pod security policies
  7. Implement network segmentation
  8. Enable audit logging at all layers

Deployment to Kubernetes

Example Kubernetes deployment:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: devsecops-app
spec:
  replicas: 3
  selector:
    matchLabels:
      app: devsecops-app
  template:
    metadata:
      labels:
        app: devsecops-app
    spec:
      serviceAccountName: devsecops-app
      containers:
      - name: app
        image: ghcr.io/yourusername/devsecops-app:latest
        ports:
        - containerPort: 5000
        env:
        - name: VAULT_ADDR
          value: "https://vault.example.com"
        livenessProbe:
          httpGet:
            path: /health
            port: 5000
        readinessProbe:
          httpGet:
            path: /health
            port: 5000

Monitoring and Observability

Implement comprehensive monitoring:

Metrics

  • Application request rate, latency, and error rate
  • Vault request rate and latency
  • Container resource usage (CPU, memory, network)
  • Secret access patterns

Logging

  • Application logs with structured logging
  • Vault audit logs
  • Container logs
  • Security event logs

Alerting

  • High error rates
  • Slow response times
  • Vault authentication failures
  • Security scan failures
  • Unusual secret access patterns

Troubleshooting

Common Issues

Vault Connection Refused

Symptom: Application fails to connect to Vault with "connection refused" error.

Solution:

# Check if Vault is running
make status

# Check Vault health
make vault-status

# View Vault logs
make logs-vault

# Restart Vault if needed
make restart

Vault Authentication Failed

Symptom: Application receives "permission denied" from Vault.

Solution:

# Verify token is correct
echo $VAULT_TOKEN

# Check token validity
vault token lookup

# Verify policy grants necessary permissions
vault policy read app-policy

# Re-initialize Vault if needed
make vault-init

Secret Not Found

Symptom: Application fails to retrieve secret from Vault.

Solution:

# List available secrets
make vault-secrets

# Verify secret exists
make vault-read-db

# Check secret path is correct in application code
# Ensure policy grants access to the path

Container Health Check Failing

Symptom: Container restarts repeatedly with health check failures.

Solution:

# View container logs
make logs-app

# Check application is starting correctly
docker exec devsecops-app ps aux

# Manually test health endpoint
curl http://localhost:5000/health

# Increase health check initial delay if needed

Port Already in Use

Symptom: Cannot start services due to port conflict.

Solution:

# Check what is using the port
lsof -i :5000
lsof -i :8200

# Stop conflicting service or change port in docker-compose.yml

Getting Help

If you encounter issues not covered here:

  1. Check the application logs: make logs
  2. Review Vault documentation: https://www.vaultproject.io/docs
  3. Search existing GitHub issues
  4. Open a new issue with detailed information:
    • Steps to reproduce the problem
    • Error messages and logs
    • Environment details (OS, Docker version, etc.)

Contributing

Contributions are welcome and appreciated. Please follow these guidelines:

Code Style

  • Follow PEP 8 style guide for Python code
  • Use Black for code formatting
  • Use Isort for import sorting
  • Write descriptive commit messages

Pull Request Process

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/your-feature
  3. Make your changes
  4. Run tests: make test
  5. Run security scans: make security-scan
  6. Format code: make format
  7. Commit changes: git commit -m "Add feature"
  8. Push to your fork: git push origin feature/your-feature
  9. Open a pull request with detailed description

Code Review

All pull requests require review before merging. Reviewers will check:

  • Code quality and style
  • Test coverage
  • Security implications
  • Documentation updates
  • Backward compatibility

License

This project is licensed under the MIT License. See the LICENSE file for details.


Acknowledgments

This project uses the following open source software:

  • HashiCorp Vault for secrets management
  • Flask web framework
  • Docker containerization platform
  • Trivy security scanner
  • Bandit Python security linter
  • GitHub Actions for CI/CD

Contact

For questions or support, please open an issue on GitHub or contact the maintainers:


Last Updated: December 2025

About

This project demonstrates a production-grade DevSecOps implementation using HashiCorp Vault as the centralized secrets management solution. It showcases modern software engineering practices including infrastructure as code, automated security scanning, continuous integration/continuous deployment (CI/CD), and secrets management best practices.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages