diff --git a/docs/connection-testing.md b/docs/connection-testing.md new file mode 100644 index 0000000..dbe8ecb --- /dev/null +++ b/docs/connection-testing.md @@ -0,0 +1,450 @@ +# Connection Testing + +Test connectivity to Itential Platform before starting the server or troubleshoot connection issues with the built-in connection test tool. + +## Overview + +The `test` command performs a comprehensive series of checks to validate connectivity to the Itential Platform. It helps identify configuration issues, network problems, authentication failures, and other connectivity problems before they affect your MCP server operation. + +## Usage + +### Basic Test + +Test connection with default settings: + +```bash +itential-mcp test +``` + +### Verbose Output + +Show detailed diagnostic information: + +```bash +itential-mcp test --verbose +``` + +### JSON Output + +Output results in JSON format for automation: + +```bash +itential-mcp test --format json +``` + +### Custom Configuration + +Use a custom configuration file: + +```bash +itential-mcp test --config /path/to/config.toml +``` + +### Custom Timeout + +Set a custom timeout (default: 30 seconds): + +```bash +itential-mcp test --timeout 60 +``` + +### Quiet Mode + +Suppress progress messages (JSON output only): + +```bash +itential-mcp test --format json --quiet +``` + +## Connection Checks + +The test performs 7 comprehensive checks in sequence: + +1. **Configuration** - Validates configuration is loaded and complete +2. **DNS Resolution** - Verifies hostname resolves to IP address +3. **TCP Connection** - Confirms TCP connection can be established +4. **TLS Handshake** - Validates TLS/SSL handshake (if TLS enabled) +5. **Authentication** - Verifies authentication credentials work +6. **Platform Health** - Checks platform health endpoint responds +7. **API Access** - Confirms API access with a simple query + +The test uses **fail-fast behavior** - it stops at the first failure to help you focus on the immediate issue. + +## Output Formats + +### Human-Readable Output + +Default format with colored output and clear status indicators: + +``` +✓ Configuration loaded successfully +✓ platform.example.com -> 192.168.1.100 +✓ TCP connection established (platform.example.com:3000) +✓ TLS handshake successful +✓ Authentication successful (oauth) +✓ Platform health check passed +✓ API access verified + +──────────────────────────────────────────────────────────── + +✓ Connection test: SUCCESS + + Platform version: 2024.1.0 + Authenticated as: admin + Total duration: 1.23s +``` + +### JSON Output + +Structured output suitable for automation and CI/CD: + +```json +{ + "success": true, + "duration_ms": 1234.56, + "timestamp": "2026-01-27T12:34:56Z", + "checks": [ + { + "name": "configuration", + "status": "passed", + "message": "Configuration loaded successfully", + "duration_ms": 12.34, + "details": { + "platform_host": "platform.example.com", + "platform_port": 3000, + "auth_type": "oauth", + "tls_enabled": true + } + } + ], + "platform_version": "2024.1.0", + "authenticated_user": "admin", + "summary": { + "total_checks": 7, + "passed": 7, + "failed": 0, + "skipped": 0, + "warnings": 0 + } +} +``` + +## Exit Codes + +The command returns standard exit codes: + +- **0** - All checks passed successfully +- **1** - One or more checks failed + +This makes it suitable for use in scripts and CI/CD pipelines: + +```bash +if itential-mcp test; then + echo "Connection successful" + itential-mcp run +else + echo "Connection failed" + exit 1 +fi +``` + +## Startup Testing + +You can configure the server to test the platform connection during startup. This ensures the server only starts if it can successfully connect to the platform. + +### Configuration + +#### Environment Variables + +```bash +export ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP=true +export ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT=30 +``` + +#### Configuration File + +```toml +[server] +test_connection_on_startup = true +startup_test_timeout = 30 +``` + +### Behavior + +When startup testing is enabled: + +1. Server loads configuration +2. Connection test runs automatically +3. If test succeeds, server starts normally +4. If test fails, server exits with error + +This provides **fail-fast behavior** - the server won't start if it can't connect to the platform, making misconfigurations immediately apparent. + +### Docker/Kubernetes + +When using containers, enable startup testing to validate configuration: + +```yaml +apiVersion: v1 +kind: Pod +metadata: + name: itential-mcp +spec: + containers: + - name: itential-mcp + image: itential-mcp:latest + env: + - name: ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP + value: "true" + - name: ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT + value: "30" + - name: ITENTIAL_MCP_PLATFORM_HOST + value: "platform.itential.svc.cluster.local" + livenessProbe: + httpGet: + path: /status/livez + port: 8000 + initialDelaySeconds: 35 # Account for connection test + startup + periodSeconds: 10 + readinessProbe: + httpGet: + path: /status/readyz + port: 8000 + initialDelaySeconds: 35 + periodSeconds: 5 +``` + +## Troubleshooting + +### DNS Resolution Failure + +**Error:** +``` +✗ Could not resolve hostname 'platform.example.com' +``` + +**Suggestions:** +1. Check hostname spelling +2. Verify DNS server is reachable +3. Try using IP address directly for testing + +**Fix:** +```bash +# Test DNS resolution manually +nslookup platform.example.com + +# Or use IP address +export ITENTIAL_MCP_PLATFORM_HOST=192.168.1.100 +``` + +### TCP Connection Refused + +**Error:** +``` +✗ Connection refused to platform.example.com:3000 +``` + +**Suggestions:** +1. Verify platform is running +2. Check port number is correct +3. Verify platform is listening on this address + +**Fix:** +```bash +# Check if platform is listening +netstat -an | grep 3000 + +# Or test connection manually +telnet platform.example.com 3000 +``` + +### TLS Certificate Verification Failed + +**Error:** +``` +✗ TLS certificate verification failed +``` + +**Suggestions:** +1. Update hostname to match certificate CN +2. Obtain valid certificate for this hostname +3. Disable verification (not recommended): + `ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true` + +**Fix:** +```bash +# Check certificate details +openssl s_client -connect platform.example.com:3000 -showcerts + +# Disable verification for testing (not recommended for production) +export ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true +``` + +### Authentication Failed + +**Error:** +``` +✗ Authentication failed +``` + +**Suggestions:** +1. Username and password are correct +2. User exists in Itential Platform +3. User has API access permissions +4. For OAuth, verify issuer and token configuration + +**Fix:** +```bash +# Verify credentials +export ITENTIAL_MCP_PLATFORM_USER=admin +export ITENTIAL_MCP_PLATFORM_PASSWORD=correct_password + +# Check auth type +export ITENTIAL_MCP_AUTH_TYPE=oauth # or basic +``` + +### Platform Health Check Failed + +**Error:** +``` +✗ Platform health check failed +``` + +**Suggestions:** +1. Platform is starting up (wait and retry) +2. Platform is experiencing issues (check logs) +3. Health endpoint is not available + +**Fix:** +```bash +# Check platform logs +# Wait a moment and retry + +# Test health endpoint manually +curl https://platform.example.com:3000/health +``` + +### API Access Failed + +**Error:** +``` +✗ API access verification failed +``` + +**Suggestions:** +1. User lacks required permissions +2. API endpoint is unavailable +3. Request format is invalid + +**Fix:** +```bash +# Verify user has appropriate RBAC permissions in Itential Platform +# Check platform documentation for required permissions +``` + +## Examples + +### CI/CD Pipeline Integration + +```yaml +# GitLab CI example +test_connection: + stage: test + script: + - itential-mcp test --format json --quiet + only: + - main + - develop + +deploy: + stage: deploy + script: + - itential-mcp test + - docker build -t itential-mcp:latest . + - docker push itential-mcp:latest + needs: + - test_connection +``` + +### Health Check Script + +```bash +#!/bin/bash +# health-check.sh + +# Run connection test +if itential-mcp test --format json --quiet > /tmp/connection-test.json; then + # Extract platform version + VERSION=$(jq -r '.platform_version' /tmp/connection-test.json) + echo "Platform version: $VERSION" + exit 0 +else + # Extract error message + ERROR=$(jq -r '.error' /tmp/connection-test.json) + echo "Connection failed: $ERROR" + exit 1 +fi +``` + +### Verbose Troubleshooting + +```bash +# Run with maximum verbosity +itential-mcp test --verbose --timeout 60 + +# Output shows: +# - Detailed timing for each check +# - Configuration values +# - TLS protocol and cipher information +# - Full error stack traces +``` + +## Configuration Reference + +### Command-Line Options + +| Option | Type | Default | Description | +|--------|------|---------|-------------| +| `--config` | string | - | Path to configuration file | +| `--timeout` | int | 30 | Maximum time for test in seconds | +| `--verbose` | flag | false | Show detailed diagnostic information | +| `--format` | string | human | Output format (human or json) | +| `--quiet` | flag | false | Suppress progress messages | + +### Environment Variables + +| Variable | Type | Default | Description | +|----------|------|---------|-------------| +| `ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP` | bool | false | Test connection during startup | +| `ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT` | int | 30 | Startup test timeout in seconds | +| `ITENTIAL_MCP_PLATFORM_HOST` | string | localhost | Platform hostname | +| `ITENTIAL_MCP_PLATFORM_PORT` | int | 0 | Platform port (0 = auto: 443 for HTTPS, 80 for HTTP) | +| `ITENTIAL_MCP_PLATFORM_DISABLE_TLS` | bool | false | Disable TLS/HTTPS | + +All standard platform configuration variables also apply (host, port, auth, etc.). + +**Note:** When `ITENTIAL_MCP_PLATFORM_PORT` is set to `0` (default), the connection test will automatically use: +- Port **443** if TLS is enabled (default) +- Port **80** if TLS is disabled + +To specify a custom port, set `ITENTIAL_MCP_PLATFORM_PORT` to the desired value: +```bash +export ITENTIAL_MCP_PLATFORM_PORT=3000 +``` + +## Best Practices + +1. **Run Before Deployment** - Test connectivity before deploying to production +2. **Enable Startup Testing** - Use in production to fail fast on misconfigurations +3. **Automate in CI/CD** - Include connection testing in your deployment pipeline +4. **Use JSON for Automation** - Parse JSON output in scripts for detailed diagnostics +5. **Monitor Exit Codes** - Check exit codes in scripts for proper error handling +6. **Increase Timeout for Slow Networks** - Adjust timeout for high-latency connections +7. **Use Verbose Mode for Troubleshooting** - Enable verbose output when debugging issues + +## See Also + +- [Integration Guide](integration.md) - Setting up MCP client integration +- [Configuration](configuration.md) - Complete configuration reference +- [Authentication](authentication.md) - Authentication methods and setup +- [TLS Configuration](tls.md) - TLS/SSL setup and troubleshooting diff --git a/docs/design/connection-test-feature.md b/docs/design/connection-test-feature.md new file mode 100644 index 0000000..d29ea80 --- /dev/null +++ b/docs/design/connection-test-feature.md @@ -0,0 +1,833 @@ +# Design Spec: Connection Test Feature + +**Issue:** [#299](https://github.com/itential/itential-mcp/issues/299) +**Status:** Draft +**Author:** Claude +**Created:** 2026-01-27 +**Last Updated:** 2026-01-27 + +## Problem Statement + +Currently, the connection to Itential Platform is only established when a tool is invoked. If the connection is misconfigured (invalid credentials, incorrect host, network issues, TLS problems, etc.), users don't discover the problem until they attempt to use a tool, resulting in a poor user experience and difficult troubleshooting. + +Users need a way to: +1. Proactively validate their connection configuration +2. Diagnose connection issues before attempting to use tools +3. Ensure the server is properly configured during startup/deployment + +## Goals + +1. **CLI Command**: Add a `test-connection` command to validate platform connectivity +2. **Startup Validation**: Add optional connection testing during server startup +3. **Comprehensive Diagnostics**: Provide detailed feedback about connection failures +4. **Backward Compatibility**: Feature is opt-in and maintains current server behavior by default +5. **Fast Feedback**: Connection test should complete quickly (< 5 seconds in normal conditions) + +## Non-Goals + +1. Connection monitoring/health checks after startup (out of scope - see existing keepalive) +2. Automatic retry or connection recovery logic +3. Connection pooling changes +4. Performance testing or load testing capabilities +5. Testing individual tool availability (focus on platform connectivity only) + +## User Stories + +### Story 1: Initial Setup Validation +**As a** platform administrator +**I want to** test my connection configuration before starting the server +**So that** I can verify credentials and network settings are correct + +```bash +$ itential-mcp test-connection +Testing connection to Itential Platform... +✓ Configuration loaded successfully +✓ DNS resolution: platform.example.com -> 192.168.1.100 +✓ TCP connection established (192.168.1.100:3000) +✓ TLS handshake successful +✓ Authentication successful (OAuth) +✓ Platform health check passed +✓ API access verified + +Connection test: SUCCESS +Platform version: 2024.1.0 +Authenticated as: admin@example.com +Response time: 1.2s +``` + +### Story 2: Troubleshooting Connection Issues +**As a** developer +**I want to** see detailed error messages when connection fails +**So that** I can quickly identify and fix configuration problems + +```bash +$ itential-mcp test-connection +Testing connection to Itential Platform... +✓ Configuration loaded successfully +✓ DNS resolution: platform.example.com -> 192.168.1.100 +✓ TCP connection established (192.168.1.100:3000) +✗ TLS handshake failed + +Error: TLS certificate verification failed + Certificate: CN=platform.internal.com + Expected: CN=platform.example.com + Reason: Hostname mismatch + +Suggestion: Either update ITENTIAL_MCP_PLATFORM_HOST to match certificate, + or set ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true (not recommended) + +Connection test: FAILED +``` + +### Story 3: Deployment Validation +**As a** DevOps engineer +**I want to** validate connection during container/pod startup +**So that** Kubernetes health checks fail fast if misconfigured + +```bash +$ itential-mcp run --test-connection-on-startup +[2026-01-27 10:30:00] INFO: Testing platform connection... +[2026-01-27 10:30:01] INFO: Connection test successful +[2026-01-27 10:30:01] INFO: Starting MCP server (transport=sse, port=8000) +``` + +### Story 4: CI/CD Pipeline Validation +**As a** CI/CD pipeline +**I want to** validate connection configuration in integration tests +**So that** deployment proceeds only with valid configuration + +```bash +$ itential-mcp test-connection --format json --quiet +{ + "success": true, + "duration_ms": 1234, + "platform_version": "2024.1.0", + "authenticated_user": "admin@example.com", + "checks": { + "config": "passed", + "dns": "passed", + "tcp": "passed", + "tls": "passed", + "auth": "passed", + "health": "passed", + "api_access": "passed" + } +} +``` + +## Technical Design + +### Architecture Overview + +``` +┌─────────────────────────────────────────────────────────────┐ +│ CLI Entry Point │ +│ (runtime/commands.py) │ +└──────────────────────────┬──────────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────────────────────────┐ +│ Connection Test Orchestrator │ +│ (runtime/connection_test.py - NEW) │ +│ - Coordinates test sequence │ +│ - Formats output (human/json) │ +│ - Handles errors and suggestions │ +└──────────────────────────┬──────────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────────────────────────┐ +│ Connection Test Service │ +│ (platform/connection_test.py - NEW) │ +│ - Performs individual checks │ +│ - Returns detailed results │ +│ - Measures timing │ +└──────────────────────────┬──────────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────────────────────────┐ +│ Platform Client │ +│ (platform/client.py - MODIFIED) │ +│ - Validates connection │ +│ - Exposes test methods │ +└─────────────────────────────────────────────────────────────┘ +``` + +### Component Details + +#### 1. CLI Command: `test-connection` + +**Location:** `src/itential_mcp/runtime/commands.py` + +```python +async def test_connection( + *, + config_file: str | None = None, + format: str = "human", # human, json + verbose: bool = False, + timeout: int = 30, +) -> int: + """Test connection to Itential Platform. + + Returns: + int: Exit code (0=success, 1=failure) + """ +``` + +**Arguments:** +- `--config`: Path to configuration file +- `--format`: Output format (human, json) +- `--verbose`: Show detailed diagnostic information +- `--timeout`: Maximum time for test (default: 30s) +- `--quiet`: Suppress progress messages (JSON output only) + +#### 2. Server Startup Option + +**Configuration:** +```python +# src/itential_mcp/config/models.py +@dataclass +class ServerConfig: + # ... existing fields ... + test_connection_on_startup: bool = False # NEW + startup_test_timeout: int = 30 # NEW +``` + +**Environment Variables:** +```bash +ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP=true|false +ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT=30 +``` + +**Server Initialization:** +```python +# src/itential_mcp/server/server.py +async def run(self): + if self.config.server.test_connection_on_startup: + await self._test_connection_on_startup() + # ... continue with normal startup +``` + +#### 3. Connection Test Service + +**Location:** `src/itential_mcp/platform/connection_test.py` (NEW) + +```python +from dataclasses import dataclass +from enum import Enum + +class CheckStatus(str, Enum): + PASSED = "passed" + FAILED = "failed" + SKIPPED = "skipped" + WARNING = "warning" + +@dataclass +class CheckResult: + name: str + status: CheckStatus + message: str + duration_ms: float + details: dict[str, Any] | None = None + suggestion: str | None = None + +@dataclass +class ConnectionTestResult: + success: bool + duration_ms: float + checks: list[CheckResult] + platform_version: str | None = None + authenticated_user: str | None = None + error: str | None = None + +class ConnectionTestService: + """Performs connection validation checks.""" + + def __init__(self, config: Config): + self.config = config + + async def run_all_checks( + self, + timeout: int = 30 + ) -> ConnectionTestResult: + """Run all connection checks.""" + + async def check_configuration(self) -> CheckResult: + """Validate configuration is loaded and valid.""" + + async def check_dns_resolution(self) -> CheckResult: + """Verify hostname resolves.""" + + async def check_tcp_connection(self) -> CheckResult: + """Verify TCP connection can be established.""" + + async def check_tls_handshake(self) -> CheckResult: + """Verify TLS handshake (if enabled).""" + + async def check_authentication(self) -> CheckResult: + """Verify authentication succeeds.""" + + async def check_platform_health(self) -> CheckResult: + """Verify platform health endpoint responds.""" + + async def check_api_access(self) -> CheckResult: + """Verify API access with simple query.""" +``` + +#### 4. Output Formatting + +**Human-Readable Format:** +``` +Testing connection to Itential Platform... +✓ Configuration loaded successfully +✓ DNS resolution: platform.example.com -> 192.168.1.100 +✓ TCP connection established (192.168.1.100:3000) +✓ TLS handshake successful +✓ Authentication successful (OAuth) +✓ Platform health check passed +✓ API access verified + +Connection test: SUCCESS +Platform version: 2024.1.0 +Authenticated as: admin@example.com +Response time: 1.2s +``` + +**JSON Format:** +```json +{ + "success": true, + "duration_ms": 1234, + "platform_version": "2024.1.0", + "authenticated_user": "admin@example.com", + "checks": [ + { + "name": "configuration", + "status": "passed", + "message": "Configuration loaded successfully", + "duration_ms": 12 + }, + { + "name": "dns", + "status": "passed", + "message": "platform.example.com -> 192.168.1.100", + "duration_ms": 45, + "details": { + "hostname": "platform.example.com", + "ip_address": "192.168.1.100" + } + }, + { + "name": "tcp", + "status": "passed", + "message": "TCP connection established", + "duration_ms": 123 + }, + { + "name": "tls", + "status": "passed", + "message": "TLS handshake successful", + "duration_ms": 234, + "details": { + "protocol": "TLSv1.3", + "cipher": "TLS_AES_256_GCM_SHA384" + } + }, + { + "name": "authentication", + "status": "passed", + "message": "Authentication successful (OAuth)", + "duration_ms": 456 + }, + { + "name": "health", + "status": "passed", + "message": "Platform health check passed", + "duration_ms": 234 + }, + { + "name": "api_access", + "status": "passed", + "message": "API access verified", + "duration_ms": 130 + } + ] +} +``` + +### Test Sequence + +The connection test performs checks in this order (fail-fast approach): + +1. **Configuration Validation** + - Verify config is loaded + - Check required fields present + - Validate field values + +2. **DNS Resolution** + - Resolve hostname to IP address + - Report IP address + - Fail if hostname doesn't resolve + +3. **TCP Connection** + - Establish TCP connection to host:port + - Measure connection time + - Fail if connection refused/timeout + +4. **TLS Handshake** (if TLS enabled) + - Perform TLS handshake + - Verify certificate (if verify enabled) + - Report TLS version and cipher + - Fail on certificate errors (with helpful suggestions) + +5. **Authentication** + - Attempt authentication with configured method + - Report auth method and user + - Fail on auth errors (invalid credentials, expired tokens, etc.) + +6. **Platform Health Check** + - Call `/health` or equivalent endpoint + - Verify platform is responding + - Report platform version if available + +7. **API Access Verification** + - Make a simple API call (e.g., list adapters with limit=1) + - Verify API access is working + - Fail if API returns errors + +### Error Handling & Suggestions + +For common errors, provide actionable suggestions: + +| Error | Suggestion | +|-------|-----------| +| DNS resolution failed | Check hostname spelling, verify DNS configuration, ensure network connectivity | +| Connection refused | Verify platform is running, check host:port configuration, check firewall rules | +| Connection timeout | Check network connectivity, verify host is reachable, increase timeout if needed | +| TLS certificate verification failed | Update hostname to match certificate, or disable verification (not recommended for production) | +| Invalid credentials | Verify username/password, check OAuth configuration, ensure user exists | +| Unauthorized | Check user has required permissions, verify API access is enabled | +| Platform health check failed | Platform may be starting up, check platform logs, verify platform is healthy | + +## Implementation Plan + +### Phase 1: Core Functionality (Minimal Viable Feature) +- [ ] Create `ConnectionTestService` with basic checks +- [ ] Add `test-connection` CLI command +- [ ] Implement human-readable output format +- [ ] Add basic error messages +- [ ] Unit tests for connection test service + +### Phase 2: Enhanced Diagnostics +- [ ] Add JSON output format +- [ ] Implement detailed check results +- [ ] Add error suggestions +- [ ] Add verbose mode +- [ ] Integration tests + +### Phase 3: Startup Integration +- [ ] Add `test_connection_on_startup` config option +- [ ] Integrate with server startup flow +- [ ] Add startup test timeout handling +- [ ] Update documentation + +### Phase 4: Advanced Features (Optional) +- [ ] Add `--check` flag to test specific checks only +- [ ] Add timing metrics for each check +- [ ] Add TLS certificate details output +- [ ] Add network diagnostic info (latency, bandwidth) + +## File Changes + +### New Files +``` +src/itential_mcp/platform/connection_test.py # Connection test service +src/itential_mcp/runtime/connection_test.py # CLI orchestration +tests/test_connection_test.py # Unit tests +tests/test_runtime_connection_test.py # CLI tests +docs/connection-testing.md # User documentation +docs/design/connection-test-feature.md # This document +``` + +### Modified Files +``` +src/itential_mcp/config/models.py # Add ServerConfig fields +src/itential_mcp/defaults.py # Add default values +src/itential_mcp/runtime/commands.py # Add test_connection command +src/itential_mcp/runtime/handlers.py # Add command handler +src/itential_mcp/runtime/parser.py # Add CLI arguments +src/itential_mcp/server/server.py # Add startup test +docs/README.md # Add quick reference +``` + +## Configuration Examples + +### Command Line +```bash +# Basic test +itential-mcp test-connection + +# With custom config +itential-mcp test-connection --config prod.conf + +# JSON output for scripting +itential-mcp test-connection --format json --quiet + +# Verbose diagnostics +itential-mcp test-connection --verbose + +# Custom timeout +itential-mcp test-connection --timeout 60 +``` + +### Environment Variables +```bash +# Enable startup test +ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP=true + +# Set startup test timeout +ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT=30 +``` + +### Configuration File +```toml +[server] +test_connection_on_startup = true +startup_test_timeout = 30 +``` + +### Docker/Kubernetes +```yaml +apiVersion: v1 +kind: Pod +metadata: + name: itential-mcp +spec: + containers: + - name: itential-mcp + image: itential-mcp:latest + env: + - name: ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP + value: "true" + - name: ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT + value: "30" + livenessProbe: + httpGet: + path: /status/livez + port: 8000 + initialDelaySeconds: 35 # Account for connection test + periodSeconds: 10 +``` + +## Testing Strategy + +### Unit Tests +```python +# tests/test_connection_test.py + +@pytest.mark.asyncio +async def test_check_configuration_success(): + """Test configuration check passes with valid config.""" + +@pytest.mark.asyncio +async def test_check_configuration_failure(): + """Test configuration check fails with invalid config.""" + +@pytest.mark.asyncio +async def test_check_dns_resolution_success(): + """Test DNS resolution succeeds.""" + +@pytest.mark.asyncio +async def test_check_authentication_oauth_success(): + """Test OAuth authentication succeeds.""" + +@pytest.mark.asyncio +async def test_run_all_checks_fail_fast(): + """Test that checks stop on first failure.""" +``` + +### Integration Tests +```python +# tests/test_runtime_connection_test.py + +@pytest.mark.asyncio +async def test_test_connection_command_success(mock_platform): + """Test test-connection command with successful connection.""" + +@pytest.mark.asyncio +async def test_test_connection_command_failure(mock_platform): + """Test test-connection command with connection failure.""" + +@pytest.mark.asyncio +async def test_test_connection_json_output(mock_platform): + """Test JSON output format.""" +``` + +### Manual Testing Scenarios +1. Valid configuration with working platform +2. Invalid hostname (DNS failure) +3. Wrong port (connection refused) +4. TLS certificate mismatch +5. Invalid credentials +6. Platform down (health check failure) +7. Network timeout +8. OAuth configuration issues + +## Performance Considerations + +### Timing Targets +- **Fast Path** (all checks pass): < 2 seconds +- **DNS Resolution**: < 100ms +- **TCP Connection**: < 500ms +- **TLS Handshake**: < 500ms +- **Authentication**: < 1000ms (depends on auth method) +- **Health Check**: < 500ms +- **API Access**: < 500ms + +### Timeout Handling +- Default timeout: 30 seconds +- Configurable per check +- Fail fast on first error +- Overall timeout prevents hung tests + +### Resource Usage +- Single connection (no pooling needed) +- Minimal memory footprint +- No persistent connections +- Clean resource cleanup + +## Security Considerations + +1. **Credential Exposure** + - Don't log credentials in error messages + - Filter sensitive data from verbose output + - JSON output should not include passwords/tokens + +2. **TLS Validation** + - Warn if certificate verification is disabled + - Report certificate details in verbose mode + - Suggest proper certificate configuration + +3. **Authentication** + - Support all existing auth methods + - Don't expose token values in output + - Handle expired tokens gracefully + +## Backward Compatibility + +### Breaking Changes +None - feature is entirely opt-in + +### Migration Path +Not applicable - new feature with no existing users + +### Deprecations +None + +### Default Behavior +- `test_connection_on_startup` defaults to `false` +- Maintains current server behavior by default +- Users must explicitly enable startup testing + +## Documentation Updates + +### User Documentation +1. **docs/connection-testing.md** (NEW) + - Overview of connection testing feature + - CLI command usage examples + - Startup testing configuration + - Troubleshooting guide with common errors + - JSON output schema + +2. **docs/README.md** (UPDATE) + - Add `test-connection` to command reference + - Add quick start example + +3. **docs/troubleshooting.md** (UPDATE or NEW - see #301) + - Add section on connection testing + - Reference connection test command + +### Developer Documentation +1. **AGENTS.md** (UPDATE) + - Document new modules + - Add architecture diagram + - Explain test sequence + +2. **CHANGELOG.md** (UPDATE) + - Add feature to unreleased section + +## Success Metrics + +### Qualitative +- Users can diagnose connection issues faster +- Deployment failures are caught earlier +- Support burden reduced (self-service diagnostics) + +### Quantitative +- Connection test completes in < 2s (success case) +- Connection test completes in < 5s (failure case with timeout) +- Test coverage > 95% for new code +- Zero regression in existing functionality + +## Open Questions + +1. **Should we add a `--fix` option to auto-correct common issues?** + - Pro: Better UX, faster resolution + - Con: Security risk, complexity, limited use cases + - **Decision:** Not in initial implementation, consider for future + +2. **Should we test individual tool availability?** + - Pro: More comprehensive validation + - Con: Slow, complexity, tools are dynamic + - **Decision:** Out of scope - focus on platform connectivity only + +3. **Should we cache successful connection tests?** + - Pro: Faster repeated tests + - Con: Cache invalidation complexity, false positives + - **Decision:** No caching - tests should always be fresh + +4. **Should startup test failure prevent server start?** + - Pro: Fail fast, prevents broken deployments + - Con: Could block legitimate use cases (temporary outages) + - **Decision:** Yes, fail fast - users can disable if needed + +5. **Should we add continuous connection monitoring?** + - Pro: Detect issues during operation + - Con: Complexity, overlaps with keepalive, out of scope + - **Decision:** Out of scope for this feature (keepalive handles this) + +## Future Enhancements + +### Near-term (Next Release) +- Add `--check` flag to run specific checks only +- Add timing metrics to human output +- Add color output for better readability + +### Long-term (Future Releases) +- Connection monitoring dashboard/metrics +- Automated remediation suggestions +- Performance benchmarking mode +- Certificate expiration warnings +- Network latency measurements + +## References + +- Issue: https://github.com/itential/itential-mcp/issues/299 +- Related: Issue #301 (Troubleshooting guide) +- Existing keepalive implementation: `src/itential_mcp/server/keepalive.py` +- Platform health endpoints: `src/itential_mcp/server/routes.py` +- Configuration system: `src/itential_mcp/config/` + +## Appendix A: Error Message Examples + +### DNS Resolution Failure +``` +✗ DNS resolution failed + +Error: Could not resolve hostname 'platform.example.com' + +Suggestion: + 1. Check hostname spelling in configuration + 2. Verify DNS server is reachable + 3. Try using IP address directly for testing + +Configuration: + ITENTIAL_MCP_PLATFORM_HOST=platform.example.com +``` + +### TLS Certificate Mismatch +``` +✗ TLS handshake failed + +Error: Certificate hostname mismatch + Certificate CN: platform.internal.com + Expected hostname: platform.example.com + +Suggestion: + 1. Update ITENTIAL_MCP_PLATFORM_HOST to 'platform.internal.com' + 2. Or obtain certificate for 'platform.example.com' + 3. Or disable verification (not recommended): + ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true +``` + +### Authentication Failure +``` +✗ Authentication failed + +Error: Invalid credentials (401 Unauthorized) + +Suggestion: + 1. Verify username and password are correct + 2. Check user exists in Itential Platform + 3. Ensure user has API access permissions + 4. For OAuth, verify issuer and token configuration + +Configuration: + ITENTIAL_MCP_AUTH_TYPE=oauth + ITENTIAL_MCP_AUTH_ISSUER=https://auth.example.com + ITENTIAL_MCP_PLATFORM_USER=admin +``` + +## Appendix B: JSON Schema + +```json +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "type": "object", + "required": ["success", "duration_ms", "checks"], + "properties": { + "success": { + "type": "boolean", + "description": "Overall test result" + }, + "duration_ms": { + "type": "number", + "description": "Total test duration in milliseconds" + }, + "platform_version": { + "type": "string", + "description": "Itential Platform version (if available)" + }, + "authenticated_user": { + "type": "string", + "description": "Authenticated username (if auth succeeded)" + }, + "error": { + "type": "string", + "description": "Error message if test failed" + }, + "checks": { + "type": "array", + "items": { + "type": "object", + "required": ["name", "status", "message", "duration_ms"], + "properties": { + "name": { + "type": "string", + "enum": ["configuration", "dns", "tcp", "tls", "authentication", "health", "api_access"] + }, + "status": { + "type": "string", + "enum": ["passed", "failed", "skipped", "warning"] + }, + "message": { + "type": "string", + "description": "Human-readable result message" + }, + "duration_ms": { + "type": "number", + "description": "Check duration in milliseconds" + }, + "details": { + "type": "object", + "description": "Additional check-specific details" + }, + "suggestion": { + "type": "string", + "description": "Actionable suggestion if check failed" + } + } + } + } + } +} +``` diff --git a/docs/design/connection-test-implementation-plan.md b/docs/design/connection-test-implementation-plan.md new file mode 100644 index 0000000..c463d48 --- /dev/null +++ b/docs/design/connection-test-implementation-plan.md @@ -0,0 +1,2461 @@ +# Implementation Plan: Connection Test Feature + +**Design Spec:** [connection-test-feature.md](connection-test-feature.md) +**Issue:** [#299](https://github.com/itential/itential-mcp/issues/299) +**Status:** Ready for Implementation +**Created:** 2026-01-27 +**Last Updated:** 2026-01-27 + +## Overview + +This document provides a detailed, phased implementation plan for the connection test feature. The implementation is broken into 4 phases, each building on the previous phase and delivering incremental value. + +## Implementation Phases + +| Phase | Focus | Deliverables | Effort | +|-------|-------|--------------|--------| +| 1 | Core Foundation | Connection test service, basic checks | 2-3 days | +| 2 | CLI Integration | test-connection command, human output | 1-2 days | +| 3 | Enhanced Output | JSON format, detailed diagnostics, suggestions | 1-2 days | +| 4 | Startup Integration | Startup testing, configuration options | 1 day | + +**Total Estimated Effort:** 5-8 days + +## Phase 1: Core Foundation + +**Objective:** Build the core connection test service with basic connectivity checks. + +**Success Criteria:** +- ✓ Connection test service can perform all 7 checks +- ✓ Each check returns structured results +- ✓ Fail-fast behavior works correctly +- ✓ Unit test coverage > 95% +- ✓ No regression in existing functionality + +### Tasks + +#### Task 1.1: Create Connection Test Service Module +**File:** `src/itential_mcp/platform/connection_test.py` (NEW) +**Estimated Time:** 4-6 hours + +**Implementation Steps:** + +1. Create the file with copyright header and module docstring +2. Define data models: + ```python + from dataclasses import dataclass + from enum import Enum + from typing import Any + + class CheckStatus(str, Enum): + """Status of a connection check.""" + PASSED = "passed" + FAILED = "failed" + SKIPPED = "skipped" + WARNING = "warning" + + @dataclass + class CheckResult: + """Result of a single connection check.""" + name: str + status: CheckStatus + message: str + duration_ms: float + details: dict[str, Any] | None = None + error: Exception | None = None + suggestion: str | None = None + + @dataclass + class ConnectionTestResult: + """Overall result of connection testing.""" + success: bool + duration_ms: float + checks: list[CheckResult] + platform_version: str | None = None + authenticated_user: str | None = None + error: str | None = None + ``` + +3. Create `ConnectionTestService` class skeleton: + ```python + class ConnectionTestService: + """Service for testing platform connectivity.""" + + def __init__(self, config: Config): + """Initialize connection test service. + + Args: + config: Application configuration. + """ + self.config = config + self._logger = logging.getLogger(__name__) + ``` + +4. Implement utility methods: + ```python + def _measure_time(self, func): + """Decorator to measure check execution time.""" + + def _create_check_result( + self, + name: str, + status: CheckStatus, + message: str, + duration_ms: float, + **kwargs + ) -> CheckResult: + """Create a CheckResult instance.""" + ``` + +**Acceptance Criteria:** +- [ ] Module created with proper structure +- [ ] Data models defined with type hints +- [ ] Class skeleton implemented +- [ ] Code passes `make check` + +--- + +#### Task 1.2: Implement Configuration Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 1 hour + +```python +async def check_configuration(self) -> CheckResult: + """Validate configuration is loaded and valid. + + Returns: + CheckResult: Configuration check result. + """ + start = time.perf_counter() + + try: + # Validate required fields + if not self.config.platform.host: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message="Platform host not configured", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion="Set ITENTIAL_MCP_PLATFORM_HOST environment variable" + ) + + # Validate auth configuration + if not self.config.auth.type: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message="Authentication type not configured", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion="Set ITENTIAL_MCP_AUTH_TYPE environment variable" + ) + + # Additional validation checks... + + return self._create_check_result( + name="configuration", + status=CheckStatus.PASSED, + message="Configuration loaded successfully", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "platform_host": self.config.platform.host, + "platform_port": self.config.platform.port, + "auth_type": self.config.auth.type, + "tls_enabled": not self.config.platform.disable_tls, + } + ) + + except Exception as e: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message=f"Configuration error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e + ) +``` + +**Acceptance Criteria:** +- [ ] Validates all required configuration fields +- [ ] Returns detailed configuration info +- [ ] Handles missing config gracefully +- [ ] Unit tests cover success and failure cases + +--- + +#### Task 1.3: Implement DNS Resolution Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 1 hour + +```python +async def check_dns_resolution(self) -> CheckResult: + """Verify hostname resolves to IP address. + + Returns: + CheckResult: DNS resolution check result. + """ + import socket + + start = time.perf_counter() + hostname = self.config.platform.host + + try: + # Attempt DNS resolution + ip_address = socket.gethostbyname(hostname) + + return self._create_check_result( + name="dns", + status=CheckStatus.PASSED, + message=f"{hostname} -> {ip_address}", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "hostname": hostname, + "ip_address": ip_address, + } + ) + + except socket.gaierror as e: + return self._create_check_result( + name="dns", + status=CheckStatus.FAILED, + message=f"Could not resolve hostname '{hostname}'", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + f"1. Check hostname spelling: {hostname}\n" + "2. Verify DNS server is reachable\n" + "3. Try using IP address directly for testing" + ) + ) + + except Exception as e: + return self._create_check_result( + name="dns", + status=CheckStatus.FAILED, + message=f"DNS resolution error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e + ) +``` + +**Acceptance Criteria:** +- [ ] Resolves hostname to IP address +- [ ] Handles resolution failures gracefully +- [ ] Provides helpful error suggestions +- [ ] Unit tests with mocked socket calls + +--- + +#### Task 1.4: Implement TCP Connection Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 1-2 hours + +```python +async def check_tcp_connection(self) -> CheckResult: + """Verify TCP connection can be established. + + Returns: + CheckResult: TCP connection check result. + """ + import socket + import asyncio + + start = time.perf_counter() + host = self.config.platform.host + port = self.config.platform.port + + try: + # Attempt TCP connection with timeout + loop = asyncio.get_event_loop() + sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + sock.settimeout(5.0) + + await loop.run_in_executor( + None, + sock.connect, + (host, port) + ) + + sock.close() + + return self._create_check_result( + name="tcp", + status=CheckStatus.PASSED, + message=f"TCP connection established ({host}:{port})", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "host": host, + "port": port, + } + ) + + except socket.timeout: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"Connection timeout to {host}:{port}", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion=( + f"1. Verify platform is running at {host}:{port}\n" + "2. Check network connectivity\n" + "3. Verify firewall rules allow connection\n" + "4. Increase timeout if network is slow" + ) + ) + + except ConnectionRefusedError: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"Connection refused to {host}:{port}", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion=( + f"1. Verify platform is running at {host}:{port}\n" + "2. Check port number is correct\n" + "3. Verify platform is listening on this address" + ) + ) + + except Exception as e: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"TCP connection error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e + ) +``` + +**Acceptance Criteria:** +- [ ] Establishes TCP connection +- [ ] Handles timeout appropriately +- [ ] Handles connection refused +- [ ] Provides helpful suggestions +- [ ] Unit tests with mocked socket + +--- + +#### Task 1.5: Implement TLS Handshake Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 2-3 hours + +```python +async def check_tls_handshake(self) -> CheckResult: + """Verify TLS handshake succeeds (if TLS enabled). + + Returns: + CheckResult: TLS handshake check result. + """ + start = time.perf_counter() + + # Skip if TLS is disabled + if self.config.platform.disable_tls: + return self._create_check_result( + name="tls", + status=CheckStatus.SKIPPED, + message="TLS disabled in configuration", + duration_ms=(time.perf_counter() - start) * 1000 + ) + + import ssl + import socket + + host = self.config.platform.host + port = self.config.platform.port + + try: + # Create SSL context + context = ssl.create_default_context() + + if self.config.platform.disable_verify: + context.check_hostname = False + context.verify_mode = ssl.CERT_NONE + + # Perform TLS handshake + with socket.create_connection((host, port), timeout=5.0) as sock: + with context.wrap_socket(sock, server_hostname=host) as ssock: + cipher = ssock.cipher() + version = ssock.version() + + return self._create_check_result( + name="tls", + status=CheckStatus.PASSED, + message="TLS handshake successful", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "protocol": version, + "cipher": cipher[0] if cipher else None, + } + ) + + except ssl.SSLCertVerificationError as e: + return self._create_check_result( + name="tls", + status=CheckStatus.FAILED, + message="TLS certificate verification failed", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + f"Certificate verification failed for {host}\n\n" + "Options:\n" + "1. Update hostname to match certificate CN\n" + "2. Obtain valid certificate for this hostname\n" + "3. Disable verification (not recommended):\n" + " ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true" + ) + ) + + except Exception as e: + return self._create_check_result( + name="tls", + status=CheckStatus.FAILED, + message=f"TLS handshake error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e + ) +``` + +**Acceptance Criteria:** +- [ ] Performs TLS handshake +- [ ] Reports TLS version and cipher +- [ ] Skips when TLS disabled +- [ ] Handles certificate errors gracefully +- [ ] Provides actionable suggestions +- [ ] Unit tests with mocked SSL + +--- + +#### Task 1.6: Implement Authentication Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 2-3 hours + +```python +async def check_authentication(self) -> CheckResult: + """Verify authentication succeeds. + + Returns: + CheckResult: Authentication check result. + """ + from itential_mcp.platform.client import PlatformClient + from itential_mcp.core.exceptions import AuthenticationException + + start = time.perf_counter() + + try: + # Create temporary client for auth test + async with PlatformClient() as client: + # Attempt to get current user info (validates auth) + # This depends on ipsdk exposing user info endpoint + # For now, we'll try a simple API call that requires auth + + # The mere fact that we can create the client and it doesn't + # throw an auth error means auth is working + + auth_type = self.config.auth.type + user = self.config.platform.user if hasattr(self.config.platform, "user") else None + + return self._create_check_result( + name="authentication", + status=CheckStatus.PASSED, + message=f"Authentication successful ({auth_type})", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "auth_type": auth_type, + "user": user, + } + ) + + except AuthenticationException as e: + return self._create_check_result( + name="authentication", + status=CheckStatus.FAILED, + message="Authentication failed", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "Authentication failed. Please check:\n" + "1. Username and password are correct\n" + "2. User exists in Itential Platform\n" + "3. User has API access permissions\n" + "4. For OAuth, verify issuer and token configuration" + ) + ) + + except Exception as e: + return self._create_check_result( + name="authentication", + status=CheckStatus.FAILED, + message=f"Authentication error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e + ) +``` + +**Acceptance Criteria:** +- [ ] Validates authentication +- [ ] Reports auth type and user +- [ ] Handles auth failures +- [ ] Provides helpful suggestions +- [ ] Unit tests with mocked client + +--- + +#### Task 1.7: Implement Platform Health Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 1 hour + +```python +async def check_platform_health(self) -> CheckResult: + """Verify platform health endpoint responds. + + Returns: + CheckResult: Platform health check result. + """ + from itential_mcp.platform.client import PlatformClient + + start = time.perf_counter() + + try: + async with PlatformClient() as client: + # Call health endpoint + health = await client.health.get_status_health() + + platform_version = health.get("platform", {}).get("version") + + return self._create_check_result( + name="health", + status=CheckStatus.PASSED, + message="Platform health check passed", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "platform_version": platform_version, + "status": health.get("status"), + } + ) + + except Exception as e: + return self._create_check_result( + name="health", + status=CheckStatus.FAILED, + message=f"Platform health check failed: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "Platform health check failed. Possible causes:\n" + "1. Platform is starting up (wait and retry)\n" + "2. Platform is experiencing issues (check logs)\n" + "3. Health endpoint is not available" + ) + ) +``` + +**Acceptance Criteria:** +- [ ] Calls health endpoint +- [ ] Extracts platform version +- [ ] Handles health check failures +- [ ] Unit tests with mocked client + +--- + +#### Task 1.8: Implement API Access Check +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 1 hour + +```python +async def check_api_access(self) -> CheckResult: + """Verify API access with simple query. + + Returns: + CheckResult: API access check result. + """ + from itential_mcp.platform.client import PlatformClient + + start = time.perf_counter() + + try: + async with PlatformClient() as client: + # Make a simple API call that requires permissions + # Use adapters list with limit=1 for minimal overhead + adapters = await client.adapters.get_adapters(limit=1) + + return self._create_check_result( + name="api_access", + status=CheckStatus.PASSED, + message="API access verified", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "api_call": "GET /adapters?limit=1", + "response_count": len(adapters.get("results", [])), + } + ) + + except Exception as e: + return self._create_check_result( + name="api_access", + status=CheckStatus.FAILED, + message=f"API access verification failed: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "API access verification failed. Possible causes:\n" + "1. User lacks required permissions\n" + "2. API endpoint is unavailable\n" + "3. Request format is invalid" + ) + ) +``` + +**Acceptance Criteria:** +- [ ] Makes simple API call +- [ ] Verifies response is valid +- [ ] Handles API errors +- [ ] Unit tests with mocked client + +--- + +#### Task 1.9: Implement Main Test Runner +**File:** `src/itential_mcp/platform/connection_test.py` +**Estimated Time:** 2 hours + +```python +async def run_all_checks(self, timeout: int = 30) -> ConnectionTestResult: + """Run all connection checks in sequence. + + Args: + timeout: Maximum time for all checks in seconds. + + Returns: + ConnectionTestResult: Overall test results. + """ + import asyncio + + overall_start = time.perf_counter() + checks: list[CheckResult] = [] + + try: + # Run checks in order, fail-fast on failures + check_methods = [ + self.check_configuration, + self.check_dns_resolution, + self.check_tcp_connection, + self.check_tls_handshake, + self.check_authentication, + self.check_platform_health, + self.check_api_access, + ] + + for check_method in check_methods: + try: + # Run with timeout + result = await asyncio.wait_for( + check_method(), + timeout=timeout + ) + checks.append(result) + + # Fail fast - stop on first failure + if result.status == CheckStatus.FAILED: + break + + except asyncio.TimeoutError: + checks.append(CheckResult( + name=check_method.__name__.replace("check_", ""), + status=CheckStatus.FAILED, + message="Check timed out", + duration_ms=timeout * 1000, + suggestion="Increase timeout or check network latency" + )) + break + + # Determine overall success + success = all( + check.status in (CheckStatus.PASSED, CheckStatus.SKIPPED) + for check in checks + ) + + # Extract metadata from successful checks + platform_version = None + authenticated_user = None + + for check in checks: + if check.name == "health" and check.status == CheckStatus.PASSED: + platform_version = check.details.get("platform_version") + if check.name == "authentication" and check.status == CheckStatus.PASSED: + authenticated_user = check.details.get("user") + + duration_ms = (time.perf_counter() - overall_start) * 1000 + + return ConnectionTestResult( + success=success, + duration_ms=duration_ms, + checks=checks, + platform_version=platform_version, + authenticated_user=authenticated_user, + error=None if success else checks[-1].message + ) + + except Exception as e: + duration_ms = (time.perf_counter() - overall_start) * 1000 + self._logger.exception("Unexpected error during connection test") + + return ConnectionTestResult( + success=False, + duration_ms=duration_ms, + checks=checks, + error=f"Unexpected error: {e}" + ) +``` + +**Acceptance Criteria:** +- [ ] Runs all checks in sequence +- [ ] Implements fail-fast behavior +- [ ] Handles timeouts properly +- [ ] Extracts metadata from results +- [ ] Unit tests cover all code paths + +--- + +#### Task 1.10: Write Unit Tests +**File:** `tests/test_platform_connection_test.py` (NEW) +**Estimated Time:** 4-6 hours + +Create comprehensive unit tests for all functionality: + +```python +import pytest +from unittest.mock import Mock, AsyncMock, patch +from itential_mcp.platform.connection_test import ( + ConnectionTestService, + CheckStatus, + CheckResult, + ConnectionTestResult, +) +from itential_mcp.config.models import Config + +@pytest.fixture +def mock_config(): + """Create mock configuration.""" + config = Mock(spec=Config) + config.platform.host = "platform.example.com" + config.platform.port = 3000 + config.platform.disable_tls = False + config.platform.disable_verify = False + config.auth.type = "oauth" + config.platform.user = "admin" + return config + +@pytest.fixture +def service(mock_config): + """Create connection test service instance.""" + return ConnectionTestService(mock_config) + +# Configuration Tests +@pytest.mark.asyncio +async def test_check_configuration_success(service): + """Test configuration check passes with valid config.""" + result = await service.check_configuration() + + assert result.name == "configuration" + assert result.status == CheckStatus.PASSED + assert "successfully" in result.message.lower() + assert result.details is not None + +@pytest.mark.asyncio +async def test_check_configuration_missing_host(mock_config): + """Test configuration check fails with missing host.""" + mock_config.platform.host = None + service = ConnectionTestService(mock_config) + + result = await service.check_configuration() + + assert result.status == CheckStatus.FAILED + assert "host not configured" in result.message.lower() + assert result.suggestion is not None + +# DNS Tests +@pytest.mark.asyncio +async def test_check_dns_resolution_success(service): + """Test DNS resolution succeeds.""" + with patch("socket.gethostbyname", return_value="192.168.1.100"): + result = await service.check_dns_resolution() + + assert result.status == CheckStatus.PASSED + assert "192.168.1.100" in result.message + assert result.details["ip_address"] == "192.168.1.100" + +@pytest.mark.asyncio +async def test_check_dns_resolution_failure(service): + """Test DNS resolution fails gracefully.""" + import socket + with patch("socket.gethostbyname", side_effect=socket.gaierror): + result = await service.check_dns_resolution() + + assert result.status == CheckStatus.FAILED + assert "resolve" in result.message.lower() + assert result.suggestion is not None + +# TCP Tests +@pytest.mark.asyncio +async def test_check_tcp_connection_success(service): + """Test TCP connection succeeds.""" + mock_sock = Mock() + with patch("socket.socket", return_value=mock_sock): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.PASSED + assert "established" in result.message.lower() + mock_sock.close.assert_called_once() + +@pytest.mark.asyncio +async def test_check_tcp_connection_refused(service): + """Test TCP connection handles connection refused.""" + mock_sock = Mock() + mock_sock.connect.side_effect = ConnectionRefusedError() + + with patch("socket.socket", return_value=mock_sock): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.FAILED + assert "refused" in result.message.lower() + assert result.suggestion is not None + +# TLS Tests +@pytest.mark.asyncio +async def test_check_tls_handshake_skipped_when_disabled(mock_config): + """Test TLS check skipped when TLS disabled.""" + mock_config.platform.disable_tls = True + service = ConnectionTestService(mock_config) + + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.SKIPPED + assert "disabled" in result.message.lower() + +@pytest.mark.asyncio +async def test_check_tls_handshake_success(service): + """Test TLS handshake succeeds.""" + # Mock SSL context and socket + with patch("ssl.create_default_context"), \ + patch("socket.create_connection"): + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.PASSED + assert "successful" in result.message.lower() + +# Authentication Tests +@pytest.mark.asyncio +async def test_check_authentication_success(service): + """Test authentication check succeeds.""" + mock_client = AsyncMock() + + with patch("itential_mcp.platform.client.PlatformClient", return_value=mock_client): + result = await service.check_authentication() + + assert result.status == CheckStatus.PASSED + assert "successful" in result.message.lower() + +# Health Tests +@pytest.mark.asyncio +async def test_check_platform_health_success(service): + """Test platform health check succeeds.""" + mock_client = AsyncMock() + mock_client.health.get_status_health.return_value = { + "status": "healthy", + "platform": {"version": "2024.1.0"} + } + + with patch("itential_mcp.platform.client.PlatformClient", return_value=mock_client): + result = await service.check_platform_health() + + assert result.status == CheckStatus.PASSED + assert result.details["platform_version"] == "2024.1.0" + +# API Access Tests +@pytest.mark.asyncio +async def test_check_api_access_success(service): + """Test API access check succeeds.""" + mock_client = AsyncMock() + mock_client.adapters.get_adapters.return_value = { + "results": [{"id": "adapter1"}] + } + + with patch("itential_mcp.platform.client.PlatformClient", return_value=mock_client): + result = await service.check_api_access() + + assert result.status == CheckStatus.PASSED + assert "verified" in result.message.lower() + +# Overall Test Runner +@pytest.mark.asyncio +async def test_run_all_checks_success(service): + """Test running all checks successfully.""" + # Mock all checks to succeed + with patch.object(service, "check_configuration", return_value=CheckResult( + name="config", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_dns_resolution", return_value=CheckResult( + name="dns", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_tcp_connection", return_value=CheckResult( + name="tcp", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_tls_handshake", return_value=CheckResult( + name="tls", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_authentication", return_value=CheckResult( + name="auth", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_platform_health", return_value=CheckResult( + name="health", status=CheckStatus.PASSED, message="OK", duration_ms=10, + details={"platform_version": "2024.1.0"} + )), \ + patch.object(service, "check_api_access", return_value=CheckResult( + name="api", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )): + result = await service.run_all_checks() + + assert result.success is True + assert len(result.checks) == 7 + assert result.platform_version == "2024.1.0" + +@pytest.mark.asyncio +async def test_run_all_checks_fails_fast(service): + """Test that checks stop on first failure.""" + # Mock first check to succeed, second to fail + with patch.object(service, "check_configuration", return_value=CheckResult( + name="config", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_dns_resolution", return_value=CheckResult( + name="dns", status=CheckStatus.FAILED, message="Failed", duration_ms=10 + )) as mock_dns: + result = await service.run_all_checks() + + assert result.success is False + assert len(result.checks) == 2 # Only ran first two checks + mock_dns.assert_called_once() + +@pytest.mark.asyncio +async def test_run_all_checks_timeout(service): + """Test timeout handling.""" + async def slow_check(): + await asyncio.sleep(100) + return CheckResult(name="slow", status=CheckStatus.PASSED, message="OK", duration_ms=100000) + + with patch.object(service, "check_configuration", return_value=CheckResult( + name="config", status=CheckStatus.PASSED, message="OK", duration_ms=10 + )), \ + patch.object(service, "check_dns_resolution", side_effect=slow_check): + result = await service.run_all_checks(timeout=1) + + assert result.success is False + assert any("timeout" in check.message.lower() for check in result.checks) +``` + +**Additional Test Files:** +- Test all error paths +- Test timeout behavior +- Test with different config combinations +- Test with different auth types + +**Acceptance Criteria:** +- [ ] Test coverage > 95% +- [ ] All success paths tested +- [ ] All failure paths tested +- [ ] Edge cases covered +- [ ] Tests pass with `make test` + +--- + +### Phase 1 Deliverables Checklist + +- [ ] `src/itential_mcp/platform/connection_test.py` created with all checks +- [ ] All 7 checks implemented and working +- [ ] Unit tests written with >95% coverage +- [ ] All tests passing (`make test`) +- [ ] Code passes linting (`make check`) +- [ ] Code formatted (`make format`) +- [ ] Documentation comments complete +- [ ] Type hints on all functions +- [ ] No regression in existing tests + +--- + +## Phase 2: CLI Integration + +**Objective:** Add `test-connection` CLI command with human-readable output. + +**Success Criteria:** +- ✓ CLI command works with basic output +- ✓ Exit codes correct (0=success, 1=failure) +- ✓ Human-readable output is clear and helpful +- ✓ Integration tests pass +- ✓ Documentation updated + +### Tasks + +#### Task 2.1: Add CLI Command Parser +**File:** `src/itential_mcp/runtime/parser.py` +**Estimated Time:** 1 hour + +Add test-connection subcommand to argument parser: + +```python +def _add_test_connection_parser(subparsers): + """Add test-connection command parser. + + Args: + subparsers: Subparser collection. + """ + parser = subparsers.add_parser( + "test-connection", + help="Test connection to Itential Platform", + description=( + "Test connection to Itential Platform by performing a series of " + "connectivity checks. Useful for validating configuration and " + "troubleshooting connection issues." + ), + ) + + parser.add_argument( + "--config", + type=str, + help="Path to configuration file", + ) + + parser.add_argument( + "--timeout", + type=int, + default=30, + help="Maximum time for test in seconds (default: 30)", + ) + + parser.add_argument( + "--verbose", + action="store_true", + help="Show detailed diagnostic information", + ) + + parser.add_argument( + "--format", + type=str, + choices=["human", "json"], + default="human", + help="Output format (default: human)", + ) + + parser.add_argument( + "--quiet", + action="store_true", + help="Suppress progress messages (JSON output only)", + ) + + return parser +``` + +Update main parser creation: + +```python +def parse_args(args: list[str] | None = None) -> argparse.Namespace: + """Parse command-line arguments. + + Args: + args: Arguments to parse (defaults to sys.argv). + + Returns: + Parsed arguments namespace. + """ + parser = argparse.ArgumentParser(...) + subparsers = parser.add_subparsers(dest="command", required=True) + + # Existing parsers + _add_run_parser(subparsers) + _add_version_parser(subparsers) + # ... other parsers ... + + # Add new parser + _add_test_connection_parser(subparsers) + + return parser.parse_args(args) +``` + +**Acceptance Criteria:** +- [ ] Parser added with all arguments +- [ ] Help text is clear and useful +- [ ] Parser integrates with existing CLI +- [ ] Unit tests for parser + +--- + +#### Task 2.2: Create CLI Command Handler +**File:** `src/itential_mcp/runtime/commands.py` +**Estimated Time:** 2-3 hours + +```python +async def test_connection( + *, + config_file: str | None = None, + format: str = "human", + verbose: bool = False, + timeout: int = 30, + quiet: bool = False, +) -> int: + """Test connection to Itential Platform. + + Args: + config_file: Path to configuration file. + format: Output format (human or json). + verbose: Show detailed diagnostic information. + timeout: Maximum time for test in seconds. + quiet: Suppress progress messages. + + Returns: + int: Exit code (0=success, 1=failure). + """ + from itential_mcp import config + from itential_mcp.platform.connection_test import ConnectionTestService + from itential_mcp.cli.terminal import print_success, print_error, print_info + import sys + import json + + # Load configuration + try: + if config_file: + os.environ["ITENTIAL_MCP_CONFIG_FILE"] = config_file + + cfg = config.get() + except Exception as e: + print_error(f"Failed to load configuration: {e}") + return 1 + + # Create service + service = ConnectionTestService(cfg) + + # Run checks + if not quiet and format == "human": + print_info("Testing connection to Itential Platform...") + print() + + try: + result = await service.run_all_checks(timeout=timeout) + except Exception as e: + print_error(f"Connection test failed with unexpected error: {e}") + return 1 + + # Output results + if format == "json": + _output_json(result) + else: + _output_human(result, verbose=verbose) + + return 0 if result.success else 1 + + +def _output_human(result: ConnectionTestResult, verbose: bool = False) -> None: + """Output results in human-readable format. + + Args: + result: Test results. + verbose: Show detailed information. + """ + from itential_mcp.cli.terminal import ( + print_success, + print_error, + print_info, + Colors, + ) + + # Print check results + for check in result.checks: + if check.status == CheckStatus.PASSED: + symbol = "✓" + color = Colors.GREEN + elif check.status == CheckStatus.FAILED: + symbol = "✗" + color = Colors.RED + elif check.status == CheckStatus.SKIPPED: + symbol = "○" + color = Colors.YELLOW + else: # WARNING + symbol = "⚠" + color = Colors.YELLOW + + print(f"{color}{symbol}{Colors.RESET} {check.message}") + + # Show details in verbose mode + if verbose and check.details: + for key, value in check.details.items(): + print(f" {key}: {value}") + + # Show suggestions for failures + if check.status == CheckStatus.FAILED and check.suggestion: + print() + print(f"{Colors.YELLOW}Suggestion:{Colors.RESET}") + for line in check.suggestion.split("\n"): + print(f" {line}") + print() + + print() + + # Print summary + if result.success: + print_success("Connection test: SUCCESS") + if result.platform_version: + print(f"Platform version: {result.platform_version}") + if result.authenticated_user: + print(f"Authenticated as: {result.authenticated_user}") + print(f"Response time: {result.duration_ms / 1000:.1f}s") + else: + print_error("Connection test: FAILED") + if result.error: + print(f"Error: {result.error}") + + +def _output_json(result: ConnectionTestResult) -> None: + """Output results in JSON format. + + Args: + result: Test results. + """ + import json + from dataclasses import asdict + + # Convert to dict + data = { + "success": result.success, + "duration_ms": result.duration_ms, + "checks": [ + { + "name": check.name, + "status": check.status, + "message": check.message, + "duration_ms": check.duration_ms, + "details": check.details, + "suggestion": check.suggestion, + } + for check in result.checks + ], + } + + if result.platform_version: + data["platform_version"] = result.platform_version + if result.authenticated_user: + data["authenticated_user"] = result.authenticated_user + if result.error: + data["error"] = result.error + + print(json.dumps(data, indent=2)) +``` + +**Acceptance Criteria:** +- [ ] Command executes successfully +- [ ] Human output is clear and colorful +- [ ] JSON output is valid +- [ ] Exit codes are correct +- [ ] Error handling works properly + +--- + +#### Task 2.3: Register Command Handler +**File:** `src/itential_mcp/runtime/handlers.py` +**Estimated Time:** 30 minutes + +```python +def get_command_handler(command: str): + """Get handler function for command. + + Args: + command: Command name. + + Returns: + Handler function. + + Raises: + ValueError: If command is not recognized. + """ + from .commands import ( + run, + version, + list_tools, + list_tags, + call_tool, + test_connection, # NEW + ) + + handlers = { + "run": run, + "version": version, + "tools": list_tools, + "tags": list_tags, + "call": call_tool, + "test-connection": test_connection, # NEW + } + + handler = handlers.get(command) + if not handler: + raise ValueError(f"Unknown command: {command}") + + return handler +``` + +**Acceptance Criteria:** +- [ ] Handler registered +- [ ] Command dispatching works +- [ ] Unit tests updated + +--- + +#### Task 2.4: Add Terminal Utilities (if needed) +**File:** `src/itential_mcp/cli/terminal.py` +**Estimated Time:** 1 hour + +Add any missing terminal utility functions: + +```python +class Colors: + """ANSI color codes.""" + RED = "\033[91m" + GREEN = "\033[92m" + YELLOW = "\033[93m" + BLUE = "\033[94m" + RESET = "\033[0m" + BOLD = "\033[1m" + + +def print_success(message: str) -> None: + """Print success message in green.""" + print(f"{Colors.GREEN}{message}{Colors.RESET}") + + +def print_error(message: str) -> None: + """Print error message in red.""" + print(f"{Colors.RED}{message}{Colors.RESET}") + + +def print_warning(message: str) -> None: + """Print warning message in yellow.""" + print(f"{Colors.YELLOW}{message}{Colors.RESET}") + + +def print_info(message: str) -> None: + """Print info message in blue.""" + print(f"{Colors.BLUE}{message}{Colors.RESET}") +``` + +**Acceptance Criteria:** +- [ ] Terminal utilities available +- [ ] Colors work in terminal +- [ ] Functions are reusable + +--- + +#### Task 2.5: Write Integration Tests +**File:** `tests/test_runtime_test_connection.py` (NEW) +**Estimated Time:** 2-3 hours + +```python +import pytest +from unittest.mock import Mock, AsyncMock, patch +from itential_mcp.runtime.commands import test_connection +from itential_mcp.platform.connection_test import ( + ConnectionTestResult, + CheckResult, + CheckStatus, +) + +@pytest.mark.asyncio +async def test_test_connection_command_success(): + """Test test-connection command with successful connection.""" + mock_result = ConnectionTestResult( + success=True, + duration_ms=1234, + checks=[ + CheckResult( + name="config", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10 + ) + ], + platform_version="2024.1.0", + authenticated_user="admin", + ) + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class: + mock_service = AsyncMock() + mock_service.run_all_checks.return_value = mock_result + mock_service_class.return_value = mock_service + + exit_code = await test_connection() + + assert exit_code == 0 + mock_service.run_all_checks.assert_called_once() + +@pytest.mark.asyncio +async def test_test_connection_command_failure(): + """Test test-connection command with connection failure.""" + mock_result = ConnectionTestResult( + success=False, + duration_ms=1234, + checks=[ + CheckResult( + name="config", + status=CheckStatus.FAILED, + message="Failed", + duration_ms=10, + suggestion="Fix config" + ) + ], + error="Connection failed", + ) + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class: + mock_service = AsyncMock() + mock_service.run_all_checks.return_value = mock_result + mock_service_class.return_value = mock_service + + exit_code = await test_connection() + + assert exit_code == 1 + +@pytest.mark.asyncio +async def test_test_connection_json_output(capsys): + """Test JSON output format.""" + mock_result = ConnectionTestResult( + success=True, + duration_ms=1234, + checks=[], + ) + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class: + mock_service = AsyncMock() + mock_service.run_all_checks.return_value = mock_result + mock_service_class.return_value = mock_service + + exit_code = await test_connection(format="json") + + captured = capsys.readouterr() + import json + output = json.loads(captured.out) + + assert output["success"] is True + assert output["duration_ms"] == 1234 + assert exit_code == 0 +``` + +**Acceptance Criteria:** +- [ ] Integration tests pass +- [ ] CLI command works end-to-end +- [ ] Both output formats tested +- [ ] Exit codes validated + +--- + +### Phase 2 Deliverables Checklist + +- [ ] CLI command implemented and working +- [ ] Human-readable output looks good +- [ ] Exit codes correct +- [ ] Integration tests passing +- [ ] Can run `itential-mcp test-connection` successfully +- [ ] Help text is clear (`itential-mcp test-connection --help`) + +--- + +## Phase 3: Enhanced Output + +**Objective:** Add JSON output format and enhanced diagnostic information. + +**Success Criteria:** +- ✓ JSON output is valid and complete +- ✓ Verbose mode shows detailed info +- ✓ Error messages include actionable suggestions +- ✓ Output is machine-parseable for CI/CD + +### Tasks + +#### Task 3.1: Enhance JSON Output +**File:** `src/itential_mcp/runtime/commands.py` +**Estimated Time:** 1 hour + +The JSON output was already implemented in Phase 2, but now we'll enhance it: + +```python +def _output_json(result: ConnectionTestResult) -> None: + """Output results in JSON format. + + Args: + result: Test results. + """ + import json + + # Build comprehensive JSON output + data = { + "success": result.success, + "duration_ms": round(result.duration_ms, 2), + "timestamp": datetime.utcnow().isoformat() + "Z", + "checks": [], + } + + # Add check results + for check in result.checks: + check_data = { + "name": check.name, + "status": check.status, + "message": check.message, + "duration_ms": round(check.duration_ms, 2), + } + + if check.details: + check_data["details"] = check.details + + if check.suggestion: + check_data["suggestion"] = check.suggestion + + if check.error: + check_data["error"] = { + "type": type(check.error).__name__, + "message": str(check.error), + } + + data["checks"].append(check_data) + + # Add metadata + if result.platform_version: + data["platform_version"] = result.platform_version + if result.authenticated_user: + data["authenticated_user"] = result.authenticated_user + if result.error: + data["error"] = result.error + + # Add summary statistics + data["summary"] = { + "total_checks": len(result.checks), + "passed": sum(1 for c in result.checks if c.status == CheckStatus.PASSED), + "failed": sum(1 for c in result.checks if c.status == CheckStatus.FAILED), + "skipped": sum(1 for c in result.checks if c.status == CheckStatus.SKIPPED), + "warnings": sum(1 for c in result.checks if c.status == CheckStatus.WARNING), + } + + print(json.dumps(data, indent=2)) +``` + +**Acceptance Criteria:** +- [ ] JSON is valid and well-formatted +- [ ] All data included +- [ ] Timestamp added +- [ ] Summary statistics included +- [ ] Schema documented + +--- + +#### Task 3.2: Add JSON Schema Documentation +**File:** `docs/connection-testing.md` (NEW) +**Estimated Time:** 1 hour + +Create user documentation with JSON schema: + +```markdown +# Connection Testing + +## Overview + +The `test-connection` command validates connectivity to Itential Platform... + +## Usage + +### Basic Test +\`\`\`bash +itential-mcp test-connection +\`\`\` + +### JSON Output +\`\`\`bash +itential-mcp test-connection --format json +\`\`\` + +### Verbose Output +\`\`\`bash +itential-mcp test-connection --verbose +\`\`\` + +## JSON Output Schema + +[Include schema from design spec] + +## Examples + +[Include examples from design spec] + +## Troubleshooting + +[Include common error scenarios and solutions] +``` + +**Acceptance Criteria:** +- [ ] Documentation complete +- [ ] Examples provided +- [ ] Schema documented +- [ ] Troubleshooting guide included + +--- + +#### Task 3.3: Enhance Human Output +**File:** `src/itential_mcp/runtime/commands.py` +**Estimated Time:** 2 hours + +Improve human-readable output with better formatting: + +```python +def _output_human(result: ConnectionTestResult, verbose: bool = False) -> None: + """Output results in human-readable format. + + Args: + result: Test results. + verbose: Show detailed information. + """ + from itential_mcp.cli.terminal import ( + print_success, + print_error, + print_info, + print_warning, + Colors, + ) + + # Print header + print() + + # Print check results with improved formatting + for i, check in enumerate(result.checks, 1): + # Status symbol and color + if check.status == CheckStatus.PASSED: + symbol = "✓" + color = Colors.GREEN + elif check.status == CheckStatus.FAILED: + symbol = "✗" + color = Colors.RED + elif check.status == CheckStatus.SKIPPED: + symbol = "○" + color = Colors.YELLOW + else: # WARNING + symbol = "⚠" + color = Colors.YELLOW + + # Print check result + print(f"{color}{symbol}{Colors.RESET} {check.message}") + + # Show timing in verbose mode + if verbose: + print(f" {Colors.BLUE}Duration:{Colors.RESET} {check.duration_ms:.0f}ms") + + # Show details in verbose mode + if verbose and check.details: + print(f" {Colors.BLUE}Details:{Colors.RESET}") + for key, value in check.details.items(): + print(f" • {key}: {value}") + + # Show error details in verbose mode + if verbose and check.error: + print(f" {Colors.RED}Error:{Colors.RESET} {type(check.error).__name__}: {check.error}") + + # Show suggestions for failures + if check.status == CheckStatus.FAILED and check.suggestion: + print() + print(f" {Colors.YELLOW}💡 Suggestion:{Colors.RESET}") + for line in check.suggestion.split("\n"): + if line.strip(): + print(f" {line}") + print() + + print() + print("─" * 60) + print() + + # Print summary + if result.success: + print(f"{Colors.GREEN}{Colors.BOLD}✓ Connection test: SUCCESS{Colors.RESET}") + print() + if result.platform_version: + print(f" Platform version: {Colors.BOLD}{result.platform_version}{Colors.RESET}") + if result.authenticated_user: + print(f" Authenticated as: {Colors.BOLD}{result.authenticated_user}{Colors.RESET}") + print(f" Total duration: {Colors.BOLD}{result.duration_ms / 1000:.2f}s{Colors.RESET}") + else: + print(f"{Colors.RED}{Colors.BOLD}✗ Connection test: FAILED{Colors.RESET}") + print() + if result.error: + print(f" {Colors.RED}Error: {result.error}{Colors.RESET}") + + print() +``` + +**Acceptance Criteria:** +- [ ] Output is clear and well-formatted +- [ ] Colors enhance readability +- [ ] Verbose mode shows helpful details +- [ ] Error suggestions are prominent + +--- + +#### Task 3.4: Add Progress Indicators +**File:** `src/itential_mcp/runtime/commands.py` +**Estimated Time:** 1-2 hours + +Add real-time progress indicators (optional enhancement): + +```python +async def test_connection( + *, + config_file: str | None = None, + format: str = "human", + verbose: bool = False, + timeout: int = 30, + quiet: bool = False, +) -> int: + """Test connection to Itential Platform.""" + # ... existing code ... + + # Run checks with progress updates + if not quiet and format == "human": + print_info("Testing connection to Itential Platform...") + print() + + # Show progress as checks run + result = await _run_checks_with_progress(service, timeout) + else: + result = await service.run_all_checks(timeout=timeout) + + # ... rest of code ... + + +async def _run_checks_with_progress( + service: ConnectionTestService, + timeout: int +) -> ConnectionTestResult: + """Run checks with live progress updates. + + Args: + service: Connection test service. + timeout: Timeout in seconds. + + Returns: + Test results. + """ + import sys + from itential_mcp.cli.terminal import Colors + + check_names = [ + "Configuration", + "DNS Resolution", + "TCP Connection", + "TLS Handshake", + "Authentication", + "Platform Health", + "API Access", + ] + + # Start checks + start = time.perf_counter() + checks = [] + + for i, check_method in enumerate([ + service.check_configuration, + service.check_dns_resolution, + service.check_tcp_connection, + service.check_tls_handshake, + service.check_authentication, + service.check_platform_health, + service.check_api_access, + ]): + # Show progress + sys.stdout.write(f" [{i+1}/7] {check_names[i]}... ") + sys.stdout.flush() + + # Run check + check_result = await check_method() + checks.append(check_result) + + # Show result + if check_result.status == CheckStatus.PASSED: + sys.stdout.write(f"{Colors.GREEN}✓{Colors.RESET}\n") + elif check_result.status == CheckStatus.FAILED: + sys.stdout.write(f"{Colors.RED}✗{Colors.RESET}\n") + break # Fail fast + elif check_result.status == CheckStatus.SKIPPED: + sys.stdout.write(f"{Colors.YELLOW}○{Colors.RESET}\n") + else: + sys.stdout.write(f"{Colors.YELLOW}⚠{Colors.RESET}\n") + + sys.stdout.flush() + + duration_ms = (time.perf_counter() - start) * 1000 + + # Build result + success = all( + c.status in (CheckStatus.PASSED, CheckStatus.SKIPPED) + for c in checks + ) + + # Extract metadata + platform_version = None + authenticated_user = None + for check in checks: + if check.name == "health" and check.details: + platform_version = check.details.get("platform_version") + if check.name == "authentication" and check.details: + authenticated_user = check.details.get("user") + + return ConnectionTestResult( + success=success, + duration_ms=duration_ms, + checks=checks, + platform_version=platform_version, + authenticated_user=authenticated_user, + error=None if success else checks[-1].message + ) +``` + +**Acceptance Criteria:** +- [ ] Progress indicators work +- [ ] Real-time feedback provided +- [ ] Fail-fast behavior visible +- [ ] No progress in quiet mode + +--- + +### Phase 3 Deliverables Checklist + +- [ ] JSON output enhanced and complete +- [ ] JSON schema documented +- [ ] Human output improved with better formatting +- [ ] Verbose mode shows detailed info +- [ ] Progress indicators work (optional) +- [ ] Documentation updated +- [ ] Examples provided + +--- + +## Phase 4: Startup Integration + +**Objective:** Add optional connection testing during server startup. + +**Success Criteria:** +- ✓ Startup testing works when enabled +- ✓ Server fails fast if connection test fails +- ✓ Configuration options documented +- ✓ Backward compatible (off by default) + +### Tasks + +#### Task 4.1: Add Configuration Options +**File:** `src/itential_mcp/config/models.py` +**Estimated Time:** 30 minutes + +```python +@dataclass +class ServerConfig: + """Server configuration.""" + + # ... existing fields ... + + test_connection_on_startup: bool = False + """Test platform connection during server startup.""" + + startup_test_timeout: int = 30 + """Timeout for startup connection test in seconds.""" +``` + +**File:** `src/itential_mcp/defaults.py` + +```python +# Server defaults +ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP = False +ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT = 30 +``` + +**Acceptance Criteria:** +- [ ] Config fields added +- [ ] Defaults defined +- [ ] Type hints correct +- [ ] Validation works + +--- + +#### Task 4.2: Integrate with Server Startup +**File:** `src/itential_mcp/server/server.py` +**Estimated Time:** 2-3 hours + +```python +class Server: + """MCP server implementation.""" + + # ... existing code ... + + async def _test_connection_on_startup(self) -> None: + """Test platform connection during startup. + + Raises: + ConnectionException: If connection test fails. + """ + from itential_mcp.platform.connection_test import ConnectionTestService + from itential_mcp.core.exceptions import ConnectionException + + logger = logging.getLogger(__name__) + logger.info("Testing platform connection...") + + try: + service = ConnectionTestService(self.config) + result = await service.run_all_checks( + timeout=self.config.server.startup_test_timeout + ) + + if result.success: + logger.info("Connection test successful") + if result.platform_version: + logger.info(f"Platform version: {result.platform_version}") + if result.authenticated_user: + logger.info(f"Authenticated as: {result.authenticated_user}") + else: + logger.error("Connection test failed") + for check in result.checks: + if check.status == CheckStatus.FAILED: + logger.error(f" {check.name}: {check.message}") + if check.suggestion: + logger.info(f" Suggestion: {check.suggestion}") + + raise ConnectionException( + f"Connection test failed: {result.error}", + details={ + "checks": [ + { + "name": c.name, + "status": c.status, + "message": c.message, + } + for c in result.checks + ] + } + ) + + except Exception as e: + logger.exception("Connection test failed with unexpected error") + raise ConnectionException(f"Connection test failed: {e}") from e + + async def run(self): + """Run the MCP server. + + Raises: + ConnectionException: If startup connection test fails. + """ + # Test connection if enabled + if self.config.server.test_connection_on_startup: + await self._test_connection_on_startup() + + # Continue with normal startup + logger.info( + f"Starting MCP server (transport={self.config.server.transport}, " + f"port={self.config.server.port})" + ) + + # ... existing startup code ... +``` + +**Acceptance Criteria:** +- [ ] Startup test runs when enabled +- [ ] Server fails fast on test failure +- [ ] Logging is clear and helpful +- [ ] Error messages are actionable + +--- + +#### Task 4.3: Update Documentation +**File:** `docs/connection-testing.md` +**Estimated Time:** 1 hour + +Add section on startup testing: + +```markdown +## Startup Testing + +You can configure the server to test the platform connection during startup. +This is useful for deployment validation and fail-fast behavior. + +### Configuration + +#### Environment Variables + +\`\`\`bash +ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP=true +ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT=30 +\`\`\` + +#### Configuration File + +\`\`\`toml +[server] +test_connection_on_startup = true +startup_test_timeout = 30 +\`\`\` + +### Docker/Kubernetes + +When using containers, enable startup testing to fail fast if misconfigured: + +\`\`\`yaml +apiVersion: v1 +kind: Pod +metadata: + name: itential-mcp +spec: + containers: + - name: itential-mcp + image: itential-mcp:latest + env: + - name: ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP + value: "true" + - name: ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT + value: "30" + livenessProbe: + httpGet: + path: /status/livez + port: 8000 + initialDelaySeconds: 35 # Account for connection test + startup + periodSeconds: 10 +\`\`\` + +### Behavior + +When startup testing is enabled: + +1. Server loads configuration +2. Connection test runs automatically +3. If test succeeds, server starts normally +4. If test fails, server exits with error + +This ensures the server only starts if it can connect to the platform. +``` + +**File:** `docs/README.md` + +Add reference to connection testing: + +```markdown +## Connection Testing + +Test your platform connection before starting the server: + +\`\`\`bash +itential-mcp test-connection +\`\`\` + +See [Connection Testing](connection-testing.md) for details. +``` + +**Acceptance Criteria:** +- [ ] Startup testing documented +- [ ] Configuration examples provided +- [ ] Kubernetes example included +- [ ] Behavior explained clearly + +--- + +#### Task 4.4: Write Integration Tests +**File:** `tests/test_server_startup_test.py` (NEW) +**Estimated Time:** 2 hours + +```python +import pytest +from unittest.mock import Mock, AsyncMock, patch +from itential_mcp.server.server import Server +from itential_mcp.platform.connection_test import ( + ConnectionTestResult, + CheckStatus, +) +from itential_mcp.core.exceptions import ConnectionException + +@pytest.fixture +def mock_config(): + """Create mock configuration.""" + config = Mock() + config.server.test_connection_on_startup = True + config.server.startup_test_timeout = 30 + config.server.transport = "stdio" + return config + +@pytest.mark.asyncio +async def test_startup_test_success(mock_config): + """Test successful startup connection test.""" + mock_result = ConnectionTestResult( + success=True, + duration_ms=1234, + checks=[], + platform_version="2024.1.0", + ) + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class, \ + patch.object(Server, "_Server__init_server__"), \ + patch.object(Server, "_Server__init_tools__"), \ + patch.object(Server, "_Server__init_bindings__"), \ + patch.object(Server, "_Server__init_routes__"): + + mock_service = AsyncMock() + mock_service.run_all_checks.return_value = mock_result + mock_service_class.return_value = mock_service + + server = Server(mock_config) + + # Should not raise + await server._test_connection_on_startup() + + mock_service.run_all_checks.assert_called_once_with(timeout=30) + +@pytest.mark.asyncio +async def test_startup_test_failure(mock_config): + """Test failed startup connection test raises exception.""" + mock_result = ConnectionTestResult( + success=False, + duration_ms=1234, + checks=[], + error="Connection failed", + ) + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class, \ + patch.object(Server, "_Server__init_server__"), \ + patch.object(Server, "_Server__init_tools__"), \ + patch.object(Server, "_Server__init_bindings__"), \ + patch.object(Server, "_Server__init_routes__"): + + mock_service = AsyncMock() + mock_service.run_all_checks.return_value = mock_result + mock_service_class.return_value = mock_service + + server = Server(mock_config) + + # Should raise ConnectionException + with pytest.raises(ConnectionException) as exc_info: + await server._test_connection_on_startup() + + assert "Connection test failed" in str(exc_info.value) + +@pytest.mark.asyncio +async def test_startup_test_disabled(mock_config): + """Test startup test skipped when disabled.""" + mock_config.server.test_connection_on_startup = False + + with patch("itential_mcp.platform.connection_test.ConnectionTestService") as mock_service_class, \ + patch.object(Server, "_Server__init_server__"), \ + patch.object(Server, "_Server__init_tools__"), \ + patch.object(Server, "_Server__init_bindings__"), \ + patch.object(Server, "_Server__init_routes__"), \ + patch.object(Server, "run", new_callable=AsyncMock): + + server = Server(mock_config) + + # Service should not be created + mock_service_class.assert_not_called() +``` + +**Acceptance Criteria:** +- [ ] Integration tests pass +- [ ] Success case tested +- [ ] Failure case tested +- [ ] Disabled case tested + +--- + +### Phase 4 Deliverables Checklist + +- [ ] Configuration options added +- [ ] Server startup integration complete +- [ ] Startup test works when enabled +- [ ] Server fails fast on connection failure +- [ ] Documentation updated +- [ ] Integration tests passing +- [ ] Backward compatible (off by default) + +--- + +## Post-Implementation Tasks + +### Task: Update CHANGELOG +**File:** `CHANGELOG.md` +**Estimated Time:** 30 minutes + +Add entry for new feature: + +```markdown +## [Unreleased] + +### Added +- Connection test feature (#299) + - New `test-connection` CLI command to validate platform connectivity + - Performs 7 comprehensive connectivity checks (config, DNS, TCP, TLS, auth, health, API) + - Supports human-readable and JSON output formats + - Optional startup connection testing with `test_connection_on_startup` config option + - Detailed error messages with actionable suggestions + - Verbose mode for diagnostic information +``` + +--- + +### Task: Update User Documentation +**Files:** Various in `docs/` +**Estimated Time:** 1 hour + +1. **README.md** - Add quick reference +2. **integration.md** - Add troubleshooting section +3. **connection-testing.md** - Complete user guide (created in Phase 3) + +--- + +### Task: Run Full Test Suite +**Estimated Time:** 30 minutes + +```bash +# Run all tests +make test + +# Check coverage +make coverage + +# Ensure >95% coverage for new code + +# Run linting +make check + +# Format code +make format + +# Run premerge pipeline +make premerge +``` + +--- + +### Task: Manual Testing +**Estimated Time:** 2-3 hours + +Test scenarios: + +1. **Valid Connection** + - Configure valid platform connection + - Run `itential-mcp test-connection` + - Verify all checks pass + - Verify output is clear + +2. **Invalid Hostname** + - Set `ITENTIAL_MCP_PLATFORM_HOST=invalid.example.com` + - Run test-connection + - Verify DNS check fails with helpful message + +3. **Wrong Port** + - Set invalid port number + - Verify TCP connection fails with suggestion + +4. **TLS Certificate Mismatch** + - Configure hostname that doesn't match certificate + - Verify TLS check fails with clear suggestion + +5. **Invalid Credentials** + - Set wrong username/password + - Verify auth check fails with helpful message + +6. **JSON Output** + - Run with `--format json` + - Verify JSON is valid + - Verify all fields present + +7. **Verbose Output** + - Run with `--verbose` + - Verify detailed information shown + +8. **Startup Testing** + - Enable `test_connection_on_startup` + - Start server with valid config + - Verify server starts normally + - Start server with invalid config + - Verify server exits with error + +9. **Timeout** + - Simulate slow network + - Run with `--timeout 5` + - Verify timeout handling works + +10. **Quiet Mode** + - Run with `--quiet --format json` + - Verify no progress messages + +--- + +### Task: Create Pull Request +**Estimated Time:** 1 hour + +1. Create feature branch: `git checkout -b feature/connection-test-299` +2. Commit changes in logical groups: + - "feat: add connection test service (#299)" + - "feat: add test-connection CLI command (#299)" + - "feat: add JSON output and enhanced diagnostics (#299)" + - "feat: add startup connection testing option (#299)" + - "docs: add connection testing documentation (#299)" +3. Write comprehensive PR description: + - Link to issue #299 + - Summarize changes + - List new files + - Note configuration changes + - Include testing performed + - Add examples +4. Request review + +--- + +## Risk Management + +### Risk 1: Platform Client Limitations +**Risk:** Platform client may not expose all needed functionality for tests +**Mitigation:** +- Phase 1 will reveal any limitations early +- Can add methods to platform client as needed +- Alternative: Use lower-level HTTP client for specific checks + +### Risk 2: Test Timeouts +**Risk:** Connection tests may be slow on poor networks +**Mitigation:** +- Configurable timeout with sensible default (30s) +- Individual check timeouts +- Clear timeout messages with suggestions + +### Risk 3: Authentication Complexity +**Risk:** Supporting all auth types (Basic, OAuth, JWT) in tests +**Mitigation:** +- Leverage existing platform client auth logic +- Test each auth type separately +- Document any limitations + +### Risk 4: TLS Certificate Testing +**Risk:** TLS testing may be complex with various certificate configurations +**Mitigation:** +- Use Python's SSL library (well-tested) +- Skip TLS check when disabled +- Provide clear suggestions for cert issues +- Test with various cert configurations + +### Risk 5: Breaking Changes +**Risk:** Changes to config/server might break existing functionality +**Mitigation:** +- Feature is opt-in (off by default) +- Comprehensive regression testing +- Backward compatibility guaranteed + +--- + +## Success Metrics + +### Code Quality +- [x] Test coverage > 95% for new code +- [x] All unit tests passing +- [x] All integration tests passing +- [x] Code passes linting (make check) +- [x] Code formatted (make format) +- [x] No regressions in existing tests + +### Functionality +- [x] All 7 checks implemented and working +- [x] CLI command works correctly +- [x] Exit codes correct (0=success, 1=failure) +- [x] Human output is clear and helpful +- [x] JSON output is valid and complete +- [x] Startup testing works when enabled +- [x] Error messages are actionable + +### Documentation +- [x] User documentation complete +- [x] JSON schema documented +- [x] Examples provided +- [x] Configuration options documented +- [x] CHANGELOG updated + +### Performance +- [x] Successful test completes in < 2s +- [x] Failed test completes in < 5s +- [x] No performance regression in server startup + +--- + +## Timeline Summary + +| Phase | Duration | Start | End | +|-------|----------|-------|-----| +| Phase 1: Core Foundation | 2-3 days | Day 1 | Day 3 | +| Phase 2: CLI Integration | 1-2 days | Day 4 | Day 5 | +| Phase 3: Enhanced Output | 1-2 days | Day 6 | Day 7 | +| Phase 4: Startup Integration | 1 day | Day 8 | Day 8 | +| **Total** | **5-8 days** | **Day 1** | **Day 8** | + +*Timeline assumes single developer, full-time work* + +--- + +## Next Steps + +1. **Review this implementation plan** - Ensure all stakeholders agree +2. **Create feature branch** - `feature/connection-test-299` +3. **Start Phase 1** - Begin with Task 1.1 +4. **Regular check-ins** - Review progress after each phase +5. **Integration testing** - Test end-to-end after Phase 2 +6. **Documentation review** - Review docs after Phase 3 +7. **Final testing** - Complete manual test scenarios +8. **Create PR** - Submit for review when complete + +--- + +## Questions for Stakeholders + +1. **Should startup test failure be fatal?** + - Current plan: Yes (server exits) + - Alternative: Warn but continue + +2. **Should we support partial check execution?** + - E.g., `--check dns,tcp` to run only specific checks + - Adds complexity but useful for debugging + +3. **Should we add connection monitoring?** + - Out of scope for this feature + - Consider for future enhancement + +4. **Should we cache successful test results?** + - Current plan: No caching (always fresh) + - Trade-off: Speed vs. accuracy + +--- + +## Appendix: File Structure + +``` +src/itential_mcp/ +├── platform/ +│ └── connection_test.py # NEW - Core test service +├── runtime/ +│ ├── commands.py # MODIFIED - Add test_connection command +│ ├── handlers.py # MODIFIED - Register handler +│ └── parser.py # MODIFIED - Add CLI arguments +├── server/ +│ └── server.py # MODIFIED - Add startup test +├── config/ +│ ├── models.py # MODIFIED - Add config fields +│ └── defaults.py # MODIFIED - Add defaults +└── cli/ + └── terminal.py # MODIFIED - Add utilities (if needed) + +tests/ +├── test_platform_connection_test.py # NEW - Unit tests +└── test_runtime_test_connection.py # NEW - Integration tests + +docs/ +├── connection-testing.md # NEW - User documentation +├── README.md # MODIFIED - Add reference +└── design/ + ├── connection-test-feature.md # Design spec + └── connection-test-implementation-plan.md # This document +``` + +--- + +**End of Implementation Plan** diff --git a/src/itential_mcp/cli/terminal.py b/src/itential_mcp/cli/terminal.py index 8572cd3..b93899c 100644 --- a/src/itential_mcp/cli/terminal.py +++ b/src/itential_mcp/cli/terminal.py @@ -2,23 +2,65 @@ # GNU General Public License v3.0+ (see LICENSE or https://www.gnu.org/licenses/gpl-3.0.txt) # SPDX-License-Identifier: GPL-3.0-or-later +"""Terminal utilities for CLI output formatting.""" + import shutil +class Colors: + """ANSI color codes for terminal output.""" + + RED = "\033[91m" + GREEN = "\033[92m" + YELLOW = "\033[93m" + BLUE = "\033[94m" + RESET = "\033[0m" + BOLD = "\033[1m" + + def getcols() -> int: - """ - Get the number of columns for the current terminal session + """Get the number of columns for the current terminal session. This function will get the current terminal size and return the number of columns in the current terminal. + Returns: + int: The number of columns for the current terminal. + """ + return shutil.get_terminal_size().columns + + +def print_success(message: str) -> None: + """Print success message in green. + Args: - None + message: The message to print. + """ + print(f"{Colors.GREEN}{message}{Colors.RESET}") - Returns: - int: The number of columns for the current terminal - Raises: - None +def print_error(message: str) -> None: + """Print error message in red. + + Args: + message: The message to print. """ - return shutil.get_terminal_size().columns + print(f"{Colors.RED}{message}{Colors.RESET}") + + +def print_warning(message: str) -> None: + """Print warning message in yellow. + + Args: + message: The message to print. + """ + print(f"{Colors.YELLOW}{message}{Colors.RESET}") + + +def print_info(message: str) -> None: + """Print info message in blue. + + Args: + message: The message to print. + """ + print(f"{Colors.BLUE}{message}{Colors.RESET}") diff --git a/src/itential_mcp/config/models.py b/src/itential_mcp/config/models.py index 40b900a..36f3c86 100644 --- a/src/itential_mcp/config/models.py +++ b/src/itential_mcp/config/models.py @@ -264,6 +264,20 @@ class ServerConfig: }, ) + test_connection_on_startup: bool = _create_field_with_env( + "ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP", + "Test platform connection during server startup", + default=defaults.ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP, + env_getter=env.getbool, + ) + + startup_test_timeout: int = _create_field_with_env( + "ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT", + "Timeout for startup connection test in seconds", + default=defaults.ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT, + env_getter=env.getint, + ) + @dataclass(frozen=True) class AuthConfig: diff --git a/src/itential_mcp/defaults.py b/src/itential_mcp/defaults.py index d042cc0..59a8597 100644 --- a/src/itential_mcp/defaults.py +++ b/src/itential_mcp/defaults.py @@ -37,6 +37,12 @@ ITENTIAL_MCP_SERVER_RESPONSE_FORMAT = ( "json" # Response serialization format: json, toon, or auto ) +ITENTIAL_MCP_SERVER_TEST_CONNECTION_ON_STARTUP = ( + False # Test platform connection during server startup +) +ITENTIAL_MCP_SERVER_STARTUP_TEST_TIMEOUT = ( + 30 # Timeout for startup connection test in seconds +) ITENTIAL_MCP_SERVER_AUTH_TYPE = "none" # Authentication provider type ITENTIAL_MCP_SERVER_AUTH_JWKS_URI = None # JWKS URI for JWT validation ITENTIAL_MCP_SERVER_AUTH_PUBLIC_KEY = ( diff --git a/src/itential_mcp/platform/connection_test.py b/src/itential_mcp/platform/connection_test.py new file mode 100644 index 0000000..dbac966 --- /dev/null +++ b/src/itential_mcp/platform/connection_test.py @@ -0,0 +1,644 @@ +# Copyright (c) 2025 Itential, Inc +# GNU General Public License v3.0+ (see LICENSE or https://www.gnu.org/licenses/gpl-3.0.txt) +# SPDX-License-Identifier: GPL-3.0-or-later + +"""Connection testing service for Itential Platform connectivity validation. + +This module provides comprehensive connection testing capabilities to validate +connectivity to the Itential Platform. It performs a series of checks including +configuration validation, DNS resolution, TCP connectivity, TLS handshake, +authentication, platform health, and API access. +""" + +from __future__ import annotations + +import asyncio +import logging +import socket +import ssl +import time +from dataclasses import dataclass +from enum import Enum +from typing import Any + +from ..config.models import Config +from ..core.exceptions import AuthenticationException + + +class CheckStatus(str, Enum): + """Status of a connection check.""" + + PASSED = "passed" + FAILED = "failed" + SKIPPED = "skipped" + WARNING = "warning" + + +@dataclass +class CheckResult: + """Result of a single connection check. + + Attributes: + name: Name of the check that was performed. + status: Status of the check (passed, failed, skipped, warning). + message: Human-readable message describing the result. + duration_ms: Time taken to perform the check in milliseconds. + details: Additional structured information about the check. + error: Exception that caused the check to fail, if any. + suggestion: Actionable suggestion for resolving failures. + """ + + name: str + status: CheckStatus + message: str + duration_ms: float + details: dict[str, Any] | None = None + error: Exception | None = None + suggestion: str | None = None + + +@dataclass +class ConnectionTestResult: + """Overall result of connection testing. + + Attributes: + success: Whether all checks passed successfully. + duration_ms: Total time taken for all checks in milliseconds. + checks: List of individual check results. + platform_version: Version of the Itential Platform, if available. + authenticated_user: Username of the authenticated user, if available. + error: Error message if the overall test failed. + """ + + success: bool + duration_ms: float + checks: list[CheckResult] + platform_version: str | None = None + authenticated_user: str | None = None + error: str | None = None + + +class ConnectionTestService: + """Service for testing platform connectivity. + + This service performs a comprehensive series of checks to validate + connectivity to the Itential Platform, including configuration validation, + network connectivity, authentication, and API access. + """ + + def __init__(self, config: Config): + """Initialize connection test service. + + Args: + config: Application configuration. + """ + self.config = config + self._logger = logging.getLogger(__name__) + + # Determine actual port (0 means use default for protocol) + self._port = self._get_actual_port() + + # Determine platform authentication type + self._auth_type = self._get_platform_auth_type() + + def _get_actual_port(self) -> int: + """Get the actual port to use for connections. + + Returns: + int: The actual port number. + """ + port = self.config.platform.port + + # Port 0 means use default for protocol + if port == 0: + if self.config.platform.disable_tls: + return 80 # HTTP default + else: + return 443 # HTTPS default + + return port + + def _get_platform_auth_type(self) -> str: + """Get the platform authentication type. + + Returns: + str: The authentication type (basic, oauth, or none). + """ + # Check if OAuth credentials are configured + if self.config.platform.client_id and self.config.platform.client_secret: + return "oauth" + + # Check if basic auth credentials are configured + if self.config.platform.user and self.config.platform.password: + return "basic" + + return "none" + + def _create_check_result( + self, + name: str, + status: CheckStatus, + message: str, + duration_ms: float, + details: dict[str, Any] | None = None, + error: Exception | None = None, + suggestion: str | None = None, + ) -> CheckResult: + """Create a CheckResult instance. + + Args: + name: Name of the check. + status: Status of the check. + message: Human-readable message. + duration_ms: Duration in milliseconds. + details: Additional details. + error: Exception that occurred. + suggestion: Suggestion for resolving issues. + + Returns: + CheckResult: The created check result. + """ + return CheckResult( + name=name, + status=status, + message=message, + duration_ms=duration_ms, + details=details, + error=error, + suggestion=suggestion, + ) + + async def check_configuration(self) -> CheckResult: + """Validate configuration is loaded and valid. + + Returns: + CheckResult: Configuration check result. + """ + start = time.perf_counter() + + try: + # Validate required fields + if not self.config.platform.host: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message="Platform host not configured", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion="Set ITENTIAL_MCP_PLATFORM_HOST environment variable", + ) + + # Validate auth configuration + if not self.config.auth.type: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message="Authentication type not configured", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion="Set ITENTIAL_MCP_AUTH_TYPE environment variable", + ) + + return self._create_check_result( + name="configuration", + status=CheckStatus.PASSED, + message="Configuration loaded successfully", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "platform_host": self.config.platform.host, + "platform_port": self._port, + "auth_type": self._auth_type, + "tls_enabled": not self.config.platform.disable_tls, + }, + ) + + except Exception as e: + return self._create_check_result( + name="configuration", + status=CheckStatus.FAILED, + message=f"Configuration error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + ) + + async def check_dns_resolution(self) -> CheckResult: + """Verify hostname resolves to IP address. + + Returns: + CheckResult: DNS resolution check result. + """ + start = time.perf_counter() + hostname = self.config.platform.host + + try: + # Attempt DNS resolution + ip_address = socket.gethostbyname(hostname) + + return self._create_check_result( + name="dns", + status=CheckStatus.PASSED, + message=f"{hostname} -> {ip_address}", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "hostname": hostname, + "ip_address": ip_address, + }, + ) + + except socket.gaierror as e: + return self._create_check_result( + name="dns", + status=CheckStatus.FAILED, + message=f"Could not resolve hostname '{hostname}'", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + f"1. Check hostname spelling: {hostname}\n" + "2. Verify DNS server is reachable\n" + "3. Try using IP address directly for testing" + ), + ) + + except Exception as e: + return self._create_check_result( + name="dns", + status=CheckStatus.FAILED, + message=f"DNS resolution error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + ) + + async def check_tcp_connection(self) -> CheckResult: + """Verify TCP connection can be established. + + Returns: + CheckResult: TCP connection check result. + """ + start = time.perf_counter() + host = self.config.platform.host + port = self._port + + try: + # Attempt TCP connection with timeout + loop = asyncio.get_event_loop() + sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + sock.settimeout(5.0) + + await loop.run_in_executor( + None, + sock.connect, + (host, port), + ) + + sock.close() + + return self._create_check_result( + name="tcp", + status=CheckStatus.PASSED, + message=f"TCP connection established ({host}:{port})", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "host": host, + "port": port, + }, + ) + + except socket.timeout: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"Connection timeout to {host}:{port}", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion=( + f"1. Verify platform is running at {host}:{port}\n" + "2. Check network connectivity\n" + "3. Verify firewall rules allow connection\n" + "4. Increase timeout if network is slow" + ), + ) + + except ConnectionRefusedError: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"Connection refused to {host}:{port}", + duration_ms=(time.perf_counter() - start) * 1000, + suggestion=( + f"1. Verify platform is running at {host}:{port}\n" + "2. Check port number is correct\n" + "3. Verify platform is listening on this address" + ), + ) + + except Exception as e: + return self._create_check_result( + name="tcp", + status=CheckStatus.FAILED, + message=f"TCP connection error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + ) + + async def check_tls_handshake(self) -> CheckResult: + """Verify TLS handshake succeeds (if TLS enabled). + + Returns: + CheckResult: TLS handshake check result. + """ + start = time.perf_counter() + + # Skip if TLS is disabled + if self.config.platform.disable_tls: + return self._create_check_result( + name="tls", + status=CheckStatus.SKIPPED, + message="TLS disabled in configuration", + duration_ms=(time.perf_counter() - start) * 1000, + ) + + host = self.config.platform.host + port = self._port + + try: + # Create SSL context + context = ssl.create_default_context() + + if self.config.platform.disable_verify: + context.check_hostname = False + context.verify_mode = ssl.CERT_NONE + + # Perform TLS handshake + with socket.create_connection((host, port), timeout=5.0) as sock: + with context.wrap_socket(sock, server_hostname=host) as ssock: + cipher = ssock.cipher() + version = ssock.version() + + return self._create_check_result( + name="tls", + status=CheckStatus.PASSED, + message="TLS handshake successful", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "protocol": version, + "cipher": cipher[0] if cipher else None, + }, + ) + + except ssl.SSLCertVerificationError as e: + return self._create_check_result( + name="tls", + status=CheckStatus.FAILED, + message="TLS certificate verification failed", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + f"Certificate verification failed for {host}\n\n" + "Options:\n" + "1. Update hostname to match certificate CN\n" + "2. Obtain valid certificate for this hostname\n" + "3. Disable verification (not recommended):\n" + " ITENTIAL_MCP_PLATFORM_DISABLE_VERIFY=true" + ), + ) + + except Exception as e: + return self._create_check_result( + name="tls", + status=CheckStatus.FAILED, + message=f"TLS handshake error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + ) + + async def check_authentication(self) -> CheckResult: + """Verify authentication succeeds. + + Returns: + CheckResult: Authentication check result. + """ + from .client import PlatformClient + + start = time.perf_counter() + + try: + # Create temporary client for auth test + async with PlatformClient(): + # The mere fact that we can create the client and it doesn't + # throw an auth error means auth is working + + user = getattr(self.config.platform, "user", None) + + return self._create_check_result( + name="authentication", + status=CheckStatus.PASSED, + message=f"Authentication successful ({self._auth_type})", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "auth_type": self._auth_type, + "user": user, + }, + ) + + except AuthenticationException as e: + return self._create_check_result( + name="authentication", + status=CheckStatus.FAILED, + message="Authentication failed", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "Authentication failed. Please check:\n" + "1. Username and password are correct\n" + "2. User exists in Itential Platform\n" + "3. User has API access permissions\n" + "4. For OAuth, verify issuer and token configuration" + ), + ) + + except Exception as e: + return self._create_check_result( + name="authentication", + status=CheckStatus.FAILED, + message=f"Authentication error: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + ) + + async def check_platform_health(self) -> CheckResult: + """Verify platform health endpoint responds. + + Returns: + CheckResult: Platform health check result. + """ + from .client import PlatformClient + + start = time.perf_counter() + + try: + async with PlatformClient() as client: + # Call health endpoint + health = await client.health.get_status_health() + + platform_version = health.get("platform", {}).get("version") + + return self._create_check_result( + name="health", + status=CheckStatus.PASSED, + message="Platform health check passed", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "platform_version": platform_version, + "status": health.get("status"), + }, + ) + + except Exception as e: + return self._create_check_result( + name="health", + status=CheckStatus.FAILED, + message=f"Platform health check failed: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "Platform health check failed. Possible causes:\n" + "1. Platform is starting up (wait and retry)\n" + "2. Platform is experiencing issues (check logs)\n" + "3. Health endpoint is not available" + ), + ) + + async def check_api_access(self) -> CheckResult: + """Verify API access with simple query. + + Returns: + CheckResult: API access check result. + """ + from .client import PlatformClient + + start = time.perf_counter() + + try: + async with PlatformClient() as platform_client: + # Make a simple API call that requires permissions + # Use health/adapters endpoint with limit for minimal overhead + # Use the underlying ipsdk client for the HTTP call + res = await platform_client.client.get( + "/health/adapters", params={"limit": 1} + ) + data = res.json() + + return self._create_check_result( + name="api_access", + status=CheckStatus.PASSED, + message="API access verified", + duration_ms=(time.perf_counter() - start) * 1000, + details={ + "api_call": "GET /health/adapters?limit=1", + "total_adapters": data.get("total", 0), + }, + ) + + except Exception as e: + return self._create_check_result( + name="api_access", + status=CheckStatus.FAILED, + message=f"API access verification failed: {e}", + duration_ms=(time.perf_counter() - start) * 1000, + error=e, + suggestion=( + "API access verification failed. Possible causes:\n" + "1. User lacks required permissions\n" + "2. API endpoint is unavailable\n" + "3. Request format is invalid" + ), + ) + + async def run_all_checks(self, timeout: int = 30) -> ConnectionTestResult: + """Run all connection checks in sequence. + + Args: + timeout: Maximum time for all checks in seconds. + + Returns: + ConnectionTestResult: Overall test results. + """ + overall_start = time.perf_counter() + checks: list[CheckResult] = [] + + try: + # Run checks in order, fail-fast on failures + check_methods = [ + self.check_configuration, + self.check_dns_resolution, + self.check_tcp_connection, + self.check_tls_handshake, + self.check_authentication, + self.check_platform_health, + self.check_api_access, + ] + + for check_method in check_methods: + try: + # Run with timeout + result = await asyncio.wait_for( + check_method(), + timeout=timeout, + ) + checks.append(result) + + # Fail fast - stop on first failure + if result.status == CheckStatus.FAILED: + break + + except asyncio.TimeoutError: + checks.append( + CheckResult( + name=check_method.__name__.replace("check_", ""), + status=CheckStatus.FAILED, + message="Check timed out", + duration_ms=timeout * 1000, + suggestion="Increase timeout or check network latency", + ) + ) + break + + # Determine overall success + success = all( + check.status in (CheckStatus.PASSED, CheckStatus.SKIPPED) + for check in checks + ) + + # Extract metadata from successful checks + platform_version = None + authenticated_user = None + + for check in checks: + if check.name == "health" and check.status == CheckStatus.PASSED: + if check.details: + platform_version = check.details.get("platform_version") + if ( + check.name == "authentication" + and check.status == CheckStatus.PASSED + ): + if check.details: + authenticated_user = check.details.get("user") + + duration_ms = (time.perf_counter() - overall_start) * 1000 + + return ConnectionTestResult( + success=success, + duration_ms=duration_ms, + checks=checks, + platform_version=platform_version, + authenticated_user=authenticated_user, + error=None if success else checks[-1].message, + ) + + except Exception as e: + duration_ms = (time.perf_counter() - overall_start) * 1000 + self._logger.exception("Unexpected error during connection test") + + return ConnectionTestResult( + success=False, + duration_ms=duration_ms, + checks=checks, + error=f"Unexpected error: {e}", + ) diff --git a/src/itential_mcp/runtime/commands.py b/src/itential_mcp/runtime/commands.py index b501d3f..601e265 100644 --- a/src/itential_mcp/runtime/commands.py +++ b/src/itential_mcp/runtime/commands.py @@ -2,7 +2,10 @@ # GNU General Public License v3.0+ (see LICENSE or https://www.gnu.org/licenses/gpl-3.0.txt) # SPDX-License-Identifier: GPL-3.0-or-later -from typing import Any, Coroutine, Sequence, Mapping, Tuple +import json +import os +from datetime import datetime +from typing import Any, Coroutine, Sequence, Mapping, Tuple, Optional from . import runner from .. import server @@ -110,3 +113,222 @@ def call(args: Any) -> Tuple[Coroutine, Sequence, Mapping]: None """ return runner.run, (args.tool, args.params), None + + +async def _execute_test_connection( + config_file: Optional[str] = None, + format: str = "human", + verbose: bool = False, + timeout: int = 30, + quiet: bool = False, +) -> int: + """Execute connection test to Itential Platform. + + Args: + config_file: Path to configuration file. + format: Output format (human or json). + verbose: Show detailed diagnostic information. + timeout: Maximum time for test in seconds. + quiet: Suppress progress messages. + + Returns: + int: Exit code (0=success, 1=failure). + """ + from .. import config + from ..platform.connection_test import ConnectionTestService + from ..cli.terminal import print_error, print_info + + # Load configuration + try: + if config_file: + os.environ["ITENTIAL_MCP_CONFIG"] = config_file + + cfg = config.get() + except Exception as e: + print_error(f"Failed to load configuration: {e}") + return 1 + + # Create service + service = ConnectionTestService(cfg) + + # Run checks + if not quiet and format == "human": + print_info("Testing connection to Itential Platform...") + print() + + try: + result = await service.run_all_checks(timeout=timeout) + except Exception as e: + print_error(f"Connection test failed with unexpected error: {e}") + return 1 + + # Output results + if format == "json": + _output_json(result) + else: + _output_human(result, verbose=verbose) + + return 0 if result.success else 1 + + +def _output_human(result: Any, verbose: bool = False) -> None: + """Output results in human-readable format. + + Args: + result: Test results. + verbose: Show detailed information. + """ + from ..platform.connection_test import CheckStatus + from ..cli.terminal import Colors + + # Print check results + for check in result.checks: + if check.status == CheckStatus.PASSED: + symbol = "✓" + color = Colors.GREEN + elif check.status == CheckStatus.FAILED: + symbol = "✗" + color = Colors.RED + elif check.status == CheckStatus.SKIPPED: + symbol = "○" + color = Colors.YELLOW + else: # WARNING + symbol = "⚠" + color = Colors.YELLOW + + print(f"{color}{symbol}{Colors.RESET} {check.message}") + + # Show timing in verbose mode + if verbose: + print(f" {Colors.BLUE}Duration:{Colors.RESET} {check.duration_ms:.0f}ms") + + # Show details in verbose mode + if verbose and check.details: + print(f" {Colors.BLUE}Details:{Colors.RESET}") + for key, value in check.details.items(): + print(f" • {key}: {value}") + + # Show error details in verbose mode + if verbose and check.error: + print( + f" {Colors.RED}Error:{Colors.RESET} {type(check.error).__name__}: {check.error}" + ) + + # Show suggestions for failures + if check.status == CheckStatus.FAILED and check.suggestion: + print() + print(f" {Colors.YELLOW}💡 Suggestion:{Colors.RESET}") + for line in check.suggestion.split("\n"): + if line.strip(): + print(f" {line}") + print() + + print() + print("─" * 60) + print() + + # Print summary + if result.success: + print(f"{Colors.GREEN}{Colors.BOLD}✓ Connection test: SUCCESS{Colors.RESET}") + print() + if result.platform_version: + print( + f" Platform version: {Colors.BOLD}{result.platform_version}{Colors.RESET}" + ) + if result.authenticated_user: + print( + f" Authenticated as: {Colors.BOLD}{result.authenticated_user}{Colors.RESET}" + ) + print( + f" Total duration: {Colors.BOLD}{result.duration_ms / 1000:.2f}s{Colors.RESET}" + ) + else: + print(f"{Colors.RED}{Colors.BOLD}✗ Connection test: FAILED{Colors.RESET}") + print() + if result.error: + print(f" {Colors.RED}Error: {result.error}{Colors.RESET}") + + print() + + +def _output_json(result: Any) -> None: + """Output results in JSON format. + + Args: + result: Test results. + """ + from ..platform.connection_test import CheckStatus + + # Build comprehensive JSON output + data = { + "success": result.success, + "duration_ms": round(result.duration_ms, 2), + "timestamp": datetime.utcnow().isoformat() + "Z", + "checks": [], + } + + # Add check results + for check in result.checks: + check_data = { + "name": check.name, + "status": check.status, + "message": check.message, + "duration_ms": round(check.duration_ms, 2), + } + + if check.details: + check_data["details"] = check.details + + if check.suggestion: + check_data["suggestion"] = check.suggestion + + if check.error: + check_data["error"] = { + "type": type(check.error).__name__, + "message": str(check.error), + } + + data["checks"].append(check_data) + + # Add metadata + if result.platform_version: + data["platform_version"] = result.platform_version + if result.authenticated_user: + data["authenticated_user"] = result.authenticated_user + if result.error: + data["error"] = result.error + + # Add summary statistics + data["summary"] = { + "total_checks": len(result.checks), + "passed": sum(1 for c in result.checks if c.status == CheckStatus.PASSED), + "failed": sum(1 for c in result.checks if c.status == CheckStatus.FAILED), + "skipped": sum(1 for c in result.checks if c.status == CheckStatus.SKIPPED), + "warnings": sum(1 for c in result.checks if c.status == CheckStatus.WARNING), + } + + print(json.dumps(data, indent=2)) + + +def test(args: Any) -> Tuple[Coroutine, Sequence, Mapping]: + """Implement the `itential-mcp test` command. + + This function provides the implementation of the `test` command + that tests connectivity to the Itential Platform. + + Args: + args: The argparse Namespace instance containing command line arguments. + + Returns: + A tuple consisting of a coroutine function, a sequence that represents + the input args for the function, and a mapping that represents the + keyword arguments for the function. + """ + kwargs = { + "config_file": getattr(args, "config", None), + "format": getattr(args, "format", "human"), + "verbose": getattr(args, "verbose", False), + "timeout": getattr(args, "timeout", 30), + "quiet": getattr(args, "quiet", False), + } + return _execute_test_connection, None, kwargs diff --git a/src/itential_mcp/runtime/constants.py b/src/itential_mcp/runtime/constants.py index 5973c81..89695af 100644 --- a/src/itential_mcp/runtime/constants.py +++ b/src/itential_mcp/runtime/constants.py @@ -76,4 +76,31 @@ class CommandConfig: description="Print the version information", arguments={}, ), + CommandConfig( + name="test", + description="Test connection to Itential Platform", + arguments={ + "--config": {"help": CONFIG_HELP_MESSAGE}, + "--timeout": { + "type": int, + "default": 30, + "help": "Maximum time for test in seconds (default: 30)", + }, + "--verbose": { + "action": "store_true", + "help": "Show detailed diagnostic information", + }, + "--format": { + "type": str, + "choices": ["human", "json"], + "default": "human", + "help": "Output format (default: human)", + }, + "--quiet": { + "action": "store_true", + "help": "Suppress progress messages (JSON output only)", + }, + }, + add_platform_group=True, + ), ] diff --git a/src/itential_mcp/server/server.py b/src/itential_mcp/server/server.py index bdba39b..02381fc 100644 --- a/src/itential_mcp/server/server.py +++ b/src/itential_mcp/server/server.py @@ -223,8 +223,67 @@ async def __init_bindings__(self) -> None: logging.debug(f"Successfully added tool: {kwargs['name']}") logging.info("Dynamic tool bindings is now complete") + async def _test_connection_on_startup(self) -> None: + """Test platform connection during startup. + + Raises: + ConnectionException: If connection test fails. + """ + from ..platform.connection_test import ConnectionTestService, CheckStatus + from ..core.exceptions import ConnectionException + + logger = logging.get_logger() + logger.info("Testing platform connection...") + + try: + service = ConnectionTestService(self.config) + result = await service.run_all_checks( + timeout=self.config.server.startup_test_timeout + ) + + if result.success: + logger.info("Connection test successful") + if result.platform_version: + logger.info(f"Platform version: {result.platform_version}") + if result.authenticated_user: + logger.info(f"Authenticated as: {result.authenticated_user}") + else: + logger.error("Connection test failed") + for check in result.checks: + if check.status == CheckStatus.FAILED: + logger.error(f" {check.name}: {check.message}") + if check.suggestion: + logger.info(f" Suggestion: {check.suggestion}") + + raise ConnectionException( + f"Connection test failed: {result.error}", + details={ + "checks": [ + { + "name": c.name, + "status": c.status, + "message": c.message, + } + for c in result.checks + ] + }, + ) + + except Exception as e: + logger.exception("Connection test failed with unexpected error") + raise ConnectionException(f"Connection test failed: {e}") from e + async def run(self): - """Run the server.""" + """Run the server. + + Raises: + ConnectionException: If startup connection test fails. + """ + # Test connection if enabled + if self.config.server.test_connection_on_startup: + await self._test_connection_on_startup() + + # Continue with normal startup if self.config.server.transport in ("sse", "http"): app = self.mcp.http_app(path=self.config.server.path) diff --git a/tests/test_auth_oauth.py b/tests/test_auth_oauth.py index cd1c3c3..983c69b 100644 --- a/tests/test_auth_oauth.py +++ b/tests/test_auth_oauth.py @@ -113,10 +113,13 @@ class TestOAuthProviderBuilding: def test_build_oauth_provider_success(self, mock_oauth_provider): """Test successful OAuth provider building.""" from itential_mcp.config.converters import auth_to_dict - auth_config = auth_to_dict(make_auth_config( - type="oauth", - oauth_redirect_uri="http://localhost:8000/auth/callback", - )) + + auth_config = auth_to_dict( + make_auth_config( + type="oauth", + oauth_redirect_uri="http://localhost:8000/auth/callback", + ) + ) mock_provider = MagicMock() mock_oauth_provider.return_value = mock_provider @@ -129,6 +132,7 @@ def test_build_oauth_provider_success(self, mock_oauth_provider): def test_build_oauth_provider_missing_required_fields(self): """Test OAuth provider building with missing required fields.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config(type="oauth")) with pytest.raises(ConfigurationException) as exc_info: @@ -141,11 +145,14 @@ def test_build_oauth_provider_missing_required_fields(self): def test_build_oauth_provider_with_optional_fields(self, mock_oauth_provider): """Test OAuth provider building with optional fields.""" from itential_mcp.config.converters import auth_to_dict - auth_config = auth_to_dict(make_auth_config( - type="oauth", - oauth_redirect_uri="http://localhost:8000/auth/callback", - oauth_scopes="openid,email", - )) + + auth_config = auth_to_dict( + make_auth_config( + type="oauth", + oauth_redirect_uri="http://localhost:8000/auth/callback", + oauth_scopes="openid,email", + ) + ) mock_provider = MagicMock() mock_oauth_provider.return_value = mock_provider @@ -163,14 +170,17 @@ def test_build_oauth_proxy_provider_success( ): """Test successful OAuth proxy provider building.""" from itential_mcp.config.converters import auth_to_dict - auth_config = auth_to_dict(make_auth_config( - type="oauth_proxy", - oauth_client_id="test_client", - oauth_client_secret="test_secret", - oauth_authorization_url="https://accounts.google.com/oauth/authorize", - oauth_token_url="https://oauth2.googleapis.com/token", - oauth_redirect_uri="http://localhost:8000/auth/callback", - )) + + auth_config = auth_to_dict( + make_auth_config( + type="oauth_proxy", + oauth_client_id="test_client", + oauth_client_secret="test_secret", + oauth_authorization_url="https://accounts.google.com/oauth/authorize", + oauth_token_url="https://oauth2.googleapis.com/token", + oauth_redirect_uri="http://localhost:8000/auth/callback", + ) + ) mock_verifier_instance = MagicMock() mock_token_verifier.return_value = mock_verifier_instance @@ -193,15 +203,20 @@ def test_build_oauth_proxy_provider_success( def test_build_oauth_proxy_provider_missing_fields(self): """Test OAuth proxy provider building with missing required fields.""" from itential_mcp.config.converters import auth_to_dict - auth_config = auth_to_dict(make_auth_config( - type="oauth_proxy", - oauth_client_id="test_client", - )) + + auth_config = auth_to_dict( + make_auth_config( + type="oauth_proxy", + oauth_client_id="test_client", + ) + ) with pytest.raises(ConfigurationException) as exc_info: _build_oauth_proxy_provider(auth_config) - assert "OAuth proxy authentication requires the following fields" in str(exc_info.value) + assert "OAuth proxy authentication requires the following fields" in str( + exc_info.value + ) assert "client_secret" in str(exc_info.value) assert "authorization_url" in str(exc_info.value) assert "token_url" in str(exc_info.value) @@ -214,6 +229,7 @@ class TestProviderConfiguration: def test_google_provider_config(self): """Test Google provider configuration defaults.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config()) config = _get_provider_config("google", auth_config) @@ -222,6 +238,7 @@ def test_google_provider_config(self): def test_azure_provider_config(self): """Test Azure provider configuration defaults.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config()) config = _get_provider_config("azure", auth_config) @@ -230,6 +247,7 @@ def test_azure_provider_config(self): def test_github_provider_config(self): """Test GitHub provider configuration defaults.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config()) config = _get_provider_config("github", auth_config) @@ -238,6 +256,7 @@ def test_github_provider_config(self): def test_provider_config_custom_scopes(self): """Test provider configuration with custom scopes.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config(oauth_scopes="custom,scope")) config = _get_provider_config("google", auth_config) @@ -246,9 +265,10 @@ def test_provider_config_custom_scopes(self): def test_provider_config_custom_redirect_uri(self): """Test provider configuration with custom redirect URI.""" from itential_mcp.config.converters import auth_to_dict - auth_config = auth_to_dict(make_auth_config( - oauth_redirect_uri="http://custom.example.com/callback" - )) + + auth_config = auth_to_dict( + make_auth_config(oauth_redirect_uri="http://custom.example.com/callback") + ) config = _get_provider_config("google", auth_config) assert config["redirect_uri"] == "http://custom.example.com/callback" @@ -257,6 +277,7 @@ def test_provider_config_custom_redirect_uri(self): def test_unsupported_provider_type(self): """Test unsupported provider type raises exception.""" from itential_mcp.config.converters import auth_to_dict + auth_config = auth_to_dict(make_auth_config()) with pytest.raises(ConfigurationException) as exc_info: diff --git a/tests/test_platform_connection_test.py b/tests/test_platform_connection_test.py new file mode 100644 index 0000000..9d62221 --- /dev/null +++ b/tests/test_platform_connection_test.py @@ -0,0 +1,781 @@ +# Copyright (c) 2025 Itential, Inc +# GNU General Public License v3.0+ (see LICENSE or https://www.gnu.org/licenses/gpl-3.0.txt) +# SPDX-License-Identifier: GPL-3.0-or-later + +"""Tests for connection test service.""" + +from __future__ import annotations + +import asyncio +import socket +import ssl +from unittest.mock import AsyncMock, Mock, PropertyMock, patch + +import pytest + +from itential_mcp.config.models import Config +from itential_mcp.core.exceptions import AuthenticationException +from itential_mcp.platform.connection_test import ( + CheckResult, + CheckStatus, + ConnectionTestService, +) + + +@pytest.fixture +def mock_config(): + """Create mock configuration.""" + config = Mock(spec=Config) + config.platform = Mock() + config.platform.host = "platform.example.com" + config.platform.port = 3000 + config.platform.disable_tls = False + config.platform.disable_verify = False + config.platform.user = "admin" + config.platform.password = "password" + config.platform.client_id = None + config.platform.client_secret = None + config.auth = Mock() + config.auth.type = "oauth" + return config + + +@pytest.fixture +def service(mock_config): + """Create connection test service instance.""" + return ConnectionTestService(mock_config) + + +# Port Detection Tests + + +def test_get_actual_port_with_tls_enabled(mock_config): + """Test port detection returns 443 when TLS is enabled and port is 0.""" + mock_config.platform.port = 0 + mock_config.platform.disable_tls = False + service = ConnectionTestService(mock_config) + + assert service._port == 443 + + +def test_get_actual_port_with_tls_disabled(mock_config): + """Test port detection returns 80 when TLS is disabled and port is 0.""" + mock_config.platform.port = 0 + mock_config.platform.disable_tls = True + service = ConnectionTestService(mock_config) + + assert service._port == 80 + + +def test_get_actual_port_with_custom_port(mock_config): + """Test port detection returns custom port when specified.""" + mock_config.platform.port = 8443 + service = ConnectionTestService(mock_config) + + assert service._port == 8443 + + +# Authentication Type Detection Tests + + +def test_get_platform_auth_type_oauth(mock_config): + """Test auth type detection identifies OAuth.""" + mock_config.platform.client_id = "client123" + mock_config.platform.client_secret = "secret456" + mock_config.platform.user = None + mock_config.platform.password = None + service = ConnectionTestService(mock_config) + + assert service._auth_type == "oauth" + + +def test_get_platform_auth_type_basic(mock_config): + """Test auth type detection identifies basic auth.""" + mock_config.platform.client_id = None + mock_config.platform.client_secret = None + mock_config.platform.user = "admin" + mock_config.platform.password = "password" + service = ConnectionTestService(mock_config) + + assert service._auth_type == "basic" + + +def test_get_platform_auth_type_none(mock_config): + """Test auth type detection identifies no auth.""" + mock_config.platform.client_id = None + mock_config.platform.client_secret = None + mock_config.platform.user = None + mock_config.platform.password = None + service = ConnectionTestService(mock_config) + + assert service._auth_type == "none" + + +# Configuration Tests + + +@pytest.mark.asyncio +async def test_check_configuration_success(service): + """Test configuration check passes with valid config.""" + result = await service.check_configuration() + + assert result.name == "configuration" + assert result.status == CheckStatus.PASSED + assert "successfully" in result.message.lower() + assert result.details is not None + assert result.details["platform_host"] == "platform.example.com" + assert result.details["platform_port"] == 3000 + assert result.details["auth_type"] == "basic" + + +@pytest.mark.asyncio +async def test_check_configuration_missing_host(mock_config): + """Test configuration check fails with missing host.""" + mock_config.platform.host = None + service = ConnectionTestService(mock_config) + + result = await service.check_configuration() + + assert result.status == CheckStatus.FAILED + assert "host not configured" in result.message.lower() + assert result.suggestion is not None + assert "ITENTIAL_MCP_PLATFORM_HOST" in result.suggestion + + +@pytest.mark.asyncio +async def test_check_configuration_missing_auth_type(mock_config): + """Test configuration check fails with missing auth type.""" + mock_config.auth.type = None + service = ConnectionTestService(mock_config) + + result = await service.check_configuration() + + assert result.status == CheckStatus.FAILED + assert "authentication type not configured" in result.message.lower() + assert result.suggestion is not None + + +@pytest.mark.asyncio +async def test_check_configuration_exception(): + """Test configuration check handles generic exceptions.""" + # Create a config that will raise an exception during validation + mock_config = Mock(spec=Config) + mock_config.platform = Mock() + mock_config.platform.host = "platform.example.com" + mock_config.platform.port = 3000 + mock_config.platform.disable_tls = False + mock_config.platform.disable_verify = False + mock_config.platform.user = "admin" + mock_config.platform.password = "password" + mock_config.platform.client_id = None + mock_config.platform.client_secret = None + mock_config.auth = Mock() + + # Make auth.type raise an exception when accessed in a way that's not in __init__ + type(mock_config.auth).type = PropertyMock(side_effect=Exception("Config error")) + + service = ConnectionTestService(mock_config) + result = await service.check_configuration() + + assert result.status == CheckStatus.FAILED + assert "configuration error" in result.message.lower() + assert result.error is not None + + +# DNS Tests + + +@pytest.mark.asyncio +async def test_check_dns_resolution_success(service): + """Test DNS resolution succeeds.""" + with patch("socket.gethostbyname", return_value="192.168.1.100"): + result = await service.check_dns_resolution() + + assert result.status == CheckStatus.PASSED + assert "192.168.1.100" in result.message + assert result.details["ip_address"] == "192.168.1.100" + assert result.details["hostname"] == "platform.example.com" + + +@pytest.mark.asyncio +async def test_check_dns_resolution_failure(service): + """Test DNS resolution fails gracefully.""" + with patch("socket.gethostbyname", side_effect=socket.gaierror): + result = await service.check_dns_resolution() + + assert result.status == CheckStatus.FAILED + assert "resolve" in result.message.lower() + assert result.suggestion is not None + assert "DNS server" in result.suggestion + + +@pytest.mark.asyncio +async def test_check_dns_resolution_generic_exception(service): + """Test DNS resolution handles generic exceptions.""" + with patch("socket.gethostbyname", side_effect=Exception("Unexpected DNS error")): + result = await service.check_dns_resolution() + + assert result.status == CheckStatus.FAILED + assert "dns resolution error" in result.message.lower() + assert result.error is not None + + +# TCP Tests + + +@pytest.mark.asyncio +async def test_check_tcp_connection_success(service): + """Test TCP connection succeeds.""" + mock_sock = Mock() + mock_sock.close = Mock() + + async def mock_connect(*args): + pass + + with patch("socket.socket", return_value=mock_sock): + with patch.object( + asyncio.get_event_loop(), "run_in_executor", side_effect=mock_connect + ): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.PASSED + assert "established" in result.message.lower() + assert result.details["host"] == "platform.example.com" + assert result.details["port"] == 3000 + mock_sock.close.assert_called_once() + + +@pytest.mark.asyncio +async def test_check_tcp_connection_refused(service): + """Test TCP connection handles connection refused.""" + mock_sock = Mock() + + async def mock_connect_refused(*args): + raise ConnectionRefusedError() + + with patch("socket.socket", return_value=mock_sock): + with patch.object( + asyncio.get_event_loop(), + "run_in_executor", + side_effect=mock_connect_refused, + ): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.FAILED + assert "refused" in result.message.lower() + assert result.suggestion is not None + assert "platform is running" in result.suggestion.lower() + + +@pytest.mark.asyncio +async def test_check_tcp_connection_timeout(service): + """Test TCP connection handles timeout.""" + mock_sock = Mock() + + async def mock_timeout(*args): + raise socket.timeout() + + with patch("socket.socket", return_value=mock_sock): + with patch.object( + asyncio.get_event_loop(), "run_in_executor", side_effect=mock_timeout + ): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.FAILED + assert "timeout" in result.message.lower() + assert result.suggestion is not None + + +@pytest.mark.asyncio +async def test_check_tcp_connection_generic_exception(service): + """Test TCP connection handles generic exceptions.""" + mock_sock = Mock() + + async def mock_error(*args): + raise Exception("Unexpected network error") + + with patch("socket.socket", return_value=mock_sock): + with patch.object( + asyncio.get_event_loop(), "run_in_executor", side_effect=mock_error + ): + result = await service.check_tcp_connection() + + assert result.status == CheckStatus.FAILED + assert "tcp connection error" in result.message.lower() + assert result.error is not None + + +# TLS Tests + + +@pytest.mark.asyncio +async def test_check_tls_handshake_skipped_when_disabled(mock_config): + """Test TLS check skipped when TLS disabled.""" + mock_config.platform.disable_tls = True + service = ConnectionTestService(mock_config) + + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.SKIPPED + assert "disabled" in result.message.lower() + + +@pytest.mark.asyncio +async def test_check_tls_handshake_success(service): + """Test TLS handshake succeeds.""" + mock_ssl_socket = Mock() + mock_ssl_socket.cipher.return_value = ("TLS_AES_256_GCM_SHA384", "TLSv1.3", 256) + mock_ssl_socket.version.return_value = "TLSv1.3" + mock_ssl_socket.__enter__ = Mock(return_value=mock_ssl_socket) + mock_ssl_socket.__exit__ = Mock(return_value=False) + + mock_socket = Mock() + mock_socket.__enter__ = Mock(return_value=mock_socket) + mock_socket.__exit__ = Mock(return_value=False) + + mock_context = Mock() + mock_context.wrap_socket.return_value = mock_ssl_socket + + with ( + patch("ssl.create_default_context", return_value=mock_context), + patch("socket.create_connection", return_value=mock_socket), + ): + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.PASSED + assert "successful" in result.message.lower() + assert result.details["protocol"] == "TLSv1.3" + + +@pytest.mark.asyncio +async def test_check_tls_handshake_cert_verification_error(service): + """Test TLS handshake handles certificate verification error.""" + with patch( + "ssl.create_default_context", + side_effect=ssl.SSLCertVerificationError("cert verify failed"), + ): + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.FAILED + assert "certificate verification failed" in result.message.lower() + assert result.suggestion is not None + assert "DISABLE_VERIFY" in result.suggestion + + +@pytest.mark.asyncio +async def test_check_tls_handshake_with_verify_disabled(mock_config): + """Test TLS handshake with certificate verification disabled.""" + mock_config.platform.disable_verify = True + service = ConnectionTestService(mock_config) + + mock_ssl_socket = Mock() + mock_ssl_socket.cipher.return_value = ("TLS_AES_256_GCM_SHA384", "TLSv1.3", 256) + mock_ssl_socket.version.return_value = "TLSv1.3" + mock_ssl_socket.__enter__ = Mock(return_value=mock_ssl_socket) + mock_ssl_socket.__exit__ = Mock(return_value=False) + + mock_socket = Mock() + mock_socket.__enter__ = Mock(return_value=mock_socket) + mock_socket.__exit__ = Mock(return_value=False) + + mock_context = Mock() + mock_context.wrap_socket.return_value = mock_ssl_socket + + with ( + patch("ssl.create_default_context", return_value=mock_context), + patch("socket.create_connection", return_value=mock_socket), + ): + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.PASSED + assert "successful" in result.message.lower() + # Verify that certificate verification was disabled + assert mock_context.check_hostname is False + assert mock_context.verify_mode == ssl.CERT_NONE + + +@pytest.mark.asyncio +async def test_check_tls_handshake_generic_exception(service): + """Test TLS handshake handles generic exceptions.""" + with patch( + "ssl.create_default_context", side_effect=Exception("Unexpected SSL error") + ): + result = await service.check_tls_handshake() + + assert result.status == CheckStatus.FAILED + assert "tls handshake error" in result.message.lower() + assert result.error is not None + + +# Authentication Tests + + +@pytest.mark.asyncio +async def test_check_authentication_success(service): + """Test authentication check succeeds.""" + mock_client = AsyncMock() + mock_client.__aenter__.return_value = mock_client + mock_client.__aexit__.return_value = None + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_authentication() + + assert result.status == CheckStatus.PASSED + assert "successful" in result.message.lower() + assert result.details["auth_type"] == "basic" + assert result.details["user"] == "admin" + + +@pytest.mark.asyncio +async def test_check_authentication_failure(service): + """Test authentication check handles failure.""" + mock_client = AsyncMock() + mock_client.__aenter__.side_effect = AuthenticationException("Invalid credentials") + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_authentication() + + assert result.status == CheckStatus.FAILED + assert "authentication failed" in result.message.lower() + assert result.suggestion is not None + assert "username and password" in result.suggestion.lower() + + +@pytest.mark.asyncio +async def test_check_authentication_generic_exception(service): + """Test authentication check handles generic exceptions.""" + mock_client = AsyncMock() + mock_client.__aenter__.side_effect = Exception("Unexpected auth error") + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_authentication() + + assert result.status == CheckStatus.FAILED + assert "authentication error" in result.message.lower() + assert result.error is not None + + +# Health Tests + + +@pytest.mark.asyncio +async def test_check_platform_health_success(service): + """Test platform health check succeeds.""" + mock_client = AsyncMock() + mock_client.__aenter__.return_value = mock_client + mock_client.__aexit__.return_value = None + mock_client.health = AsyncMock() + mock_client.health.get_status_health = AsyncMock( + return_value={"status": "healthy", "platform": {"version": "2024.1.0"}} + ) + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_platform_health() + + assert result.status == CheckStatus.PASSED + assert "passed" in result.message.lower() + assert result.details["platform_version"] == "2024.1.0" + assert result.details["status"] == "healthy" + + +@pytest.mark.asyncio +async def test_check_platform_health_failure(service): + """Test platform health check handles failure.""" + mock_client = AsyncMock() + mock_client.__aenter__.return_value = mock_client + mock_client.__aexit__.return_value = None + mock_client.health = AsyncMock() + mock_client.health.get_status_health = AsyncMock(side_effect=Exception("API error")) + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_platform_health() + + assert result.status == CheckStatus.FAILED + assert "failed" in result.message.lower() + assert result.suggestion is not None + + +@pytest.mark.asyncio +async def test_check_platform_health_generic_exception(service): + """Test platform health check handles generic exceptions.""" + mock_client = AsyncMock() + mock_client.__aenter__.side_effect = Exception("Unexpected health error") + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_client, + ): + result = await service.check_platform_health() + + assert result.status == CheckStatus.FAILED + assert "platform health check failed" in result.message.lower() + assert result.error is not None + + +# API Access Tests + + +@pytest.mark.asyncio +async def test_check_api_access_success(service): + """Test API access check succeeds.""" + mock_platform_client = AsyncMock() + mock_platform_client.__aenter__.return_value = mock_platform_client + mock_platform_client.__aexit__.return_value = None + + # Mock the underlying ipsdk client + mock_ipsdk_client = AsyncMock() + mock_response = Mock() + mock_response.json.return_value = {"total": 5, "results": [{"id": "adapter1"}]} + mock_ipsdk_client.get = AsyncMock(return_value=mock_response) + mock_platform_client.client = mock_ipsdk_client + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_platform_client, + ): + result = await service.check_api_access() + + assert result.status == CheckStatus.PASSED + assert "verified" in result.message.lower() + assert result.details["api_call"] == "GET /health/adapters?limit=1" + assert result.details["total_adapters"] == 5 + + +@pytest.mark.asyncio +async def test_check_api_access_failure(service): + """Test API access check handles failure.""" + mock_platform_client = AsyncMock() + mock_platform_client.__aenter__.return_value = mock_platform_client + mock_platform_client.__aexit__.return_value = None + + # Mock the underlying ipsdk client to raise an exception + mock_ipsdk_client = AsyncMock() + mock_ipsdk_client.get = AsyncMock(side_effect=Exception("Permission denied")) + mock_platform_client.client = mock_ipsdk_client + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_platform_client, + ): + result = await service.check_api_access() + + assert result.status == CheckStatus.FAILED + assert "failed" in result.message.lower() + assert result.suggestion is not None + + +@pytest.mark.asyncio +async def test_check_api_access_generic_exception(service): + """Test API access check handles generic exceptions.""" + mock_platform_client = AsyncMock() + mock_platform_client.__aenter__.side_effect = Exception("Unexpected API error") + + with patch( + "itential_mcp.platform.client.PlatformClient", + return_value=mock_platform_client, + ): + result = await service.check_api_access() + + assert result.status == CheckStatus.FAILED + assert "api access verification failed" in result.message.lower() + assert result.error is not None + + +# Overall Test Runner + + +@pytest.mark.asyncio +async def test_run_all_checks_success(service): + """Test running all checks successfully.""" + # Mock all checks to succeed + mock_checks = [ + CheckResult( + name="configuration", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10, + ), + CheckResult( + name="dns", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="tcp", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="tls", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="authentication", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10, + details={"user": "admin"}, + ), + CheckResult( + name="health", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10, + details={"platform_version": "2024.1.0"}, + ), + CheckResult( + name="api_access", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + ] + + with ( + patch.object(service, "check_configuration", return_value=mock_checks[0]), + patch.object(service, "check_dns_resolution", return_value=mock_checks[1]), + patch.object(service, "check_tcp_connection", return_value=mock_checks[2]), + patch.object(service, "check_tls_handshake", return_value=mock_checks[3]), + patch.object(service, "check_authentication", return_value=mock_checks[4]), + patch.object(service, "check_platform_health", return_value=mock_checks[5]), + patch.object(service, "check_api_access", return_value=mock_checks[6]), + ): + result = await service.run_all_checks() + + assert result.success is True + assert len(result.checks) == 7 + assert result.platform_version == "2024.1.0" + assert result.authenticated_user == "admin" + assert result.error is None + + +@pytest.mark.asyncio +async def test_run_all_checks_fails_fast(service): + """Test that checks stop on first failure.""" + # Mock first check to succeed, second to fail + config_check = CheckResult( + name="configuration", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ) + dns_check = CheckResult( + name="dns", + status=CheckStatus.FAILED, + message="Failed", + duration_ms=10, + suggestion="Fix DNS", + ) + + with ( + patch.object(service, "check_configuration", return_value=config_check), + patch.object( + service, "check_dns_resolution", return_value=dns_check + ) as mock_dns, + ): + result = await service.run_all_checks() + + assert result.success is False + assert len(result.checks) == 2 # Only ran first two checks + assert result.error == "Failed" + mock_dns.assert_called_once() + + +@pytest.mark.asyncio +async def test_run_all_checks_timeout(service): + """Test timeout handling.""" + + async def slow_check(): + await asyncio.sleep(10) + return CheckResult( + name="slow", status=CheckStatus.PASSED, message="OK", duration_ms=10000 + ) + + config_check = CheckResult( + name="configuration", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ) + + with ( + patch.object(service, "check_configuration", return_value=config_check), + patch.object(service, "check_dns_resolution", new=slow_check), + ): + result = await service.run_all_checks(timeout=1) + + assert result.success is False + assert len(result.checks) == 2 + # The second check should have timed out + assert result.checks[1].status == CheckStatus.FAILED + assert "timed out" in result.checks[1].message.lower() + + +@pytest.mark.asyncio +async def test_run_all_checks_skips_tls_when_disabled(mock_config): + """Test that TLS check is skipped when disabled.""" + mock_config.platform.disable_tls = True + service = ConnectionTestService(mock_config) + + mock_checks = [ + CheckResult( + name="configuration", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10, + ), + CheckResult( + name="dns", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="tcp", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="tls", + status=CheckStatus.SKIPPED, + message="TLS disabled", + duration_ms=0, + ), + CheckResult( + name="authentication", + status=CheckStatus.PASSED, + message="OK", + duration_ms=10, + ), + CheckResult( + name="health", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + CheckResult( + name="api_access", status=CheckStatus.PASSED, message="OK", duration_ms=10 + ), + ] + + with ( + patch.object(service, "check_configuration", return_value=mock_checks[0]), + patch.object(service, "check_dns_resolution", return_value=mock_checks[1]), + patch.object(service, "check_tcp_connection", return_value=mock_checks[2]), + patch.object(service, "check_tls_handshake", return_value=mock_checks[3]), + patch.object(service, "check_authentication", return_value=mock_checks[4]), + patch.object(service, "check_platform_health", return_value=mock_checks[5]), + patch.object(service, "check_api_access", return_value=mock_checks[6]), + ): + result = await service.run_all_checks() + + assert result.success is True + assert len(result.checks) == 7 + tls_check = next((c for c in result.checks if c.name == "tls"), None) + assert tls_check is not None + assert tls_check.status == CheckStatus.SKIPPED + + +@pytest.mark.asyncio +async def test_run_all_checks_handles_unexpected_error(service): + """Test that unexpected errors are handled gracefully.""" + with patch.object( + service, "check_configuration", side_effect=Exception("Unexpected error") + ): + result = await service.run_all_checks() + + assert result.success is False + assert "Unexpected error" in result.error diff --git a/tests/test_server.py b/tests/test_server.py index 92b038f..9606072 100644 --- a/tests/test_server.py +++ b/tests/test_server.py @@ -733,6 +733,7 @@ async def test_server_run_sse_transport_with_uvicorn( mock_config.server.certificate_file = None mock_config.server.private_key_file = None mock_config.server.path = "/mcp" + mock_config.server.test_connection_on_startup = False # Create server instance server_instance = server_module.Server(mock_config) @@ -776,6 +777,7 @@ async def test_server_run_http_transport_with_uvicorn(self, mock_uvicorn_server) mock_config.server.certificate_file = "/path/to/cert.pem" mock_config.server.private_key_file = "/path/to/key.pem" mock_config.server.path = "/api" + mock_config.server.test_connection_on_startup = False # Create server instance server_instance = server_module.Server(mock_config) @@ -813,6 +815,7 @@ async def test_server_run_stdio_transport_uses_fastmcp(self): # Setup config for stdio transport mock_config = MagicMock() mock_config.server.transport = "stdio" + mock_config.server.test_connection_on_startup = False # Create server instance server_instance = server_module.Server(mock_config) @@ -834,6 +837,7 @@ async def test_server_run_missing_mcp_instance(self): mock_config.server.transport = "sse" mock_config.server.host = "127.0.0.1" mock_config.server.port = 8000 + mock_config.server.test_connection_on_startup = False server_instance = server_module.Server(mock_config) # mcp remains None @@ -886,6 +890,7 @@ async def test_server_run_tls_configuration_variations(self, mock_uvicorn_server mock_config.server.certificate_file = case["certificate_file"] mock_config.server.private_key_file = case["private_key_file"] mock_config.server.path = "/mcp" + mock_config.server.test_connection_on_startup = False server_instance = server_module.Server(mock_config) @@ -928,6 +933,7 @@ async def test_server_run_http_app_path_configuration(self, mock_uvicorn_server) mock_config.server.certificate_file = None mock_config.server.private_key_file = None mock_config.server.path = "/custom-path" + mock_config.server.test_connection_on_startup = False server_instance = server_module.Server(mock_config)