Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .docfx/Dockerfile.docfx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
ARG NGINX_VERSION=1.29.3-alpine
ARG NGINX_VERSION=1.29.4-alpine

FROM --platform=$BUILDPLATFORM nginx:${NGINX_VERSION} AS base
RUN rm -rf /usr/share/nginx/html/*
Expand Down
2 changes: 1 addition & 1 deletion .docfx/docfx.json
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
],
"globalMetadata": {
"_appTitle": "Bootstrapper by Codebelt",
"_appFooter": "<span>Generated by <strong>DocFX</strong>. Copyright 2024-2025 Geekle. All rights reserved.</span>",
"_appFooter": "<span>Generated by <strong>DocFX</strong>. Copyright 2024-2026 Geekle. All rights reserved.</span>",
"_appLogoPath": "images/50x50.png",
"_appFaviconPath": "images/favicon.ico",
"_googleAnalyticsTagId": "G-56S43SYNH9",
Expand Down
133 changes: 80 additions & 53 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
---
description: 'Writing Unit Tests in Bootstrapper'
description: 'Writing Unit Tests'
applyTo: "**/*.{cs,csproj}"
---

# Writing Unit Tests in Bootstrapper
This document provides instructions for writing unit tests in the Bootstrapper codebase. Please follow these guidelines to ensure consistency and maintainability.
# Writing Unit Tests
This document provides instructions for writing unit tests for a project/solution. Please follow these guidelines to ensure consistency and maintainability.

## 1. Base Class

Expand Down Expand Up @@ -48,35 +48,35 @@ namespace Your.Namespace
## 5. File and Namespace Organization

- Place test files in the appropriate test project and folder structure.
- Use namespaces that mirror the source code structure. The namespace of a test file MUST match the namespace of the System Under Test (SUT). Do NOT append ".Tests", ".Benchmarks" or similar suffixes to the namespace. Only the assembly/project name should indicate that the file is a test/benchmark (for example: Bootstrapper.Foo.Tests assembly, but namespace Bootstrapper.Foo).
- Example: If the SUT class is declared as:
```csharp
namespace Bootstrapper.Foo.Bar
{
public class Zoo { /* ... */ }
}
```
then the corresponding unit test class must use the exact same namespace:
```csharp
namespace Bootstrapper.Foo.Bar
{
public class ZooTest : Test { /* ... */ }
}
```
- Do NOT use:
```csharp
namespace Bootstrapper.Foo.Bar.Tests { /* ... */ } // ❌
namespace Bootstrapper.Foo.Bar.Benchmarks { /* ... */ } // ❌
- Use namespaces that mirror the source code structure. The namespace of a test file MUST match the namespace of the System Under Test (SUT). Do NOT append ".Tests", ".Benchmarks" or similar suffixes to the namespace. Only the assembly/project name should indicate that the file is a test/benchmark (for example: YourProject.Foo.Tests assembly, but namespace YourProject.Foo).
- Example: If the SUT class is declared as:
```csharp
namespace YourProject.Foo.Bar
{
public class Zoo { /* ... */ }
}
```
then the corresponding unit test class must use the exact same namespace:
```csharp
namespace YourProject.Foo.Bar
{
public class ZooTest : Test { /* ... */ }
}
```
- Do NOT use:
```csharp
namespace YourProject.Foo.Bar.Tests { /* ... */ } // ❌
namespace YourProject.Foo.Bar.Benchmarks { /* ... */ } // ❌
```
- The unit tests for the YourProject.Foo assembly live in the YourProject.Foo.Tests assembly.
- The functional tests for the YourProject.Foo assembly live in the YourProject.Foo.FunctionalTests assembly.
- Test class names end with Test and live in the same namespace as the class being tested, e.g., the unit tests for the Boo class that resides in the YourProject.Foo assembly would be named BooTest and placed in the YourProject.Foo namespace in the YourProject.Foo.Tests assembly.
- Modify the associated .csproj file to override the root namespace so the compiled namespace matches the SUT. Example:
```xml
<PropertyGroup>
<RootNamespace>YourProject.Foo</RootNamespace>
</PropertyGroup>
```
- The unit tests for the Bootstrapper.Foo assembly live in the Bootstrapper.Foo.Tests assembly.
- The functional tests for the Bootstrapper.Foo assembly live in the Bootstrapper.Foo.FunctionalTests assembly.
- Test class names end with Test and live in the same namespace as the class being tested, e.g., the unit tests for the Boo class that resides in the Bootstrapper.Foo assembly would be named BooTest and placed in the Bootstrapper.Foo namespace in the Bootstrapper.Foo.Tests assembly.
- Modify the associated .csproj file to override the root namespace so the compiled namespace matches the SUT. Example:
```xml
<PropertyGroup>
<RootNamespace>Bootstrapper.Foo</RootNamespace>
</PropertyGroup>
```
- When generating test scaffolding automatically, resolve the SUT's namespace from the source file (or project/assembly metadata) and use that exact namespace in the test file header.

- Notes:
Expand All @@ -91,7 +91,7 @@ using System.Globalization;
using Codebelt.Extensions.Xunit;
using Xunit;

namespace Bootstrapper
namespace YourProject
{
/// <summary>
/// Tests for the <see cref="DateSpan"/> class.
Expand Down Expand Up @@ -149,33 +149,61 @@ namespace Bootstrapper
- Before overriding methods, verify that the method is virtual or abstract; this rule also applies to mocks.
- Never mock IMarshaller; always use a new instance of JsonMarshaller.

For further examples, refer to existing test files such as
[`test/Bootstrapper.Core.Tests/DisposableTest.cs`](test/Bootstrapper.Core.Tests/DisposableTest.cs) and [`test/Bootstrapper.Core.Tests/Security/HashFactoryTest.cs`](test/Bootstrapper.Core.Tests/Security/HashFactoryTest.cs).
## 9. Avoid `InternalsVisibleTo` in Tests

- **Do not** use `InternalsVisibleTo` to access internal types or members from test projects.
- Prefer **indirect testing via public APIs** that depend on the internal implementation (public facades, public extension methods, or other public entry points).

### Preferred Pattern

**Pattern name:** Public Facade Testing (also referred to as *Public API Proxy Testing*)

**Description:**
Internal classes and methods must be validated by exercising the public API that consumes them. Tests should assert observable behavior exposed by the public surface rather than targeting internal implementation details directly.

### Example Mapping

- **Internal helper:** `DelimitedString` (internal static class)
- **Public API:** `TestOutputHelperExtensions.WriteLines()` (public extension method)
- **Test strategy:** Write tests for `WriteLines()` and verify its public behavior. The internal call to `DelimitedString.Create()` is exercised implicitly.

### Benefits

- Avoids exposing internal types to test assemblies.
- Ensures tests reflect real-world usage patterns.
- Maintains strong encapsulation and a clean public API.
- Tests remain resilient to internal refactoring as long as public behavior is preserved.

### When to Apply

- Internal logic is fully exercised through existing public APIs.
- Public entry points provide sufficient coverage of internal code paths.
- The internal implementation exists solely as a helper or utility for public-facing functionality.

---
description: 'Writing Performance Tests in Bootstrapper'
description: 'Writing Performance Tests'
applyTo: "tuning/**, **/*Benchmark*.cs"
---

# Writing Performance Tests in Bootstrapper
This document provides guidance for writing performance tests (benchmarks) in the Bootstrapper codebase using BenchmarkDotNet. Follow these guidelines to keep benchmarks consistent, readable, and comparable.
# Writing Performance Tests
This document provides guidance for writing performance tests (benchmarks) for a project/solution using BenchmarkDotNet. Follow these guidelines to keep benchmarks consistent, readable, and comparable.

## 1. Naming and Placement

- Place micro- and component-benchmarks under the `tuning/` folder or in projects named `*.Benchmarks`.
- Place benchmark files in the appropriate benchmark project and folder structure.
- Use namespaces that mirror the source code structure, e.g. do not suffix with `Benchmarks`.
- Namespace rule: DO NOT append `.Benchmarks` to the namespace. Benchmarks must live in the same namespace as the production assembly. Example: if the production assembly uses `namespace Bootstrapper.Security.Cryptography`, the benchmark file should also use:
Namespace rule: DO NOT append `.Benchmarks` to the namespace. Benchmarks must live in the same namespace as the production assembly. Example: if the production assembly uses `namespace YourProject.Security.Cryptography`, the benchmark file should also use:
```
namespace Bootstrapper.Security.Cryptography
namespace YourProject.Security.Cryptography
{
public class Sha512256Benchmark { /* ... */ }
}
```
The class name must end with `Benchmark`, but the namespace must match the assembly (no `.Benchmarks` suffix).
- The benchmarks for the Bootstrapper.Bar assembly live in the Bootstrapper.Bar.Benchmarks assembly.
- Benchmark class names end with Benchmark and live in the same namespace as the class being measured, e.g., the benchmarks for the Zoo class that resides in the Bootstrapper.Bar assembly would be named ZooBenchmark and placed in the Bootstrapper.Bar namespace in the Bootstrapper.Bar.Benchmarks assembly.
- Modify the associated .csproj file to override the root namespace, e.g., <RootNamespace>Bootstrapper.Bar</RootNamespace>.
- The benchmarks for the YourProject.Bar assembly live in the YourProject.Bar.Benchmarks assembly.
- Benchmark class names end with Benchmark and live in the same namespace as the class being measured, e.g., the benchmarks for the Zoo class that resides in the YourProject.Bar assembly would be named ZooBenchmark and placed in the YourProject.Bar namespace in the YourProject.Bar.Benchmarks assembly.
- Modify the associated .csproj file to override the root namespace, e.g., <RootNamespace>YourProject.Bar</RootNamespace>.

## 2. Attributes and Configuration

Expand Down Expand Up @@ -206,7 +234,7 @@ The class name must end with `Benchmark`, but the namespace must match the assem
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Configs;

namespace Bootstrapper
namespace YourProject
{
[MemoryDiagnoser]
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
Expand Down Expand Up @@ -244,15 +272,14 @@ namespace Bootstrapper
- If a benchmark exposes regressions or optimizations, add a short note in the benchmark file referencing the relevant issue or PR.
- For any shared helpers for benchmarking, prefer small utility classes inside the `tuning` projects rather than cross-cutting changes to production code.

For further examples, refer to the benchmark files under the `tuning/` folder such as `tuning/Bootstrapper.Core.Benchmarks/DateSpanBenchmark.cs` and `tuning/Bootstrapper.Security.Cryptography.Benchmarks/Sha512256Benchmark.cs`.
For further examples, refer to the benchmark files under the `tuning/` folder.

---
description: 'Writing XML documentation in Bootstrapper'
description: 'Writing XML documentation'
applyTo: "**/*.cs"
---

# Writing XML documentation in Bootstrapper

# Writing XML documentation
This document provides instructions for writing XML documentation.

## 1. Documentation Style
Expand All @@ -265,12 +292,12 @@ This document provides instructions for writing XML documentation.
using System;
using System.Collections.Generic;
using System.IO;
using Bootstrapper.Collections.Generic;
using Bootstrapper.Configuration;
using Bootstrapper.IO;
using Bootstrapper.Text;
using Cuemon.Collections.Generic;
using Cuemon.Configuration;
using Cuemon.IO;
using Cuemon.Text;

namespace Bootstrapper.Security
namespace Cuemon.Security
{
/// <summary>
/// Represents the base class from which all implementations of hash algorithms and checksums should derive.
Expand Down Expand Up @@ -540,7 +567,7 @@ namespace Bootstrapper.Security
}
}

namespace Bootstrapper.Security
namespace Cuemon.Security
{
/// <summary>
/// Configuration options for <see cref="FowlerNollVoHash"/>.
Expand Down
164 changes: 164 additions & 0 deletions .github/prompts/benchmark.prompt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,164 @@
---
mode: agent
description: 'Writing Performance Benchmarks'
---

# Benchmark Fixture Prompt (Tuning Benchmarks)

This prompt defines how to generate performance tests (“benchmarks”) for a project/solution using BenchmarkDotNet.
Benchmarks are *not* unit tests — they are micro- or component-level performance measurements that belong under the `tuning/` directory and follow strict conventions.

Copilot must follow these guidelines when generating benchmark fixtures.

---

## 1. Naming and Placement

- All benchmark projects live under the `tuning/` folder.
Examples:
- `tuning/<ProjectName>.Benchmarks/`
- `tuning/<ProjectName>.Console.Benchmarks/`

- **Namespaces must NOT end with `.Benchmarks`.**
They must mirror the production assembly’s namespace.

Example:
If benchmarking a type inside `YourProject.Console`, then:

```csharp
namespace YourProject.Console
{
public class Sha512256Benchmark { … }
}
```

* **Benchmark class names must end with `Benchmark`.**
Example: `DateSpanBenchmark`, `FowlerNollVoBenchmark`.

* Benchmark files should be located in the matching benchmark project
(e.g., benchmarks for `YourProject.Console` go in `YourProject.Console.Benchmarks.csproj`).

* In the `.csproj` for each benchmark project, set the root namespace to the production namespace, for example:

```xml
<RootNamespace>YourProject.Console</RootNamespace>
```

---

## 2. Attributes and Configuration

Each benchmark class should use:

```csharp
[MemoryDiagnoser]
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
```

Optional but strongly recommended where meaningful:

* `[Params(...)]` — define small, medium, large input sizes.
* `[GlobalSetup]` — deterministic initialization of benchmark data.
* `[Benchmark(Description = "...")]` — always add descriptions.
* `[Benchmark(Baseline = true)]` — when comparing two implementations.

Avoid complex global configs; prefer explicit attributes inside the class.

---

## 3. Structure and Best Practices

A benchmark fixture must:

* Measure a **single logical operation** per benchmark method.
* Avoid I/O, networking, disk access, logging, or side effects.
* Avoid expensive setup inside `[Benchmark]` methods.
* Use deterministic data (e.g., seeded RNG or predefined constants).
* Use `[GlobalSetup]` to allocate buffers, random payloads, or reusable test data only once.
* Avoid shared mutable state unless reset per iteration.

Use representative input sizes such as:

```csharp
[Params(8, 256, 4096)]
public int Count { get; set; }
```

BenchmarkDotNet will run each benchmark for each parameter value.

---

## 4. Method Naming Conventions

Use descriptive names that communicate intent:

* `Parse_Short`
* `Parse_Long`
* `ComputeHash_Small`
* `ComputeHash_Large`
* `Serialize_Optimized`
* `Serialize_Baseline`

When comparing approaches, always list them clearly and tag one as the baseline.

---

## 5. Example Benchmark Fixture

```csharp
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Configs;

namespace YourProject
{
[MemoryDiagnoser]
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
public class SampleOperationBenchmark
{
[Params(8, 256, 4096)]
public int Count { get; set; }

private byte[] _payload;

[GlobalSetup]
public void Setup()
{
_payload = new byte[Count];
// deterministic initialization
}

[Benchmark(Baseline = true, Description = "Operation - baseline")]
public int Operation_Baseline() => SampleOperation.Process(_payload);

[Benchmark(Description = "Operation - optimized")]
public int Operation_Optimized() => SampleOperation.ProcessOptimized(_payload);
}
}
```

---

## 6. Reporting and CI

* Benchmark projects live exclusively under `tuning/`. They must not affect production builds.
* Heavy BenchmarkDotNet runs should *not* run in CI unless explicitly configured.
* Reports are produced by the benchmark runner and stored under the configured artifacts directory.

---

## 7. Additional Guidelines

* Keep benchmark fixtures focused and readable.
* Document non-obvious reasoning in short comments.
* Prefer realistic but deterministic data sets.
* When benchmarks reveal regressions or improvements, reference the associated PR or issue in a comment.
* Shared benchmark helpers belong in `tuning/` projects, not in production code.

---

## Final Notes

* Benchmarks are performance tests, not unit tests.
* Use `[Benchmark]` only for pure performance measurement.
* Avoid `MethodImplOptions.NoInlining` unless absolutely necessary.
* Use small sets of meaningful benchmark scenarios — avoid combinatorial explosion.
Loading
Loading