A type-safe, predictable workflow orchestration engine for .NET that solves critical limitations in existing workflow solutions.
- Type Safety: Compile-time validation prevents runtime errors
- Performance: Skip unnecessary steps based on runtime conditions
- Reliability: Built-in validation and error recovery
- Flexibility: Runtime adaptability with optional execution
- Observability: Complete audit trail and debugging support
- Composability: Reusable blocks with clear responsibilities
- Persistence: Natural state management for long-running workflows
- Developer Experience: Intuitive API with full IntelliSense support
- Advanced Code Execution: Support for inline C#, assembly execution, and async patterns
- Debugging Support: Integrated debugging with breakpoints, stepping, and variable inspection
- Quick Start
- Core Concepts
- Usage Examples
- JSON Workflow System
- Guard System
- State Management
- Error Handling
- Advanced Features
- Code Execution System
- API Reference
Traditional workflow engines suffer from critical limitations:
Type Safety & Debugging
- Runtime type errors from dynamic inference
- No compile-time validation of workflow definitions
- Limited visibility into execution flow
- Black box approach makes troubleshooting difficult
Validation & Control
- No pre-execution validation before moving to next step
- Missing business rule enforcement between workflow steps
- Fixed execution paths with no runtime adaptability
- Every step must execute regardless of necessity
State Management
- No native support for long-running tasks (hours/days)
- Complex state management across extended execution periods
- Handling partial failures in multi-step processes
// Problems this engine solves:
// Fraud Detection: Always run expensive checks even for trusted customers
// Email Verification: Always send verification for pre-verified users
// Document Processing: Always run OCR even when text is already extracted
// Approval Workflows: Always require manager approval for minor changes
// API Calls: Always make external calls even when data is cachedAdd the package to your project:
dotnet add package FlowCoreFlowCore integrates seamlessly with Microsoft.Extensions.DependencyInjection:
using Microsoft.Extensions.DependencyInjection;
using FlowCore.Core;
// Configure services
var services = new ServiceCollection();
services.AddLogging(builder =>
{
builder.SetMinimumLevel(LogLevel.Information);
builder.AddConsole();
});
services.AddSingleton<IWorkflowBlockFactory, WorkflowBlockFactory>();
services.AddSingleton<IStateManager, InMemoryStateManager>();
var serviceProvider = services.BuildServiceProvider();using Microsoft.Extensions.DependencyInjection;
using FlowCore.Core;
// Configure services
var services = new ServiceCollection();
services.AddLogging(builder =>
{
builder.SetMinimumLevel(LogLevel.Information);
builder.AddConsole();
});
services.AddSingleton<IWorkflowBlockFactory, WorkflowBlockFactory>();
services.AddSingleton<IStateManager, InMemoryStateManager>();
var serviceProvider = services.BuildServiceProvider();
// Create a simple workflow using the fluent API
var workflow = FlowCoreWorkflowBuilder.Create("user-registration", "User Registration")
.WithVersion("1.0.0")
.WithDescription("User registration process")
.WithVariable("welcomeEmailTemplate", "Welcome to our platform!")
.StartWith("BasicBlocks.LogBlock", "validate_input")
.OnSuccessGoTo("create_user")
.WithDisplayName("Validate User Input")
.And()
.AddBlock("BasicBlocks.LogBlock", "create_user")
.OnSuccessGoTo("send_email")
.WithDisplayName("Create User Account")
.And()
.AddBlock("BasicBlocks.LogBlock", "send_email")
.WithDisplayName("Send Welcome Email")
.And()
.Build();
// Execute the workflow
var logger = serviceProvider.GetRequiredService<ILogger<WorkflowEngine>>();
var blockFactory = serviceProvider.GetRequiredService<IWorkflowBlockFactory>();
var stateManager = serviceProvider.GetRequiredService<IStateManager>();
var engine = new WorkflowEngine(blockFactory, stateManager: stateManager, logger: logger);
var input = new { Username = "john_doe", Email = "john@example.com" };
var result = await engine.ExecuteAsync(workflow, input);
Console.WriteLine($"Workflow completed: {result.Succeeded}");
Console.WriteLine($"Duration: {result.Duration?.TotalMilliseconds}ms");// Define workflow in JSON
var jsonWorkflow = @"
{
""id"": ""order-processing"",
""name"": ""Order Processing"",
""version"": ""1.0.0"",
""startBlockName"": ""validate_order"",
""blocks"": {
""validate_order"": {
""id"": ""validate_order"",
""type"": ""BasicBlocks.LogBlock"",
""nextBlockOnSuccess"": ""process_payment""
}
}
}";
// Execute JSON workflow
var jsonEngine = new JsonWorkflowEngine(serviceProvider);
var result = await jsonEngine.ExecuteFromJsonAsync(jsonWorkflow, orderData);Core Principles:
- Type Safety First: Compile-time validation over runtime discovery
- Predictable Execution: Clear success/failure paths with guard checks
- Optional Execution: Runtime adaptability with conditional block skipping
- Developer Experience: Feels natural to .NET developers
- Composability: Small, focused blocks over monolithic workflows
- State Persistence: Natural checkpointing at each block boundary
- Error Recovery: Built-in retry and compensation logic
- Guard Validation: Pre/post-execution validation at each step
Core Metaphor:
Workflow = Linked List of Code Blocks
Each block knows:
- What to execute
- What to do on success
- What to do on failure
- When to wait for approval
Key Advantages:
- Intuitive: Every developer understands linked lists
- Type-Safe: Compile-time validation prevents runtime errors
- Transparent: Clear execution flow, easy to debug
- Composable: Blocks can be reused and combined
public interface IWorkflowBlock
{
Task<ExecutionResult> ExecuteAsync(ExecutionContext context);
string NextBlockOnSuccess { get; }
string NextBlockOnFailure { get; }
}public class ExecutionContext
{
public object Input { get; }
public IReadOnlyDictionary<string, object> State { get; }
public CancellationToken CancellationToken { get; }
public string WorkflowName { get; }
public DateTime StartedAt { get; }
public string CurrentBlockName { get; internal set; }
}public class ExecutionResult
{
public bool IsSuccess { get; }
public string NextBlockName { get; }
public object Output { get; }
public ExecutionMetadata Metadata { get; }
// Factory methods for different outcomes
public static ExecutionResult Success(string nextBlock = null);
public static ExecutionResult Failure(string nextBlock = null);
public static ExecutionResult Skip(string nextBlock = null);
public static ExecutionResult Wait(TimeSpan duration, string nextBlock = null);
}Runtime workflow optimization through intelligent block skipping:
public class SmartValidationBlock : IWorkflowBlock
{
public async Task<ExecutionResult> ExecuteAsync(ExecutionContext context)
{
// Skip expensive validation for low-risk scenarios
if (context.Customer.RiskScore == RiskLevel.Low)
return ExecutionResult.Skip("expensive_validation_block");
if (context.Order.Amount < 50)
return ExecutionResult.Skip("payment_validation_block");
// Only run validation when necessary
var result = await RunValidation(context);
return ExecutionResult.Success("payment_processing_block");
}
}Execution Strategies:
- Skip with Alternative: Jump to different block when skipping
- Skip Multiple Blocks: Skip sequences of unnecessary steps
- Conditional Insertion: Add blocks based on runtime conditions
- Dynamic Routing: Change workflow path based on execution results
Pre/post-execution validation at each block boundary:
public class SecurePaymentBlock : IWorkflowBlock
{
public async Task<ExecutionResult> ExecuteAsync(ExecutionContext context)
{
// Guard Check: Validate before execution
if (!await ValidatePaymentMethod(context.PaymentInfo))
return ExecutionResult.Failure("payment_method_invalid");
if (!await CheckAccountBalance(context.Amount))
return ExecutionResult.Failure("insufficient_funds");
if (!await VerifySecurityToken(context.UserToken))
return ExecutionResult.Failure("security_validation_failed");
// Only execute payment if all guards pass
var result = await ProcessPayment(context);
return ExecutionResult.Success("order_confirmation_block");
}
}Guard Types:
- Data Validation Guards: Type checking and format validation
- Business Rule Guards: Policy and procedure compliance
- System State Guards: Resource availability and security validation
Natural checkpointing at each block boundary:
public class WorkflowEngine
{
public async Task<WorkflowState> ExecuteAsync(WorkflowDefinition definition, object input)
{
var context = new ExecutionContext(input);
var currentBlock = definition.StartBlock;
while (currentBlock != null)
{
// Execute block and handle result
var result = await currentBlock.ExecuteAsync(context);
// Persist state at each checkpoint
await PersistWorkflowState(definition.Id, context.State);
// Determine next block based on result
currentBlock = await GetNextBlock(definition, result.NextBlockName);
}
return context.State;
}
}- Natural State Persistence: Each block execution is a checkpoint
- Error Recovery: Failed workflows can resume from last successful block
- Progress Tracking: Clear execution history and audit trail
- Resource Management: Memory and connection leak prevention
Declarative workflow definitions with runtime interpretation:
{
"id": "enterprise_security_workflow",
"version": "2.0.0",
"description": "Enterprise security workflow with guard checks",
"metadata": {
"author": "Security Team",
"tags": ["security", "enterprise", "validation"]
},
"variables": {
"securityLevel": "high",
"requireMFA": true,
"auditLevel": "detailed"
},
"nodes": {
"auth_guard": {
"type": "MyCompany.SecurityBlocks.AuthenticationGuardBlock, MyCompany.SecurityBlocks",
"configuration": {
"tokenValidation": "strict",
"allowedTokenTypes": ["bearer", "apikey"]
},
"guards": {
"preExecution": [
{
"type": "TokenValidationGuard",
"condition": "token != null && token.IsValid",
"errorBlock": "invalid_token_error"
}
]
},
"transitions": {
"success": "authz_guard",
"failure": "auth_failure_handler"
}
},
"authz_guard": {
"type": "AuthorizationGuardBlock",
"configuration": {
"requiredPermission": "admin_access",
"checkHierarchy": true
},
"transitions": {
"success": "business_validation",
"failure": "access_denied_handler"
}
},
"business_validation": {
"type": "BusinessRuleValidationBlock",
"configuration": {
"rules": [
{
"name": "BusinessHoursCheck",
"expression": "currentTime >= '09:00' && currentTime <= '17:00'",
"errorMessage": "Operation outside business hours"
}
]
},
"transitions": {
"success": "execute_business_logic",
"failure": "business_rule_violation_handler"
}
}
},
"execution": {
"startNode": "auth_guard",
"errorHandler": "global_error_handler",
"timeout": "00:30:00",
"retryPolicy": {
"maxRetries": 3,
"backoffStrategy": "exponential",
"initialDelay": "00:00:01"
}
},
"persistence": {
"provider": "SqlServer",
"connectionString": "Server=.;Database=WorkflowState;Trusted_Connection=True;",
"checkpointFrequency": "AfterEachNode"
}
}Runtime Interpretation Engine:
public class JsonWorkflowEngine
{
private readonly IBlockFactory _blockFactory;
private readonly IGuardEvaluator _guardEvaluator;
private readonly IStateManager _stateManager;
public async Task<WorkflowResult> ExecuteFromJson(string jsonDefinition, object input)
{
// Parse and validate JSON
var workflowDefinition = JsonConvert.DeserializeObject<WorkflowDefinition>(jsonDefinition);
// Create execution context
var context = new JsonExecutionContext(input, workflowDefinition.Variables);
// Execute workflow using JSON-defined flow
return await ExecuteWorkflowAsync(workflowDefinition, context);
}
}// Complex workflows built by composing focused blocks
var enterpriseWorkflow = new Workflow("enterprise_security_workflow")
.StartWith(new AuthenticationGuardBlock())
.Then(new AuthorizationGuardBlock("required_permission"))
.Then(new BusinessRuleValidationBlock())
.Then(new DataConsistencyGuardBlock())
.Then(new SecurityAuditBlock())
.Then(new ExecuteBusinessLogicBlock())
.Then(new PostExecutionValidationBlock())
.OnError(new IntelligentCorrectionBlock("adaptive_failure_handling"));public class CompensatableBlock : IWorkflowBlock
{
public async Task<ExecutionResult> ExecuteAsync(ExecutionContext context)
{
var compensationPoint = await CreateCompensationPoint(context);
try
{
var result = await ExecuteBusinessLogic(context);
return ExecutionResult.Success("next_block");
}
catch (Exception ex)
{
await CompensateAsync(compensationPoint);
return ExecutionResult.Failure("error_handling_block");
}
}
}The engine supports declarative workflow definitions through JSON, enabling runtime interpretation and business user control.
{
"id": "order-processing",
"name": "Order Processing",
"version": "1.0.0",
"description": "Complete order lifecycle with payment processing",
"startBlockName": "validate_order",
"blocks": {
"validate_order": {
"id": "validate_order",
"type": "BasicBlocks.LogBlock",
"nextBlockOnSuccess": "process_payment",
"nextBlockOnFailure": "reject_order"
},
"process_payment": {
"id": "process_payment",
"type": "BasicBlocks.LogBlock",
"nextBlockOnSuccess": "send_confirmation"
},
"send_confirmation": {
"id": "send_confirmation",
"type": "BasicBlocks.LogBlock"
},
"reject_order": {
"id": "reject_order",
"type": "BasicBlocks.LogBlock"
}
},
"variables": {
"minOrderAmount": 10.0,
"maxOrderAmount": 10000.0
}
}var jsonEngine = new JsonWorkflowEngine(serviceProvider);
var jsonDefinition = File.ReadAllText("workflow-definition.json");
var orderData = new { OrderId = "ORD-001", Amount = 299.99m };
var result = await jsonEngine.ExecuteFromJsonAsync(jsonDefinition, orderData);{
"guards": {
"preExecution": [
{
"type": "BusinessRuleGuard",
"condition": "order.Amount >= variables.minOrderAmount",
"errorBlock": "amount_too_low"
}
]
}
}{
"transitions": {
"success": {
"condition": "result.Amount > 1000",
"target": "high_value_processing"
},
"default": "standard_processing"
}
}{
"configuration": {
"endpoint": "{{variables.apiBaseUrl}}/process",
"timeout": "{{variables.requestTimeout}}",
"headers": {
"Authorization": "Bearer {{context.user.token}}"
}
}
}- Declarative: Define workflows without code compilation
- Business Control: Enable business user workflow modification
- Runtime Updates: Support hot-reloading of workflow definitions
- External Storage: Store definitions in databases or files
- A/B Testing: Test different workflow versions simultaneously
- Analytics: Monitor workflow performance and execution patterns
var workflow = FlowCoreWorkflowBuilder.Create("user-registration", "User Registration")
.WithVersion("1.0.0")
.WithDescription("Simple user registration process")
.StartWith("BasicBlocks.LogBlock", "validate_input")
.OnSuccessGoTo("create_user")
.WithDisplayName("Validate User Input")
.And()
.AddBlock("BasicBlocks.LogBlock", "create_user")
.OnSuccessGoTo("send_email")
.WithDisplayName("Create User Account")
.And()
.AddBlock("BasicBlocks.LogBlock", "send_email")
.WithDisplayName("Send Welcome Email")
.And()
.Build();
var engine = new WorkflowEngine(blockFactory);
var input = new { Username = "john_doe", Email = "john@example.com" };
var result = await engine.ExecuteAsync(workflow, input);Complete order lifecycle management with validation, payment processing, and inventory updates:
var workflow = FlowCoreWorkflowBuilder.Create("order-processing", "Order Processing")
.WithVersion("2.0.0")
.WithVariable("minOrderAmount", 10.0)
.WithVariable("maxOrderAmount", 10000.0)
.WithVariable("taxRate", 0.08m)
.WithVariable("shippingThreshold", 50.0m)
.StartWith("BasicBlocks.LogBlock", "validate_order")
.OnSuccessGoTo("process_payment")
.OnFailureGoTo("reject_order")
.WithDisplayName("Validate Order Details")
.And()
.AddBlock("BasicBlocks.LogBlock", "process_payment")
.OnSuccessGoTo("update_inventory")
.OnFailureGoTo("payment_failed")
.WithDisplayName("Process Payment")
.And()
.AddBlock("BasicBlocks.LogBlock", "update_inventory")
.OnSuccessGoTo("send_confirmation")
.WithDisplayName("Update Inventory")
.And()
.AddBlock("BasicBlocks.LogBlock", "send_confirmation")
.WithDisplayName("Send Order Confirmation")
.And()
.AddBlock("BasicBlocks.LogBlock", "reject_order")
.WithDisplayName("Reject Order")
.And()
.AddBlock("BasicBlocks.LogBlock", "payment_failed")
.WithDisplayName("Payment Failed")
.And()
.Build();
// Realistic order data with complete details
var orderData = new
{
OrderId = "ORD-2024-001",
CustomerId = "CUST-001",
Amount = 299.96m,
Items = new[]
{
new { ProductId = "PRD-001", Name = "Laptop", Quantity = 1, Price = 249.99m },
new { ProductId = "PRD-002", Name = "Wireless Mouse", Quantity = 1, Price = 29.99m },
new { ProductId = "PRD-003", Name = "USB-C Cable", Quantity = 2, Price = 9.99m }
},
ShippingAddress = new
{
Street = "123 Main St",
City = "San Francisco",
State = "CA",
ZipCode = "94102",
Country = "USA"
},
PaymentMethod = new
{
Type = "CreditCard",
Last4Digits = "4242",
ExpiryDate = "12/2025"
},
CustomerInfo = new
{
Email = "customer@example.com",
Phone = "+1-555-0123",
LoyaltyTier = "Gold"
}
};
var result = await engine.ExecuteAsync(workflow, orderData);Enterprise customer onboarding with sequential profile creation, email verification, and automated notifications:
var workflow = FlowCoreWorkflowBuilder.Create("customer-onboarding", "Customer Onboarding")
.WithVersion("1.0.0")
.WithVariable("welcomeEmailTemplate", "Welcome to our platform!")
.WithVariable("verificationTimeout", TimeSpan.FromHours(24))
.WithVariable("defaultUserRole", "StandardUser")
.StartWith("BasicBlocks.LogBlock", "initialize_onboarding")
.OnSuccessGoTo("create_profile")
.WithDisplayName("Initialize Customer Onboarding")
.And()
.AddBlock("BasicBlocks.LogBlock", "create_profile")
.OnSuccessGoTo("verify_email")
.WithDisplayName("Create Customer Profile")
.And()
.AddBlock("BasicBlocks.LogBlock", "verify_email")
.OnSuccessGoTo("send_welcome")
.OnFailureGoTo("verification_failed")
.WithDisplayName("Send Email Verification")
.And()
.AddBlock("BasicBlocks.LogBlock", "send_welcome")
.OnSuccessGoTo("setup_preferences")
.WithDisplayName("Send Welcome Package")
.And()
.AddBlock("BasicBlocks.LogBlock", "setup_preferences")
.OnSuccessGoTo("schedule_followup")
.WithDisplayName("Setup User Preferences")
.And()
.AddBlock("BasicBlocks.LogBlock", "schedule_followup")
.OnSuccessGoTo("notify_admin")
.WithDisplayName("Schedule Follow-up Call")
.And()
.AddBlock("BasicBlocks.LogBlock", "notify_admin")
.WithDisplayName("Notify Admin Team")
.And()
.AddBlock("BasicBlocks.LogBlock", "verification_failed")
.WithDisplayName("Handle Verification Failure")
.And()
.Build();
// Comprehensive customer data
var customerData = new
{
CustomerId = "CUST-NEW-001",
CompanyName = "TechCorp Solutions",
Industry = "Technology",
CompanySize = "50-200 employees",
SubscriptionTier = "Enterprise",
PrimaryContact = new
{
Name = "Sarah Johnson",
Email = "sarah@techcorp.com",
Phone = "+1-555-0199",
Title = "CTO",
Department = "Engineering"
},
BillingInfo = new
{
BillingEmail = "billing@techcorp.com",
PaymentMethod = "Invoice",
BillingCycle = "Monthly"
},
Preferences = new
{
Timezone = "America/Los_Angeles",
Language = "en-US",
NotificationPreferences = new[] { "Email", "SMS" }
}
};
var result = await engine.ExecuteAsync(workflow, customerData);var workflow = FlowCoreWorkflowBuilder.Create("document-processing", "Document Processing")
.WithVersion("3.0.0")
.WithVariable("maxFileSize", 10485760L) // 10MB
.WithVariable("supportedFormats", new[] { "PDF", "JPG", "PNG", "TIFF" })
.StartWith("BasicBlocks.LogBlock", "validate_document")
.OnSuccessGoTo("extract_text")
.OnFailureGoTo("validation_failed")
.WithDisplayName("Validate Document Upload")
.And()
.AddBlock("BasicBlocks.LogBlock", "extract_text")
.OnSuccessGoTo("classify_document")
.WithDisplayName("Extract Text (OCR)")
.And()
.AddBlock("BasicBlocks.LogBlock", "classify_document")
.OnSuccessGoTo("store_document")
.WithDisplayName("Classify Document")
.And()
.AddBlock("BasicBlocks.LogBlock", "store_document")
.OnSuccessGoTo("send_notification")
.WithDisplayName("Store Document")
.And()
.AddBlock("BasicBlocks.LogBlock", "send_notification")
.WithDisplayName("Send Processing Notification")
.And()
.Build();
var documentData = new
{
DocumentId = "DOC-2024-001",
FileName = "invoice_techcorp_001.pdf",
FileSize = 2048576L, // 2MB
FileFormat = "PDF"
};
var result = await engine.ExecuteAsync(workflow, documentData);Continuous device monitoring with diagnostics, alert triggering, and automated responses:
var workflow = FlowCoreWorkflowBuilder.Create("iot-monitoring", "IoT Device Monitoring")
.WithVersion("1.0.0")
.WithVariable("temperatureThreshold", 75.0)
.WithVariable("humidityThreshold", 80.0)
.WithVariable("alertCooldown", TimeSpan.FromMinutes(15))
.WithVariable("criticalThreshold", 90.0)
.StartWith("BasicBlocks.LogBlock", "collect_telemetry")
.OnSuccessGoTo("analyze_metrics")
.OnFailureGoTo("connection_failed")
.WithDisplayName("Collect Device Telemetry")
.And()
.AddBlock("BasicBlocks.LogBlock", "analyze_metrics")
.OnSuccessGoTo("check_thresholds")
.WithDisplayName("Analyze Device Metrics")
.And()
.AddBlock("BasicBlocks.LogBlock", "check_thresholds")
.OnSuccessGoTo("normal_operation")
.OnFailureGoTo("trigger_alert")
.WithDisplayName("Check Warning Thresholds")
.And()
.AddBlock("BasicBlocks.LogBlock", "trigger_alert")
.OnSuccessGoTo("execute_action")
.WithDisplayName("Trigger Alert Notification")
.And()
.AddBlock("BasicBlocks.LogBlock", "execute_action")
.OnSuccessGoTo("log_incident")
.WithDisplayName("Execute Automated Action")
.And()
.AddBlock("BasicBlocks.LogBlock", "log_incident")
.OnSuccessGoTo("update_dashboard")
.WithDisplayName("Log Incident to Database")
.And()
.AddBlock("BasicBlocks.LogBlock", "update_dashboard")
.WithDisplayName("Update Monitoring Dashboard")
.And()
.AddBlock("BasicBlocks.LogBlock", "normal_operation")
.OnSuccessGoTo("update_dashboard")
.WithDisplayName("Normal Operation - Update Status")
.And()
.AddBlock("BasicBlocks.LogBlock", "connection_failed")
.WithDisplayName("Handle Connection Failure")
.And()
.Build();
// IoT device telemetry data
var deviceData = new
{
DeviceId = "IOT-SENSOR-001",
DeviceType = "EnvironmentalSensor",
Location = new
{
Building = "Warehouse A",
Floor = "2",
Zone = "Storage-North",
Coordinates = new { Latitude = 37.7749, Longitude = -122.4194 }
},
Telemetry = new
{
Temperature = 78.5,
Humidity = 65.2,
Pressure = 1013.25,
BatteryLevel = 87.3,
SignalStrength = -45
},
Status = new
{
IsOnline = true,
LastHeartbeat = DateTime.UtcNow.AddMinutes(-2),
FirmwareVersion = "2.1.4",
Uptime = TimeSpan.FromHours(156)
},
Alerts = new
{
TemperatureAlert = false,
HumidityAlert = false,
BatteryLowAlert = false,
ConnectionIssues = false
},
Metadata = new
{
InstallationDate = new DateTime(2024, 1, 15),
MaintenanceSchedule = "Quarterly",
NextMaintenanceDate = new DateTime(2024, 4, 15),
ResponsibleTeam = "Facilities"
}
};
var result = await engine.ExecuteAsync(workflow, deviceData);// Business Hours Guard
var businessHoursGuard = new CommonGuards.BusinessHoursGuard(
TimeSpan.FromHours(9), TimeSpan.FromHours(17));
// Data Format Guard
var emailGuard = new CommonGuards.DataFormatGuard("Email", @"^[^\s@]+@[^\s@]+\.[^\s@]+$");
// Numeric Range Guard
var amountGuard = new CommonGuards.NumericRangeGuard("Amount", 100.0m, 10000.0m);
// Required Field Guard
var requiredGuard = new CommonGuards.RequiredFieldGuard("CustomerId", "Email", "Amount");
// Authorization Guard
var authGuard = new CommonGuards.AuthorizationGuard("admin", "administrator", "manager");var intelligentWorkflow = new Workflow("adaptive_order_processing")
.StartWith(new CustomerAnalysisBlock())
.Then(new OptionalFraudCheckBlock()) // Skip if low risk
.Then(new OptionalAddressValidationBlock()) // Skip if verified
.Then(new OptionalCreditCheckBlock()) // Skip if pre-approved
.Then(new PaymentProcessingBlock())
.Then(new OptionalEnhancedShippingBlock()) // Skip for standard orders
.Then(new OptionalPremiumSupportBlock()); // Skip for basic customersvar logger = serviceProvider.GetRequiredService<ILogger<WorkflowEngine>>();
var blockFactory = serviceProvider.GetRequiredService<IWorkflowBlockFactory>();
var engine = new WorkflowEngine(blockFactory, logger: logger);
var realisticInput = new
{
CustomerId = "CUST-PREMIUM-001",
OrderId = "ORD-2024-001",
Amount = 1250.50m,
Items = new[] { "Premium Laptop", "Wireless Mouse", "USB-C Hub" },
ShippingAddress = "123 Business Ave, Tech City, TC 12345",
PaymentMethod = "CreditCard",
CustomerTier = "Premium",
ProcessingPriority = "High"
};
var result = await engine.ExecuteAsync(workflowDefinition, realisticInput);{
"id": "intelligent_order_processing",
"version": "2.0.0",
"description": "Adaptive order processing with optional validations",
"variables": {
"lowRiskThreshold": 100,
"highValueThreshold": 10000,
"trustedCustomerScore": 0.8
},
"nodes": {
"customer_analysis": {
"type": "CustomerRiskAssessmentBlock",
"configuration": {
"riskModel": "ML_v2.0",
"factors": ["orderHistory", "paymentHistory", "accountAge"]
},
"transitions": {
"success": "fraud_check"
}
},
"fraud_check": {
"type": "OptionalFraudCheckBlock",
"configuration": {
"skipConditions": [
"context.Customer.RiskScore < variables.trustedCustomerScore",
"context.Order.Amount < variables.lowRiskThreshold"
]
},
"transitions": {
"success": "address_validation",
"skip": "payment_processing"
}
},
"address_validation": {
"type": "OptionalAddressValidationBlock",
"configuration": {
"skipIfVerified": true,
"verificationService": "AddressService_v2"
},
"transitions": {
"success": "credit_check",
"skip": "payment_processing"
}
},
"credit_check": {
"type": "OptionalCreditCheckBlock",
"configuration": {
"skipForPreApproved": true,
"creditBureau": "Experian"
},
"transitions": {
"success": "payment_processing",
"skip": "payment_processing"
}
},
"payment_processing": {
"type": "AdaptivePaymentProcessingBlock",
"configuration": {
"providers": ["Stripe", "PayPal", "BankTransfer"],
"routingRules": {
"amount < 100": "PayPal",
"amount >= 10000": "BankTransfer",
"default": "Stripe"
}
},
"transitions": {
"success": "shipping_decision"
}
},
"shipping_decision": {
"type": "DecisionBlock",
"configuration": {
"conditions": [
{
"expression": "order.Amount >= variables.highValueThreshold",
"target": "enhanced_shipping"
},
{
"expression": "order.IsInternational",
"target": "international_shipping"
}
],
"defaultTarget": "standard_shipping"
}
},
"standard_shipping": {
"type": "ShippingBlock",
"configuration": {
"carrier": "UPS_Ground",
"signatureRequired": false
},
"transitions": {
"success": "completion_notification"
}
},
"enhanced_shipping": {
"type": "ShippingBlock",
"configuration": {
"carrier": "FedEx_Express",
"signatureRequired": true,
"insurance": "full_value"
},
"transitions": {
"success": "completion_notification"
}
},
"international_shipping": {
"type": "ShippingBlock",
"configuration": {
"carrier": "DHL_Express",
"customsDeclaration": true,
"tracking": "detailed"
},
"transitions": {
"success": "completion_notification"
}
},
"completion_notification": {
"type": "MultiChannelNotificationBlock",
"configuration": {
"channels": ["Email", "SMS"],
"template": "order_shipped",
"personalization": {
"customerName": "{{context.Customer.Name}}",
"orderNumber": "{{context.Order.Id}}",
"trackingNumber": "{{result.TrackingNumber}}"
}
},
"transitions": {
"success": "workflow_complete"
}
},
"workflow_complete": {
"type": "WorkflowCompletionBlock",
"configuration": {
"archiveData": true,
"triggerAnalytics": true
}
}
},
"execution": {
"startNode": "customer_analysis",
"timeout": "00:10:00",
"retryPolicy": {
"maxRetries": 2,
"backoffStrategy": "linear"
}
},
"persistence": {
"provider": "SqlServer",
"checkpointFrequency": "AfterEachNode"
}
}public class WorkflowService
{
private readonly JsonWorkflowEngine _engine;
public async Task<WorkflowResult> ExecuteOrderWorkflow(string customerId, OrderData order)
{
// Load workflow definition from database or file
var jsonDefinition = await _workflowRepository.GetWorkflowDefinition("intelligent_order_processing");
// Prepare input context
var input = new
{
CustomerId = customerId,
Order = order,
Timestamp = DateTime.UtcNow
};
// Execute workflow from JSON definition
return await _engine.ExecuteFromJson(jsonDefinition, input);
}
public async Task<WorkflowResult> ExecuteWorkflowById(string workflowId, object input)
{
var definition = await _workflowRepository.GetWorkflowDefinition(workflowId);
return await _engine.ExecuteFromJson(definition, input);
}
}The framework includes a sophisticated guard system for pre/post-execution validation:
var businessHoursGuard = new CommonGuards.BusinessHoursGuard(
TimeSpan.FromHours(9), TimeSpan.FromHours(17));var emailGuard = new CommonGuards.DataFormatGuard(
"Email", @"^[^\s@]+@[^\s@]+\.[^\s@]+$");var amountGuard = new CommonGuards.NumericRangeGuard(
"Amount", 10.0m, 5000.0m);var requiredGuard = new CommonGuards.RequiredFieldGuard(
"CustomerId", "Email", "Amount");var authGuard = new CommonGuards.AuthorizationGuard(
"admin", "administrator", "manager");var workflow = FlowCoreWorkflowBuilder.Create("guarded-workflow", "Guarded Workflow")
.StartWith("BasicBlocks.LogBlock", "validate_request")
.OnSuccessGoTo("process_request")
.OnFailureGoTo("validation_failed")
.WithDisplayName("Validate Request with Guards")
.And()
.AddBlock("BasicBlocks.LogBlock", "process_request")
.OnSuccessGoTo("complete")
.WithDisplayName("Process Valid Request")
.And()
.Build();For development and testing, use the in-memory state manager:
var engine = new WorkflowEngine(
blockFactory,
stateManager: new InMemoryStateManager(),
stateManagerConfig: new StateManagerConfig
{
CheckpointFrequency = CheckpointFrequency.AfterEachBlock
});For production use with long-running workflows and fault-tolerant execution, use the SQLite-based state manager:
// Configure SQLite state manager
var dbPath = "workflow_states.db";
var stateManagerConfig = new StateManagerConfig
{
CheckpointFrequency = CheckpointFrequency.AfterEachBlock,
Compression = new StateCompressionConfig
{
Enabled = true,
MinSizeThreshold = 1024, // Compress data larger than 1KB
Algorithm = CompressionAlgorithm.GZip
}
};
var stateManager = new SQLiteStateManager(dbPath, stateManagerConfig, logger);
// Use with workflow engine
var engine = new WorkflowEngine(
blockFactory,
stateManager: stateManager,
logger: logger);Production Configuration:
var productionConfig = new StateManagerConfig
{
// Enable automatic checkpointing after each block
CheckpointFrequency = CheckpointFrequency.AfterEachBlock,
// Enable compression for large state data
Compression = new StateCompressionConfig
{
Enabled = true,
MinSizeThreshold = 2048, // 2KB threshold
Algorithm = CompressionAlgorithm.GZip
},
// Enable encryption for sensitive data
Encryption = new StateEncryptionConfig
{
Enabled = true,
KeyIdentifier = "production-key-2024",
Algorithm = EncryptionAlgorithm.AES256
},
// State retention and cleanup
MaxStateAge = TimeSpan.FromDays(90),
// Enable versioning for state history
EnableVersioning = true,
MaxVersionsPerExecution = 5
};
var stateManager = new SQLiteStateManager(
"Data Source=/var/lib/flowcore/states.db",
productionConfig,
logger);State Recovery Example:
// Save workflow state for recovery
var state = new Dictionary<string, object>
{
["currentStep"] = "payment_processing",
["orderAmount"] = 599.99m,
["customerId"] = "CUST-001",
["attemptCount"] = 1
};
await stateManager.SaveStateAsync(workflowId, executionId, state);
// Later, recover the state after interruption
var recoveredState = await stateManager.LoadStateAsync(workflowId, executionId);
if (recoveredState != null)
{
// Resume workflow from the saved checkpoint
Console.WriteLine($"Resuming from step: {recoveredState["currentStep"]}");
}State Maintenance:
// Get database statistics
var stats = await stateManager.GetStatisticsAsync();
Console.WriteLine($"Total states: {stats.TotalStates}");
Console.WriteLine($"Database size: {stats.TotalSizeBytes} bytes");
Console.WriteLine($"Active executions: {stats.ActiveExecutions}");
// Cleanup old completed workflows
var deletedCount = await stateManager.CleanupOldStatesAsync(
DateTime.UtcNow.AddDays(-30),
status: WorkflowStatus.Completed);
Console.WriteLine($"Deleted {deletedCount} old workflow states");The engine automatically persists state at each block boundary based on the configured checkpoint frequency.
// Workflows can be suspended and resumed
await engine.SuspendWorkflowAsync(workflowId, executionId, context);
// Resume from checkpoint
var result = await engine.ResumeFromCheckpointAsync(
workflowDefinition, executionId);- Persistent Storage: Workflow states are stored in SQLite database for durability
- Automatic Schema Management: Database schema is created and managed automatically
- Compression: Optional GZip/Brotli compression for large state data
- Encryption: Optional AES-256 encryption for sensitive workflow data
- Concurrent Access: Thread-safe operations with proper locking
- State Cleanup: Built-in cleanup mechanisms for old workflow states
- Statistics: Real-time statistics about stored states and database usage
- Metadata Tracking: Rich metadata including status, timestamps, and custom fields
- Connection String Support: Flexible configuration with connection strings or file paths
- Use Encryption: Enable encryption for workflows handling sensitive data
- Enable Compression: Reduce storage costs with compression for large states
- Regular Cleanup: Implement scheduled cleanup of old completed workflows
- Monitor Database: Track state size and database growth over time
- Connection Pooling: Use connection pooling for high-throughput scenarios
- Regular Backups: Implement database backup strategy for disaster recovery
- Index Optimization: SQLite indexes are created automatically for performance
var workflow = FlowCoreWorkflowBuilder.Create("resilient-workflow", "Resilient Workflow")
.StartWith("BasicBlocks.LogBlock", "process_critical_task")
.OnSuccessGoTo("continue_workflow")
.OnFailureGoTo("handle_error") // Custom error handling
.And()
.AddBlock("BasicBlocks.LogBlock", "handle_error")
.OnSuccessGoTo("retry_task")
.WithDisplayName("Handle Error and Retry")
.And()
.Build();var executionConfig = new WorkflowExecutionConfig
{
RetryPolicy = new RetryPolicy
{
MaxRetries = 3,
BackoffStrategy = BackoffStrategy.Exponential,
InitialDelay = TimeSpan.FromSeconds(1)
}
};var parallelWorkflow = new Workflow("parallel_processing")
.StartWith(new ParallelBlock(new[]
{
new ValidationBlock("validation_1"),
new ValidationBlock("validation_2"),
new ValidationBlock("validation_3")
}))
.Then(new AggregationBlock());var conditionalWorkflow = new Workflow("conditional_processing")
.StartWith(new DecisionBlock(ctx =>
ctx.Input.Amount > 1000 ? "premium_path" : "standard_path"))
.Then(new PremiumProcessingBlock()) // Only for high-value orders
.Then(new StandardProcessingBlock()); // Default path// Modify workflow at runtime based on conditions
var dynamicWorkflow = new Workflow("dynamic_processing")
.StartWith(new AdaptiveBlock(ctx =>
{
// Add blocks dynamically based on context
if (ctx.Input.RequiresSpecialHandling)
{
return ExecutionResult.Success("special_handling_block");
}
return ExecutionResult.Success("standard_processing_block");
}));var currentTime = DateTime.UtcNow;
var isPeakHour = currentTime.Hour >= 9 && currentTime.Hour <= 17;
var adaptiveWorkflow = FlowCoreWorkflowBuilder.Create("adaptive-workflow", "Adaptive Processing")
.WithVariable("currentTime", currentTime)
.WithVariable("isPeakHour", isPeakHour)
.WithVariable("processingPriority", isPeakHour ? "Expedited" : "Standard")
.Build();FlowCore now includes a comprehensive code execution system that allows dynamic execution of C# code, assemblies, and supports advanced debugging capabilities.
Execute C# code strings directly within workflows:
var inlineExecutor = new InlineCodeExecutor(securityConfig, logger);
// Execute simple C# code
var result = await inlineExecutor.ExecuteAsync(new CodeExecutionContext
{
Code = "return input.Amount * 1.1m;", // 10% markup
Parameters = new Dictionary<string, object> { ["input"] = order }
});Enhanced support for async/await patterns with performance monitoring:
var asyncExecutor = new AsyncInlineCodeExecutor(securityConfig, logger);
// Execute async code with full async support
var asyncResult = await asyncExecutor.ExecuteAsyncCodeAsync(new AsyncCodeExecutionContext
{
Code = @"
await Task.Delay(1000); // Simulate async operation
var processed = await ProcessOrderAsync(input.Order);
return processed;
",
Parameters = new Dictionary<string, object> { ["input"] = orderData }
});Execute methods from pre-compiled .NET assemblies with security validation:
var assemblyExecutor = new AssemblyCodeExecutor(securityConfig, logger);
// Execute code from a compiled assembly
var assemblyResult = await assemblyExecutor.ExecuteAsync(new CodeExecutionContext
{
AssemblyPath = "path/to/MyBusinessLogic.dll",
TypeName = "MyBusinessLogic.OrderProcessor",
MethodName = "ProcessOrder",
Parameters = new Dictionary<string, object> { ["order"] = order }
});Integrated debugging capabilities for troubleshooting code execution:
var debugger = new BasicCodeExecutionDebugger(logger);
// Start a debug session
var session = await debugger.StartDebugSessionAsync(new DebugConfiguration
{
BreakOnFirstLine = true,
BreakOnExceptions = true,
EnableDetailedLogging = true
});
// Set breakpoints
await debugger.SetBreakpointAsync(new Breakpoint
{
LineNumber = 10,
Condition = "amount > 1000"
});
// Execute with debugging
var debugResult = await debugger.ExecuteWithDebuggingAsync(context);
// Inspect variables and step through code
await session.StepOverAsync();
var variableValue = await session.GetVariableValueAsync("totalAmount");Debugging Features:
- Breakpoints: Set conditional breakpoints with hit counts
- Stepping: Step over, step into, step out functionality
- Variable Inspection: View and modify variables during execution
- Call Stack Tracking: Monitor execution flow and method calls
- Execution Tracing: Detailed trace of execution events
All code execution is secured with comprehensive validation:
- Namespace Validation: Restrict access to sensitive namespaces
- Type Validation: Prevent execution of dangerous types
- Assembly Security: Strong-name validation and whitelisting
- Sandboxing: AppDomain isolation for assembly execution
- Timeout Protection: Configurable execution timeouts
- Audit Logging: Complete security event logging
var securityConfig = new CodeSecurityConfig
{
AllowedNamespaces = ["System", "System.Linq", "MyCompany.Business"],
BlockedTypes = ["System.Reflection", "System.IO.File"],
MaxExecutionTime = TimeSpan.FromSeconds(30),
AllowDynamicAssemblyLoading = false
};WorkflowEngine: Main workflow execution engineWorkflowExecutor: Enhanced executor with monitoring and error handlingFlowCoreWorkflowBuilder: Fluent API for building workflowsJsonWorkflowEngine: JSON-based workflow executionExecutionContext: Runtime execution contextExecutionResult: Block execution outcomes
ICodeExecutor: Base interface for code executorsIAsyncCodeExecutor: Interface for asynchronous code executionInlineCodeExecutor: Executes inline C# code stringsAsyncInlineCodeExecutor: Executes async C# code with performance monitoringAssemblyCodeExecutor: Executes methods from pre-compiled assembliesCodeExecutionContext: Context for code executionAsyncCodeExecutionContext: Enhanced context for async execution
ICodeExecutionDebugger: Interface for code debuggingBasicCodeExecutionDebugger: Basic implementation of code debuggerIDebugSession: Represents a debugging sessionBreakpoint: Defines breakpoints in codeDebugConfiguration: Configuration for debug sessionsDebugExecutionResult: Result of debug execution
BasicBlocks.LogBlock: Logging and debuggingBasicBlocks.WaitBlock: Delay executionBasicBlocks.SetStateBlock: Modify workflow stateBasicBlocks.ConditionalBlock: Conditional logicBasicBlocks.FailBlock: Force failure scenarios
CommonGuards.BusinessHoursGuard: Time-based validationCommonGuards.DataFormatGuard: Format validationCommonGuards.NumericRangeGuard: Range validationCommonGuards.RequiredFieldGuard: Required field validationCommonGuards.AuthorizationGuard: Permission validation
IWorkflowBlock: Custom block implementationIWorkflowBlockFactory: Block creation and resolutionIStateManager: State persistence abstractionIGuard: Custom guard implementationIWorkflowExecutor: Workflow execution abstractionIExecutionMonitor: Execution monitoring interface
- Type Safety: Compile-time validation prevents runtime errors
- Performance: Skip unnecessary steps based on runtime conditions
- Reliability: Built-in validation and error recovery
- Flexibility: Runtime adaptability with optional execution
- Observability: Complete audit trail and debugging support
- Maintainability: Composable blocks with clear responsibilities
- Scalability: Natural state persistence for long-running workflows
- Developer Experience: Intuitive API with full IntelliSense support
- Advanced Code Execution: Dynamic C# execution with security and performance
- Integrated Debugging: Step-through debugging for troubleshooting
- Declarative Definition: Define workflows without code compilation
- Runtime Modification: Update workflows without application restart
- Business User Control: Enable non-developers to modify workflows
- External Storage: Store workflow definitions in databases or files
- Version Management: Track and manage workflow definition versions
- A/B Testing: Test different workflow versions simultaneously
- Analytics: Monitor workflow performance and execution patterns
- Integration: Easy integration with external systems and tools
- Multi-Mode Execution: Support for inline, async, and assembly-based code
- Security First: Comprehensive validation and sandboxing
- Performance Optimized: Caching, async support, and monitoring
- Debugging Ready: Integrated tools for development and troubleshooting
- Flexible Integration: Seamless incorporation into workflows
FlowCore includes comprehensive security hardening across all execution modes:
- Type Safety: Compile-time validation prevents runtime type errors
- Guard Validation: Pre/post-execution validation at each block boundary
- State Isolation: Secure state management with access controls
- Dynamic Assembly Loading Protection: Assembly loading is disabled by default and requires explicit opt-in
- Assembly Whitelist Support: Only explicitly allowed assemblies can be loaded dynamically
- Strong-Name Signature Validation: All assemblies must have valid strong-name signatures
- Runtime Security Validation: Real-time validation of assembly security properties
- Namespace Restrictions: Prevent access to sensitive system namespaces
- Type Restrictions: Block execution of dangerous types (e.g., reflection, file I/O)
- Sandboxing: AppDomain isolation for assembly execution
- Timeout Protection: Configurable execution timeouts to prevent infinite loops
- Comprehensive Audit Trail: All security events are logged for compliance and monitoring
// Secure by default - dynamic loading disabled
var factory = new WorkflowBlockFactory(
serviceProvider: new ServiceCollection().BuildServiceProvider(),
securityOptions: new WorkflowBlockFactorySecurityOptions(),
logger: null);
// Explicit security configuration for trusted scenarios
var securityOptions = new WorkflowBlockFactorySecurityOptions
{
AllowDynamicAssemblyLoading = true,
AllowedAssemblyNames = new[] { "MyCompany.TrustedBlocks" },
ValidateStrongNameSignatures = true,
AllowedPublicKeyTokens = new[] { yourCompanyPublicKeyToken }
};
var secureFactory = new WorkflowBlockFactory(
serviceProvider: new ServiceCollection().BuildServiceProvider(),
securityOptions: securityOptions,
logger: null);For comprehensive security guidelines, best practices, and safe extension patterns, see SECURITY.md.
The framework now includes 7 comprehensive examples that demonstrate real-world usage patterns:
- Dynamic Context: Peak hour detection and adaptive processing
- Realistic Data: Complete user profiles with timestamps and device info
- Adaptive Logic: Different processing based on registration volume
- Business Variables: Configurable delays and priority assignment
- Complete Lifecycle: From validation to fulfillment and notification
- Business Data: Orders, customers, payments, shipping, inventory
- Error Scenarios: Payment failures, inventory issues, shipping problems
- Stakeholder Integration: Notifications, confirmations, status updates
- Parallel Processing: Multiple setup tasks running simultaneously
- Enterprise Context: Company information, industry classification
- Stakeholder Management: Follow-up scheduling and notifications
- Business Logic: Subscription tiers, priorities, assigned managers
- Technical Implementation: OCR, classification, storage, thumbnails
- File Management: Size limits, format validation, metadata generation
- Department Integration: Cross-team workflows and processing
- Quality Assurance: Content validation and SEO optimization
- E-commerce Focus: Product information, pricing, inventory management
- Content Processing: Image handling, SEO generation, review workflows
- Publishing Logic: Auto-publish vs manual review based on business rules
- Stakeholder Communication: Catalog updates and notifications
- Financial Systems: Multiple payment gateways and retry mechanisms
- Risk Management: Circuit breakers, fallback options, authorization
- Recovery Patterns: Intelligent retry with exponential backoff
- Compliance: Audit trails, confirmation systems, webhook processing
- Real Guard Usage: Actual guard class instantiation and evaluation
- Business Rule Validation: Time-based, format, range, authorization checks
- Integration Patterns: Guards affecting workflow execution paths
- Enterprise Scenarios: Multi-guard validation with complex business rules
- Dynamic Code Execution: Runtime C# code for data transformation and analytics
- Workflow Orchestration: Multi-step pipeline with conditional transitions
- Error Handling & Recovery: Retry mechanisms and graceful failure handling
- State Persistence: Data flow and state management between blocks
- Security: Secure code execution with namespace restrictions
Pipeline Structure:
Input Validation → Data Transformation (CodeBlock) → Analytics (CodeBlock) → Report Generation → Notification
↓ (Error Path)
Retry Logic → Permanent Failure
Key Components:
- Transform Data Block: Cleans and enriches raw data, calculates discounts
- Analyze Data Block: Computes statistics (averages, min/max, totals)
- Error Recovery: Automatic retries with fallback to permanent failure
- State Management: Seamless data passing between execution steps
Usage:
// Run the complete example suite
await DataAnalyticsPipelineExample.RunAsync();This example processes sample product data, demonstrates error handling with invalid entries, and showcases the engine's ability to handle complex business logic through dynamic code execution while maintaining security and reliability.
The DataAnalyticsPipeline follows FlowCore's core linked-list architecture where each block knows its next steps:
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ validate_input │───▶│ transform_data │───▶│ analyze_data │───▶│ generate_report │───▶│send_notification│───▶│pipeline_complete│
│ (LogBlock) │ │ (CodeBlock) │ │ (CodeBlock) │ │ (LogBlock) │ │ (LogBlock) │ │ (LogBlock) │
│ │ │ │ │ │ │ │ │ │ │ │
│ Next: transform_│ │ Next: analyze_d │ │ Next: generate_ │ │ Next: send_noti │ │ Next: pipeline_ │ │ Next: (End) │
│ Fail: input_vali│ │ Fail: transform │ │ Fail: analysis_ │ │ Fail: report_ge │ │ Fail: (End) │ │ Fail: (End) │
└─────────────────┘ └─────────────────┘ └─────────────────┘ └─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │ │
│ │ │ │
▼ ▼ ▼ ▼
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│input_validation_│ │transformation_ │ │ analysis_failed│ │report_generation│
│ failed │ │ failed │ │ (LogBlock) │ │ failed │
│ (LogBlock) │ │ (LogBlock) │ │ │ │ (LogBlock) │
│ │ │ │ │ Next: (End) │ │ │
│ Next: (End) │ │ Next: retry_tra │ │ Fail: (End) │ │ Next: (End) │
│ Fail: (End) │ │ Fail: transform │ └─────────────────┘ │ Fail: (End) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐ ┌─────────────────┐
│retry_ │───▶│ analyze_data │
│transformation │ │ (CodeBlock) │
│ (LogBlock) │ │ │
│ │ │ Next: generate_ │
│ Next: analyze_d │ │ Fail: analysis_ │
│ Fail: max_retri │ └─────────────────┘
└─────────────────┘ │
│ │
▼ ▼
┌─────────────────┐ ┌─────────────────┐
│max_retries_ │ │ analysis_failed│
│exceeded │ │ (LogBlock) │
│ (LogBlock) │ │ │
│ │ │ Next: (End) │
│ Next: (End) │ │ Fail: (End) │
│ Fail: (End) │ └─────────────────┘
└─────────────────┘
| Block Name | Type | Purpose | Next on Success | Next on Failure |
|---|---|---|---|---|
| validate_input | LogBlock | Validate input data format | transform_data | input_validation_failed |
| transform_data | CodeBlock | Clean & transform raw data | analyze_data | transformation_failed |
| analyze_data | CodeBlock | Compute analytics/statistics | generate_report | analysis_failed |
| generate_report | LogBlock | Generate processing report | send_notification | report_generation_failed |
| send_notification | LogBlock | Send completion notification | pipeline_complete | (none) |
| pipeline_complete | LogBlock | Mark pipeline as complete | (none) | (none) |
| Block Name | Type | Purpose | Next on Success | Next on Failure |
|---|---|---|---|---|
| input_validation_failed | LogBlock | Handle input validation errors | (none) | (none) |
| transformation_failed | LogBlock | Handle transformation errors | retry_transformation | transformation_permanently_failed |
| retry_transformation | LogBlock | Retry failed transformation | analyze_data | max_retries_exceeded |
| analysis_failed | LogBlock | Handle analysis errors | (none) | (none) |
| report_generation_failed | LogBlock | Handle report generation errors | (none) | (none) |
| transformation_permanently_failed | LogBlock | Handle permanent transformation failures | (none) | (none) |
| max_retries_exceeded | LogBlock | Handle max retry attempts reached | (none) | (none) |
// Cleans raw data, removes nulls/invalid entries
// Calculates discounted prices (10% off)
// Returns count of transformed items
var rawData = (List<Dictionary<string, object>>)context.GetState("RawData");
var transformedData = new List<Dictionary<string, object>>();
// ... data cleaning and transformation logic ...
context.SetState("TransformedData", transformedData);
return transformedData.Count;// Computes statistics from transformed data
// Calculates average, min, max prices
// Returns count of analytics computed
var transformedData = (List<Dictionary<string, object>>)context.GetState("TransformedData");
var analytics = new Dictionary<string, object>();
// ... statistical calculations ...
context.SetState("Analytics", analytics);
return analytics.Count;- Clear Execution Flow: Each block explicitly defines its next steps
- Error Recovery: Failed blocks route to appropriate error handling
- State Propagation: Data flows naturally from block to block via shared state
- Retry Logic: Transformation failures trigger retry attempts before permanent failure
- Type Safety: Compile-time validation ensures block connections are valid
- Composability: Blocks can be reused and combined in different workflows
This structure showcases how FlowCore's linked-list approach provides predictable, debuggable workflow execution while maintaining flexibility for complex business scenarios.
- Dynamic Variables: Runtime condition evaluation and assignment
- Adaptive Processing: Different execution paths based on current state
- Intelligent Timing: Conditional delays and processing priorities
- Load Balancing: Automatic resource allocation based on demand
- Realistic Identifiers: Meaningful IDs like "ORD-2024-001", "CUST-PREMIUM-001"
- Business Context: Industry terminology, department structures, realistic workflows
- Complete Data Models: Full object models representing real business entities
- Error Scenarios: Practical failure modes with appropriate recovery strategies
- Multi-System Coordination: Integration between different business systems
- Stakeholder Communication: Notifications, approvals, status updates
- Audit Compliance: Complete tracking and logging for regulatory requirements
- Performance Optimization: Load balancing, caching, and resource management
Type-Safe Workflow Definitions:
- No more runtime type inference errors
- Compile-time validation of workflow structure
- IntelliSense support for workflow creation
- Clear execution flow visualization
Declarative Workflow Definitions:
- Complete workflow definition in JSON format
- Runtime interpretation and execution
- Dynamic workflow loading and modification
- Separation of workflow logic from orchestration
Key Innovation Matrix:
| Feature | Code-Based | JSON-Based | Benefit |
|---|---|---|---|
| Definition | Compile-time | Runtime | Flexibility |
| Modification | Recompile required | Hot-reload | Agility |
| Storage | Embedded code | External files/DB | Manageability |
| Versioning | Code versioning | Definition versioning | Control |
| Testing | Code testing | Definition testing | Separation |
| Analytics | Code instrumentation | Execution tracking | Insights |
The framework supports both programming paradigms:
Code-First Approach:
var workflow = new Workflow("process")
.StartWith(new ValidationBlock())
.Then(new ProcessingBlock())
.Then(new NotificationBlock());JSON-First Approach:
var engine = new JsonWorkflowEngine();
var result = await engine.ExecuteFromJson(jsonDefinition, inputData);Hybrid Approach:
// Load JSON workflow and extend with code blocks
var jsonWorkflow = await LoadWorkflowFromJson("base_process");
var extendedWorkflow = jsonWorkflow
.Then(new CustomPostProcessingBlock())
.Then(new CustomNotificationBlock());This linked-list approach provides type-safe, predictable workflow orchestration that feels natural to .NET developers while solving the critical limitations of existing workflow solutions. The addition of JSON-based workflow definitions extends this capability to support declarative, runtime-modifiable workflows that enable true business user control and operational agility.