Skip to content

Architectural patterns for building reliable, grounded LLM applications with Apache Camel. Features examples for Generative Parsing, Semantic Routing, and context injection.

License

Notifications You must be signed in to change notification settings

ibek/camel-openai-patterns

Repository files navigation

Camel OpenAI Integration Patterns

License: MIT Apache Camel Java

Turn LLMs into boring, effective semantic processors.

This repository demonstrates architectural patterns for building reliable LLM applications using Apache Camel. Rather than relying on brittle prompt engineering alone, these examples show how to orchestrate interactions to ensure structured outputs, correct routing, and contextual integrity.

📖 Read the companion article: Making LLMs Boring: From Chatbots to Semantic Processors


Table of Contents


Architectural Patterns

  1. Generative Parsing: Constraining LLM output to valid formats (JSON, XML, POJOs) for seamless integration with downstream systems.
  2. Semantic Routing: Directing traffic flow based on the intent of the user's prompt rather than static headers.
  3. Grounded Pipelines: Injecting context to ensure the LLM responds based on specific, retrieved data rather than hallucinations.

Prerequisites

Before running the examples, ensure you have the following:

  • Java 17 or 21
  • Inference Server: Any server exposing OpenAI-compatible endpoints.
    • Local Runners: Ollama, vLLM, Llama.cpp, LocalAI.
    • Cloud Providers: OpenAI Platform, Groq, Mistral, or others (Amazon Bedrock/Google Vertex if using an OpenAI-compatible gateway).

Install Camel Launcher

You need the Camel CLI to run these examples. Camel Launcher Documentation

Linux / macOS

wget https://repo1.maven.org/maven2/org/apache/camel/camel-launcher/4.17.0/camel-launcher-4.17.0-bin.zip
unzip camel-launcher-*-bin.zip
cd camel-launcher-*/
chmod +x bin/camel.sh
mkdir -p $HOME/.local/bin
ln -sf "$PWD/bin/camel.sh" "$HOME/.local/bin/camel"

Windows

  1. Download and unzip the package.
  2. Add the camel-launcher/bin directory to your System PATH.

Verify Installation

camel --version

Quick Start

1. Configure your environment:

export OPENAI_API_KEY=your-api-key
export OPENAI_BASE_URL=http://localhost:11434/v1  # Ollama example
export OPENAI_MODEL=ministral-3:8b

Note: If using the real OpenAI API, set OPENAI_BASE_URL to https://api.openai.com/v1.

2. Run your first example:

cd generative-parsing/classify-leaf-node
echo "I lost my credit card and need to block it immediately" | camel run --source-dir=./

3. See structured output:

{
    "rationale": "The user explicitly states they lost their credit card and requests immediate blocking, which is a critical security action to prevent unauthorized use. This falls under the highest priority category of 'Security_and_Access' due to the urgency and potential fraud risk associated with a lost card.",
    "path": "Security_and_Access > Fraud_and_Disputes > Report_Lost_or_Stolen_Card",
    "confidence": 1.0,
    "status": "ACCEPTED"
}

Repository Structure

The project is organized by pattern. Each directory contains a standalone quickstart with its own README and runnable Camel YAML files.

├── generative-parsing/       # Pattern 1: Structured Data Extraction
│   ├── classify-leaf-node/   # Deep taxonomy classification
│   ├── entity-resolution/    # Fuzzy matching to canonical IDs
│   └── pii-redaction/        # Identify and mask PII
├── semantic-routing/         # Pattern 2: Intent-based Routing
│   ├── detect-gaps/          # Compliance gap analysis
│   ├── moderation-policy/    # Content safety filtering
│   └── risk-scoring/         # Quantitative risk assessment
├── grounded-pipelines/       # Pattern 3: Context Injection
│   └── database-query/       # Air-gapped SQL querying
└── adapters/                 # Pluggable Input/Output definitions

How to Run

Navigate to a specific pattern directory and follow its README.md to use the camel run command.

Example: Running the Leaf Node Classification example

cd generative-parsing/classify-leaf-node
echo "I noticed a charge from a vendor in London that I never visited." | camel run --source-dir=./

Quiet Mode (No Logging)

If you want to focus on the output without Camel logs:

camel run --source-dir=./ --logging-level=OFF

Using Adapters

By default, all examples use the Console Adapter (Standard Input/Output) for simple CLI interactivity.

You can switch the interface to HTTP, Kafka, or File by replacing the adapter route in the *.camel.yaml file. See the adapters/README.md for detailed instructions.

Adapter Use Case Endpoint
Console CLI testing, piped input stream:in / stream:out
HTTP REST API integration platform-http:/api/...
Kafka Event-driven streaming kafka:topic-name
File Batch processing file:data/inbox

Recommended Models

These patterns work with any OpenAI-compatible model. For cost-effective local processing, we recommend:

Model Size / Active Notes
Ministral-3-8B 8B Excellent for structured output tasks
Qwen3-VL-8B 8B Strong reasoning, multilingual
Granite-4.0-H-Small 32B/9B IBM's enterprise-focused model

Developer Tips

Visual Route Design

Don't just write YAML by hand! Use Kaoto for designing your Camel routes visually, or leverage AI coding assistants (Claude, Cursor, Copilot) with prompts like:

Create a Camel 4.17 YAML route that monitors a folder for new text files,
sends small files (<5KB) to the camel-openai component for summarization,
and saves the response to an output folder.

Export to Production

Convert these examples to Maven/Gradle projects for Quarkus or Spring Boot:

camel export --runtime=quarkus --directory=./my-project

Learn More


Contributing

Contributions are welcome! Please read Contributing Guide for details on:

  • Setting up your development environment
  • Submitting bug reports and feature requests
  • Creating new patterns

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Architectural patterns for building reliable, grounded LLM applications with Apache Camel. Features examples for Generative Parsing, Semantic Routing, and context injection.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published