It’s the kind of minimalist “serverless-but-not-really” toy that spins up containers on demand, pretends to orchestrate things, and confidently behaves like it belongs in a cloud brochure. It fetches images, fires them up, returns results, and then casually wanders off like it just performed a miracle. No promises, no guarantees, just vibes, enthusiasm, and the sheer audacity to call itself an execution platform. Think of it as Lambda's chaotic younger sibling who runs on coffee, pulls random images on demand, skips production readiness checks, and proudly says, "Scaling? Never heard of her."
- Function registration and management
- Docker-based function execution
- Synchronous function invocation via RESTful API
- Async invocation infrastructure
- Prometheus metrics integration with invocation counters and duration histograms
- Timeout handling with configurable timeouts
- JSON-based function persistence
- Docker image upload support
- Concurrent function execution support
- Go 1.24.5 or higher
- Docker Engine running
- Clone the repository:
git clone <your-repo-url>
cd Mini-Lambda-System- Install dependencies:
go mod tidy- Build and run the server:
go run .The server will start on port 8300.
First, let's build the included Python function:
# Navigate to the function directory
cd function
# Build the Docker image
docker build -f hello-image.dockerfile -t hello-python .
# Go back to root directory
cd ..curl -X POST http://localhost:8300/functions \
-H "Content-Type: application/json" \
-d '{
"name": "hello-python",
"image": "hello-python"
}'Response:
{
"id": "5ac94553-cd05-4cc0-b657-acae0c6559e1",
"name": "hello-python",
"image": "hello-python",
"created_at": "2025-01-16T16:24:48.719407+05:45"
}curl http://localhost:8300/functionscurl -X POST http://localhost:8300/invoke/5ac94553-cd05-4cc0-b657-acae0c6559e1 \
-H "Content-Type: application/json" \
-d '{
"event": {
"message": "Hello from Mini Lambda!"
},
"timeout": 30
}'Response:
{
"result": "{\"message\": \"Processed: Hello from Mini Lambda!\", \"input_received\": {\"message\": \"Hello from Mini Lambda!\"}}\n",
"duration": 1250,
"timestamp": "2025-01-16T16:30:15.123Z"
}You can upload Docker images directly to the system:
# Save your Docker image to a tar file
docker save my-function:latest > my-function.tar
# Upload the image
curl -X POST http://localhost:8300/images \
-F "image=@my-function.tar"curl http://localhost:8300/images- POST
/functions - Body:
{ "name": "function-name", "image": "docker-image-name" }
- GET
/functions - Returns array of all registered functions
- POST
/invoke/{function-id} - Body:
{ "event": {}, // Any JSON payload "timeout": 120 // Timeout in seconds (optional, default: 120) } - Returns immediate results with output, duration, and timestamp
- POST
/images - Form Data:
image: Docker image tar file
- Uploads and loads a Docker image into the local Docker registry
- GET
/images - Returns array of all available Docker images in the local registry
- GET
/metrics - Returns Prometheus metrics including:
Total_Invocations: Counter of function invocations by function nameInvocation Duration ms: Histogram of invocation durations in milliseconds
- Registration: Functions are registered with a name and Docker image
- Invocation: When invoked, the system:
- Creates a new Docker container from the specified image
- Passes the JSON payload via stdin
- Captures stdout/stderr as output and logs
- Measures execution duration
- Records metrics
- Cleans up the container
The system includes backend support for asynchronous invocations with:
- Invocation Status Tracking: PENDING → RUNNING → COMPLETED/FAILED
- Result Storage: Output, logs, duration, and error information
- Concurrent Execution: Multiple functions can run simultaneously
- Thread-Safe Operations: Mutex-protected invocation management
Note: Async API endpoints are not yet implemented but the infrastructure is ready.
- Create your function file (
my-function.py):
import json
import sys
from datetime import datetime
# Read input from stdin
input_data = sys.stdin.read()
event = json.loads(input_data) if input_data.strip() else {}
# Your function logic here
result = {
"message": f"Hello {event.get('name', 'World')}!",
"timestamp": str(datetime.now())
}
# Output result as JSON
print(json.dumps(result))- Create Dockerfile (
my-function.dockerfile):
FROM python:3.9-alpine
COPY my-function.py /app/function.py
WORKDIR /app
CMD ["python3", "function.py"]- Build and register:
docker build -f my-function.dockerfile -t my-function .
curl -X POST http://localhost:8300/functions \
-H "Content-Type: application/json" \
-d '{"name": "my-function", "image": "my-function"}'- Create
package.json:
{
"name": "node-function",
"version": "1.0.0",
"main": "index.js"
}- Create
index.js:
let input = "";
process.stdin.on("data", (chunk) => (input += chunk));
process.stdin.on("end", () => {
const event = input ? JSON.parse(input) : {};
const result = {
message: `Processed: ${event.message || "No message"}`,
nodeVersion: process.version,
};
console.log(JSON.stringify(result));
});- Create Dockerfile:
FROM node:18-alpine
WORKDIR /app
COPY package.json index.js ./
CMD ["node", "index.js"]Your Docker function must:
- Read JSON input from stdin
- Output JSON result to stdout
- Exit with code 0 on success
- Handle empty/invalid input gracefully
Access Prometheus metrics at http://localhost:8300/metrics to monitor:
- Total Invocations:
Total_Invocationscounter by function name - Invocation Duration:
Invocation Duration mshistogram with linear buckets (10ms-1000ms) - Function Performance: Per-function execution statistics
Example metrics output:
# HELP Total_Invocations Number of function invocations
# TYPE Total_Invocations counter
Total_Invocations{function="hello-python"} 5
# HELP Invocation Duration ms Invocation latency in ms
# TYPE Invocation Duration ms histogram
Invocation Duration ms_bucket{function="hello-python",le="10"} 0
Invocation Duration ms_bucket{function="hello-python",le="110"} 2
- Ensure the function ID exists by calling
/functions - Check that the function was properly registered
- Verify Docker is running:
docker ps - Check that the Docker image exists:
docker images - Ensure the Docker image is executable and has proper CMD/ENTRYPOINT
- Increase the timeout parameter in your invoke request
- Check function logs for performance bottlenecks
- Monitor execution time via
/metrics
- Ensure the uploaded file is a valid Docker tar export
- Check file size limits and available disk space
- Verify the tar file was created with
docker save
- REST API endpoints for async invocation management
- Function versioning support
- Resource limits and quotas
- Log aggregation and search
- Function scaling and load balancing
- Authentication and authorization
- Function marketplace/registry