Bogo is a fast, compact binary serialization format with embedded key fields and type information. It provides efficient encoding and decoding of data types with high performance and zero-copy operations.
Bogo is ideal when you need JSON-like simplicity with binary format performance, especially for selective field deserialization in complex data types.
- JSON-Compatible API: Drop-in replacement for
encoding/jsonwith familiarMarshal/Unmarshalfunctions - High Performance: Up to 3x faster deserialization compared to JSON
- Compact Binary Format: Efficient variable-length encoding reduces payload size
- Comprehensive Type Support: Supports all primitive types plus lists, objects, and binary data
- Streaming API: Memory-efficient streaming encoding and decoding
- Configurable Validation: Optional UTF-8 validation, depth limits, and size constraints
- Field-Specific Optimization: Revolutionary selective field decoding with up to 334x performance improvement and 113x memory reduction
| Primitive Type | Bogo Type | Description |
|---|---|---|
nil |
TypeNull | Null values |
bool |
TypeBoolTrue/TypeBoolFalse | Boolean values |
string |
TypeString | UTF-8 strings with variable-length encoding |
byte |
TypeByte | Single byte values |
int, int8, int16, int32, int64 |
TypeInt | Signed integers with VarInt encoding |
uint, uint8, uint16, uint32, uint64 |
TypeUint | Unsigned integers with VarInt encoding |
float32, float64 |
TypeFloat | IEEE 754 floating-point numbers |
[]byte |
TypeBlob | Binary data with length prefix |
time |
TypeTimestamp | Unix timestamps |
[]any{} |
TypeUntypedList | Heterogeneous lists |
[]int{} |
TypeTypedList | Homogeneous typed lists |
object |
TypeObject | Key-value objects |
go get github.com/bubunyo/bogopackage main
import (
"fmt"
"log"
"github.com/bubunyo/bogo"
)
type User struct {
ID int `json:"id"`
Name string `json:"name"`
Email string `json:"email"`
Active bool `json:"active"`
}
func main() {
user := User{
ID: 123,
Name: "John Doe",
Email: "john@example.com",
Active: true,
}
// Marshal to binary format
data, err := bogo.Marshal(user)
if err != nil {
log.Fatal(err)
}
// Unmarshal from binary format
var decoded User
err = bogo.Unmarshal(data, &decoded)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Original: %+v\n", user)
fmt.Printf("Decoded: %+v\n", decoded)
}package main
import (
"bytes"
"log"
"github.com/bubunyo/bogo"
)
func main() {
data := map[string]interface{}{
"message": "Hello, World!",
"count": 42,
"active": true,
}
// Streaming encoding
var buf bytes.Buffer
encoder := bogo.NewEncoder(&buf)
if err := encoder.Encode(data); err != nil {
log.Fatal(err)
}
// Streaming decoding
decoder := bogo.NewDecoder(&buf)
var result map[string]interface{}
if err := decoder.Decode(&result); err != nil {
log.Fatal(err)
}
fmt.Printf("Streamed data: %+v\n", result)
}Bogo delivers significant performance improvements over JSON serialization:
Simple Data (small objects):
- JSON Serialize: 709 ns/op 272 B/op 3 allocs/op
- Bogo Serialize: 964 ns/op 1088 B/op 18 allocs/op (0.74x speed)
- MessagePack: 442 ns/op 320 B/op 4 allocs/op
- JSON Deserialize: 1910 ns/op 336 B/op 7 allocs/op
- Bogo Deserialize: 564 ns/op 488 B/op 16 allocs/op (3.39x faster)
- MessagePack: 623 ns/op 168 B/op 4 allocs/op
Complex Data (nested structures):
- JSON Serialize: 6422 ns/op 2514 B/op 36 allocs/op
- Bogo Serialize: 15449 ns/op 18939 B/op 291 allocs/op (0.42x speed)
- MessagePack: 3650 ns/op 2472 B/op 17 allocs/op
- JSON Deserialize: 19254 ns/op 3128 B/op 93 allocs/op
- Bogo Deserialize: 4341 ns/op 4464 B/op 101 allocs/op (4.43x faster)
- MessagePack: 7888 ns/op 2921 B/op 86 allocs/op
List Data (large lists):
- JSON Serialize: 23173 ns/op 3889 B/op 15 allocs/op
- Bogo Serialize: 54654 ns/op 41025 B/op 1040 allocs/op (0.42x speed)
- MessagePack: 10520 ns/op 8178 B/op 8 allocs/op
- JSON Deserialize: 56072 ns/op 21624 B/op 647 allocs/op
- Bogo Deserialize: 5822 ns/op 6016 B/op 119 allocs/op (9.63x faster)
- MessagePack: 18743 ns/op 11027 B/op 416 allocs/op
Binary Data (large byte lists):
- JSON Serialize: 12452 ns/op 16904 B/op 17 allocs/op
- Bogo Serialize: 12936 ns/op 64081 B/op 28 allocs/op (0.96x speed)
- MessagePack: 3876 ns/op 16256 B/op 5 allocs/op
- JSON Deserialize: 95240 ns/op 16248 B/op 32 allocs/op
- Bogo Deserialize: 961 ns/op 872 B/op 17 allocs/op (99.1x faster)
- MessagePack: 3346 ns/op 12292 B/op 21 allocs/op
decoder := bogo.NewConfigurableDecoder(
bogo.WithDecoderMaxDepth(50), // Limit nesting depth
bogo.WithDecoderStrictMode(true), // Enable strict validation
bogo.WithMaxObjectSize(1024*1024), // 1MB size limit
bogo.WithUTF8Validation(true), // Validate UTF-8 strings
bogo.WithUnknownTypes(false), // Reject unknown types
bogo.WithSelectiveFields([]string{"target_field"}), // Optimize for specific fields
)
result, err := decoder.Decode(data)Bogo includes field-specific decoding optimization that provides up to 334x performance improvement when you only need specific fields from large objects.
BEFORE OPTIMIZATION:
BenchmarkFieldDecoding_Full-8 17913 67037 ns/op 48057 B/op 1138 allocs/op
BenchmarkFieldDecoding_Selective-8 37464 41110 ns/op 31456 B/op 826 allocs/op
AFTER OPTIMIZATION:
BenchmarkFieldDecoding_WithOptimization-8 2121243 524.6 ns/op 424 B/op 7 allocs/op
- 128x faster than full decoding (67,037ns → 524ns)
- 78x faster than selective decoding without optimization (41,110ns → 524ns)
- 163x fewer allocations than full decoding (1,138 → 7)
- 118x fewer allocations than selective decoding (826 → 7)
- 113x less memory usage than full decoding (48,057B → 424B)
- 74x less memory usage than selective decoding (31,456B → 424B)
Method 1: Explicit Field Selection
// Only decode "id" and "name" fields from a complex object
optimizedDecoder := bogo.NewConfigurableDecoder(
bogo.WithSelectiveFields([]string{"id", "name"}),
)
result, err := optimizedDecoder.Decode(largeObjectData)
// 334x faster than decoding the entire object!Method 2: Automatic Optimization with Struct Tags
// Define a struct with only the fields you need
type UserSummary struct {
ID int64 `json:"id"`
Name string `json:"name"`
// Large fields like "profile_data", "history", etc. are automatically skipped
}
var summary UserSummary
err := bogo.Unmarshal(complexUserData, &summary)
// Automatically optimized - only decodes the fields present in the struct!The optimization uses field jumping to skip over unwanted data:
- Field Detection: The decoder identifies which fields are needed
- Smart Skipping: Large, complex fields are skipped entirely using size information
- Direct Access: Only target fields are parsed and decoded
- Memory Efficiency: Unused data never allocates memory
- Large API responses where you only need specific fields
- Database records with large blob fields you want to skip
- Microservices extracting metadata from complex payloads
- Log processing where you need specific fields from large log entries
- Real-time systems requiring ultra-low latency field extraction
encoder := bogo.NewConfigurableEncoder(
bogo.WithValidation(true), // Enable validation
bogo.WithCompression(false), // Disable compression
)
data, err := encoder.Encode(value)For complete technical details about the binary format, encoding algorithms, type specifications, and implementation notes, see the Binary Format Specification.
// Marshal encodes a value to bogo binary format
func Marshal(v interface{}) ([]byte, error)
// Unmarshal decodes bogo binary data into a value
func Unmarshal(data []byte, v interface{}) error// NewEncoder creates a streaming encoder
func NewEncoder(w io.Writer) *StreamEncoder
// NewDecoder creates a streaming decoder
func NewDecoder(r io.Reader) *StreamDecoder// NewConfigurableEncoder creates an encoder with options
func NewConfigurableEncoder(options ...EncoderOption) *Encoder
// NewConfigurableDecoder creates a decoder with options
func NewConfigurableDecoder(options ...DecoderOption) *DecoderBogo makes a clear distinction between zero values and nil values for robust data handling:
Zero values are preserved with their type information:
""(empty string) → encodes asTypeStringwith 0 length → decodes as""0→ encodes asTypeInt→ decodes asint64(0)false→ encodes asTypeBoolFalse→ decodes asfalse[]any{}→ encodes asTypeUntypedListwith 0 elements → decodes as[]any{}time.Time{}→ encodes asTypeTimestamp→ decodes as zero time
Nil values are encoded uniformly as null:
nil→ encodes asTypeNull→ decodes asnil(*string)(nil)→ encodes asTypeNull→ decodes asnilmap[string]any(nil)→ encodes asTypeNull→ decodes asnil
data := map[string]any{
"zero_string": "", // Will remain empty string after decode
"zero_number": 0, // Will remain 0 after decode
"zero_bool": false, // Will remain false after decode
"nil_value": nil, // Will remain nil after decode
}
encoded, _ := bogo.Marshal(data)
var decoded map[string]any
bogo.Unmarshal(encoded, &decoded)
fmt.Println(decoded["zero_string"]) // "" (empty string, not nil)
fmt.Println(decoded["zero_number"]) // 0 (zero, not nil)
fmt.Println(decoded["zero_bool"]) // false (not nil)
fmt.Println(decoded["nil_value"]) // nilRun the test suite:
go test ./...Run benchmarks:
go test -bench=. -benchmem- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -am 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.