Fast, friendly CLI log analyzer built with Bun. Think "jq for logs."
- Auto-detects log formats - JSON, Apache, Nginx, Syslog, CLF
- Simple query syntax -
loq app.log where level=error - SQL-like queries -
loq "SELECT * FROM app.log WHERE status >= 400" - Powerful filtering - equality, comparison, contains, regex
- Aggregations - count, group by, sum, avg, min, max
- Multiple outputs - colorized, table, JSON, CSV
- Streaming - handles large files efficiently
- Pipe support -
cat *.log | loq where level=error - Tail mode -
loq -f app.log where level=error - Extensible - add custom log formats via config
# Install Bun if you haven't
curl -fsSL https://bun.sh/install | bash
# Clone and install
git clone https://github.com/code-with-auto/loq.git
cd loq
bun install
bun link# After bun link, loq is available at ~/.bun/bin/loq
# Add to your PATH or use directly:
~/.bun/bin/loq --help# View all logs with colorized output
loq app.log
# Filter by level
loq app.log where level=error
# Filter HTTP logs by status
loq access.log where "status >= 400"
# Search in messages
loq app.log where message contains "timeout"
# Regex matching
loq app.log where path matches "^/api/v[0-9]+"
# Boolean logic
loq app.log where level=error and message contains "database"
loq app.log where level=error or level=warn
# Aggregations
loq app.log count by level
loq access.log count by status
# Limit results
loq app.log where level=error limit 10
# Tail mode (like tail -f)
loq -f app.log where level=error
# Pipe support
cat logs/*.log | loq where level=error
kubectl logs pod/my-app | loq where level=error
# Output formats
loq app.log -o json where level=error
loq app.log -o csv where level=error
loq app.log -o tableloq <file> [where <conditions>] [count [by <field>]] [limit <n>]Operators:
=,!=- equality>,<,>=,<=- comparisoncontains- substring matchmatches- regex match
Boolean:
and,or,not- Parentheses for grouping:
(level=error or level=warn) and status>=500
Time filters:
after yesterdaybefore todaybetween "10:00" and "11:00" today
loq "SELECT * FROM app.log WHERE level='error' LIMIT 10"
loq "SELECT level, COUNT(*) FROM app.log GROUP BY level"
loq "SELECT path, AVG(response_time) FROM access.log GROUP BY path"| Format | Example |
|---|---|
| JSON | {"level":"info","message":"Server started"} |
| Apache/Nginx | 192.168.1.1 - - [20/Dec/2024:10:00:00 +0000] "GET /api HTTP/1.1" 200 1234 |
| Syslog | Dec 20 12:34:56 myhost myapp[1234]: Message |
| CLF | 127.0.0.1 - - [10/Oct/2000:13:55:36 -0700] "GET / HTTP/1.0" 200 2326 |
| Plain text | Fallback - entire line as message, no field extraction |
Note: Plain text mode has limited query support. Use
message contains "ERROR"instead oflevel=error. For full query capabilities, use structured logging (JSON) or create a custom format.
Create ~/.loqrc or ./loq.config.ts:
// loq.config.ts
export default {
formats: [
{
name: 'my-app',
detect: /^\[\d{4}-\d{2}-\d{2}/,
parse: {
pattern: /^\[(?<timestamp>[^\]]+)\] (?<level>\w+): (?<message>.+)$/,
fields: {
timestamp: 'timestamp',
level: 'level',
message: 'message',
},
},
},
{
name: 'nginx-json',
detect: (line) => {
try {
const obj = JSON.parse(line);
return 'request_uri' in obj;
} catch {
return false;
}
},
parse: (line) => {
const obj = JSON.parse(line);
return {
timestamp: obj.time_iso8601,
level: obj.status >= 400 ? 'error' : 'info',
message: `${obj.request_method} ${obj.request_uri}`,
fields: obj,
};
},
},
],
aliases: {
errors: 'where level=error',
slow: 'where response_time>1000',
},
};Or use JSON (~/.loqrc):
{
"formats": [
{
"name": "bracketed",
"detect": "^\\[\\d{4}",
"parse": {
"pattern": "^\\[([^\\]]+)\\] (\\w+): (.+)$",
"fields": {
"timestamp": 1,
"level": 2,
"message": 3
}
}
}
]
}Options:
-f, --follow Tail mode (like tail -f)
-o, --output Output format: color, table, json, csv
--format Force log format: json, apache, syslog, clf
-n, --limit Limit number of results
-h, --help Show help
-v, --version Show version
# Install dependencies
bun install
# Run in development mode
bun run dev
# Run tests
bun test
# Run tests with coverage
bun test --coverage
# Type check
bun run typecheck
# Build standalone binary
bun build src/index.ts --compile --outfile loq
# Generate documentation
bun run docsAPI documentation is automatically generated using TypeDoc and hosted on GitHub Pages.
# Generate docs locally
bun run docs
# Serve docs locally
bun run docs:serve
# Then open http://localhost:8080loq/
├── src/
│ ├── index.ts # CLI entry point
│ ├── cli/
│ │ └── args.ts # Argument parsing
│ ├── config/
│ │ └── loader.ts # Config file loading
│ ├── parser/
│ │ ├── types.ts # Types & plugin system
│ │ ├── auto-detect.ts # Format detection
│ │ └── formats/ # Built-in parsers
│ ├── query/
│ │ ├── lexer.ts # Tokenizer
│ │ ├── parser.ts # Query parser
│ │ ├── ast.ts # AST types
│ │ └── executor.ts # Query execution
│ ├── output/
│ │ ├── formatter.ts # Output formatting
│ │ └── colors.ts # Terminal colors
│ └── utils/
│ └── time.ts # Time utilities
├── tests/ # Test files
├── .github/workflows/ # CI/CD
└── package.json
Create ~/.loqrc with your format definitions (see above).
- Create a parser in
src/parser/formats/:
// src/parser/formats/myformat.ts
import type { LogEntry, LogParser } from '../types';
export const myFormatParser: LogParser = {
name: 'myformat',
detect(line: string): boolean {
// Return true if this line matches your format
return line.startsWith('MYFORMAT:');
},
parse(line: string): LogEntry | null {
// Parse the line and return a LogEntry
const match = line.match(/^MYFORMAT: \[(.+?)\] (\w+) - (.+)$/);
if (!match) return null;
return {
raw: line,
timestamp: match[1],
level: match[2],
message: match[3],
fields: { /* any additional fields */ },
};
},
};- Register in
src/parser/auto-detect.ts:
import { myFormatParser } from './formats/myformat';
const builtinParsers: LogParser[] = [
myFormatParser, // Add here
jsonParser,
// ...
];-
Add tests in
tests/parsers/myformat.test.ts -
Submit a PR!
MIT
Contributions welcome! See CONTRIBUTORS.md for guidelines.