An Expo app demonstrating Decart's Realtime Video models for transforming live camera feeds in real-time.
demo.mov
- Video Restyling (Mirage) - Transform your camera feed into different artistic styles (anime, cyberpunk, Pixar-style, and 90+ more)
- Video Editing (Lucy) - Make real-time edits to your appearance (change outfits, add accessories, transform into characters)
- Multiple View Modes - Switch between transformed-only, picture-in-picture, vertical split, and horizontal split views
- Front/Back Camera - Switch between cameras on the fly
- Bun (or npm/yarn)
- Expo CLI
- Physical iOS or Android device (camera required - simulators won't work)
- Decart API key from platform.decart.ai
-
Clone the repository:
git clone https://github.com/DecartAI/decart-example-expo-realtime.git cd decart-example-expo-realtime -
Install dependencies:
bun install
-
Create your environment file:
cp .env.example .env.local
-
Add your Decart API key to
.env.local:EXPO_PUBLIC_DECART_API_KEY=your-api-key-here -
Run on your device:
# iOS bun ios --device # Android bun android --device
The app uses Decart's Realtime SDK to:
- Capture video from the device camera via WebRTC
- Stream frames to Decart's inference servers
- Receive transformed frames back in real-time
- Display the transformed video alongside or instead of the original
Key integration points:
useWebRTC.ts- Establishes the WebRTC connection using@decartai/sdkmirage-skin-list.ts/lucy-skin-list.ts- Style definitions with promptsCamera.tsx- Orchestrates the camera, styles, and model switching
- Mirage v2 (
mirage_v2) - Video restyling that transforms the entire scene - Lucy v2v (
lucy_v2v_720p_rt) - Video editing that modifies specific elements
MIT