β οΈ BETA VERSION β This app is currently in active development and testing. Features may change and bugs may exist. Report issues β
Features Β· Requirements Β· Getting Started Β· Architecture Β· Privacy
Visual Assist is a native iOS application designed to help visually impaired users navigate their environment safely and independently. Built with Apple's latest frameworks, it leverages the power of:
|
LiDAR + ARKit Depth Sensing |
Vision Text Recognition |
Core ML Object Detection |
SwiftUI Modern Interface |
|
Visual Assist is currently in beta testing. This means:
|
| Area | Status | Notes |
|---|---|---|
| Navigation Mode | β Working | Core functionality complete |
| Text Reading | β Working | OCR may vary with lighting |
| Object Detection | π Testing | Accuracy improvements ongoing |
| Voice Commands | β Working | English only for now |
| Apple Watch | οΏ½οΏ½ Planned | Coming in future release |
|
Real-time obstacle detection powered by LiDAR sensor technology.
|
Point-and-read OCR with natural speech synthesis.
|
AI-powered scene understanding and description.
|
|
|
|
|
# Clone the repository
git clone https://github.com/yadava5/VisualAssist.git
# Navigate to project
cd VisualAssist
# Open in Xcode
open VisualAssist.xcodeproj| Step | Action |
|---|---|
| 1οΈβ£ | Select your Development Team in Signing & Capabilities |
| 2οΈβ£ | Connect your iPhone Pro via USB |
| 3οΈβ£ | Press β + R to build and run |
Grant permissions β App announces "Visual Assist ready" β Start using!
VisualAssist/
βββ π App/ # Entry point & state
β βββ VisualAssistApp.swift
β βββ AppState.swift
βββ π Views/ # SwiftUI interface
β βββ HomeView.swift
β βββ NavigationModeView.swift
β βββ TextReadingModeView.swift
β βββ ObjectAwarenessModeView.swift
β βββ Components/
βββ π Services/ # Business logic
β βββ LiDARService.swift
β βββ CameraService.swift
β βββ SpeechService.swift
β βββ HapticService.swift
βββ π Models/ # Data structures
βββ π Utilities/ # Helpers
| Framework | Purpose | |
|---|---|---|
| ARKit | LiDAR depth sensing | π΅ |
| Vision | Text recognition (OCR) | π’ |
| Core ML | Object detection | π£ |
| AVFoundation | Camera capture | π |
| Speech | Voice commands | π΄ |
| Core Haptics | Haptic feedback | π‘ |
| Pattern | Usage |
|---|---|
| MVVM | Clean view/logic separation |
| Combine | Reactive @Published properties |
| Swift Concurrency | Modern async/await |
| iOS 26 Design | Liquid glass UI effects |
| Feature | Description | |
|---|---|---|
| π | On-Device Processing | All ML runs locally on your iPhone |
| π‘ | No Network Required | Works completely offline |
| π« | No Data Collection | Nothing leaves your device |
| π | No Analytics | Zero tracking or telemetry |
| π€ | No Account | Use immediately, no sign-up |
Visual Assist is built with accessibility as a core principle:
|
|
- β Apple Watch companion app
- πΊοΈ Indoor mapping & saved locations
- π΅ Currency recognition
- π Multi-language support
- π Siri Shortcuts integration
- π CarPlay navigation support
|
This project is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.
For commercial licensing, contact the author. |
This project uses DocC for API documentation.
# Build documentation in Xcode
# Product β Build Documentation (ββ§βD)
# Or via command line
xcodebuild docbuild -scheme VisualAssist -derivedDataPath ./docs