Home > Developer Guide > Features > Camera Scoring
Camera-Based Arrow Scoring
Purpose: Document the ML-powered camera scoring feature that automatically detects arrow positions in target photos
PR: #354
Platforms: Android and iOS
Overview
Camera-based arrow scoring allows users to photograph their target and have arrow positions automatically detected using machine learning. The feature uses a YOLOv8s model trained on archery target images to identify arrow locations, then provides an adjustment UI where users can fine-tune positions before confirming scores.
Feature Flow
┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Capture │ ──► │ Detect │ ──► │ Adjust │ ──► │ Submit │
│ Photo │ │ Arrows │ │ Positions │ │ Scores │
└─────────────┘ └─────────────┘ └─────────────┘ └─────────────┘
│ │ │ │
▼ ▼ ▼ ▼
CameraX (Android) TFLite/CoreML Draggable UI Score applied
UIImagePicker (iOS) YOLOv8s inference Arrow markers to current end
Architecture
ML Model
| Property | Value |
|---|---|
| Architecture | YOLOv8s |
| Input Size | 640×640 pixels |
| Output Format | [1, 6, 8400] |
| Detection Classes | 1 (arrow) |
| Confidence Threshold | 0.35 |
| IoU Threshold | 0.50 |
Output Tensor Layout:
- Indices 0-3: x, y, width, height (center-based coordinates)
- Index 4: confidence score
- Index 5: class ID (always 0 for arrow)
Platform Implementations
| Component | Android | iOS |
|---|---|---|
| ML Runtime | TensorFlow Lite | CoreML + Vision |
| Model File | arrow_detector.tflite | ArrowDetector.mlpackage |
| Camera | CameraX | UIImagePickerController |
| Image Source | Camera only | Camera or Photo Library |
Coordinate System
The feature uses a normalized coordinate system for arrow positions:
| Property | Range | Description |
|---|---|---|
| X-axis | -1.0 to +1.0 | Left edge to right edge |
| Y-axis | -1.0 to +1.0 | Top edge to bottom edge |
| Center | (0, 0) | Center of target face |
| Edge | distance = 1.0 | Edge of scoring area |
Score Calculation
Scores are calculated based on normalized distance from center:
| Distance | Score |
|---|---|
| ≤ 0.05 | X-ring (10) |
| ≤ 0.10 | 10 |
| ≤ 0.20 | 9 |
| ≤ 0.30 | 8 |
| ≤ 0.40 | 7 |
| ≤ 0.50 | 6 |
| ≤ 0.60 | 5 |
| ≤ 0.70 | 4 |
| ≤ 0.80 | 3 |
| ≤ 0.90 | 2 |
| ≤ 1.00 | 1 |
| > 1.00 | Miss (0) |
Key Components
Detection Service
Handles ML inference and post-processing:
| Android | iOS |
|---|---|
ArrowDetectionService.kt | ArrowDetectionService.swift |
| TFLite Interpreter | VNCoreMLModel |
| ByteBuffer preprocessing | CGImage processing |
| Manual NMS implementation | Vision framework NMS |
Adjustment UI
Interactive arrow position adjustment:
| Feature | Description |
|---|---|
| Draggable markers | Long-press + drag to move arrows |
| Remove arrows | Drag off target edge |
| Add arrows | Tap in placement mode |
| Score preview | Real-time score calculation |
| X-ring detection | Automatic X vs 10 distinction |
Flow Coordinator
Orchestrates the complete capture → detect → adjust → confirm flow:
| State | Description |
|---|---|
| RequestingPermission | Requesting camera permission |
| Capturing | Camera viewfinder active |
| Processing | ML inference running |
| Adjusting | Arrow adjustment UI shown |
| Error | Error state with retry option |
Target Face Rendering
The adjustment UI renders a standard 10-ring target face:
| Ring | Score | Color |
|---|---|---|
| 1-2 | 1-2 | White |
| 3-4 | 3-4 | Black |
| 5-6 | 5-6 | Blue (#00B4D8) |
| 7-8 | 7-8 | Red |
| 9-10 | 9-10 | Gold (#FFD700) |
| X | 10 (X) | Gold (#FFD700) |
Colors match DomainColor from the existing scoring system to ensure visual consistency.
User Interactions
| Gesture | Action | Result |
|---|---|---|
| Long-press + drag | Move arrow | Arrow position updates, score recalculates |
| Drag off edge | Remove arrow | Arrow removed from count |
| Tap (placement mode) | Add arrow | New arrow placed at tap location |
| Tap “Place” button | Enter placement mode | Target turns green, tap to add |
| Tap “Confirm” | Submit scores | Scores applied to current end |
| Tap “Cancel” | Exit flow | Return to scoring screen |
Confidence Indicators
Arrow marker colors indicate detection confidence:
| Confidence | Color |
|---|---|
| ≥ 70% | Red |
| 50-69% | Orange |
| < 50% | Deep Orange |
| Manual placement | Green |
Integration Points
The camera scoring flow integrates with existing scoring screens:
| Screen | Integration |
|---|---|
| ActiveScoringScreen (Android) | Camera button in score input section |
| TargetScoreInput (Android) | Camera button in visual scoring header |
| ScoringView (iOS) | Camera tab in scoring mode selector |
| ActiveScoringView (iOS) | Camera button for legacy view |
Key Implementation Patterns
Drag Gesture Fix (Critical Discovery)
Both platforms encountered a bug where dragging arrows caused them to snap back to their original positions. The root cause was closure capture of stale arrow positions.
| Platform | Problem | Solution |
|---|---|---|
| Android | detectDragGestures callback captures arrow position at render time | Delta-based updates: pass (dx, dy) to ViewModel which reads current state |
| iOS | DragGesture translation added to already-updated position | Track drag start position in @State, calculate startPos + translation |
See platform guides for detailed implementation.
Letterbox Preprocessing (Android)
YOLO models expect square 640×640 input. Letterboxing preserves aspect ratio by:
- Scaling image to fit within 640×640
- Centering on gray (114, 114, 114) canvas
- Storing scale/offset for coordinate back-transformation
Manual Entry Fallback
When ML detection fails, users can enter scores manually via an “Enter Manually” button on the error screen. This initializes the adjustment UI with zero arrows in placement mode.
Known Issues
TFLite 16KB Page Size (Android)
Issue: TensorFlow Lite 2.14.0 libraries are incompatible with Android 15+ devices that use 16KB page alignment.
Impact: App may crash on affected devices when loading the ML model.
Status: Upstream issue - awaiting TensorFlow release with compatible libraries.
Workaround: Feature flag to disable camera scoring on affected devices (future work).
Database Migration
Camera scoring settings are stored in the app settings:
Migration 36 → 37:
| Field | Type | Default | Description |
|---|---|---|---|
cameraScoreEnabled | Boolean | true | Enable camera scoring feature |
saveArrowImages | Boolean | false | Persist scanned images |
arrowImageStorageUri | String? | null | Storage location for images |
Detailed Guides
- Android Implementation - TFLite integration, CameraX, Compose UI
- iOS Implementation - CoreML/Vision, SwiftUI components
Files Added
Android
| File | Lines | Purpose |
|---|---|---|
domain/camera/ArrowDetectionResult.kt | 214 | Detection data models |
domain/camera/ArrowDetectionService.kt | 385 | TFLite inference wrapper |
ui/camera/ArrowAdjustmentScreen.kt | 478 | Draggable arrows UI |
ui/camera/ArrowAdjustmentViewModel.kt | 257 | Adjustment logic |
ui/camera/CameraScoringFlow.kt | 240 | Flow coordinator |
iOS
| File | Lines | Purpose |
|---|---|---|
Services/Camera/ArrowDetectionService.swift | 533 | CoreML model wrapper |
Views/Camera/CameraScoringAdjustmentView.swift | 526 | Adjustment UI |
Views/Camera/CameraScoringFlow.swift | 287 | Flow coordinator |
Last Updated: 2025-12-04 PR: #354