Home > Developer Guide > Features > Camera Scoring


Camera-Based Arrow Scoring

Purpose: Document the ML-powered camera scoring feature that automatically detects arrow positions in target photos

PR: #354

Platforms: Android and iOS

Overview

Camera-based arrow scoring allows users to photograph their target and have arrow positions automatically detected using machine learning. The feature uses a YOLOv8s model trained on archery target images to identify arrow locations, then provides an adjustment UI where users can fine-tune positions before confirming scores.

Feature Flow

┌─────────────┐     ┌─────────────┐     ┌─────────────┐     ┌─────────────┐
│   Capture   │ ──► │   Detect    │ ──► │   Adjust    │ ──► │   Submit    │
│   Photo     │     │   Arrows    │     │   Positions │     │   Scores    │
└─────────────┘     └─────────────┘     └─────────────┘     └─────────────┘
     │                    │                   │                   │
     ▼                    ▼                   ▼                   ▼
 CameraX (Android)   TFLite/CoreML      Draggable UI      Score applied
 UIImagePicker (iOS) YOLOv8s inference  Arrow markers     to current end

Architecture

ML Model

PropertyValue
ArchitectureYOLOv8s
Input Size640×640 pixels
Output Format[1, 6, 8400]
Detection Classes1 (arrow)
Confidence Threshold0.35
IoU Threshold0.50

Output Tensor Layout:

  • Indices 0-3: x, y, width, height (center-based coordinates)
  • Index 4: confidence score
  • Index 5: class ID (always 0 for arrow)

Platform Implementations

ComponentAndroidiOS
ML RuntimeTensorFlow LiteCoreML + Vision
Model Filearrow_detector.tfliteArrowDetector.mlpackage
CameraCameraXUIImagePickerController
Image SourceCamera onlyCamera or Photo Library

Coordinate System

The feature uses a normalized coordinate system for arrow positions:

PropertyRangeDescription
X-axis-1.0 to +1.0Left edge to right edge
Y-axis-1.0 to +1.0Top edge to bottom edge
Center(0, 0)Center of target face
Edgedistance = 1.0Edge of scoring area

Score Calculation

Scores are calculated based on normalized distance from center:

DistanceScore
≤ 0.05X-ring (10)
≤ 0.1010
≤ 0.209
≤ 0.308
≤ 0.407
≤ 0.506
≤ 0.605
≤ 0.704
≤ 0.803
≤ 0.902
≤ 1.001
> 1.00Miss (0)

Key Components

Detection Service

Handles ML inference and post-processing:

AndroidiOS
ArrowDetectionService.ktArrowDetectionService.swift
TFLite InterpreterVNCoreMLModel
ByteBuffer preprocessingCGImage processing
Manual NMS implementationVision framework NMS

Adjustment UI

Interactive arrow position adjustment:

FeatureDescription
Draggable markersLong-press + drag to move arrows
Remove arrowsDrag off target edge
Add arrowsTap in placement mode
Score previewReal-time score calculation
X-ring detectionAutomatic X vs 10 distinction

Flow Coordinator

Orchestrates the complete capture → detect → adjust → confirm flow:

StateDescription
RequestingPermissionRequesting camera permission
CapturingCamera viewfinder active
ProcessingML inference running
AdjustingArrow adjustment UI shown
ErrorError state with retry option

Target Face Rendering

The adjustment UI renders a standard 10-ring target face:

RingScoreColor
1-21-2White
3-43-4Black
5-65-6Blue (#00B4D8)
7-87-8Red
9-109-10Gold (#FFD700)
X10 (X)Gold (#FFD700)

Colors match DomainColor from the existing scoring system to ensure visual consistency.


User Interactions

GestureActionResult
Long-press + dragMove arrowArrow position updates, score recalculates
Drag off edgeRemove arrowArrow removed from count
Tap (placement mode)Add arrowNew arrow placed at tap location
Tap “Place” buttonEnter placement modeTarget turns green, tap to add
Tap “Confirm”Submit scoresScores applied to current end
Tap “Cancel”Exit flowReturn to scoring screen

Confidence Indicators

Arrow marker colors indicate detection confidence:

ConfidenceColor
≥ 70%Red
50-69%Orange
< 50%Deep Orange
Manual placementGreen

Integration Points

The camera scoring flow integrates with existing scoring screens:

ScreenIntegration
ActiveScoringScreen (Android)Camera button in score input section
TargetScoreInput (Android)Camera button in visual scoring header
ScoringView (iOS)Camera tab in scoring mode selector
ActiveScoringView (iOS)Camera button for legacy view

Key Implementation Patterns

Drag Gesture Fix (Critical Discovery)

Both platforms encountered a bug where dragging arrows caused them to snap back to their original positions. The root cause was closure capture of stale arrow positions.

PlatformProblemSolution
AndroiddetectDragGestures callback captures arrow position at render timeDelta-based updates: pass (dx, dy) to ViewModel which reads current state
iOSDragGesture translation added to already-updated positionTrack drag start position in @State, calculate startPos + translation

See platform guides for detailed implementation.

Letterbox Preprocessing (Android)

YOLO models expect square 640×640 input. Letterboxing preserves aspect ratio by:

  1. Scaling image to fit within 640×640
  2. Centering on gray (114, 114, 114) canvas
  3. Storing scale/offset for coordinate back-transformation

Manual Entry Fallback

When ML detection fails, users can enter scores manually via an “Enter Manually” button on the error screen. This initializes the adjustment UI with zero arrows in placement mode.


Known Issues

TFLite 16KB Page Size (Android)

Issue: TensorFlow Lite 2.14.0 libraries are incompatible with Android 15+ devices that use 16KB page alignment.

Impact: App may crash on affected devices when loading the ML model.

Status: Upstream issue - awaiting TensorFlow release with compatible libraries.

Workaround: Feature flag to disable camera scoring on affected devices (future work).


Database Migration

Camera scoring settings are stored in the app settings:

Migration 36 → 37:

FieldTypeDefaultDescription
cameraScoreEnabledBooleantrueEnable camera scoring feature
saveArrowImagesBooleanfalsePersist scanned images
arrowImageStorageUriString?nullStorage location for images

Detailed Guides


Files Added

Android

FileLinesPurpose
domain/camera/ArrowDetectionResult.kt214Detection data models
domain/camera/ArrowDetectionService.kt385TFLite inference wrapper
ui/camera/ArrowAdjustmentScreen.kt478Draggable arrows UI
ui/camera/ArrowAdjustmentViewModel.kt257Adjustment logic
ui/camera/CameraScoringFlow.kt240Flow coordinator

iOS

FileLinesPurpose
Services/Camera/ArrowDetectionService.swift533CoreML model wrapper
Views/Camera/CameraScoringAdjustmentView.swift526Adjustment UI
Views/Camera/CameraScoringFlow.swift287Flow coordinator

Last Updated: 2025-12-04 PR: #354