Sollertia

Measuring Fine Motor Control in XR

Anaya Yorke & Garret S. Stand

GitHub Research Project

Abstract

Sollertia is a system for measuring fine motor control in XR using task-based interaction and wearable force input. The system replicates the same button-pressing task in both physical and XR conditions, enabling direct comparison of motor performance across environments. By combining behavioral metrics (reaction time, spatial accuracy, movement trajectory) with continuous force data from finger-mounted sensors, Sollertia captures how people move, react, and apply pressure during interaction. The goal is to evaluate whether XR interaction can capture meaningful fine motor behavior comparable to physical tasks.

Research Questions

  1. Can XR-based motor tasks capture meaningful fine motor behavior?
  2. How does motor performance in XR compare to equivalent physical tasks?
  3. What role does force modulation play in characterizing motor control quality?

Background

Traditional motor assessments provide limited quantitative insight, often yielding only summary scores that obscure the temporal dynamics of movement. XR environments offer the potential for precise, objective measurement of motor behavior, but validation against physical task performance is needed.

A fundamental question remains: do motor behaviors observed in XR accurately reflect real-world motor capabilities? Sollertia addresses this by replicating the same task in both physical and XR conditions.

Theoretical Foundation

Fitts' Law provides a foundational model for understanding speed-accuracy tradeoffs in aimed movements:

$$MT = a + b \cdot ID$$

where $ID = \log_2(2D/W)$ is the index of difficulty

Principle Application in Sollertia
Fitts' Law Target size and distance parameters follow established speed-accuracy tradeoffs
Motor Learning Theory Repeated trials enable assessment of skill acquisition and variability reduction
Force Control Models FSR data captures grip force modulation and stability

System Architecture

Sollertia comprises three integrated components:

+------------------+ | Meta Quest 3 | | XR Application | +--------+---------+ | | WebSocket | +------------------+ | +------------------+ | Wearable +---------+---------+ Clinical | | Hardware | USB Serial | Dashboard | | (FSR Glove) | | (Rust) | +------------------+ +------------------+

XR Application

Unity-based application for Meta Quest 3 with hand tracking. Presents discrete pointing tasks with millisecond-precision event logging.

Wearable Hardware

Finger-mounted FSR sensors (index and middle fingers) capture continuous pressure data at 10 Hz via Arduino.

Dashboard

Rust-based application for real-time visualization, session management, and data export.

Hardware Specifications

Component Details
Microcontroller ELEGOO UNO R3 (Arduino-compatible)
Sensors 2x FSR402 on index and middle fingers
Sampling Rate 10 Hz
Communication USB Serial (9600 baud)
Resolution 10-bit ADC (0-1023)

Measurement Framework

Behavioral Metrics

Metric Description Unit
Reaction Time Interval from target onset to movement initiation ms
Movement Time Duration from movement start to target contact ms
Spatial Error Euclidean distance from fingertip to target center mm
Movement Trajectory 3D path of hand/finger during reach Vector3[]
Trial Variability Standard deviation of metrics across trials varies

Force Metrics

Metric Description Unit
Peak Force Maximum pressure during press N
Force Onset Time Time from contact to force threshold ms
Force Duration Time force exceeds threshold ms
Force Variability Coefficient of variation across trials %

Derived Metrics

$$\text{Throughput} = \frac{ID}{MT}$$
$$\text{Accuracy Index} = 1 - \frac{\text{error}}{\text{target radius}}$$
$$\text{Force Stability} = \frac{1}{CV_{\text{force}}}$$

Methods

Task Protocol

The button-pressing task follows a discrete pointing paradigm:

  1. Fixation (500 ms): Brief pause before target onset
  2. Target Onset: Visual target illuminates at randomized location
  3. Reach: Participant moves hand toward target
  4. Contact: Fingertip makes contact with target surface
  5. Press: Force applied until threshold reached
  6. Feedback: Visual/auditory confirmation of successful press

Task Parameters

Parameter Default Range
Session Duration 45 s 30-120 s
Target Count 9 4-16
Target Diameter 40 mm 20-60 mm
Force Threshold 1.5 N 0.5-3.0 N

Data Collection

Each trial generates a synchronized data record:

Trial Record
├── trial_id: uint32
├── target_id: uint8
├── stimulus_time: timestamp_ms
├── movement_onset_time: timestamp_ms
├── contact_time: timestamp_ms
├── press_time: timestamp_ms
├── release_time: timestamp_ms
├── contact_position: Vector3
├── target_position: Vector3
├── trajectory: Vector3[]
└── force_signal: uint16[] (10 Hz)

Analysis Pipeline

  1. Preprocessing: Timestamp alignment, outlier removal, signal filtering
  2. Feature Extraction: Compute behavioral and force metrics per trial
  3. Aggregation: Session-level statistics (mean, SD, trends)
  4. Comparison: Physical vs. XR condition analysis
  5. Visualization: Performance dashboards and trajectory plots

References

[1] Modeling Endpoint Distribution of Pointing Selection Tasks in Virtual Reality Environments
Lu, X., et al. (2019)
PDF
[2] Metrics of Motor Learning for Analyzing Movement Mapping in Virtual Reality
Chen, D., et al. (2024)
PDF
[3] From Pegs to Pixels: A Comparative Analysis of the Nine Hole Peg Test and a Digital Copy Drawing Test for Fine Motor Control Assessment
Schoen, A., et al. (2025)
PDF
[4] Accurate and Low-Latency Sensing of Touch Contact on Any Surface with Finger-Worn IMU Sensor
Wei, X., et al. (2019)
PDF
[5] Accuracy Evaluation of Touch Tasks in Commodity Virtual and Augmented Reality Head-Mounted Displays
Schneider, O., et al. (2021)
PDF
[6] The information capacity of the human motor system in controlling the amplitude of movement
Fitts, P. M. (1954). Journal of Experimental Psychology, 47(6), 381-391.