3. Telemetry Collection
Controlling high-resolution kinematic capture.
Starting and Stopping Capture
While the Nx10 SDK requires zero manual instrumentation of your UI, you configure if the telemetry capture starts when you initialise the library. If you have set autoStartTelemetry to false then you must call the methods below to begin capture.
This provides you with absolute control over when the LFM receives data. While we recommend capturing telemetry across the entire app experience to build the most accurate baseline, you may wish to disable it for certain areas or actions, especially if you believe they will negatively impact the overall shape of your data.
Zero Instrumentation
Once you call startTelemetry(), or initialise with capture automatically enabled, the SDK uses method swizzling to automatically hook into the main UIWindow event loop. You do not need to subclass your UIView or UIButton elements. It happens entirely in the background.
The API Surface
Call these methods in your view controllers or app lifecycle managers to control the data flow.
import SwiftUI
import NX10CoreSDK
var body: some View {
VStack {
// Your View code here
}
.onAppear {
Task {
do {
try await NX10Core.shared.startTelemetry()
} catch {}
}
}
}
}You may wish to stop telemetry recording specifically when the app moves into a background state, for example.
import SwiftUI
import NX10CoreSDK
struct ContentView: View {
@Environment(\.scenePhase) private var scenePhase
var body: some View {
VStack {
Text("Your App Content")
}
.onChange(of: scenePhase) { newPhase in
if newPhase == .background {
// Stop telemetry when the app goes to the background
NX10Core.shared.telemetryService?.stopTelemetry()
}
}
}
}Manual Telemetry (Custom Events)
While the SDK handles standard UIKit and SwiftUI interactions automatically via method swizzling, if you are using highly customized canvas views that intercept OS touches where you may need to log telemetry manually, of if you want higher fidelity data than the default swizzlin can provide, the SDK provides featurs for this
Tracking Key Presses
You can log individual keystrokes.
// Log individual characters or interactions
NX10Core.shared.telemetryService?.keyPressed("a")
NX10Core.shared.telemetryService?.keyPressed(" ")
NX10Core.shared.telemetryService?.keyPressed("b")Tracking Touch Data
You can log detailed touch paths manually by providing the exact coordinates for where a touch began, where it moved, and where it ended. This is exceptionally useful for custom gesture recognizers or game-like interfaces.
// Example: Tracking a simple tap
NX10Core.shared.telemetryService?.appendTouch(at: (
began: CGPoint(x: 150, y: 200),
movedTo: nil,
endedAt: nil
))
// Example: Tracking a swipe/drag
NX10Core.shared.telemetryService?.appendTouch(at: (
began: CGPoint(x: 100, y: 200),
movedTo: CGPoint(x: 150, y: 250),
endedAt: CGPoint(x: 200, y: 300)
))What does the SDK capture?
When telemetry is active, the SDK seamlessly collects:
- General Touch Events: Down, Move, Up events with highly accurate X/Y coordinates, force (3D Touch/Haptic Touch if available), and swipe velocity.
- Device Kinematics: Continuous
CMMotionManagerreadings (Rotational velocity of the device via Gyroscope, and linear acceleration via Accelerometer) captured simultaneously with touch events.
Network & Battery Optimization
High-resolution telemetry generates a massive amount of data. Sending standard JSON objects over REST for every touch frame would instantly drain an iPhone's battery and saturate the network.
The Nx10 SDK solves this by encoding events into highly compressed, fixed-order Tuples. Furthermore, rather than stamping every event with a heavy ISO-8601 string, the SDK generates a single base UTC timestamp for a "capture window" and records all events simply as millisecond offsets from that base time. These compressed batches are sent periodically using URLSession background tasks to ensure a near-zero impact on device resources.
