Predictive Forecasts (LFM)
Look into the future. Forecast your user's emotional trajectory up to 15 minutes ahead.
Time as a Training Variable
While real-time indices (GBI/BCI) tell you how a user feels now, the true power of the Nx10 ecosystem lies in the Large Feelings Model (LFM). By analyzing real-time kinematic data alongside historical baselines and SAAQ ground-truth labels, the LFM builds a deeply personal cognitive-affective profile for every user.
Because the keyboard travels with the user across all their apps (messaging, emails, browsing), the LFM gets an incredibly holistic view of their day. It uses this sequence of data to forecast short-horizon state transitions (typically ±15 minutes into the future).
The Forecast Data Contract
Probability matrices and trajectory horizons.
When the LFM generates a forecast, it provides a probability distribution across various affective trajectories. This allows your app to intervene before a negative state occurs.
{
"horizonMinutes": 15,
"primaryTrajectory": "SessionAbandonment",
"baselineDeviationScore": 2.4,
"probabilities": {
"SustainedFlow": 0.05,
"CognitiveBurnout": 0.15,
"Tilt": 0.10,
"SessionAbandonment": 0.65,
"Recovery": 0.05
}
}Accessing the Data
Both real-time indices and long-term sequence predictions are generated in the LFM to keep the keyboard extension ultra-lightweight. You can consume these predictive forecasts via two routes.
Route 1: Server-to-Server Webhooks
This is the most powerful way to consume predictive data. If the LFM predicts a user is going to experience a highly negative emotional state in the next 15 minutes, it can ping your backend. Your backend can then trigger an Apple Push Notification (APNs) or an email to re-engage the user.
app.post('/api/nx10/webhooks', express.json(), (req, res) => {
const { deviceId, eventType, payload } = req.body;
// React to Predictive LFM Forecasts
if (eventType === 'lfm_forecast_alert') {
const forecast = payload.lfmForecast;
const churnRisk = forecast.probabilities['SessionAbandonment'] || 0;
if (churnRisk > 0.80) {
console.log(`[WARNING] User ${deviceId} has an 80% chance of quitting.`);
// Proactively send a push notification with a reward to save the session
pushNotificationService.send(deviceId, {
title: "Here's a gift for you!",
body: "Take 10% off your next trade, valid for the next hour."
});
}
}
res.status(200).send('Received');
});Route 2: In-App (Nx10Insights SDK)
If the user is actively using your Host App, you can use the lightweight Nx10Insights Swift SDK to poll the LFM for the latest 15-minute forecast. This is perfect for deciding whether to show an ad or offer a discount right at the moment the user opens your app.
import UIKit
import Nx10Insights
class HostAppViewController: UIViewController {
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Fetch the user's latest 15-minute emotional trajectory from the cloud
Nx10Insights.shared.fetchLatestForecast { [weak self] result in
switch result {
case .success(let forecast):
let churnProb = forecast.probabilities[.sessionAbandonment] ?? 0.0
if churnProb > 0.75 {
print("High churn risk detected. Skipping interstitial ad.")
self?.skipAdAndShowContent()
} else {
self?.showStandardAdFlow()
}
case .failure(let error):
print("Failed to fetch forecast: (error)")
}
}
}
}The Data Moat: Cold Start vs. Longitudinal
The true power of the LFM lies in the fact that it learns your specific users. But how does it work on day one?
1. The Cold Start (Phenotyping)
When a brand new user installs your keyboard, the LFM has no history for them. Within their first few minutes of typing, the SDK collects their base TAG data and compares it against population-wide Affective Phenotypes. It clusters them into a known archetype (e.g., "Heavy Presser, Fast Cadence") to provide a highly accurate initial baseline and generalized forecast.
2. Longitudinal Observation (Per-User Embeddings)
As the user types across various apps over days and weeks, the LFM continually retrains. It generates a unique, compact vector (a Per-User Embedding) that represents their individual affective pattern. It learns what their specific stress or fatigue looks like, independent of the general population. This builds a massive Data Moat - the longer they use the keyboard, the more profoundly the model understands them.
