AIHumanity
AI Perception SDK

See What Your AI Cannot Feel

Real-time biometric perception from a standard RGB camera. No wearables. No specialised hardware. Give your AI the ability to sense heartbeat, breath, calmness, and gaze — all from the human face.

Get SDK AccessView Docs
< 100ms
end-to-end latency
30+ fps
real-time processing
5 signals
from one camera
Model-agnostic
works with any LLM

Five Perception Signals. One SDK.

Each signal is independently accessible or combined into a unified emotional state stream.

rPPG · Contactless

Heartbeat & Rhythm Detection

Measure heart rate and heart rate variability in real time from a standard camera feed — no wearables required. Our remote photoplethysmography (rPPG) model detects subtle skin-color fluctuations caused by blood flow, delivering medical-grade signal quality at 30+ fps.

  • Heart rate accuracy within ±3 BPM at 1–2 metres
  • Works across skin tones and lighting conditions
  • HRV metrics for stress and engagement scoring
Respiratory Rate · Passive

Breath Detection

Track breathing rate and depth from micro-movements in the face and upper chest region. No contact sensors, no IR arrays — just vision. Ideal for wellness applications, meditation guidance, and detecting stress-induced breath-holding.

  • Respiratory rate estimation (breaths per minute)
  • Detects breath-hold and hyperventilation patterns
  • Integrates with calming feedback loops in real time
Emotional State · Composite Signal

Calmness Detection

Fuse heart rate, breathing rhythm, micro-expression analysis, and blink rate into a single real-time calmness score. Give your AI companions and game characters the ability to sense when a user is relaxed, anxious, or frustrated — and adapt accordingly.

  • 0–100 calmness index updated every 500 ms
  • Multi-signal fusion for robust accuracy
  • Trigger adaptive NPC dialogue, pacing, or music
Blink · Wink · Squint

Eye Gesture Recognition

Detect intentional eye gestures — blinks, winks, extended squints, and double-blinks — as discrete input events. Enable hands-free UI control, accessibility triggers, and expressive avatar responses driven entirely by natural eye movement.

  • Single blink, double blink, and wink classification
  • Squint intensity for analog expression mapping
  • Sub-100 ms latency for responsive interaction
Screen Gaze · 3D Scene

Eye Look-at Location

Estimate where on screen — or in a 3D scene — the user is looking, using only a webcam. No expensive eye-tracking hardware. Map attention heatmaps, detect when a character has lost the player's focus, or drive gaze-aware NPC eye contact for dramatically more believable AI companions.

  • Screen-space gaze estimation at 60 fps
  • 3D look-direction vector for in-world targeting
  • Attention heatmaps for UX analytics

Built for Every Emotional Interface

🎮

Games & VR

NPCs that read player stress and fatigue. Adaptive difficulty triggered by physiological state. Gaze-driven conversation systems.

🤖

AI Companions

Companions that sense your mood without you saying a word. Calm you when you're stressed. Engage more when you're relaxed.

🏥

Health & Wellness

Contactless biometric monitoring for meditation apps, telehealth check-ins, and mental wellness platforms.

🏫

EdTech

Detect student attention and stress levels. Adapt lesson pacing automatically. Identify when focus is lost.

Integration

Drop-in SDK. Works with Any Stack.

AI Perception plugs into your existing pipeline via a lightweight SDK. Stream perception events as JSON over WebSocket, REST, or native callbacks. Supports Unity, Unreal, Python, Node.js, and browser environments.

Single WebSocket stream for all perception signals
Configurable sample rate per signal type
On-device inference — no cloud round-trip required
GDPR-friendly: video never leaves the device
// Connect to perception stream
const sdk = AIPerception.connect({
apiKey: "YOUR_API_KEY",
signals: ["heartbeat", "calmness", "gaze"]
});
sdk.on("perception", event => {
console.log(event.heartRate); // 72
console.log(event.calmness); // 0.84
console.log(event.gaze); // {x: 0.51, y: 0.48}
});

Ready to Give Your AI a Sense of Feeling?

Request early access to the AI Perception SDK and start integrating biometric signals today.

Request Access →
Talk to David