Posture Pal
Multisensory nudges for everyday posture
Real-life demonstration of how personalized, data-driven, multisensory design can improve daily habits and quality of life through a mobile app with over half a million users.
Introduction
Across behavioral science, small, timely cues called “nudges” help translate intention into habit. Habits stick when feedback is immediate, frequent, and low-friction; when cues are multimodal (visual, tactile, auditory), they remain salient without demanding attention. Ear-worn devices are uniquely positioned for this kind of always-available, body-proximal feedback. Posture Pal explores this space: a lightweight, multisensory system that closes the loop between sensing and subtle correction to support long-term behavior change.
Problem
Commercial posture correctors often require dedicated hardware. They are expensive, taped or strapped to the body, and easy to abandon after novelty wears off. Users report inconvenience (charging, attachment) and social intrusiveness, which are barriers that undermine consistent use and habit formation.
Solution
Posture Pal is an iOS app that transforms your headphones, a device many people already wear for hours, into a non-invasive posture coach. Using the AirPods’in-built motion sensors, the app estimates neck tilt relative to each user’s calibrated neutral. When a sustained forward head posture is detected, Posture Pal provides gentle, multisensory cues that prompt micro-corrections without interrupting flow.
How it works
- Personal calibration. Users define a comfortable neutral posture; the app adapts thresholds per person and context.
- Closed-loop sensing. Continuous head-pose estimation runs in the background with low battery impact, enabling all-day awareness.
- Multisensory feedback.
- Visual: live neck tilt visualization and status indicators through color.
- Vibration: discreet iPhone or Apple Watch haptics for quick, eyes-free nudges.
- Sound: brief signals through AirPods that are audible yet unobtrusive.
- Plays well with audio. Alerts coexist with music and podcasts, so tracking doesn’t intrude into routines.
- Sensitivity controls. Adjustable thresholds and grace periods reduce false positives and respect user preferences.
- Accessibility. Built with Dynamic Type, VoiceOver descriptions, and Voice Control actions to make setup and use inclusive.



Context & Process
I co-built Posture Pal together with Jordi Bruin. This project began with Jordi’s experience with wearing a posture hardware device and the observation that many “smart” posture tools fail not on sensing accuracy but on adherence. We reframed the design target from “perfect measurement” to sustainable engagement. The head-tilt inference was prototyped with Core Motion and iterated with users. The resulting experience foregrounds ambient awareness over constant correction.
Outcome
Posture Pal now has over 500,000+ downloads and has been featured by Apple in multiple countries with a lot of interest in China. The project demonstrates that ear-based sensing + multisensory micro-cues can deliver posture support that is accessible, private, and convenient. The approach lowers the adoption barrier to zero extra devices, enabling consistent use, which is the real engine of behavior change.



Research Alignment & Broader Vision
- Fluid Interfaces: Posture Pal turns a ubiquitous wearable into an ambient interface that augments self-regulation through gentle, just-in-time cues.
- Multisensory Intelligence: It operationalizes multimodal feedback (visual, haptic, auditory) and applies combinations to affect adherence, fatigue, and learning over time.
- Personal Robots: The system is a closed-loop assistive agent, sensing state, deciding when to act, and delivering human-aligned interventions.
Beyond posture, the same pattern of commodity sensors + personalized models + multisensory, low-burden feedback can scale to break reminders, breathing regulation, screen-time ergonomics, and rehabilitation micromovements. Posture Pal is a concrete step toward everyday multisensory computing that quietly improves health behaviors in the background of life.
Role: Research, iOS engineering (Swift/SwiftUI), sensing/algorithms, UX, and accessibility.