The first thing you feel is nothing. Then you realize that’s the problem.
I’m sitting in a product demo at a major tech company. They’re showing off their latest notification system. The room erupts in delight at a cascade of perfectly crafted chimes—a playful triple-tap for messages, a warm ascending tone for calendar reminders, a crisp ping for emails.
And I can’t experience any of it.
When they remember I’m in the room, someone turns on haptic mode. My phone buzzes. Once. The same dull, formless shudder for every single notification. My mother calling. A spam email. A severe weather alert. A breaking news notification.
All reduced to one dumb vibration.
This system was designed by someone who thinks communication is sound.
But I design interfaces with my body. And let me tell you what they’re missing.
The Day I Learned My Skin Could Listen
I’ve been deaf my whole life. The thing that surprised people wasn’t what I couldn’t do—it was what I could feel.
As a kid, I’d press my hand against our family dog’s ribcage and feel his growl build before anyone saw his teeth bare. I’d read the subway through the soles of my shoes—the pitch of rail vibration told me which train was coming before it rounded the bend. I could feel someone open a door three rooms away from the shift in a building’s HVAC bass.
Vibration isn’t noise. It’s a language. One that hearing people never had to learn and therefore never bothered to design for.
Most interaction designers are trained to think in audio hierarchies. Distinct chimes, branded tones, emotional soundscapes. But those hierarchies collapse when your user can’t hear them. And what they put in its place—a single crude buzz—is the design equivalent of replacing an entire dictionary with the word “something.”
I learned to read vibration the way a sommelier reads wine. And what I discovered changed everything about how I think devices should communicate.
The Haptic Ghetto
Here’s what drives me crazy. The hardware is already there.
Apple’s Taptic Engine is a precise linear actuator capable of producing textures, rhythms, pressures, and durations that could rival a small percussion ensemble. Android devices ship with similarly capable motors. The engineering is done.
But nobody builds anything with it.
Your iPhone offers hundreds of distinct notification sounds. Carefully crafted. Obsessively refined. Emotionally resonant. Now count the distinct haptic patterns. You’ll run out of fingers on one hand.
The asymmetry is staggering. Sound gets a lexicon. Vibration gets a grunt.
I call this the haptic ghetto: the sad little room off the main hall of interaction design where tactile feedback is permanently subordinated to audio. Given fewer resources. Less iteration. Less grammar. Not because of technical constraints, but because of an ableist hierarchy of the senses in which hearing is treated as cognitively superior, more nuanced, more worthy of investment.
That hierarchy is wrong. And it’s costing everyone—not just deaf users—a massive amount of information.
Christine Sun Kim, the deaf artist whose work relentlessly interrogates the social ownership of sound, has explored how hearing people treat sound as “their property”—something they control, define, and gatekeep. Haptic design in tech is a direct extension of that gatekeeping. The people who design these systems hear. They test them by listening. They evaluate information through audio fidelity.
Vibration is what they add at the end, if the accessibility checklist demands it, and they give it the design equivalent of a shrug.
What My Body Already Knows
When designers hear “vibration,” they picture that single crude buzz of a phone rattling on a table. That buzz is to haptic communication what a foghorn is to spoken language: one blunt signal in a universe of possible articulations.
I’ve been navigating that universe my entire life.
I feel the structural bass of a building’s HVAC system shift when someone enters a room. I read the subsonic tremor of truck engines through pavement to navigate city streets. I can tell you whether a washing machine is on spin cycle or rinse from across the house through the floor.
This isn’t a superpower. This isn’t “overcoming.” This is what happens when you actually live inside a sensory channel that the dominant culture considers secondary.
You develop granularity. You build a haptic vocabulary that hearing people never need and therefore never acquire. And then those same people design your devices and give you a single buzz for everything from “your partner texted” to “your house is on fire.”
Deaf and DeafBlind communities have always known that vibration carries complex information. Haben Girma has described how tactile communication between DeafBlind people involves pressure, speed, location on the body, and rhythm—a full grammar transmitted through touch. The Protactile movement, developed by DeafBlind scholars and community members like Jelica Nuccio and aj granda, as documented in the Protactile Language Emergence Project, has formalized an entire linguistic system based on touch and vibration. One that operates with the sophistication of any spoken or signed language.
These aren’t workarounds. These are mature communication systems that predate and outperform anything the tech industry has attempted with haptics.
And yet when Apple or Google designs a new haptic vocabulary, they don’t consult Protactile linguists. They don’t hire deaf interaction designers to lead the work. They hand the project to audio designers who’ve been asked to “also think about vibration,” as if that’s the same thing as knowing what vibration can do when it’s your primary channel.
What a Real Haptic Language Looks Like
A notification system built on haptic grammar would use at least five distinct parameters:
- Rhythm: the temporal pattern of pulses—a quick double-tap versus a slow rolling wave
- Intensity: how hard each pulse hits—a gentle brush versus an urgent press
- Duration: how long each pulse sustains—a sharp snap versus a lingering hold
- Texture: sharp versus smooth attack, which the Taptic Engine can already produce
- Location: which part of the device or wearable produces the vibration
Combine these and you get a design space with hundreds of distinguishable signals. More than enough to tell me whether my partner is calling, my calendar is reminding, the weather is threatening, or the news is breaking—all without looking at a screen or hearing a sound.
This isn’t speculative design fiction. This is engineering that could ship in a software update.
If anyone with decision-making power understood the sensory channel well enough to prioritize it.
Instead, what we get are what Liz Jackson calls “disability dongles”—well-meaning but useless products designed by nondisabled people to solve problems disabled people don’t actually have. The current state of haptic design fits that pattern precisely. Dongles all the way down.
Vibrating alarm clocks “for deaf people” that offer a single brutal motor set to maximum because the designers assumed that if you can’t hear, what you need is more force. The entire framing reveals the poverty of imagination: vibration as loudness substitute rather than vibration as information architecture.
What I want is not a louder buzz. What I want is a haptic system that communicates as much as a ringtone does for hearing users—identity, urgency, category, source—all through the skin.
Designed by people who live in that channel. Not people visiting it out of compliance obligation.
The Chair Where Deaf Designers Always Sit
Disability justice, as articulated by Mia Mingus and in the work of Sins Invalid, insists on the leadership of those most impacted. In haptic design, those most impacted are deaf, DeafBlind, and deafblind communities who have already done the foundational work of understanding touch as language.
The knowledge exists. The expertise exists.
What doesn’t exist is the institutional willingness to treat that expertise as design leadership rather than “user research.”
There’s a difference between being the person who defines the product and being the person who reacts to a prototype someone else already built.
I’ve sat in both chairs. The second chair is where deaf designers are almost always placed. Brought in late to validate decisions made without us. Nodding politely at haptic patterns that communicate nothing because they were designed by people who experience vibration as noise rather than signal.
Why This Matters Beyond Deafness
The hearing world has spent decades refining the semiotics of digital sound. The Intel chime. The Netflix ta-dum. The Mac startup chord. Sonic logos designed with obsessive attention to emotional resonance, memorability, and meaning.
No equivalent craft exists for vibration.
That absence isn’t an oversight. It’s a statement about whose perception counts. Every time a designer says “we added haptic feedback” and means a single formless buzz, they’re telling me that my sensory world doesn’t merit the same creative investment as theirs.
But here’s what the hearing world is starting to realize: they need this too. People silence their phones in meetings, on trains, in theaters. They work in open