The fluorescent lights hit first.
Not “hit” metaphorically. They pulse at 120 times per second, a strobe that my brain refuses to ignore. Most people walk into this grocery store and see light. I walk in and feel a mathematical assault: 120Hz of visual flicker colliding with the low constant hum of refrigerator compressors, creating interference patterns that rattle through my skull like a badly tuned orchestra warming up for a concert that never starts.
I’m standing in aisle seven, not shopping. I’m conducting a sensory audit.
Red signs for meat. Green for produce. Yellow for sale items. Each color fires different photoreceptors at different frequencies. My internal cognitive load meter — a gauge I’ve learned to read the way a pilot watches altitude — ticks upward. 60%. 75%. 85%. My hand hovers over a can of tomatoes. I can’t decide whether to pick it up. Not because I don’t want tomatoes. Because my brain has spent its entire decision-making budget on surviving the lights.
This isn’t a medical condition. It’s an engineering problem. And I’ve spent fifteen years learning to solve it.
The Workshop That Changed Nothing
Last year I sat in an “empathy exercise” at a major tech company. Good people, good intentions. The facilitator asked thirty designers to close their eyes and navigate their own app using only a keyboard. They stumbled. They laughed nervously. They said things like “Wow, I had no idea” and “This really opens your eyes.”
Then they went back to their desks and designed the same interfaces they’d always designed.
I know this because three months later, the company hired me to figure out why their “fully accessible” app was hemorrhaging users with cognitive disabilities. The app passed every automated audit. It met every WCAG guideline. Color contrast? Perfect. Screen reader support? Flawless. Alt text? On every image.
And it was practically unusable for anyone whose brain processes information the way mine does.
The problem wasn’t technical compliance. The problem was that the entire design philosophy rested on a fiction: that imagining someone else’s experience is the same as understanding it.
I call this the empathy fallacy. It’s the belief that a two-hour workshop can substitute for a lifetime of navigating a world built for different nervous systems. It’s like trying to compose a symphony by imagining what deafness might feel like — rather than learning from Beethoven, who actually lived it.
What I See When I See an Interface
My autism gives me something I’ve come to think of as mathematical perception. It’s not a metaphor. I literally experience interfaces as systems of equations.
Where a neurotypical designer sees a “clean layout,” I see cognitive load coefficients — the precise mathematical cost of each decision point a user has to process. Where someone says the navigation feels “intuitive,” I see pattern recognition thresholds — the exact moment when information density tips into noise.
A cluttered dashboard isn’t just “busy.” It creates sensory interference patterns I can almost measure in hertz. A poorly designed menu isn’t “confusing.” It has high cognitive entropy — too many equally weighted options competing for a finite processing budget.
This sounds abstract until you see what it does in practice.
That hospital waiting room. The one where patients described the space as “stressful” but couldn’t say why. Staff felt it too. Families sat rigid in chairs, grinding their teeth, staring at phones to escape — something in the air felt wrong.
I walked in and the math was screaming.
Fluorescent lights flickering at one frequency. Digital displays refreshing at another. The HVAC system vibrating at yet another. Three competing frequencies, none harmonically related, creating a sensory dissonance that stressed every nervous system in the room. Most brains filtered it below conscious awareness. The stress still landed in their bodies.
My solution wasn’t empathy. It was arithmetic. We switched to matte displays running at higher refresh rates. Adjusted lighting frequencies. Dampened mechanical vibrations until the competing rhythms resolved into something the body could tolerate.
Complaints about the “stressful” waiting room dropped significantly.
Mathematical harmony in sensory design — frequency patterns and cognitive load equations that reduce environmental stress
Nobody called it an accessibility project. They called it “fixing the vibe.” But the fix came from disability expertise — from a brain that can’t ignore what most brains politely overlook.
The Cathedral Principle
I think about medieval architects more than any software designer probably should.
The builders of Chartres Cathedral didn’t rely on subjective taste. They followed mathematical principles — the golden ratio, harmonic proportions, geometric relationships that had been tested across centuries. When you stand inside Chartres, you don’t just see beauty. You feel a mathematical rightness that registers somewhere below language, in the body itself.
The accessibility was baked into the blueprint, written in numbers.
Modern design took a different path. Individual expression. Subjective aesthetics. “Clean” and “intuitive” and “minimal” — words that sound precise but measure nothing. The result is interfaces that win design awards and give people headaches. Apps that look beautiful on Dribbble and break down when an actual human brain — any human brain, not just mine — tries to use them under stress, fatigue, or cognitive load.
We lost the math. And when you lose the math, you lose the people who need it most.
Three Patterns the Math Reveals
Here’s what mathematical perception shows me that empathy exercises miss:
Cognitive load isn’t linear. It’s exponential. Most designers assume that adding features adds complexity proportionally. Five features, five units of complexity. But for brains like mine — and research on cognitive load suggests this extends well beyond autism — the curve is exponential. Each new decision point doesn’t just add to the pile. It multiplies the cost of every decision around it. The fix is what I call logarithmic complexity scaling: each new feature should reduce overall cognitive demands, not increase them. Like adding the right wrench to a toolbox. The toolbox gets more capable. The job gets easier.
Hidden complexity creates anxiety, not simplicity. Designers love progressive disclosure — tucking options away until someone needs them. It looks elegant. Minimal. Clean. But for brains that rely on pattern recognition to feel safe in a system, hidden elements are hidden threats. In my experience, autistic users need substantially more system visibility before they feel comfortable navigating. My approach is transparent complexity: show everything, but organize it mathematically, like a well-structured proof where every element has a visible place and a clear purpose. It looks busier. It works better.
Sensory harmony is measurable. We audit color contrast down to decimal points. We specify font sizes in pixels. But we completely ignore sensory frequencies — the actual mathematical properties of light, sound, and vibration that different nervous systems process differently. That hospital waiting room proved these frequencies aren’t subjective preferences. They’re engineering variables. And harmonizing them doesn’t just help people like me. It helps everyone.
Neurodivergent design in practice — sensory frequency calibration reducing cognitive load in a clinical waiting environment through mathematical harmony
The Gap Between Passing and Working
There’s a distance — I think of it as the accessibility gap — between what passes an audit and what actually works for a human being.
The gap exists because we can measure the easy things and we can’t measure the hard things. Color contrast is a math problem, so we solved it. Cognitive load is treated as subjective experience, so we wave our hands and say “intuitive” and hope for the best.
The gap exists because designers can imagine physical disabilities more easily than cognitive ones. Blindness? Add alt text. Mobility limitation? Support keyboard navigation. But how do you imagine experiencing fluorescent light as a strobe attack? How do you empathize with a brain that processes every decision point as a mathematical equation with diminishing energy reserves?
And the gap exists because of a stubborn hierarchy: a designer with ten years of portfolio experience gets more credibility than an autistic user with a lifetime of navigating hostile interfaces. Disability expertise is filed under “special needs” rather than recognized as specialized engineering knowledge.
Beethoven’s contemporaries eventually understood this. His deafness didn’t diminish his music. It transformed his relationship to composition itself. He developed mathematical approaches to harmony and structure that hearing composers never needed — and those approaches produced some of the most universally moving music ever written. He wasn’t composing despite deafness. He was composing through it.
Design needs the same recognition. Disability expertise isn’t a perspective to be politely consulted. It’s a measurement instrument that detects what other instruments miss.
What Changes When You Take the Math Seriously
I’m now working with three companies to build what I call cognitive load auditing into their design process — not as an accessibility add-on, but as a core engineering practice. We use adapted versions of NASA’s Task Load Index to set maximum cognitive thresholds for user journeys. We treat every interface element the way a mathematical proof treats every step: justified or removed.
We’re building sensory frequency maps for digital environments. Visual refresh rates, auditory frequencies, animation timing, notification rhythms. We audit for conflicting patterns the same way an acoustic engineer audits a concert hall. Not because someone with autism might use the product. Because sensory dissonance stresses every nervous system. Mine just reports the data more loudly.
And we’re creating roles — not consulting gigs, but permanent design positions — for people whose disability experience gives them engineering insights that no empathy workshop can replicate.
Not accommodation. Instrumentation.
The grocery store still flickers at 120Hz. The compressors still drone their low constant hum. But I don’t just suffer the chaos anymore. I see the equations behind it. And I’m learning to rewrite them —