Right now I know exactly where I am.

The café two doors south vents its espresso machine through a side grate. That hiss sits at roughly 4 kHz and bounces off the limestone façade across the street at a delay that means “seven meters wide.” It rained an hour ago—I can tell because wet asphalt speaks differently than dry, shifting the pitch of tire noise just enough. And I know the intersection ahead is dangerous because it goes acoustically dead: no audible pedestrian signal, no textured surface change, just a wash of undifferentiated traffic noise that swallows directionality whole.

I navigate a city that was never designed for me to hear.

This isn’t a complaint about missing accommodations, though accommodations are missing. This is about something bigger: cities are acoustic environments, sound is spatial infrastructure, and the near-total absence of intentional acoustic design in urban planning is a form of structural exclusion so normalized it doesn’t even register as a choice.

The Visual Monopoly

Here is a fact that should bother you more than it probably does: the entire modern discipline of urban design is organized around a single sense.

Zoning codes regulate building heights, setbacks, sight lines, façade materials, signage colors, lighting levels. Historic preservation boards argue about visual coherence. Traffic engineers model “level of service” using car throughput and signal visibility. Pedestrian infrastructure means sidewalks (visual-spatial), painted crosswalks (visual), walk signals (visual), wayfinding signage (visual), and—as a grudging afterthought in some jurisdictions—an audible pedestrian signal that beeps like a pager from 1997.

The profession calls this “the built environment.” Built for whom?

I’ve been a researcher in acoustic ecology for nine years. I’ve consulted on three municipal accessibility plans. And I can tell you that in every planning document I’ve ever reviewed, sound appears in exactly one context: noise abatement. Sound is a problem to be mitigated. It is never—not once, in any document I’ve seen—described as a resource to be designed.

That’s like a profession that treats light exclusively as glare.

How I Actually Move Through a City

Let me walk you through this, because the mechanics matter and they’re poorly understood—even by well-meaning accessibility advocates.

I use passive echolocation, active echolocation, ambient sound reading, and learned acoustic landmarks. These aren’t metaphors. They’re techniques.

Passive echolocation means listening to how ambient sound—traffic, voices, wind—reflects off surfaces around me. A building wall to my left compresses the soundscape on that side. An open parking lot to my right lets sound dissipate. A recessed doorway creates a brief acoustic shadow I can feel as a micro-absence as I walk past. I’m reading architecture with my ears the way you read it with your eyes, continuously, unconsciously, the way you process depth from binocular vision without thinking about the math.

Active echolocation means I sometimes produce sounds—tongue clicks, finger snaps, the tap of my cane—and listen to what comes back. The return tells me about surface distance, material density, aperture size. A glass storefront sounds different from a brick wall. An open alley sounds different from a closed one. This isn’t superhuman. This is physics. Sound reflects. I’ve spent years learning to read those reflections, the way a sighted architect learns to read a blueprint.

Ambient sound reading means I use the soundscape as a map. The Doppler shift of passing cars tells me traffic direction and speed. The acoustic signature of a bus—diesel engine, air brakes, that particular pneumatic door hiss—tells me I’m near a transit stop. A fountain in a plaza gives me a fixed auditory landmark. The change from hard-surface reverberation to soft-surface absorption tells me I’ve moved from sidewalk to park.

Learned acoustic landmarks are exactly what they sound like: a mental database of specific sounds at specific locations. The espresso machine grate. The particular rattle of a subway ventilation shaft at 23rd Street. The way sound pools in the atrium of the public library entrance.

These are my street signs.

Now here’s the critical point: none of this was designed. Every acoustic cue I rely on is accidental—a byproduct of decisions made for other reasons. The espresso grate exists because the café needed ventilation. The subway rattle exists because the infrastructure is aging. The fountain exists because a landscape architect thought it would look pleasant.

I’m navigating a city using its waste sound.

What Happens When the Waste Sound Disappears

Because these cues are accidental, they’re also unstable. They vanish without warning, without consultation, without anyone understanding that they were infrastructure for someone.

Acoustic wayfinding in practice — sound as spatial infrastructure, intersection mapped by ear

Three examples from my last two years:

The fountain was turned off. A public plaza I crossed daily had a small fountain near its northwest corner. For me, that fountain was a GPS pin—I oriented my entire route through the plaza relative to its sound. October 2024, the city turned it off during drought restrictions. No announcement to accessibility services. No alternative landmark. I arrived at the plaza and it was acoustically flat. I lost twenty minutes and eventually asked a stranger to walk me to the far side.

The road was repaved. A stretch of rough road I used as an auditory reference—its distinctive tire noise told me exactly which block I was on—was resurfaced with smooth asphalt. The tire noise dropped by roughly 8–10 decibels and lost its texture. My landmark vanished overnight.

The building went up. A seven-story mixed-use development replaced a single-story commercial strip on a street I knew well. The acoustic geometry of the entire block changed. Sound that used to dissipate over low rooftops now reflected off a glass curtain wall. Reverberation times increased. The soundscape turned muddy. Directionality degraded. I had to relearn the block from scratch.

In each case, professionals considered visual impact, structural integrity, traffic flow, environmental sustainability. In none of these cases did anyone consider acoustic impact on spatial navigation. Because acoustic navigation doesn’t exist as a category in their professional framework.

This isn’t negligence. It’s an epistemological gap. The knowledge that sound is navigational infrastructure hasn’t entered the disciplines that shape cities. And the people who hold that knowledge—blind and low-vision navigators, acoustic ecologists, sound designers—aren’t in the rooms where planning decisions get made.

The Beep That Passes for Access

Let me talk about the one piece of acoustic infrastructure that does exist, because it illustrates the problem perfectly.

Audible pedestrian signals—APS—are the beeping or chirping sounds at crosswalks that indicate when it’s safe to cross. They’re the single most common acoustic accessibility feature in urban environments. The 2023 Public Right-of-Way Accessibility Guidelines require them at all new and reconstructed signalized intersections. Progress.

Except.

Most APS systems use a simple tone that says “walk” or “don’t walk.” Binary. Go or stop. They don’t tell you which direction to walk. They don’t tell you the width of the intersection. They don’t tell you about the median island halfway across. They don’t describe the geometry of a complex offset intersection. They don’t localize well in noisy environments—the beep bounces off buildings and vehicles and becomes directionless. At large intersections with multiple crossings, the signals from different corners blur into undifferentiated chirping that tells you almost nothing.

Now consider what a sighted person gets at the same intersection: crosswalk markings, the median island, turn lanes, the pedestrian countdown timer, the geometry of the entire crossing, the behavior of other pedestrians, the trajectory of turning vehicles. A rich, multi-dimensional, real-time spatial model.

The blind pedestrian gets a beep.

This isn’t “equivalent access.” This isn’t even close. And the reason it persists is that the people who design these systems are solving for compliance—does this intersection have an audible signal, yes or no—rather than for navigability. They’re checking a box. They’re not asking: what does this intersection need to sound like for someone to cross it safely and independently?

That question would require a fundamentally different design process. It would require understanding the intersection as an acoustic environment. It would require consulting with the people who navigate it by ear.

It would require sound design.

What Sound-Designed Streets Actually Look Like

I’m tired of articles that diagnose a problem and then gesture vaguely toward “doing better.” Here’s what intentional acoustic wayfinding infrastructure could include:

The city that forgot to listen — urban sound design as structural exclusion

Directional audible beacons. Not the omni-directional beep of current APS systems—focused