I know where I am right now because the café two doors south vents its espresso machine exhaust through a side grate, and the resulting hiss sits at roughly 4 kHz, and it bounces off the limestone façade across the street at a delay I have learned means “seven meters wide.” I know it rained an hour ago because the pitch of tire noise on the road has shifted — wet asphalt speaks differently than dry. I know the intersection ahead is dangerous because it goes acoustically dead: no audible pedestrian signal, no textured surface change, just a wash of undifferentiated traffic noise that swallows directionality.

I navigate a city that was never designed for me to hear.

This is not a complaint about missing accommodations, though accommodations are missing. This is an argument about something larger: that cities are acoustic environments, that sound is spatial infrastructure, and that the near-total absence of intentional acoustic design in urban planning constitutes a form of structural exclusion so normalized it doesn’t even register as a choice.

The visual monopoly

Here is a fact that should trouble you more than it probably does: the entire modern discipline of urban design is organized around a single sense.

Zoning codes regulate building heights, setbacks, sight lines, façade materials, signage colors, lighting levels. Historic preservation boards argue about visual coherence. Traffic engineers model “level of service” using car throughput and signal visibility. Pedestrian infrastructure means sidewalks (a visual-spatial concept), painted crosswalks (visual), walk signals (visual), wayfinding signage (visual), and — as a grudging afterthought in some jurisdictions — the occasional audible pedestrian signal that beeps at an intersection like a pager from 1997.

The profession calls this “the built environment.” But built for whom?

I have been a researcher in acoustic ecology for nine years. I have consulted on three municipal accessibility plans. And I can tell you that in every planning document I have ever reviewed, sound appears in exactly one context: noise abatement. Sound is a problem to be mitigated. It is never, in any document I have seen, described as a resource to be designed.

This is the equivalent of a discipline that treats light exclusively as glare.

What acoustic wayfinding actually is

Let me describe how I move through a city, because the mechanics matter and they are poorly understood, even by well-meaning accessibility advocates.

I use a combination of passive echolocation, active echolocation, ambient sound reading, and learned acoustic landmarks. These are not metaphors. They are techniques.

Passive echolocation means listening to how ambient sound — traffic, voices, wind — reflects off surfaces around me. A building wall to my left compresses the soundscape on that side. An open parking lot to my right lets sound dissipate. A recessed doorway creates a brief acoustic shadow I can feel as a micro-absence as I walk past. I am reading architecture with my ears the way you read it with your eyes, and I am doing it continuously, unconsciously, the way you process depth and distance from binocular vision without thinking about the math.

Active echolocation means I sometimes produce sounds — tongue clicks, finger snaps, the tap of my cane — and listen to what comes back. The return tells me about surface distance, material density, aperture size. A glass storefront sounds different from a brick wall. An open alley sounds different from a closed one. This is not superhuman. This is physics. Sound reflects. I have spent years learning to read those reflections, the way a sighted architect learns to read a blueprint.

Ambient sound reading means I use the soundscape as a map. The doppler shift of passing cars tells me traffic direction and speed. The acoustic signature of a bus — diesel engine, air brakes, that particular pneumatic door hiss — tells me I am near a transit stop. A fountain in a plaza gives me a fixed auditory landmark. The change from hard-surface reverberation to soft-surface absorption tells me I have moved from sidewalk to park.

Learned acoustic landmarks are exactly what they sound like: I build a mental database of specific sounds at specific locations. The espresso machine grate. The particular rattle of a subway ventilation shaft at 23rd Street. The way sound pools in the atrium of the public library entrance. These are my street signs.

Now here is the critical point: none of this was designed. Every acoustic cue I rely on is accidental. It is a byproduct of decisions made for other reasons — structural, aesthetic, mechanical, commercial. The espresso grate exists because the café needed ventilation. The subway rattle exists because the infrastructure is aging. The fountain exists because a landscape architect thought it would look pleasant.

I am navigating a city using its waste sound.

What happens when the waste sound disappears

And this is where the structural problem reveals itself: because these cues are accidental, they are also unstable. They disappear without warning, without consultation, without anyone understanding that they constituted infrastructure for someone.

Acoustic wayfinding in practice — sound as spatial infrastructure, intersection mapped by ear Acoustic wayfinding in practice — sound as spatial infrastructure, intersection mapped by ear

Three examples from my own experience in the last two years:

The fountain was turned off. A public plaza I crossed daily had a small fountain near its northwest corner. For me, that fountain was a GPS pin — I oriented my entire route through the plaza relative to its sound. In October 2024, the city turned it off to save water during a drought restriction. No announcement was made to accessibility services. No alternative landmark was provided. I arrived at the plaza and it was acoustically flat. I lost twenty minutes and eventually asked a stranger to walk me to the far side.

The road was repaved. A stretch of road I used as an auditory reference — its rough surface produced a distinctive tire noise that told me exactly which block I was on — was resurfaced with smooth asphalt. The tire noise dropped by what I estimate was 8–10 decibels and lost its characteristic texture. My auditory landmark vanished overnight.

The building went up. A new mixed-use development replaced a single-story commercial strip on a street I knew well. The acoustic geometry of the entire block changed. Sound that used to dissipate over low rooftops now reflected off a seven-story glass curtain wall. Reverberation times increased. The soundscape became muddier, more confused. Directionality degraded. I had to relearn the block from scratch.

In each case, the change was made by professionals who considered visual impact, structural integrity, traffic flow, and environmental sustainability. In none of these cases did anyone consider acoustic impact on spatial navigation. Because acoustic navigation is not a category that exists in their professional framework.

This is not negligence. It is an epistemological gap. The knowledge that sound is navigational infrastructure has not entered the disciplines that shape cities. And the people who hold that knowledge — blind and low-vision navigators, acoustic ecologists, sound designers — are not in the rooms where planning decisions are made.

The audible pedestrian signal problem

Let me address the one piece of acoustic infrastructure that does exist, because it illustrates the problem perfectly.

Audible pedestrian signals — APS — are the beeping or chirping sounds at crosswalks that indicate when it is safe to cross. They are the single most common acoustic accessibility feature in urban environments. In the United States, the 2023 Public Right-of-Way Accessibility Guidelines require APS at all new and reconstructed signalized intersections. Progress.

Except.

Most APS systems use a simple tone — a beep or a cuckoo — that indicates “walk” or “don’t walk.” They provide binary information: go or stop. They do not tell you which direction to walk. They do not tell you the width of the intersection. They do not tell you whether there is a median island halfway across. They do not tell you about the geometry of a complex or offset intersection. They do not localize well in noisy environments — the beep bounces off buildings and vehicles and becomes directionless. At large intersections with multiple crossings, the signals from different corners can blur together into an undifferentiated chirping that tells you almost nothing.

Compare this to what a sighted person gets at the same intersection: they can see the crosswalk markings, the median island, the turn lanes, the pedestrian countdown timer, the geometry of the entire crossing, the behavior of other pedestrians, the trajectory of turning vehicles. They receive a rich, multi-dimensional, real-time spatial model.

The blind pedestrian gets a beep.

This is not “equivalent access.” This is not even close. And the reason it persists is that the people who design these systems are solving for compliance — does this intersection have an audible signal, yes or no — rather than for navigability. They are checking a box on a form. They are not asking: what does this intersection need to sound like for someone to cross it safely and independently?

That question would require a fundamentally different design process. It would require understanding the intersection as an acoustic environment. It would require consulting with the people who navigate it by ear. It would require sound design.

What sound design for streets could look like

I want to be concrete, because I am tired of articles that diagnose a problem and then gesture vaguely toward “doing better.” Here is what intentional acoustic wayfinding infrastructure could include:

The city that forgot to listen — urban sound design as structural exclusion The city that forgot to listen — urban sound design as structural exclusion

Directional audible beacons. Not the omni-directional beep of current APS systems, but focused, directional sound that guides you along the correct crossing path. The technology exists. Parametric speakers can project a narrow beam of sound that you can follow like a rail. Several Japanese cities have piloted this. It works. It has not been adopted at scale because the procurement pipeline for traffic infrastructure is not set up to evaluate acoustic performance.

Acoustic texture differentiation. Different surface materials produce different sounds underfoot and under cane. This is already partially true — tactile warning surfaces at curb cuts produce a different cane sound than smooth concrete. But it is not systematized. A deliberate palette of surface textures, chosen for their acoustic as well as tactile properties, could encode navigational information into the ground itself. Approaching an intersection could sound different from mid-block walking. A bus stop zone could have a distinct acoustic surface signature.

Designed ambient sound landmarks. Instead of relying on accidental sounds that can disappear, cities could install permanent, low-level sound features at key decision points — not loud, not intrusive, but present and distinctive. A specific water feature at this transit hub. A tuned wind element at that intersection. A subtle tonal marker embedded in the architecture of a building entrance. Sound designers do this routinely in museums, airports, and retail environments. The expertise exists. It has simply never been applied to streets.

Acoustic impact assessment. When a building goes up, we require environmental impact assessments that consider shadows, wind tunnels, traffic, and sometimes noise. We do not assess how the new structure changes the acoustic navigability of the surrounding streets. An acoustic impact assessment would model how a development alters sound reflection, reverberation, and landmark audibility for the people who navigate by ear. This is computationally feasible. Acoustic modeling software used in concert hall design could be adapted for streetscape analysis.

Soundscape preservation. Just as we preserve visual character in historic districts, we could identify and protect acoustic landmarks that serve navigational functions. The city could maintain a registry of sounds that matter — not for aesthetic reasons, but for accessibility. Before you turn off the fountain, before you repave the road, before you demolish the building with the distinctive ventilation hum, you check the acoustic registry and you consult with the community that depends on those sounds.

None of this is speculative technology. All of it exists. The barrier is not engineering. The barrier is imagination — the inability of a visually organized profession to conceive of sound as infrastructure.

“But that would benefit such a small population”

I hear this objection before it is spoken, and I want to dismantle it.

First, the numbers. The World Health Organization estimates 2.2 billion people globally have a vision impairment, of whom at least 43 million are blind. In the United States alone, roughly 7 million adults report significant vision loss. These are not small numbers. They are small only if you have never had to count yourself among them.

But second, and more fundamentally, the objection misunderstands how good design works. Curb cuts were “for” wheelchair users. They turned out to be essential for parents with strollers, travelers with luggage, delivery workers with carts, elderly people with walkers, anyone who has ever rolled anything anywhere. The beneficiary population of acoustic wayfinding would extend far beyond blind navigators. People with cognitive disabilities who struggle to process complex visual signage. Elderly pedestrians whose vision is degrading faster than they admit. Anyone walking at night, in fog, in rain, in any condition that degrades visual information. Tourists in unfamiliar cities. Children.

Sound is not a niche sense. It is the sense that works when vision fails — and vision fails more often, for more people, in more conditions, than the sighted planning profession has ever bothered to notice.

Third, the objection reveals its own logic: it says that infrastructure need only serve the majority. That is not a design philosophy. That is a confession. It says, plainly, that some people’s presence in public space is optional.

A city that only works for people who can see it is not a well-designed city. It is a city that has confused its audience with its architects — and built the world in the image of those who never had to listen to find their way home.