February 9, 2026

The Strip-Mining of Trust

Adam Butler·article

I.

The handshake used to close deals in the Chicago Board of Trade. Not metaphorically—literally. Traders in colored jackets screaming across the pit, and when two hands clasped, that was the contract. No timestamp, no algorithm, no audit trail. Just two men who knew that if they reneged, they would never work the floor again. The infrastructure of global commodity pricing ran on something that couldn’t be measured, couldn’t be optimized, couldn’t be put on a balance sheet: the accumulated trust of men who had learned, over decades, that their word was worth more than their margin.

What made the system work was what made it invisible to the logic that would eventually replace it. The handshake was bound—not by contract but by community, not by surveillance but by memory. The traders saw each other every day. They would see each other tomorrow. Reputation accumulated over years; betrayal echoed for careers. The binding was social, and therefore cheap. No lawyers drafted the agreement. No compliance officers monitored the execution. No auditors verified the settlement. Trust was the lubricant that made the gears turn without friction, and because it was lubricant, it generated no profit for anyone selling friction.

The pits are gone now. The building still stands on LaSalle Street, but the trading floors have been converted to event spaces where corporations host product launches. The handshakes have been replaced by matching engines executing trades in microseconds—faster than human betrayal, faster than human loyalty. More efficient. More legible. More profitable, at least in the quarterly reports.

The algorithms inherited a reservoir of institutional trust built over a century, the sediment of ten thousand handshakes honored. They cannot replenish a drop of it. The new systems extract from the commons they did not build, drawing down the principal while calling the withdrawals “innovation.” And in the space where trust once operated for free, a new industry has emerged—Loss: lawyers to draft the contracts the handshake didn’t need, compliance officers to monitor the behavior that reputation once constrained, auditors to verify the settlements that memory once guaranteed. The destruction of trust is not merely a cost. It is a profit center for everyone selling the machinery of distrust.

At almost exactly the same time the last traders were clearing out their lockers on LaSalle Street, a different kind of infrastructure was failing six hours east. Beneath Dort Highway in Flint, Michigan, a seventy-year-old water main—iron, municipal, invisible—was beginning to corrode. The city’s emergency managers had switched the water source to save money, fifty dollars per connection, and the new water’s chemistry was eating through the protective scale that lined the pipes. The lead that had been safely encased for decades began leaching into the water supply. Children drank it. Families bathed in it. The corrosion crept through the system for eighteen months before anyone with the authority to act would admit what was happening.

Here is the paradox: the water system that poisoned Flint’s children could only do so because it had inherited a reservoir of institutional trust built over generations. Trust that the water was safe, that the officials were competent, that the systems worked. When residents turned on their taps, they did not perform chemical analyses. When parents filled their children’s glasses, they did not demand assurances. They simply drank, as their parents had drunk, as their grandparents had drunk, because decades of reliable service had taught them that the water could be trusted. This trust was not naivety; it was the rational response to a system that had earned it. And it was precisely this earned trust that made the extraction possible.

The emergency managers who switched the water source were not operating in a vacuum of public scrutiny. They were operating in a cushion of public confidence—confidence they had inherited but not built, confidence they spent without acknowledging the withdrawal. Had Flint’s residents been vigilant, suspicious, equipped with home testing kits and skeptical of official assurances, the corrosion might have been caught in weeks rather than eighteen months. But such vigilance would have been its own kind of tax, its own erosion of the social fabric. A community that must verify every interaction with its own government is a community already impoverished, already paying the costs of lost trust in currencies of time and anxiety and social friction. The emergency managers drew on reserves of confidence to avoid exactly this kind of scrutiny—and in doing so, they dissolved the very confidence they required.

This is the essential structure: decisions that are only viable because trust exists, but which deplete the trust they depend upon. Markets, left unbound, will extract from any commons they encounter. Trust is the commons that makes all other commons possible. It cannot be measured, cannot be priced, does not appear on any balance sheet. Therefore it will be treated as an externality, a cost to be minimized, an inefficiency to be eliminated. The extraction will continue until the commons is empty, because the commons was never on the ledger to begin with.

II.

To understand why this extraction is not accident but structure, we must first understand what trust actually does—the work it performs invisibly, the load it bears that nothing else can carry.

Consider the mundane miracle of buying coffee from a stranger. You hand over money—itself a token of trust in institutions you will never see—and receive in return a cup of liquid you did not witness being prepared, from beans whose origin you cannot verify, in a transaction completed in seconds with someone whose name you do not know. This exchange, repeated billions of times daily across the planet, would be impossible without the substrate of trust that allows strangers to cooperate at scale. Every handoff that doesn’t require a lawyer, every purchase that doesn’t demand inspection, every loan extended without collateral—these are trust’s invisible dividends, transaction costs that never appear on any ledger because trust has already paid them.

Trust is lubricant. It is the grease that lets the gears turn without grinding, the oil that prevents the machinery of cooperation from seizing. And because it is lubricant, it is invisible when it works. No one notices the oil until the engine burns out.

What does the market see? Trust does work markets might otherwise intermediate. Friction creates demand for products. When trust disappears, someone must sell its replacement. The lawyer who drafts the contract the handshake didn’t need. The compliance software that monitors the worker who was once trusted to do her job. The credentialing body that certifies the competence that reputation once established. The background check that investigates the neighbor who was once known. The audit that verifies the books that were once taken on faith. Each of these is a profitable service, and each becomes more profitable as trust erodes. There is now an entire industry—call it the friction industry—whose revenues depend on the continued destruction of the commons it has learned to replace.

Trust does not appear in GDP. Its absence does. When neighbors watch each other’s houses, no transaction occurs; when they install surveillance systems, GDP rises. When a handshake closes a deal, no lawyer bills; when contracts replace handshakes, the legal industry grows. The successful destruction of trust registers as economic growth—more services sold, more professionals employed, more activity to measure. A high-trust society looks, to the econometrician, poorer than a low-trust society frantically purchasing substitutes for what it has lost. We have built our measure of prosperity around what can be counted, and trust cannot be counted—only its expensive replacements can. This is how a civilization can impoverish itself while every metric shows it getting richer.

The economists call trust “social capital,” but the name obscures the crucial asymmetry. Trust is not like money. It is more like topsoil—built up over generations through countless small depositions, easily washed away in a single season of careless extraction, and once gone, recoverable only on timescales that mock human planning. The farmer who strip-mines his soil for a decade of bumper harvests will leave his children a desert. The society that strip-mines its trust for a generation of efficiency gains will leave its children something worse: institutions that still stand in their accustomed places but have been hollowed out from within, and a friction industry selling services to navigate the rubble.

III.

Gloria Reyes has worked the cardiac step-down unit at Regional Medical for twenty-three years. At 6:47 AM, she stands in room 412, watching the charting software’s countdown timer pulse orange in her peripheral vision: fourteen minutes remaining to complete Mr. Okonkwo’s morning assessment, vitals documentation, and medication administration before the algorithm flags her as non-compliant.

Mr. Okonkwo is describing chest tightness that started around 3 AM. The sensation is “different,” he says—not quite pain, more like someone resting a sandbag on his sternum. Gloria has learned to read the particular timbre of worry in a patient’s voice, the way fear tightens the soft palate and thins the consonants. Twenty-three years of reading these signals tells her something is wrong that the telemetry isn’t catching.

The timer drops to eleven minutes.

What Gloria knows—what the productivity dashboard cannot know—is that Mr. Okonkwo spent his first two days refusing to report symptoms. His son had told him the family couldn’t afford another week of his absence from the restaurant. It took Gloria three shifts of unhurried conversation, sitting at his bedside during moments the algorithm would classify as “idle time,” to build enough trust that he would tell her when something felt wrong. That relational investment—invisible to the metric, illegible to the extraction apparatus—is now paying its dividend.

She pages cardiology. The timer expires.

The cath lab will find a 94% occlusion in his left anterior descending artery—the “widow-maker.” Mr. Okonkwo will go home to his family in eight days.

The ledger for this shift will show one non-compliant assessment, one efficiency violation, one nurse 12% slower than the unit benchmark. Here is what the ledger cannot show: the three shifts of trust-building that made this morning’s disclosure possible, the twenty-three years of pattern recognition that heard “different” and understood danger, the social capital that transforms a terrified man’s silence into lifesaving speech.

The productivity software that tracks Gloria’s movements was sold to the hospital by a consulting firm that will return next quarter to assess the implementation and recommend enhancements. The algorithm that flags her for non-compliance was designed by engineers who have never held a dying man’s hand. The compliance report that will note her “opportunity for improvement” will be written by analysts whose jobs exist because the hospital no longer trusts its nurses to manage their own time. Each of these services is billable. Each is more billable the less the hospital trusts. The friction industry does not need Gloria to fail. It needs the hospital to believe she might.

At 7:23 AM, Gloria silences the tablet and sits back down beside Gerald Murray in room 408—a seventy-three-year-old retired electrician who came in for arrhythmia but whose real problem is that his wife of forty-nine years died eight months ago and he has stopped caring whether his heart keeps beating. She lets him tell her about his wife’s garden—the roses she planted in 1987, the year their son was born. Somewhere, a dashboard records this as waste. Gloria knows it is the only real work she will do today.

IV.

The algorithm that tracks Gloria’s movements cannot perceive kindness—only deviation. It cannot value judgment—only compliance. This is not a failure of implementation but a structural limitation: to manage at scale what it cannot see, the system must convert discretionary judgment into measurable proxies. And here Goodhart’s Law begins its corrosive work. Once the proxy becomes the target, it ceases to measure what it was designed to capture. What gets measured gets managed; what gets managed gets gamed; what gets gamed gets hollowed out.

But the extraction does not stop at institutional walls. It propagates through competition itself. Consider a regional delivery company—call it Swift Logistics—that for decades operated on trust: drivers knew their routes, dispatchers knew their drivers, and the accumulated knowledge of which customers needed patience lived in relationships between people who saw each other’s faces. When the optimization software arrived—sold by a vendor whose growth depended on replacing exactly these relationships—the company gained efficiency. When its competitors adopted the same systems, Swift Logistics faced a choice: join the race or be priced out of existence.

The regional carrier that refused—that kept the veteran dispatchers who knew which drivers were struggling at home—found itself operating at a 15% cost disadvantage. Within three years it was acquired. The dispatchers who were load-bearing walls of institutional knowledge were offered severance packages. The relationships that made the enterprise cohere were converted to licensing fees for the software that replaced them.

This is the race to the bottom that market fundamentalism calls “competition.” Each firm that defects from trust-preserving practices gains a temporary advantage. Each defection degrades the commons. The tragedy plays out not in a shared pasture but in a shared reservoir of social capital that no one owns and everyone draws down—until nothing remains to draw. And in the aftermath, the friction industry collects its rents: compliance consultants to manage the workforce that is no longer trusted, HR software to monitor the employees who are no longer known, legal teams to paper the relationships that used to operate on a handshake.

Private equity has refined this extraction into a playbook called brand arbitrage. Acquire an institution that has accumulated reputational capital over generations—a regional hospital, a beloved newspaper, a century-old consumer brand. Extract value by cutting the quality that built the reputation while riding the residual trust. The brand will carry the degraded product for years before anyone notices. By the time trust has fully eroded, the fund has exited, returns secured. The shell remains. The commons has been enclosed.

Mercy Health Partners served its community for eighty years before acquisition by a chain backed by private equity. Within eighteen months: nursing ratios increased from four patients per nurse to seven; the chaplaincy program was eliminated as “non-revenue generating”; the wound care specialists who knew which patients had transportation barriers were replaced by a centralized algorithm. Patient satisfaction scores held steady for two years, coasting on reputation, then collapsed. By then, investors had refinanced their debt onto the hospital’s balance sheet and extracted a special dividend. The community was left with an institution that looked the same from outside but had been hollowed from within.

V.

Margaret Delacroix has administered elections in Beaumont County for thirty-one years. She knows which polling locations need extra parking, which precincts run out of “I Voted” stickers by 2 PM, which poll workers need reminding to take lunch. The infrastructure of democratic legitimacy runs through her office, through relationships with precinct captains built over decades, through institutional memory that tells her how many provisional ballots to expect when it rains.

The 2020 certification went smoothly, as it always had. But this time, the county commissioner who had signed off without comment for years demanded a private meeting. He’d seen videos online. He wanted Margaret to explain why she was part of a conspiracy.

Here is what the accusers understand, perhaps better than they know: their charges only land because of the trust Margaret built. If everyone already assumed elections were stolen, the accusation would have no purchase. But because generations of officials had made the system trustworthy, the charge of fraud carries weight. The accusers are strip-mining the civic commons, converting decades of institutional credibility into the currency of political grievance—exploiting inherited confidence to destroy the conditions that made such confidence reasonable.

Margaret retired early. Her replacement lasted eight months before the threats became unbearable. The position sat vacant for six weeks. When someone finally accepted, it was a recent graduate with no institutional memory, no relationships with precinct captains, no sense of which polling locations flood when it rains. The office still exists. The commons it drew upon has been enclosed.

VI.

Dr. Susan Chen spent twenty-two years as public health director in a county where trust had to be earned face to face. She knew which pastors could help spread vaccination messages, which radio stations reached the farming communities, which phrases landed wrong with residents who’d learned to distrust outsiders. When the pandemic arrived, she had something her urban counterparts lacked: a reservoir of relational credibility built through decades of showing up.

But she was communicating into a landscape that had been strip-mined while she wasn’t watching. The local newspaper that once carried her advisories had been acquired, gutted, and reduced to wire copy. The radio station had replaced local programming with syndicated content. The information ecosystem her trust-building had relied upon no longer existed. In its place: social media feeds optimized for engagement, which meant optimized for outrage, which meant optimized for the precise opposite of careful public health messaging.

This is Goodhart’s Law applied to the epistemic commons. Once engagement becomes the metric, engagement becomes the game. Platforms discovered that doubt is more shareable than certainty, that controversy sustains attention, that the algorithms cannot perceive truth—only traffic. The proxy crowded out the substance, just as delivery times had crowded out service, just as throughput metrics had crowded out care. And the platforms profited from every click, every share, every moment of attention paid to content that corroded the epistemic commons they had neither built nor would be asked to repair.

Susan watched patients she’d known for years—people whose children she’d vaccinated, whose parents she’d counselled through chronic illness—refuse guidance she’d spent decades earning the standing to give. The relational trust she’d built still held. But it had been swamped by an information environment that treated her expertise as just another signal in a cacophony of signals, her credentials as evidence of elite capture rather than competence. The information infrastructure that had made her work possible—built by journalists who stayed late, editors who demanded verification, publishers who understood credibility as commons to maintain—had been enclosed. The masthead remained. The commons had been converted to shareholder value and shipped elsewhere.

VII.

What emerges from these examples is not separate pathologies but a coherent system, a machine whose parts mesh precisely. Algorithmic management, proxy metrics, and financialization do not merely coexist; they amplify each other through feedback loops that accelerate trust extraction while obscuring its occurrence. The same logic that hollows hospitals hollows newsrooms; the same logic that strips institutional knowledge from delivery companies strips it from election offices. The machine is indifferent to the domain. It sees only the commons, and it knows only how to extract.

Consider the causal architecture. Financialization establishes the imperative: quarterly returns must rise, and the time horizon for acceptable ROI compresses from decades to months. This temporal compression cascades downward, creating demand for legible metrics that can demonstrate value creation within the reporting cycle. Enter proxy metrics—the measurable stand-ins for unmeasurable goods. But metrics require enforcement, and enforcement at scale requires automation. Thus algorithmic management completes the circuit: surveillance systems that monitor compliance with proxies that serve financial extraction that demands ever-more granular surveillance.

The system exhibits what engineers call positive feedback—each turn of the cycle intensifies the next. When a hospital implements algorithmic scheduling to optimize “patient throughput,” nurses lose the discretionary time that allowed them to notice a patient’s unusual pallor, to sit with a frightened family, to mentor a new colleague. These relational goods—invisible to the algorithm—constituted precisely the trust infrastructure that made the institution function. Their erosion manifests as “burnout” and “turnover,” metrics that trigger further algorithmic intervention: mandatory resilience training, automated shift-filling, predictive analytics for flight risk. The machine responds to the damage it causes by tightening its grip.

The extraction also triggers arms races that consume what trust would have conserved. Surveillance begets gaming begets more surveillance; credentials that can be gamed demand more credentials; fraud that exploits low trust demands detection that sophisticated actors learn to evade. Each cycle forces greater investment in verification, compliance, and enforcement—resources a high-trust society would have devoted to cooperation and production. But these resources do not disappear. They flow to the industries that sell verification, compliance, and enforcement. The arms race is not merely overhead; it is revenue—and because it is revenue, it registers as growth. The econometrician sees prosperity where a society sees decay. An entire sector of the economy now depends on the perpetuation of distrust, and that sector has every incentive to ensure the arms race continues. The extraction machine doesn’t merely deplete trust; it creates the incentives for its own metastasis.

What makes the machine so difficult to oppose is its distributed agency. No individual actor intends to destroy trust. The fund manager responds to investor expectations. The CEO responds to the fund manager. The middle manager responds to the CEO’s targets. The algorithm responds to the manager’s parameters. The compliance consultant responds to the algorithm’s outputs. Responsibility dissolves across the system while the extraction compounds. Everyone is “just doing their job,” the phrase that excuses so much.

The extraction also distributes across time. Unfunded pension obligations are claims on future workers’ wages—debts incurred by those who will not pay them. Carbon emissions externalize costs onto those not yet born. Housing markets that treat shelter as an investment vehicle transfer wealth from those who must buy to those who already own, from the young to the old, from the future to the present. Each represents the same structure: decisions viable only because trust exists, but which deplete the trust they depend upon. Trust that obligations will be honored. That the climate will remain livable. That shelter will remain attainable. The intergenerational contract, the assumption that each generation leaves its successors no worse than it found them, is being strip-mined like every other commons. The extractors will not be present when the bill comes due.

The crowding-out mechanism adds a final efficiency. As extrinsic incentives replace intrinsic motivation, as surveillance replaces discretion, the workers who remain are increasingly those who have made their peace with the transactional frame. The nurse who found meaning in presence leaves; her replacement finds meaning in metrics met. The professor who cared about intellectual formation retires; her replacement understands the game. Through selection effects, the machine gradually populates institutions with people adapted to its logic—not villains, but survivors. The trust-rich culture that once made violation feel transgressive gives way to a trust-poor culture where extraction feels like common sense.

This is the machine’s final output: not just extracted value but transformed subjects. It does not merely harvest the commons; it renders the very concept of commons illegible to those who remain.

VIII.

We have bound markets before.

Child labor was efficient. The factory owner who employed children at subsistence wages outcompeted the one who did not, and the market, left to its own logic, selected for the employment of children. It took decades of agitation, documentation, and political mobilization to establish that some things must not be left to market logic—that certain goods, certain people, certain relationships must be held outside the optimization frame. The same pattern repeated with workplace safety, with environmental protection, with consumer fraud. In each case, the market produced extraction until external forces—law, regulation, social sanction enforced through political power—bound it.

These bindings were not anti-market. They were the conditions under which markets could function without consuming the substrate on which they depend. The factory that maims its workers externalizes costs onto families and communities; eventually, there are no workers healthy enough to employ. The plant that poisons the river externalizes costs onto downstream users; eventually, there is no water clean enough to use. Binding markets against these extractions was not an imposition on prosperity but its precondition.

Trust may be the most consequential commons we have failed to protect—perhaps because its depletion is harder to photograph than a maimed child or a burning river. There is no trust EPA, no trust inspector, no trust spill that makes the evening news. Worse, our central measure of national prosperity actively rewards its destruction: every lawyer hired to replace a handshake, every compliance officer employed to replace discretion, every surveillance system sold to replace trust—all of it counts as growth. We have defined success in terms that guarantee we will not see the failure until it is complete. The extraction proceeds invisibly until the system fails, and even then, the failure is blamed on anything but its actual cause: bad actors, aging infrastructure, cultural decay, anything except the market logic that made the extraction rational and the bindings we removed that once prevented it.

The language of market fundamentalism has made this blindness worse. For forty years, we have been told that binding is the problem—that regulation stifles, that constraints impede, that the market knows best. But the market does not know. The market optimizes. It optimizes for what it can measure, and trust cannot be measured. It optimizes for the short term, and trust is built over generations. It optimizes for private capture, and trust is a commons. To expect the market to preserve trust is to expect the mining company to preserve the mountain. The extraction is not a bug. It is the function.

And yet the friction industry will tell us that binding is impossible, that we cannot return to the world of the handshake, that the complexity of modern life requires the machinery they sell. They are not wrong that we cannot simply recreate the trading pit or the small-town hospital or the local newspaper. But they are wrong that the only alternative is the world they profit from. The question is not whether we can eliminate friction; some friction is necessary, even salutary. The question is whether we will protect the conditions under which trust can form—the repeated interactions, the long time horizons, the scale at which reputation can function, the professional autonomy that allows judgment to operate—or whether we will allow the friction industry to hollow out these conditions and sell us replacements forever.

Trust cannot be legislated into existence. But its destruction can be slowed. The conditions that allow trust to form can be protected. The extraction can be made less profitable, the time horizons lengthened, the ownership structures reformed, the professional autonomy defended. These are not utopian demands. They are the same work that earlier generations did when they bound markets against child labor and pollution and fraud. They knew what we have forgotten: that markets are tools, not masters, and that their considerable power must be channeled by constraints they cannot generate themselves.

IX.

Return now to Flint, to the seventy-year-old water main corroding beneath Dort Highway. Every system running on extracted trust is that pipe—stable until it catastrophically isn’t, still delivering what looks like water until the test comes back positive for lead.

We are living off that inheritance now, drawing down the principal while calling the withdrawals “productivity gains.”

The results are in. In Flint, they came back positive for lead. Elsewhere, they come back as Gloria Reyes, silencing her tablet to sit with a grieving man while the dashboard records it as waste. As Margaret Delacroix, retiring early because the threats have made her position untenable. As Susan Chen, watching patients she vaccinated as children refuse her guidance because an algorithm decided doubt was more engaging than trust. As Gerald Murray in room 408, telling a stranger about his wife’s roses because there is no one else who remembers.

The corrosion is already advanced. The only question is what we do now.

The traders in the pit knew something the algorithms cannot know: that they would see each other tomorrow. That memory mattered. That reputation was worth maintaining because they would have to live in the world they were building. We have optimized away the tomorrow. We have automated away the memory. We have sold the world we live in to people who will not have to live in what it becomes.

It was not progress. It was liquidation.

And the national accounts call it prosperity. Trust does not appear in GDP; its expensive replacements do. By our measures, the high-trust society looks poorer than the one frantically purchasing surveillance and compliance and litigation to replace what it has lost. We have defined wealth in terms that reward its destruction.

The handshake is not coming back. But the binding—the slow, difficult, political work of insisting that some things cannot be left to the market—can return. Previous generations faced this choice. It is our turn now.