
The United States is not drifting toward inequality by accident or market force. It is being legislated there, deliberately, through a decades-long campaign of regulatory capture, tax code reengineering, and institutional defunding — all packaged as freedom, innovation, and fiscal responsibility. What follows is a MECE breakdown of the six interlocking mechanisms that built two countries inside one border.
Regulatory Capture:
The Shell Game Has a Name Structural
Regulatory capture is the process by which the agencies created to police an industry come to be controlled by that industry. It is not a conspiracy theory — it is an academic term with decades of documented evidence. When a bank CEO's former chief of staff runs the Consumer Financial Protection Bureau, when a pharmaceutical lobbyist is appointed to head the FDA's drug approval office, the regulation doesn't disappear — it is redirected.
The Powell Memo of 1971 — written by future Supreme Court Justice Lewis Powell for the U.S. Chamber of Commerce — is the founding document of modern corporate influence strategy. It explicitly called for business to invest in think tanks, law schools, media, and political campaigns to shift the ideological terrain. Within a decade, the architecture was in place: Heritage Foundation, Cato Institute, ALEC, and a revolving door between industry and the agencies meant to regulate it.
- Revolving door: Industry veterans staff regulatory agencies, then return to industry at 3–5x salary
- Rulemaking capture: Technical complexity of regulations allows industry to write the rules they're supposed to follow
- Enforcement budget cuts: Congress defunds agencies like the IRS, SEC, EPA — enforcement collapses without new laws needed
- Think-tank laundering: Industry-funded research presented as neutral policy analysis, cited in legislation
Deregulation is rarely the removal of rules. It is the replacement of rules that protect the public with rules that protect the regulated.
From Public Good
to Debt Trap Education
The GI Bill of 1944 was the single greatest wealth-building investment in American history. It sent 7.8 million veterans to college at essentially zero cost, creating the suburban middle class, the engineering workforce that built the space program, and the consumer economy of the postwar boom. That policy worked. So it was systematically dismantled.
GI Bill: free college for returning veterans. Average private tuition: ~$400/year in today's dollars.
Higher Education Act creates federal student loans — the first structural shift from grants to debt.
Reagan cuts federal education spending. States follow. Universities pivot to tuition as primary revenue.
Bankruptcy reform makes student loans nearly impossible to discharge — the debt trap locks permanently.
Bankruptcy Abuse Prevention Act: private student loans also exempted. Lenders face zero default risk.
Average 4-year public university cost exceeds $100,000 total. Student debt surpasses $1.77 trillion.
This wasn't market failure. It was legislative construction. The laws that made student debt non-dischargeable were written with heavy lobbying by Sallie Mae and the banking industry. The defunding of state universities was a political choice. The result: a generation entering the workforce with median debt of $37,000+ that compounds at rates designed to be unpayable, feeding an industry that profits from the inability to repay.
The Most Expensive
Sickest Rich Country Healthcare
The United States spends more per capita on healthcare than any nation on Earth — nearly double the OECD average — and gets worse outcomes than almost all of its wealthy peers on life expectancy, infant mortality, and maternal mortality. This is not a system that is failing to work. It is a system that is working exactly as designed — to extract revenue, not produce health.
- 1973 HMO Act: Nixon administration, working with insurance industry, embedded profit motive into healthcare delivery structure
- Pharmacy Benefit Managers: An invented middleman layer that now controls drug pricing, with no price transparency requirements
- Drug patent extension laws: "Evergreening" allows minor reformulations to reset 20-year monopoly patents, blocking generics
- Medicare Modernization Act (2003): Explicitly prohibited Medicare from negotiating drug prices — written by PhRMA lobbyists
- Certificate of Need laws: In many states, hospitals must get government permission to add capacity — enforced by existing hospital lobbies to block competition
- Out-of-network billing: A legal mechanism that allows providers to charge uninsured rates to insured patients without consent
When the law explicitly bans the government from negotiating drug prices, that is not policy — that is a protection racket with a congressional seal.
The Tax Code Was
the Heist Fiscal
No mechanism of wealth concentration is more powerful or more invisible than the tax code. Since the 1980s, every major tax reform has followed the same arc: headline cuts presented as broad relief that disproportionately benefit the top 1%, paired with the elimination of deductions that primarily benefited the middle class, and the gradual shift of tax burden from capital to labor.
Top marginal income tax rate: 91% (WWII era) to 70%. The era of maximum middle-class expansion in U.S. history.
Reagan Economic Recovery Tax Act: top rate cut from 70% to 50%, then to 28%. Largest top-rate cut in U.S. history.
Bush dividend and capital gains tax cuts: investment income now taxed at 15% while salary income taxed at 37%.
Tax Cuts & Jobs Act: corporate rate cut 35%→21%, pass-through deduction, estate tax exemption doubled. Top 1% captured 83% of the benefit.
IRS budget cut proposals and staff reductions under DOGE — enforcement capacity against high-income filers further weakened.
The carried interest loophole — which allows private equity and hedge fund managers to pay capital gains rates (20%) on income that is functionally a salary — has survived 40 years of bipartisan promises to close it. It survives because the industry spends $600M+ per election cycle in lobbying and campaign contributions. The loophole is worth approximately $18 billion annually to a few thousand people.
When Shelter Became
an Asset Class Housing
Housing in the United States has been converted from a social good — a place to live — into a financial instrument whose primary purpose is return on investment. This did not happen through market forces. It happened through a specific set of laws and tax policies that made housing speculation more profitable than productive investment, while making it structurally illegal to build enough homes in most American cities.
- 1031 Exchange: Tax code allows indefinite deferral of capital gains on investment property sales — incentivizing holding over sale
- Single-family zoning: Enforced by local laws in 75% of residential land in major U.S. cities, making multifamily housing illegal by default
- Prop 13-style laws: Freeze property taxes for existing owners, creating massive incentive to hold rather than sell or build
- Opportunity Zone abuses: 2017 TCJA created tax shelters nominally for development; billions flowed to luxury projects in already-wealthy areas
- Institutional landlord deregulation: Private equity firms like Blackstone have assembled hundreds of thousands of single-family homes; algorithmic pricing software coordinates rents across "competing" landlords
- Short-term rental loopholes: Airbnb/VRBO removed long-term rental supply in high-demand cities; most regulation blocked at state level via industry lobbying
The financialization of housing has created a self-reinforcing system: the more expensive housing becomes, the higher the return for institutional investors, the more political power they accumulate to block zoning reform, the less housing is built, the more expensive it becomes. Workers cannot afford to live near jobs. Young people cannot build generational wealth through homeownership as their parents did. The asset class that built the American middle class is now inaccessible to it.
The Propaganda
Is the Product Cultural
Every mechanism above requires public acquiescence — or at least public confusion. The final pillar is the manufactured ideology that makes ordinary people vote against their material interests, oppose programs that benefit them, and direct their anger at other working people rather than at the system extracting from all of them.
- Welfare queen mythology: Reagan's racialized "welfare queen" narrative — based on one disputed case — reshaped public opposition to social programs for four decades
- "Job creator" framing: Billionaires rebranded as benevolent producers; taxation reframed as theft from people who earned everything alone
- Anti-union culture war: Decades of legal and PR assault on unions — the primary mechanism through which workers captured a share of productivity — reduced membership from 35% to 10%
- "Innovation" as distraction: Tesla, crypto, and SPACs elevated as proof the system works, while median real wages stagnated for 50 years
- Deficit hawkery selectivity: Deficits are a crisis when funding schools or healthcare; they are invisible when funding tax cuts or defense — the asymmetry is the message
- "Anti-woke" misdirection: Culture war issues consuming political oxygen while major wealth transfers happen through reconciliation bills and regulatory rollbacks
When billionaires declare it "almost racist" to discuss taxing them, they are deploying the language of civil rights to protect the infrastructure of aristocracy. This is not coincidence. It is craft.
The success of this narrative capture is measurable: the United States has among the lowest levels of class consciousness and labor organizing in the OECD, despite having among the highest inequality. Americans consistently overestimate the amount of redistribution that already occurs and underestimate the concentration of wealth at the top. This is not natural. It is produced.
The Verdict:
Two Countries,
One Border
This is not a story of a system that tried and failed. It is a story of a system that succeeded — for the people who designed it. The laws did exactly what they were written to do. The regulations were captured on purpose. The defunding was deliberate.
The United States is not becoming a "third world country" in the technical sense. It is becoming something more precisely described: a first-world oligarchy with third-world distribution — world-class aggregate wealth, developing-nation outcomes for the bottom half.
The single most important thing to understand is that none of this required a secret meeting. It required sustained, well-funded, decades-long pressure on legislators, regulators, judges, and media — and it worked. The question is not whether this is happening. The evidence is in the law books. The question is what sufficient public clarity about it might eventually demand in response.
FROM AFFORDABLE
TO UNAFFORDABLE
How Policy, Law, and the Search for Profit Transformed
American Healthcare
A Fact-Filled FAQ
Based on CMS, OECD, KFF, Commonwealth
Fund & peer-reviewed data
PART I: WHAT HEALTHCARE LOOKED LIKE AFTER
WORLD WAR II
Before the systematic financialization of
American medicine, healthcare was a different world. A doctor’s visit was
affordable to a working-class family. A hospital stay would not bankrupt you.
The United States had built, through union bargaining and nonprofit insurance,
a system that broadly worked — not perfect, not universal, but functional and
humane in a way today’s system is not.
The Postwar Healthcare Landscape (1945–1965)
Following World War II, American healthcare
was characterized by:
•
Blue Cross (hospital
insurance) and Blue Shield (physician insurance) — both nonprofit plans —
covered the majority of insured Americans.
•
Employer-sponsored
insurance, which had taken root during the war when the 1942 Wage Stabilization
Act prevented employers from raising wages, became the dominant coverage model.
By 1958, three-quarters of the 123 million Americans with private coverage were
in employer-based programs.
•
The plans were designed
as genuine risk-pooling, not profit extraction. Premiums were community-rated,
meaning a factory worker paid the same as an executive.
•
A doctor’s office
typically had one physician and a nurse-receptionist. Patients paid by check.
Overhead was minimal. There was no prior authorization, no coding department,
no insurance liaison.
•
A 1933 hospital birth in
North Carolina cost $60 total, including a 10-day stay for the mother. By the
mid-1950s, an average family could see a doctor, cover a hospital stay, and
manage prescription costs without financial ruin.
When I was young in the 1950s and 1960s, we would go to the
doctor’s office. There were the doctor and his nurse/receptionist. My mother
would pay by check. The doctor did not have billing staff. The doctor did not
accept insurance — patients gathered receipts for reimbursement. It was simple.
— First-person account, EH.net oral history archive
The Numbers: Then vs. Now
The contrast between postwar healthcare
costs and today’s reality is not merely striking — it is a documented
indictment of six decades of policy choices.
KEY
DATA COMPARISON
5% Healthcare
as % of GDP (1960) — 1 in every 20 dollars in the economy
13.3% Healthcare
as % of GDP (2000) — more than doubled in 40 years
18.0% Healthcare
as % of GDP (2024) — nearly 1 in every 5 dollars (CMS official
data)
$74.1B Total
U.S. health spending (1970) — all healthcare
combined
$5.3 Trillion Total
U.S. health spending (2024) — 71x increase; CMS
National Health Expenditure data
$15,474 Per-capita
spend (2024) — highest of any nation on Earth
~$6,500 OECD
average per-capita spend — U.S. spends more than twice the
rich-country average
~$2,500/year Household
healthcare cost (1984) — typical family annual spend
~$10,000+/year Household
healthcare cost (now) — a 300%+ real increase; wages grew only 18%
in same period
740% Insurance
premium increase since 1984 — the single biggest
driver of household cost growth
PART II: THE LAWS THAT CHANGED EVERYTHING
American healthcare did not become
unaffordable by accident or market force alone. It was legislated into its
current form through a series of specific laws, regulatory decisions, and
political choices — each sold to the public with different justifications, each
serving to transfer wealth from patients to corporations.
1942: The Accident That Created Employer-Based Insurance
The Stabilization Act of 1942, designed to
prevent wartime inflation, banned employers from raising wages. To compete for
scarce workers, employers began offering health insurance as a tax-free
benefit. The IRS formalized this in 1954, cementing employer-based insurance as
the structural foundation of American healthcare. This was never designed as a
permanent system — but it became one, with enormous consequences: when
Americans lose their jobs, they lose their healthcare.
1965: Medicare and Medicaid — Landmark Progress, Unintended Inflation
The creation of Medicare and Medicaid under
President Johnson was a genuine moral achievement, extending coverage to the
elderly and the poor. But the payment model — which reimbursed hospitals at
their “usual, customary, and reasonable” fees without hard price controls —
triggered the first major healthcare cost explosion. From 1960 to 1970,
healthcare’s share of GDP grew faster than in any decade before or since. The
lesson policymakers drew was not to control prices — it was to let the system
grow.
1973: The HMO Act — When Profit Entered the Exam Room
This is the critical turning point. Before
1973, for-profit health insurance was illegal in most states. The Health
Maintenance Organization Act of 1973, signed by Nixon after intensive lobbying,
required employers with 25+ workers to offer federally certified HMOs as an
option and — crucially — removed the legal prohibitions on for-profit insurance
structures. The stated goal was cost control through managed care. The actual
result was the industrialization of medicine and the insertion of shareholder
returns as a primary design principle of American healthcare.
1982–1983: DRGs and Prospective Payment — The Coding Revolution
The Tax Equity and Fiscal Responsibility
Act of 1982 and the subsequent Prospective Payment System (1983) changed how
Medicare paid hospitals: instead of costs incurred, it paid fixed rates by
diagnosis code (DRG — Diagnosis Related Group). This created massive pressure
to game codes, discharge patients faster, and maximize billing. The army of
coders, billing specialists, and compliance officers that now consumes 25-35
cents of every healthcare dollar was born here.
1992: The RUC — The Secret Committee That Sets Doctor Pay
The AMA’s Relative Value Scale Update
Committee (RUC) is a private, physician-dominated body that effectively sets
Medicare’s payment rates for every medical service. Since 1992, it has
systematically overvalued specialist procedures and undervalued primary care —
driving medical students toward specialties and away from general practice,
contributing to the primary care shortage. This committee operates with no
public transparency and no government oversight.
2003: The Medicare Drug Prohibition — Written by Lobbyists
The Medicare Modernization Act of 2003
created the prescription drug benefit (Part D) — a genuine expansion of
coverage. But it contained a poison pill: Section 1860D-11(i) explicitly
prohibits Medicare from negotiating drug prices. The Veteran’s Administration
negotiates drug prices and pays 40-58% less than Medicare for identical drugs.
Every other large OECD government negotiates. This prohibition — worth hundreds
of billions to pharmaceutical companies — was drafted with direct industry
participation. Billy Tauzin, the congressman who shepherded the bill, resigned
from Congress immediately afterward to become president of PhRMA at a salary of
$2 million per year.
2010: The ACA — Genuine Reform, Structural Preservation
The Affordable Care Act achieved real
gains: 20+ million newly insured Americans, elimination of pre-existing
condition exclusions, children covered until 26, Medicaid expansion in 37
states. It also preserved the fundamental profit-extraction structure of
American healthcare, mandating that Americans buy private insurance rather than
creating a public option. The ACA made a broken system more inclusive without
fixing the underlying brokenness. Costs continued to rise at rates above
inflation throughout the decade.
PART III: FREQUENTLY ASKED QUESTIONS
Q: How did the U.S.
go from 5% of GDP on healthcare in 1960 to 18% today?
A: The
trajectory is documented by the Centers for Medicare & Medicaid Services
(CMS) and is unambiguous: 5% of GDP in 1960, rising to 13.3% by 2000, and
reaching 18.0% in 2024. In nominal dollars, total U.S. health spending went
from $74.1 billion in 1970 to $5.3 trillion in 2024 — a 71-fold increase in
roughly 50 years. The drivers, per CMS analysis, were not increased utilization
(Americans do not go to the doctor more than Europeans) but higher prices for
identical services, administrative overhead, and the profit extraction layer
inserted into every transaction.
CMS projects that by 2033, healthcare
could reach 20% of GDP — one-fifth of the entire economy — representing $8.6
trillion per year.
Q: Was healthcare
actually affordable in the 1950s and 1960s? Didn’t people struggle then too?
A: Yes,
there were people without coverage then, and healthcare had genuine gaps — the
elderly and the poor were largely uncovered before Medicare and Medicaid
(1965). But for the working middle class with employer coverage, healthcare
costs were a small fraction of income. The nonprofit Blue Cross/Blue Shield
model spread risk without profit extraction. A doctor’s visit was a direct
transaction, not a billing event processed by three intermediary companies. The
defining difference: no working family with insurance filed for bankruptcy
because of a medical bill. That phenomenon — medical bankruptcy — is a modern
American invention.
Q: How many
Americans go bankrupt because of medical bills today?
A: Medical
costs are a contributing factor in roughly 59% of all U.S. personal
bankruptcies, according to KFF/NYT survey data and Investopedia analysis. Each
year, hundreds of thousands of Americans are pushed into bankruptcy by
healthcare debt. This is virtually unknown in any other wealthy country.
Medical bankruptcies in Europe are, statistically, nearly nonexistent — because
other systems do not allow medical costs to be catastrophic.
About 50% of U.S. adults say they believe
they would go bankrupt if faced with a significant health event. 250,000+
Americans resort to GoFundMe crowdfunding for medical bills every year — that
is 1 in 3 GoFundMe campaigns. Approximately 90% of those campaigns fail to
reach their goal.
Q: If the U.S.
spends so much on healthcare, why are outcomes so bad?
A: The
United States spends 2-4 times what peer nations spend per capita on healthcare
and achieves worse outcomes on nearly every major health metric. Per the 2025
Americas Health Rankings report (citing OECD data):
•
U.S. life expectancy:
79.0 years (2024) vs. OECD peer-country average of 82.7 years — a gap of 3.7
years.
•
U.S. ranked No. 32 out
of 38 OECD countries in life expectancy — below Colombia and just above Poland.
•
U.S. infant mortality:
5.6 deaths per 1,000 live births — ranked No. 32 of 38 OECD nations. Estonia
leads at 1.7. Japan: 1.7. The U.S. rate is more than 3x the best performers.
•
Mississippi’s infant
mortality rate (9.0 per 1,000) is more than twice the OECD average — a rate
comparable to developing nations.
•
U.S. maternal mortality:
23.8 deaths per 100,000 live births in 2020 — more than 3 times the rate of
most other high-income countries.
•
Black Americans’ life
expectancy: 74.8 years. American Indians/Alaska Natives: 71.8 years. These
figures are below the averages of many middle-income countries.
The high spending is not purchasing
health. It is purchasing administrative overhead, pharmaceutical profit,
insurance company margins, and executive compensation.
Q: What percentage
of healthcare dollars goes to administration, not care?
A: The
U.S. spends roughly 25-35% of all healthcare dollars on administrative costs —
billing, coding, prior authorization, insurance liaison staff, claims
processing, and compliance. Canada, with a single-payer system, spends
approximately 12% on administration. The difference — estimated at $400-500
billion annually — is the cost of maintaining the multi-payer complexity. Every
hospital, clinic, and doctor’s office in America employs more billing staff
than clinical staff in many cases.
Q: Why does insulin
— a 100-year-old drug — cost hundreds of dollars in the U.S.?
A: Insulin
was discovered in 1921. Its inventors sold the patent to the University of
Toronto for $1, explicitly so it would remain affordable. The cost to
manufacture a vial of modern insulin is estimated at $2-4. Yet in 2019, one
vial of Humalog (insulin lispro) cost $332 — a 1,400% increase from $21 in
1999. The mechanism is not research cost (this drug is over 20 years old); it
is a legal structure that allows pharmaceutical companies to serially
reformulate drugs to reset patent protections, blocking generic competition.
This is called “evergreening.” Three companies — Eli Lilly, Novo Nordisk, and
Sanofi — control essentially all U.S. insulin supply and raised prices in
lockstep for two decades.
Americans with Type 1 diabetes, who
require insulin to survive, have rationed doses, used expired vials, and in
documented cases, died — because they could not afford a medication that costs
$4 to manufacture. This is a policy outcome, not a market outcome.
Q: What is a
Pharmacy Benefit Manager and why does it matter?
A: Pharmacy
Benefit Managers (PBMs) are intermediary companies that manage prescription
drug benefits on behalf of insurers and employers. The three largest — CVS
Caremark, Express Scripts, and OptumRx — control approximately 80% of U.S.
prescription drug transactions. They negotiate “rebates” from drug
manufacturers that they largely retain as profit rather than pass to patients.
They determine which drugs are covered (formularies), at what cost-share to
patients, and at which pharmacies. They operate with minimal transparency
requirements. The rebate system creates a perverse incentive: higher list
prices generate larger rebates, so PBMs benefit from expensive drugs, not cheap
ones.
Q: Why can’t
Medicare negotiate drug prices?
A: Because
Congress made it illegal. Section 1860D-11(i) of the Medicare Modernization Act
of 2003 explicitly prohibits the Secretary of Health and Human Services from
negotiating drug prices. The Veteran’s Administration, which is not subject to
this prohibition, pays 40-58% less for the same drugs. The Inflation Reduction
Act of 2022 allowed negotiation on a limited set of drugs (beginning with 10 in
2026, expanding slowly thereafter) — a partial, contested reform achieved after
20 years of lobbying failure. PhRMA spent $400 million lobbying against this
provision over five years.
Q: What is ‘prior
authorization’ and is it medically necessary?
A: Prior
authorization is a requirement that physicians get permission from an insurance
company before providing certain treatments, drugs, or procedures. Originally a
narrow cost-control tool, it has expanded to cover thousands of routine medical
decisions. Physicians now spend an average of 14.9 hours per week on prior
authorization paperwork, according to AMA surveys. The AMA reports that 94% of
physicians say prior authorizations delay necessary care; 33% say delays have
led to serious adverse events for patients, including hospitalization or
permanent impairment.
Insurance companies deny prior
authorization requests at rates ranging from 5-20% depending on plan type. Of
those denials that are appealed, the majority are eventually approved — meaning
the initial denial served primarily to delay care and discourage patients from
persisting. This is not medicine. It is administrative attrition.
Q: How does
hospital billing actually work, and why is it so opaque?
A: Every
U.S. hospital maintains a “chargemaster” — an internal price list for every
service. These prices are largely fictional: they are the starting point for
negotiation with insurers, who pay a fraction of the chargemaster rate.
Uninsured patients are typically billed at chargemaster rates — the highest
possible price, charged to those least able to pay. A heart bypass surgery that
costs $78,318 in the U.S. costs $25,059 in the UK and significantly less in
other wealthy nations. Hospital price transparency regulations (fully effective
2021) required hospitals to publish their prices. A 2022 study found that more
than 70% of hospitals were still not in full compliance.
Q: Who profits from
this system?
A: The
major beneficiaries are publicly documented through SEC filings and executive
compensation data:
•
UnitedHealth Group (2024
revenue): $371 billion — larger than the GDP of many countries.
•
CVS Health/Aetna: $372
billion in combined revenue.
•
The top 5 health
insurance CEOs averaged $18-25 million in annual compensation in 2023.
•
Hospital systems: the
top 25 nonprofit hospital systems had combined operating surpluses of $37
billion in 2022.
•
Pharmaceutical
companies: the top 10 pharma companies earned over $500 billion in revenue in
2023.
Many of these companies are classified as
“nonprofit,” which exempts them from corporate taxes, while paying executive
teams tens of millions annually and lobbying aggressively against any policy
that would constrain their revenue.
Q: How does the
U.S. compare to other wealthy countries on health coverage?
A: Every
country in the Commonwealth Fund’s comparison study — Australia, Canada,
France, Germany, Japan, Netherlands, New Zealand, Norway, South Korea, Sweden,
Switzerland, UK — guarantees government or public health coverage to all
residents. The United States is the sole exception. In 2021, 8.6% of Americans
— approximately 28 million people — had no health coverage of any kind. Many
more are “underinsured”: technically covered but with deductibles and co-pays
so high that they effectively cannot access care without financial risk.
Q: Is this about to
get better or worse?
A: As
of 2026, the trajectory is worsening on several fronts: proposals to
restructure Medicaid (which covers 80+ million Americans) through block grants
and work requirements would reduce coverage for millions of low-income adults
and children. IRS enforcement budget cuts reduce collection of taxes that fund
public programs. Drug price negotiation provisions of the Inflation Reduction
Act face ongoing legal and legislative challenges. The 2025 CMS data show
healthcare spending growing at 7.2% annually — well above both wage growth and
general inflation. Without structural reform, the system is projected to
consume 20% of GDP by 2033.
PART IV: WHAT THE EVIDENCE SAYS ABOUT
SOLUTIONS
The comparative international data is not
ambiguous. Countries that achieve better health outcomes at lower cost share
common structural features. This is not ideology — it is health systems
research.
What Works, Based on Evidence
Single-payer or tightly
regulated multi-payer systems Administrative simplicity —
Canada spends ~12% on admin vs. 25-35% in U.S.
Government negotiates
directly with manufacturers Drug price negotiation —
VA pays 40-58% less than Medicare for identical drugs
Systems that pay GPs well
attract doctors to primary care Primary care investment —
U.S. has severe primary care shortage due to specialist pay gap
Eliminates charity care
cost-shifting and ER overuse Universal coverage —
Every peer nation covers all residents
Early intervention
dramatically reduces downstream costs Preventive investment —
OECD data consistently links preventive spending to lower total costs
The United States does not have a healthcare system. It has a
healthcare market. Markets optimize for profit. Healthcare optimizes for
health. These are not the same objective, and for 50 years, we have been
pretending they are.
The Cost of Inaction
Every year the United States delays
structural healthcare reform:
•
Approximately
26,000-45,000 Americans die annually from lack of health coverage, according to
peer-reviewed estimates (American Journal of Public Health).
•
Hundreds of thousands of
families file for bankruptcy due to medical bills.
•
The portion of GDP
consumed by healthcare grows, crowding out wages, housing, education, and other
productive investment.
•
Employer-sponsored
insurance premiums rise, effectively functioning as a tax on employment that
falls hardest on small businesses and middle-income workers.
•
The uninsured and
underinsured delay care until conditions become acute and expensive, entering
the system through emergency rooms at maximum cost to everyone.
The question is not whether reform costs
money. The question is whether the current system’s costs — in dollars, in
lives, in bankruptcies, in rationed insulin, in maternal deaths — are
acceptable. The data say they are not.
SOURCES & DATA
Centers for
Medicare & Medicaid Services (CMS) — National Health Expenditure Data,
Historical Series 1960–2024
Peterson-KFF
Health System Tracker — U.S. Health Spending Over Time; Life Expectancy
Comparisons
Commonwealth Fund
— Mirror, Mirror 2023: A Portrait of the Failing U.S. Health System
America’s Health
Rankings — 2025 Annual Report, International Comparison (OECD data)
AJMC — U.S. Has
Highest Infant & Maternal Mortality Despite Most Healthcare Spending (2023)
CMS / Health
Affairs — History of Health Spending in the United States, 1960–2013 (Catlin
& Cowan, 2015)
NCBI — Taxpayers’
Share of U.S. Prescription Drug and Insulin Costs (2024)
Policy Matters
Ohio — Insulin Price-Gouging Kills (2025); original insulin patent history
EH.net — Health
Insurance in the United States; The Blues: History of Blue Cross and Blue
Shield
NCBI Bookshelf —
Origins and Evolution of Employment-Based Health Benefits
AMA — Prior
Authorization Survey Data (annual)
“Third World America?”
Wealth Inequality, Poverty, and the Changing American Dream
An AP High School Reading Article
For much of the 20th century, the United States was viewed as one of the wealthiest and most economically powerful nations in history. Millions of families believed in the “American Dream”—the idea that hard work, education, and persistence could lead to economic security and upward mobility.
Yet in recent decades, many economists, historians, and political commentators have argued that the United States is becoming increasingly divided between the wealthy and everyone else. Some critics use the phrase “Third World America” to describe growing poverty, crumbling infrastructure, underfunded schools, housing insecurity, and widening inequality in parts of the country.
The term is controversial. The United States remains one of the richest nations on Earth, with enormous technological innovation and high overall economic output. However, critics argue that many Americans now experience conditions more commonly associated with struggling developing nations: unstable housing, unaffordable healthcare, poor public transportation, food insecurity, and declining economic opportunity.
One of the most famous discussions of this idea appeared in Third World America by Arianna Huffington. Economist Peter Temin later described the United States as developing into a “dual economy,” where one group benefits from globalization and wealth concentration while another struggles with stagnant wages and limited opportunity.
The Rise of Economic Inequality
Economic inequality refers to the uneven distribution of wealth and income across society. Since the 1970s, income growth in the United States has increasingly favored top earners.
Research from the Pew Research Center shows that the percentage of Americans considered middle class fell from about 61% in 1971 to around 50% by the early 2020s. Meanwhile, the share of upper-income households increased significantly.
Today, the wealthiest Americans control a historically large percentage of national wealth. At the same time, many working- and middle-class families report living paycheck to paycheck despite working full-time jobs.
Some economists point to several major causes:
- Globalization and outsourcing of manufacturing jobs
- Automation replacing certain forms of labor
- Declining union membership
- Rising healthcare and housing costs
- Tax policies favoring corporations and high earners
- Reduced government investment in public goods
- Increasing costs of higher education
Critics argue that beginning in the 1970s and 1980s, economic policies shifted away from New Deal-style protections for workers toward deregulation, privatization, and tax reductions for wealthy individuals and corporations. Supporters of these policies argue they encouraged investment and economic growth. Opponents argue they accelerated wealth concentration.
The Disappearing Middle Class
The American middle class historically represented stable jobs, home ownership, savings, pensions, and access to education. Many families could survive on one income during the 1950s and 1960s.
Today, that reality has changed dramatically.
According to recent economic reports, household debt has reached historic highs, including credit card debt, medical debt, student loans, and housing costs. Even many households earning middle-class incomes report financial insecurity.
A major issue is that wages for many workers have not risen at the same pace as:
- Housing prices
- Healthcare costs
- Childcare costs
- College tuition
- Inflation
For many Americans, economic growth feels uneven. Stock markets and corporate profits may rise, while ordinary families struggle to afford rent, groceries, or emergency expenses.
Poverty in America
Despite enormous national wealth, poverty remains a major issue in the United States.
According to the U.S. Census Bureau, about 35.9 million Americans lived in poverty in 2024, representing approximately 10.6% of the population.
Child poverty remains especially concerning. Millions of children experience food insecurity, unstable housing, or lack access to adequate healthcare and educational resources.
Researchers also note that poverty looks different across regions:
- Rural poverty in former manufacturing or mining communities
- Urban poverty linked to housing costs and segregation
- Suburban poverty hidden in areas previously considered middle class
Critics argue that official poverty statistics may underestimate hardship because they do not always fully account for modern living costs such as housing, transportation, healthcare, and debt.
Infrastructure and Public Decline
Another reason some commentators use the term “Third World America” is concern about declining infrastructure.
Across the nation:
- Bridges and roads require repair
- Water systems are aging
- Public transportation systems are uneven or outdated
- Schools in low-income areas are often underfunded
Events such as the Flint Water Crisis shocked many Americans because they revealed that basic public services could fail dramatically even in a wealthy nation.
Infrastructure experts regularly warn that the United States has underinvested in public systems for decades compared to many other developed nations.
Healthcare and Life Expectancy
The United States spends more on healthcare than almost any country in the world, yet health outcomes are uneven.
Many Americans face:
- High medical debt
- Lack of affordable insurance
- Limited access to healthcare in rural regions
- Shortages of mental health services
Researchers have also documented “deaths of despair,” including increases in:
- Drug overdoses
- Alcohol-related illness
- Suicide rates in some communities
Some economists argue these trends reflect deeper social and economic instability.
Education and Opportunity
Education has traditionally been viewed as the pathway to upward mobility in America. However, critics argue that educational inequality mirrors economic inequality.
Schools in wealthier neighborhoods often have:
- Better funding
- More experienced teachers
- Advanced academic programs
- Stronger extracurricular opportunities
Meanwhile, schools in low-income areas may face:
- Teacher shortages
- Larger class sizes
- Aging facilities
- Limited resources
Student loan debt has also exploded over the past several decades, leaving many young adults burdened financially before beginning their careers.
Competing Perspectives
Not everyone agrees that the United States is becoming a “Third World” country.
Some economists argue:
- Living standards remain historically high overall
- Technology has improved quality of life
- Extreme poverty globally is much worse than poverty in the U.S.
- Many middle-class households moved upward economically rather than downward
Others counter that averages hide severe inequality and regional hardship. They argue that economic growth increasingly benefits a small percentage of society while many workers experience insecurity and declining stability.
This debate remains highly political and emotionally charged.
Historical Roots of the Debate
Historians disagree about when these economic changes began.
Some trace the roots back to:
- Post-World War II suburbanization
- The decline of unions in the 1950s and 1960s
- Deindustrialization in the 1970s
- Reagan-era economic policies in the 1980s
- Globalization and outsourcing in the 1990s
- Financialization of the economy in the 2000s
Others point to broader global forces such as automation and technological change that affected many industrial nations, not just the United States.
Conclusion
The phrase “Third World America” is not a literal classification. The United States remains a wealthy, technologically advanced nation. However, the term reflects growing concern that millions of Americans experience economic insecurity, declining opportunity, and failing public systems despite the nation’s immense wealth.
At the center of this debate are fundamental questions:
- Who benefits from economic growth?
- What responsibilities does government have toward citizens?
- What defines a healthy middle class?
- Can the American Dream still be achieved by most people?
These questions continue to shape political debates, economic policy, and the future of American society.
Socratic Seminar Questions
- What defines a “middle class” society?
- Is the American Dream still attainable for most Americans? Why or why not?
- Should wealth inequality be considered a major threat to democracy?
- What role should government play in reducing poverty?
- Are high levels of economic inequality inevitable in capitalist societies?
- Which is more important: economic growth or economic equality?
- How does education affect economic opportunity?
- Should healthcare be considered a human right?
- What causes poverty more often: personal choices or systemic issues?
- How does infrastructure influence economic opportunity?
- Are tax cuts for wealthy individuals beneficial or harmful overall?
- What responsibilities do corporations have toward workers and communities?
- How has globalization changed the American economy?
- Should college education be publicly funded?
- What policies could strengthen the middle class?
- Is the phrase “Third World America” accurate, exaggerated, or politically manipulative?
- How does media influence perceptions of economic reality?
- What historical turning points most changed the American economy after World War II?
- How does poverty affect children differently than adults?
- What would a fair economic system look like to you?
THE BROKEN
PROMISE
How America Dismantled Its Education System — From the
GI Bill to the Student Debt Trap
An analytical essay for AP / A-Level students
Sources: OECD Education at a Glance 2024
• CMS • Federal Reserve • Education Commission of the States • Learning Policy
Institute
I. THE CONSTITUTIONAL PROMISE: FREE
EDUCATION FOR ALL
The founding logic of American public
education was radical for its time and remains radical today: that a democracy
cannot survive without an educated citizenry, and that the state therefore has
an obligation to provide that education at public expense. This was not an
abstract ideal. It was written into the constitutions of all fifty states.
The Constitutional Foundation
Every state in the United States has an
“education article” in its constitution, mandating the creation of a public
education system. The language is often striking in its ambition. A sample:
•
Indiana: “Knowledge and
learning, generally diffused throughout a community, being essential to the
preservation of a free government; it should be the duty of the General
Assembly to provide, by law, for a general and uniform system of Common
Schools, wherein tuition shall be without charge, and equally open to all.”
•
Florida: “The education
of children is a fundamental value of the people of the State of Florida. It
is, therefore, a paramount duty of the state to make adequate provision for the
education of all children…”
•
Arizona: Article XI,
Section 6 of the state constitution stated that instruction at state
universities “shall be as nearly free as possible.” The University of
California system honored this principle by charging no tuition to state
residents until 1968.
•
Thirty-six of
thirty-seven states at the time of the Fourteenth Amendment’s passage (1868)
included some right to education in their state constitutions, according to
legal scholars Calabresi and Agudo.
These provisions were not decoration. They
reflected a consensus forged over more than a century: that education was a
public good, a democratic imperative, and an economic investment — not a
private commodity to be priced by the market.
Knowledge and learning, generally diffused throughout a
community, being essential to the preservation of a free government. — Indiana
Constitution, Article 8, still in force today
The Land-Grant Legacy
The federal commitment to higher education
predates the republic itself. The Land Ordinance of 1785 reserved portions of
surveyed western territories for local schools. The Morrill Acts of 1862 and
1890 donated federal land to each state to fund colleges focused on
agriculture, engineering, and the practical sciences. These became the great
state universities: Michigan, Texas, Wisconsin, Ohio State, Cornell, MIT. Their
founding mandate was explicit: provide a rigorous, practical university
education to ordinary citizens, not just elites.
By the early twentieth century, this system
had produced a network of publicly funded universities offering education at
minimal cost. Tuition was largely a token charge, not a barrier. The American
university was conceived as a public institution, like a library or a
courthouse — open to all, funded by all.
II. THE GI BILL: PROOF THAT IT WORKED
If any single episode demonstrates what an
open, publicly funded education system can accomplish, it is the Servicemen’s
Readjustment Act of 1944 — the GI Bill. It is simultaneously the greatest
investment in human capital in American history and the clearest proof that the
system which followed it was a choice, not an inevitability.
What the GI Bill Did
President Roosevelt signed the GI Bill on
June 22, 1944 — two weeks after D-Day, while the war was still raging. The
legislation was designed to prevent mass unemployment as 15 million men and
women returned from combat. Its education provisions were straightforward: veterans
would receive coverage of tuition and fees, a living allowance, and access to
university education at essentially no personal cost.
Universities responded by building remedial
programs, expanding capacity, and creating flexible entry pathways to ensure
returning soldiers — many of whom had left school years earlier — could
succeed. The system did not say: “You aren’t qualified.” It said: “We will make
you qualified.
What
followed was transformational:
8 million+ veterans
used GI Bill education benefits in the first few years —
U.S. Department of Defense
2× the
number of university degrees more than doubled between 1940 and 1950
4.6% → 25% percentage
of Americans with bachelor’s degrees or higher, 1945 to 1995 —
Our Documents Initiative / NCES
60% of
University of Iowa enrollment was veterans in 1947 —
documented by NPR
$525 cost
of one year at Harvard in 1947 — equivalent to 17.5% of
average household income
$66,104 cost
of one year at Harvard/Penn equivalent today —
337% real increase since 1959
The GI Bill created the American middle
class. The engineers, teachers, scientists, doctors, and managers who built the
postwar economy — who purchased the houses in the suburbs, raised the tax base,
funded Social Security, and formed the backbone of American productivity for
two generations — were disproportionately GI Bill graduates. The return on
investment was not subtle. Every dollar invested in GI Bill education is
estimated to have returned five to seven dollars to the economy.
What Changed
The University of California system charged
no tuition to California residents until 1968. The shift came not from
economics but from politics: Ronald Reagan, campaigning for Governor of
California in 1966, explicitly attacked “free education” as un-American and
imposed the first tuition charges on the UC system upon taking office. The
argument that public higher education was a charitable gift rather than a civic
right began here, and it spread nationally over the following decades.
The 1972 reauthorization of the Higher
Education Act shifted federal student aid from grants (which do not require
repayment) toward loans (which do), cementing the debt model as the policy
default. This was a political choice, not an economic necessity.
III. THE DEBT TRAP: HOW EDUCATION BECAME A
PRODUCT
Between 1980 and 2026, the United States
transformed its higher education system from a publicly subsidized ladder of
opportunity into a privately financed debt market. The transformation was not
accidental. It was a series of deliberate legislative and regulatory choices,
each framed as expanding access or fiscal responsibility, each in practice
transferring the cost of education from the collective (taxpayers) to the
individual (students and families).
The Key Legislative Turning Points
1978: The Middle Income Student Assistance Act
Opened federal student loan eligibility to
middle-income families, not just the poor. A genuine expansion of access, but
it also primed the market for commercial lenders to enter student lending at
scale. As more federal money flowed to education, institutions began raising
prices to capture it — a dynamic economists call the “Bennett Hypothesis,”
named after Reagan’s Education Secretary who articulated it in 1987.
1980: The Parent PLUS Loan Program
Created federal lending for parents, not
just students. No cap on borrowing. Parents could now take on unlimited federal
debt to finance children’s education, removing another natural price brake from
the system.
1992: The Unsubsidized Stafford Loan
Previously, federal student loans only
accrued interest once a student graduated. Unsubsidized loans began charging
interest from the day of disbursement — including while the student was still
enrolled. For a student taking four years to graduate, interest compounded
during every year of attendance. This structural change transformed student
loans from a bridge to a trap.
1998: The Bankruptcy Reform That Locked the Trap
The Higher Education Amendments of 1998
made federal student loans nearly impossible to discharge in bankruptcy.
Unlike almost every other category of consumer debt — credit card debt, medical
debt, business loans, even gambling debts — student loans survive bankruptcy.
The stated rationale was preventing abuse. The practical effect was that
lenders faced essentially no default risk. When lenders face no default risk,
they have no incentive to ensure borrowers can repay. The commercial student
lending market exploded.
2005: Private Loans Join the Non-Dischargeable Club
The Bankruptcy Abuse Prevention and
Consumer Protection Act extended non-dischargeability to private student loans
as well. Sallie Mae, the largest student lender, spent $9.5 million lobbying
for this provision. Students taking private loans at variable interest rates,
with no income-based repayment options, were now equally trapped.
2008: The For-Profit College Explosion
Under federal rules, for-profit colleges
could receive up to 90% of their revenue from federal student aid. The result:
a massive industry with direct financial incentive to enroll as many students
as possible, collect their federal loan money, and deliver credentials of
minimal market value. The University of Phoenix at its peak enrolled 600,000
students. Corinthian Colleges collapsed in 2015 amid fraud investigations,
having left 350,000 students with worthless degrees and tens of thousands of
dollars in debt.
The Numbers Today
$1.77 Trillion total
outstanding student loan debt in the U.S. (2024) —
Federal Reserve Bank of New York
#3 student
debt is the third-largest category of consumer debt —
after mortgages and auto loans
$37,000+ average
student debt at graduation for bachelor’s degree holders
$57,000 average
debt for dual-student households — projected to cost
$200,000+ in lost retirement savings and home equity
300%+ real
increase in tuition since 1980, adjusted for inflation
157.5% increase
in average college tuition since 2000 alone
5.91% average
annual tuition increase every year since 1963–64 —
consistently above wage growth and general inflation
$13,900 in-state
tuition at University of Arizona today — vs. $0 when AZ
constitution called for instruction “as nearly free as possible”
$21,101 in-state
tuition and fees at UC Berkeley today — vs. $0 until 1968
The consequences extend beyond individual
debt burdens. A 2024 study found that dual-income households graduating with
average debt lose more than $200,000 in projected lifetime retirement savings
and home equity compared to debt-free peers. The GI Bill built a generation’s
wealth. The debt model is dismantling it.
The policy pivots away from affordable prices and toward
debt-financing of higher education did not take long to generate a massive debt
bubble. — VetEdSuccess.org analysis of Title IV federal aid programs
IV. THE PIPELINE BROKEN: EARLY CHILDHOOD,
PARENTAL LEAVE, AND THE MISSING FOUNDATION
A serious education system does not begin
at age five. It begins at birth — or, as the neuroscience increasingly shows,
before birth. The first three years of life are the period of greatest brain
development in human existence. Experiences during these years shape cognitive
capacity, emotional regulation, language acquisition, and social skills in ways
that persist across an entire lifetime. Countries that understand this invest
heavily in those years. The United States, almost uniquely among wealthy nations,
does not.
The Parental Leave Catastrophe
The United States is the only OECD
member country — and one of only six nations on Earth — without a national
paid parental leave policy. The company of nations sharing this distinction
includes Lesotho, Liberia, Papua New Guinea, and Swaziland. Every other
developed democracy guarantees paid leave to new parents. The U.S. does not.
17.3 weeks average
paid maternity leave across 37 OECD countries offering it
0 weeks federal
paid parental leave in the United States
12 weeks unpaid
leave available under FMLA (1993) — only for workers at
companies with 50+ employees; covers roughly 60% of workers
18 weeks OECD
average paid maternity leave in addition to some paternity leave for fathers
The consequence is not merely a
quality-of-life issue for parents. It is an early childhood development crisis.
Infants whose parents must return to work within weeks of birth are placed in
whatever childcare arrangement the family can afford — which, for millions of
Americans, is inadequate, unstable, or non-existent. Brain development in the
first year of life is significantly shaped by consistent, responsive
caretaking. Policy that makes such caretaking economically impossible is
educational policy, with permanent consequences.
The Childcare Void
Between the end of parental leave (for the
fraction of workers who have it) and the beginning of kindergarten, American
families are largely on their own. The United States has created what OECD
researchers call a “childcare gap” — the period between the end of available
leave and the start of publicly funded pre-primary education. In the U.S., this
gap is approximately five years. In most OECD countries, it is covered by
subsidized or free childcare.
0.7% average
OECD public spending on childcare and early education as % of GDP
<0.5% U.S.
public spending on childcare and early education —
among the lowest in the OECD
1.8% Iceland’s
public spending on childcare and early education —
the OECD leader
61% of
U.S. 3–5 year-olds enrolled in pre-primary education —
vs. close to 90% OECD average for this age group
$20,000+ annual
cost of full-time childcare for an infant in major U.S. cities —
often exceeding in-state college tuition
For families in the bottom half of the
income distribution, the math is often impossible. Full-time infant childcare
in a metropolitan area frequently costs more than a year of public university
tuition. Many parents — disproportionately mothers — leave the workforce
entirely, not because they choose to, but because working nets less money than
childcare costs. This is not individual failure. It is policy failure, with
ripple effects across child development, gender equity, family finances, and
lifetime earnings.
Full-Day Kindergarten: Still Not Universal
Kindergarten in the United States is not
universally full-day. As of the mid-2020s, a number of states and districts
still offer only half-day kindergarten, or offer full-day as an optional extra
cost. In countries like Germany, France, and the Nordic nations, full-day early
education beginning at age three is standard and publicly funded. The research
on full-day versus half-day kindergarten is unambiguous: children in full-day
programs demonstrate stronger literacy and numeracy outcomes through at least third
grade, with larger effects for children from lower-income families — precisely
the children who most need the intervention.
V. TESTING, STANDARDS, AND THE ACHIEVEMENT
GAP THAT POLICY BUILT
The No Child Left Behind Act of 2001 (NCLB)
is perhaps the most consequential piece of domestic education legislation since
the original Elementary and Secondary Education Act of 1965. It was presented
as a civil rights measure: a demand for accountability that would finally close
the persistent gap in achievement between white and wealthy students on one
hand and Black, Hispanic, and low-income students on the other. Its actual
effects were far more complex and, for many students, actively harmful.
What NCLB Did
NCLB required annual standardized testing
in reading and mathematics for students in grades 3–10. Schools that did not
meet “Adequate Yearly Progress” benchmarks faced escalating sanctions, up to
and including state takeover and forced staff replacement. The theory of change
was simple: measure outcomes, hold schools accountable, and improvement will
follow.
The reality was more complicated:
•
Schools serving
low-income students — where the “achievement gap” was largest — faced the most
pressure and the fewest resources. High-stakes testing in under-resourced
environments does not produce learning; it produces teaching to the test.
•
Curriculum narrowed
dramatically. Subjects not tested — history, arts, music, physical education,
civics — were cut to make time for test preparation in reading and math. The
rich, broad curriculum that high-income schools maintained became a marker of privilege.
•
The law’s definition of
“highly qualified teacher” imposed certification requirements that, in
practice, made teacher shortages in already understaffed schools worse by
restricting who could fill vacancies.
•
NCLB explicitly defined
the “achievement gap” as a school and curriculum problem, ignoring the
substantial evidence that the gap is primarily an income and opportunity gap.
Schools cannot feed children who come to school hungry. Schools cannot house
children who are experiencing homelessness. Schools cannot undo the cognitive
effects of exposure to lead, chronic stress, or childhood trauma — all of which
are significantly more prevalent in low-income communities.
The Testing Industry
Standardized testing at scale is not free.
The testing mandated by NCLB and its successor, the Every Student Succeeds Act
(2015), generated a multi-billion-dollar industry. Companies like Pearson,
McGraw-Hill, and Educational Testing Service became major beneficiaries of
federal education policy. Curriculum development, test preparation materials,
scoring systems, and teacher training programs followed the same logic: public
education spending was redirected into private corporate revenue.
This is the monetization of childhood in
its most direct form. When a public school district in a low-income area spends
$200 per student per year on testing and $50 per student on school libraries,
it is not making a pedagogical choice. It is making a political one, driven by
legislative mandates whose design prioritizes accountability optics over
educational substance.
The Teacher Shortage: A Consequence, Not a Coincidence
The NCLB era accelerated a trend that
continues to worsen: the depletion of the teaching workforce. Per the Learning
Policy Institute’s 2025 analysis:
411,549 U.S.
teaching positions either unfilled or filled by uncertified teachers (2024–25)
1 in 8 of
all teaching positions nationally is affected by shortage
48 states reported
shortages in at least one area in 2024–25 — DC included;
every state reported multi-subject shortages
45 states
reporting special education shortages — the most common
shortage area
41 / 40 states
reporting science and math shortages respectively
Teacher attrition is not mysterious. Real
teacher salaries in the United States have declined relative to other
college-educated professionals for decades. The emotional burden of high-stakes
testing accountability in under-resourced environments is documented and
significant. Administrative burdens — paperwork, compliance, data entry —
consume hours that used to be spent on lesson planning and student
relationships. And in many states, teachers purchase their own classroom
supplies from personal income.
The shortage falls hardest on the students
who most need experienced, qualified teachers: children in low-income,
high-need schools. The achievement gap that NCLB was designed to close is
widened, not narrowed, when the children on the wrong side of it are taught by
a revolving door of uncertified substitutes while children in affluent
districts retain stable, experienced teaching staffs.
NCLB faults curriculum and schools themselves for student
failure, while ignoring old and damaged school buildings, class size,
homelessness, hunger, and lack of healthcare. — ConnectUS analysis of NCLB
effects
VI. CLASS SIZE, FUNDING, AND THE PROPERTY
TAX TRAP
American public schools are primarily
funded through local property taxes. This is not a technical quirk. It is a
structural guarantee of inequality: wealthy communities with high property
values generate abundant school funding; poor communities with low property
values do not. The Supreme Court upheld this system in 1973 (San Antonio
Independent School District v. Rodriguez), ruling that education is not a
fundamental right under the federal constitution and that property-tax-based
funding disparities did not require federal remedy.
The consequences are stark and measurable.
Per-pupil spending in the wealthiest school districts in America can be three
to four times higher than in the poorest, within the same state. Class sizes in
low-income schools are larger. Libraries are smaller or nonexistent. Counselors
cover caseloads of 400–1,000 students when professional guidelines recommend
250. Advanced placement courses, music programs, and extracurricular activities
— the enrichments that appear on college applications — are standard in wealthy
districts and absent in poor ones.
California’s Proposition 13 (1978) froze
property tax assessments, starving school funding. In two decades, California
fell from first in the nation in per-student spending to 43rd. The pattern has
repeated in state after state: tax limitation measures, originally marketed as
relief for homeowners, have systematically defunded the public schools that
serve the children of those homeowners.
VII. THE AMERICAN DREAM, DEFERRED: DEGREES
THAT DON’T DELIVER
For most of the twentieth century, a
college degree was a reliable passport to the middle class. The earnings
premium on a bachelor’s degree — the income advantage over a high school
diploma — was real and substantial. That premium still technically exists in
aggregate. What has changed is the denominator: when a degree costs
$50,000-$200,000 and takes a decade to pay off, the net lifetime benefit is
dramatically reduced, unevenly distributed by field of study, institution type,
and the economic conditions graduates enter.
The Credential Trap
The United States has created what
sociologists call “credential inflation”: as more people obtain bachelor’s
degrees, employers raise degree requirements for jobs that previously did not
require them. A study by Harvard Business School found that 65% of job postings
for new production supervisors required a bachelor’s degree, while only 16% of
currently employed supervisors in those roles held one. The credential is
demanded not because the job requires it, but because it serves as a cheap
screening mechanism.
The result: students take on substantial
debt to obtain credentials that signal eligibility for jobs their grandparents
got through union apprenticeships or on-the-job training. The debt is real. The
credential’s functional value is often marginal. And the for-profit college
sector — which enrolled millions of lower-income students through aggressive
marketing and federal loan access — issued credentials that, in many cases, had
no labor market value at all.
The International Comparison
The contrast with peer nations is
instructive. Germany’s dual apprenticeship system routes approximately half of
secondary students into structured vocational training — three years of
combined classroom and workplace learning, leading to nationally recognized
qualifications and well-paying careers in manufacturing, healthcare, trades,
and technology. Germany does not have a student debt crisis because it does not
require most of its population to finance their own education. France, the
Nordic countries, and Scotland offer tuition-free or heavily subsidized
university education. South Korea, Japan, and Canada maintain significant
public subsidies for higher education.
The OECD’s 2024 Education at a Glance
report notes that U.S. bachelor’s-level tuition fees average $9,596 per year —
at the upper end of the OECD range. Many OECD countries charge nothing, or
under $2,000. The United States is, in comparative terms, an outlier in how
much of the cost of higher education it places on individuals rather than
sharing it socially.
$9,596 average
annual U.S. bachelor’s-level tuition (OECD, 2024) —
upper end of OECD range
$0–$2,000 typical
annual tuition in Germany, France, Nordic countries —
publicly subsidized or free
83% U.S.
students with at least one college-educated parent who complete their degree —
vs. significantly lower for first-generation students
50% of
Post-9/11 GI Bill-eligible veterans who used their benefits also completed
their degree within six years
VIII. WHAT COUNTRIES THAT ACTUALLY WORK DO
INSTEAD
This is not a problem without solutions.
The solutions exist, are documented, and are in operation in multiple countries
with strong track records. They are not radical experiments. They are proven
models.
The Common Features of High-Performing Education Systems
•
Universal early
childhood education beginning at age 1–3, publicly funded or heavily
subsidized, with trained and well-paid staff.
◦
Finland:
childcare is a legal right from birth; preschool from age 5 is free; class
sizes are small; homework is minimal in primary years.
◦
Norway
and Sweden: 80%+ of under-3s are in publicly funded childcare; parental leave
exceeds one year at significant wage replacement.
•
Paid parental leave of
substantial duration (6–18 months) at meaningful wage replacement, available to
both parents.
◦
Germany:
up to 14 months parental leave at 65% of prior salary. Japan: 52 weeks. UK: 52
weeks (though wage replacement tapers).
◦
The
U.S.: 0 weeks federally mandated paid leave.
•
Tuition-free or heavily
subsidized higher education, with investment in vocational training as a
high-status, high-value alternative to academic degrees.
◦
Germany’s
apprenticeship system: three-year programs combining classroom and workplace
learning, leading to recognized national qualifications.
◦
Scotland:
free university tuition for Scottish residents at Scottish universities.
•
Teacher pay, status, and
training competitive with other graduate professions. In Finland, teaching is
among the most selective and prestigious careers. In South Korea and Singapore,
teachers are drawn from the top third of graduates.
•
School funding systems
that do not reproduce community wealth disparities. Most high-performing
nations fund schools nationally or regionally, not through local property
taxes.
The Evidence on Returns
James Heckman, Nobel laureate in economics,
has produced decades of peer-reviewed research demonstrating that the economic
return on early childhood investment is approximately 7-12% per year — among
the highest of any public investment — driven by better educational outcomes,
higher adult earnings, lower rates of crime, and reduced dependence on social
services. The OECD consistently finds that countries investing heavily in early
childhood education have lower inequality, higher mobility, and better long-run
fiscal positions than those that do not.
The United States is not unable to afford
this. It is choosing not to fund it. The choice is visible in the budget: the
U.S. spends vastly more per capita on incarceration than on early childhood
education. It spends more on standardized testing infrastructure than on school
libraries. It has allowed the cost of higher education to expand without bound
while real per-pupil K-12 funding has stagnated or declined in many states.
Every dollar invested in quality early childhood development
returns seven to twelve dollars to society over the long run. The United States
is choosing to forgo this return. — James Heckman, University of Chicago, Nobel
Laureate in Economics
IX. CONCLUSION: THE CHOICE THAT WAS MADE
The American education system did not
arrive at its current state by accident or market force. It was shaped by a
series of deliberate choices — legislative, judicial, and political — each of
which shifted the cost of education from the public to the individual, each of
which moved the system further from the constitutional promise of free,
uniform, thorough education that every state enshrined.
The GI Bill proved, beyond reasonable
dispute, that universal access to higher education produces extraordinary
social and economic returns. The University of California proved, for nearly
three decades, that a world-class public university system can operate at no
cost to residents. Every OECD peer nation demonstrates, currently, that paid
parental leave, subsidized childcare, and affordable higher education are not
utopian fantasies but functioning policy choices made by countries that
decided, as a matter of democratic priority, to invest in their people.
The United States made a different choice.
Beginning in the late 1970s and accelerating through the 1980s, 1990s, and
2000s, it chose to convert education from a public good into a private
investment, to be financed individually by the people who needed it most. The
testing regimes of NCLB defined the “achievement gap” as a school problem while
refusing to address it as a poverty problem. The bankruptcy reforms of 1998 and
2005 made student debt non-dischargeable while leaving lenders free to extend
credit without assessing repayability. The property-tax funding model was
upheld by the Supreme Court and has perpetuated per-pupil spending disparities
for five decades.
The consequences are measurable: $1.77
trillion in student debt. A teaching workforce in crisis. An early childhood
system that leaves millions of children without stable care or education in
their most developmentally critical years. A credential market that demands
expensive degrees for jobs that do not require them. And a generation for whom
the American Dream — the idea that hard work and education lead to a better
life — is statistically less accessible than it was for their parents or
grandparents.
This is not drift. This is not entropy.
This is a policy record. It can be read, debated, and — if the political
will exists — reversed. The evidence about what works is not in dispute. The
question is whether the democratic institutions of the United States are
capable of choosing the public good over private extraction, as they briefly,
brilliantly, did in 1944.
SOURCES & FURTHER READING
OECD — Education
at a Glance 2024: Country Notes for the United States
OECD — Paid Family
Leave Across OECD Countries (Bipartisan Policy Center compilation, 2022)
Education
Commission of the States — Constitutional Obligations for Public Education:
50-State Review (2016)
Learning Policy
Institute — Overview of Teacher Shortages 2025 Fact Sheet
Federal Reserve
Bank of New York — Student Loan Debt Statistics, Q4 2024
VetEdSuccess.org —
Lessons for Title IV from an Analysis of GI Bill Outcomes (2026)
EducationData.org
— Average Cost of College by Year (inflation-adjusted series)
Committee for
Economic Development (CED) — Public Investment in Childcare and Early Education
(2020)
GigaFact / Our
Documents Initiative — GI Bill Enrollment and Degree Data, 1944–1950
ConnectUS — Pros
and Cons of the No Child Left Behind Act
Ballard Brief
(BYU) — The Socioeconomic Achievement Gap in U.S. Public Schools
University of
Chicago Law Review — Education’s Deep Roots: Historical Evidence for the Right
to Education
Race Forward —
Historical Timeline of Public Education in the U.S.
StateUniversity.com
— Constitutional Requirements Governing American Education
Heckman, J.J. —
Multiple peer-reviewed studies on returns to early childhood investment
(University of Chicago)
No comments:
Post a Comment
Thank you!