The (Mis)Measure of Prosperity: Morning in America and the Decline of the Social Wage
The three decades of post-World War II, broadly shared prosperity in the United States were largely the product of intentional public policy decisions, not the private ones of a theoretically autonomous market or “natural” economy left to its own devices. These decisions—including the GI Bill, which made it possible for millions of World War II veterans to attend college or vocational school; the creation and strengthening of labor market institutions, through measures such as the minimum wage and collective bargaining; and sensible financial regulation that limited asset bubbles and other abuses—helped make a vibrant middle-class a reality.
The rise of conservatism in the mid-1970s, and the subsequent election of Ronald Reagan in 1980, brought a decisive halt to this long wave of investment in the social infrastructure that was necessary to build and sustain a broad middle-class. Conservatives claimed their radical ideas would result in greater economic growth and productivity, to the benefit of all Americans. President Reagan, for example, opened his first Economic Report, in 1982, with the assertion that the conservative “reorientation of the role of the federal government in our economy” would mean “more jobs, more opportunity, and more freedom for all Americans.”
This reorientation did not deliver on its promises. Over the last three decades, the economy and productivity have grown, on average, at slower rates than during the post-World War II era of shared prosperity. And the share of American jobs that can be categorized as “good”—ones that pay wages that will produce at least a moderate income for a full-time worker, and provide health and retirement benefits—has barely budged, despite substantial increases in workers’ education levels and the technology they use on the job.1 On top of this, the benefits of the gains in growth and productivity that did occur were disproportionately captured by the few at the very top. Between 1979 and 2007, the share of total market income (before individual income taxes and social transfers) going to the top 10 percent of taxpayers increased from a little over a third to almost half.2 By 2007, workers in low-wage jobs were being paid no more per hour, in real terms, than they were in 1979.3 So much for the conservative promise of “more freedom and more opportunity” for all Americans.
These dismal trends in market wages and compensation levels during the conservative era have been well documented, but they do not tell the full story of what living standards have been like for middle- and working-class families. We must also assess trends in the “social wage.” As we define it, the social wage includes publicly subsidized benefits and services that relate to health, income security, homeownership and housing, food and nutrition, education, training, and social services. These benefits and services may be subsidized by the public sector through direct federal expenditures, or partially subsidized by the public through individual tax subsidies (tax expenditures), including the exclusion of benefits from taxable income, tax deductions, and tax credits; we call the former “direct social wage” benefits, and the latter “indirect social wage” benefits (see Table 1).
Social Wage Trends During the Conservative Era
One basic way to assess the social wage involves examining trends in the level of public expenditures on the benefits, services, and investments that constitute it. A conventional way to do this is by looking at expenditure trends in inflation-adjusted dollars. This, however, does not account for population growth; if expenditures keep pace with inflation but not with population growth, the social wage will decline as the same number of dollars, in real terms, is spread across a larger population base, or some people are excluded or provided with less. Therefore, we compare expenditures to the size of the overall economy, as measured by the gross domestic product (GDP).4
Table 2 shows the change, between 1979 and 2007, in direct social wage expenditures on benefits and services in six broad functional categories that, taken together, roughly comprise the social wage: 1) health care—including Medicaid, Medicare, and other health-services expenditures; 2) retirement and disability income security—including Social Security and other forms of retirement and disability insurance; 3) “other income security”—including unemployment insurance, Supplemental Security Income (SSI), temporary assistance, and child care assistance; 4) housing and food assistance—including food stamps, child nutrition programs, rental housing assistance, and public housing; 5) veterans benefits and services; and 6) education, training, employment services, and social services.
Table 3 shows the change, between 1979 and 2007, in selected indirect social wage expenditures (tax expenditures) in roughly the same set of categories, with the main difference being the addition of a separate category for children that includes the child credit and various child care-related credits. A full treatment of tax expenditures is beyond the scope of this article so, in Table 3, we focus on the major expenditures as well as notable new expenditures related to education and children.
As Table 2 shows, direct expenditures on the social wage were higher in 2007 than in 1979—12.89 percent of GDP in 2007 compared with 10.69 percent in 1979. While the size of the public sphere overall has remained relatively flat over the same period—federal spending generally hovered around 20 percent of GDP—and national defense spending was modestly lower (going from 4.6 percent of GDP in 1979 to 4.0 percent in 2007), spending on the social wage, broadly defined, increased. However, this is entirely due to increasing expenditures on health care. If we exclude health care, direct spending on the rest of the social wage actually declined as a share of GDP by 7 percent. In what follows, we discuss the trends in the various categories of the social wage, and also assess the extent to which the increase in health care spending reflects an increase in health care access and quality for most workers.
The U.S. health care system is a public-private hybrid, a system in which roughly half of the spending is private and half is public (and is likely to remain as such, even after reform). Even the private part of the system—which covers the majority of people under age sixty-five through employer-based coverage—is partially subsidized, via the exclusion of employer health benefits from the federal income tax. As Table 3 shows, this subsidy amounted to nearly $134 billion in 2007—or about a third of public expenditures on Medicare that year—making it a not insignificant part of the social wage.
Direct federal expenditures on health care as a share of GDP increased by 150 percent from about 1.9 percent in 1979 to 4.7 percent in 2006. In 1979, health care accounted for fewer than one in every ten dollars spent on the social wage; by 2007, it was one in every five. Tax expenditures for employer-provided health insurance rose even more, rising from 0.3 percent of GDP in 1979 to 0.98 percent in 2007.
Increased expenditures on health care did not, however, lead to broad-based improvements in access and quality, but rather are due to rising costs of services that go primarily to health care professionals and corporations. While public expenditures finance nearly half of the system, the actual costs of health care services are largely determined in unregulated private markets. In recent decades, health care costs have increased at an extraordinary rate, outpacing economic growth, inflation, and workers’ earnings. In 1979, health care (both public and private expenditures) accounted for 8.6 percent of GDP; by 2007, it had almost doubled to 16.2 percent.5
Even though health care costs have doubled, there has not been a commensurate “doubling of care” or improvement in basic health outcomes.
Even though health care costs have doubled, there has not been a commensurate “doubling of care” or improvement in basic health outcomes. At the most basic level, the percentage of working-age adults (ages eighteen to sixty-four) with health insurance fell from 86.3 percent in 1979 to 80.4 percent in 2008.6 Americans appear to also get less for each health care dollar: an analysis comparing the health systems of the thirty OECD (Organization for Economic Cooperation and Development) member countries found that even though the United States spends considerably more on health care than any other OECD country, it falls below the OECD median on most measures of health services use.7 This confirms earlier research, comparing the United States with the United Kingdom and Germany, that found that Americans paid more per capita, but received fewer “real health care resources” per capita.8 Even though costs have risen faster in the United States, we fall short on core aggregate health measures, such as life expectancy: in 1980, life expectancy in the United States was more than a month longer than the average in other wealth nations; but by 2005, it was about twenty months shorter. Over the same period, the U.S. also saw increasing disparities in life expectancies across income groups.9
The share of Americans with health insurance provides the most succinct indicator of the decline in the health care component of the social wage. Despite a doubling in expenditures, the percentage of uninsured Americans between the ages of eighteen and sixty-five with health insurance was lower in 2007 (80.4 percent) than in 1979 (84.3 percent).10 The overall decline in health insurance coverage is almost entirely due to a decline in employment-based coverage, which fell from 68.8 percent in 1979 to 64.2 percent in 2007. However, there have been some bright spots in the health care component of the social wage, as the overall decline was offset to some extent by an increase in public coverage due to expansions in eligibility, particularly regarding Medicaid coverage. When Medicaid was established in 1965, eligibility was largely limited to parents and children receiving AFDC (Aid to Families with Dependent Children), seniors, and people with disabilities. During the second half of the 1980s, coverage was gradually extended to all children between the ages of six and eighteen with family incomes below 100 percent of the poverty line, and to all pregnant women and children under the age of six with family incomes under 133 percent of the poverty line. As a result of these and other changes, the share of children age fifteen and under eligible for Medicaid increased from about 13 percent in 1983 to 29 percent in 1996.11
Health care provides the sole exception to the general rule of conservative opposition to significant expansions of the social wage. In 2003, the Bush administration presided over the addition of a prescription drug benefit to Medicare, which began in 2006. While an important improvement to the social wage, it was done in a uniquely conservative manner that prohibits the federal government from negotiating discounts with drug companies, and establishes a system that is “excessively complicated for [the elderly], with too many plans, and unnecessary variation across the plans in terms of premiums, benefits, covered drugs, rules, forms, and procedures.”12 As a result, it will end up costing the public considerably more than it should have, while making it more difficult for the elderly to access benefits.
Unemployment insurance benefits, paid as a percentage of total wages, are almost half of what they were in 1979.
Aside from prescription drugs for seniors, the increase in expenditures has generally not brought about expansions in what is covered by public insurance. Of particular importance is long-term care, which is not covered by Medicare, and is only partially covered for low-income seniors through Medicaid.
The largest share of public expenditures on income security in the United States goes to retired workers. Social Security retirement and disability benefits amounted to 4.3 percent of GDP in 2007, a percentage that is just slightly higher than it was in 1979. However, as with health care, the overall retirement security of workers has been most detrimentally affected by what has happened with employersponsored benefits. Overall, participation in any kind of plan declined from about 90 percent in the mid-1980s to 66 percent in 2007, and these plans are less likely to provide a traditional pension. Workers are now less likely to be covered by a traditional pension that pays out a set amount each year during the worker’s retirement years (a “defined-benefit” plan), and more likely to be in a 401(k) program in which the amount the worker receives in retirement is determined by his or her own contributions and the vagaries of the stock market (a “defined-contribution” plan). In 1980, 84 percent of employees in medium and large private establishments participated in a defined-benefit plan; by 2007, the share had declined to 32 percent.13
Unemployment Insurance, the EITC, and “Other Income Security” Benefits
Taken together, the two broad categories of “other income security”—including unemployment insurance, AFDC/TANF, SSI, child care assistance, the Earned Income Tax Credit (EITC), food stamps, WIC (Women, Infants, and Children), school lunch, rental housing assistance, and other smaller programs—amounted to 1.86 percent of GDP in 2007, a modest increase from their level of 1.58 percent of GDP in 1979. There have, however, been significant shifts within the categories, particularly a substantial decline in unemployment insurance which has been offset by increases in other categories.
Unemployment Insurance. Expenditures on unemployment insurance (UI) as a share of GDP were 40 percent lower in 2007 than in 1979. Part of this change is attributable to a lower unemployment rate—4.6 percent in 2007 versus 5.8 percent in 1979. However, when the unemployment rate grew to 5.8 percent in 2008, expenditures on UI were still about 25 percent lower, as a share of GDP, than they were in 1979. An additional factor in the decline of UI expenditures is that the amount of UI benefits paid as a percentage of total wages is almost half of what it was in 1979, as are contributions collected.
Moreover, some groups of workers have been particularly hard hit by the failure to modernize the unemployment insurance system. Most notably, the share of unemployed workers without a high school diploma that receives UI benefits has declined steadily since 1979.14 Similarly, as the Government Accountability Office has found, low-wage workers are about half as likely as higher-wage workers to receive unemployment insurance, a disparity that is present even when controlling for job tenure.15 This decline is largely attributable to eligibility restrictions that disproportionately impact today’s low-wage workers. For example, according to the National Employment Law Project, more than thirty states do not recognize serious illness, or the disability of a child or another family member as good causes for leaving employment.
Low-wage workers are about half as likely as higher-wage workers to receive unemployment insurance.
Fortunately, this problem has received increased attention over the last decade. A growing number of states has considered policy changes that would expand UI eligibility for low-wage and part-time workers. And earlier this year, the American Recovery and Reinvestment Act provided financial incentives for states to adopt reforms that would modernize UI. The National Employment Law Project has documented that nineteen states with reforms already in place immediately qualified for incentive funding, six other states quickly adopted reforms, and legislation has been introduced in most remaining states.
The EITC and Other In-Work Benefits. Established in 1975, the EITC was a small benefit—providing $4.7 billion in benefits in 1979—until a series of expansions was enacted in the second half of the 1980s and the early 1990s. By 2007, the EITC provided more than $40 billion to working-class families. In 1979, expenditures on the EITC were roughly one-fifth of those on unemployment insurance. By 2007, EITC expenditures were 14 percent higher than UI expenditures.
[EITC-related] gains have been offset by declines in the inflationadjusted value of the minimum wage.
The rise of the EITC is conventionally presented as a substantial gain for low-wage workers, but these gains have been offset by declines in the inflation-adjusted value of the minimum wage. In 1979, a full-time, full-year minimum wage worker earned $17,665 a year (in 2009 dollars), compared to $15,080 in 2009—and the EITC does not fully fill in that gap. The combined income from the EITC/minimum wage for a full-time, full-year minimum wage worker who is a single parent with one child is less in 2009 than in 1979; single parents with two or three children would receive about $1,000 more per year. For minimum wage workers without children, the EITC/minimum wage combination is also lower in 2009 than in 1979. For workers without children, the maximum EITC is just $457 a year, providing them with far less than the minimum wage did thirty years ago.
Who benefits from the EITC, as compared to the minimum wage? In a forthcoming article, Jessie Rothstein, currently a senior economist for the Council of Economic Advisers, finds evidence to suggest that the EITC increases labor supply in a way that depresses wages.16 In basic supply-and-demand terms, the EITC increases the number of single parents in the labor market. This greater supply means there are more available workers—and less pressure on employers to raise wages to fill jobs. All of this affects both the low-wage workers who receive a significant EITC benefit (workers with children) and those who do not (workers without children), with the latter group being harmed the most by the wage decline because they receive no offsetting benefit.
The Personal Responsibility and Work Opportunity Reconciliation Act and Welfare Reform. If there is one conventionally agreed upon “conservative success story,” it is the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (PRWORA)—legislation that originated as one of the ten bills included in the House Republicans’ 1994 “Contract with America.” The PRWORA replaced AFDC, one of the original programs created by the Social Security Act of 1935, with a block grant named Temporary Assistance for Needy Families (TANF).
The conservative foundation of the PRWORA was the conversion of welfare funds into a block grant program for the states. This conversion froze nominal funding for TANF, eliminating its counter-cyclical nature at the federal level, and allowed states to both substantially reduce their state expenditures on the program below the nominal levels of the early 1990s, and use federal funds to supplant state expenditures, which could then be diverted to unrelated uses (which included, in some notorious cases, the financing of tax cuts). These features of TANF have rendered it largely unresponsive to the economic declines of the 2000s and the increase in economic insecurity. Before the passage of the PRWORA in 1996, about 60 percent of children experiencing income poverty received AFDC income supplements; but by 2007, only 24 percent of such children received TANF supplements. Since the start of the recession in December 2007, there is little evidence that TANF will reverse direction and operate in a counter-cyclical fashion.
The national conception of the social wage has not incorporated the need for paid time off for caregiving, or the need for workers to have access to workplace flexibility.
The conventional narrative of welfare reform starts in 1996 with the passage of the PRWORA, and then points to increases in single-parent employment (and income) and declines in teen pregnancy, income poverty, and the number of parents receiving income supplements through AFDC/TANF, as proof that the conservative policy shift was a real-world success. But these trends were well established before most states actually implemented the PRWORA. For example, after peaking in 1992, both the income poverty rate for single mothers and the number of families receiving AFDC/ TANF income supplements started on a steady downward path that continued for the rest of the decade. Furthermore, the bulk of the research suggests that the 1993 expansion of the EITC—enacted with no conservative support in Congress—was the more important catalyst of positive income and employment trends for single parents than the PRWORA or related welfare reform provisions.17
Education, Training, and Employment Services
The education and training components of the social wage have fallen sharply. At the federal level, overall direct expenditures on education, training, and employment services were 8.8 percent of GDP in 1979 and 4.3 percent in 2007. Massive cuts in expenditures on training and employment services for adults and youth accounted for nearly the entire decline. The Clinton administration had some success in increasing tax expenditures for education, establishing the Hope Tax Credit and the Lifetime Learning Tax Credit which, by 2007, provided tax credits equal to about $5.8 billion, or .04 percent of GDP. These expansions have offset some of the decline in direct expenditures on higher education. However, both credits are non-refundable, which means that lower-income, working-class families do not benefit from them and, as a result, the majority of expenditures on both credits goes to families in the top 40 percent of the income distribution, with only about 14 percent of the benefits going to families in the bottom 40 percent.18 The American Recovery and Reinvestment Act provides some help here by making the Hope Tax Credit partially refundable; but the Act did not rectify the problems with the Lifetime Learning Tax Credit, which is especially important in the face of massive job losses, as it is aimed at working adults who seek training.
Looking forward, real improvements in the overall living standards of low-wage workers will require a renewed emphasis on boosting their market wages, and expanding the social insurance available to them when they are temporarily away from work. This should include further and sustained increases in the minimum wage, increased access to collective bargaining, and mandatory provision of paid sick days and paid family leave.
Among the most notable failures to update the social wage, to adjust for changing times, is the failure to establish the kinds of basic national guarantees of paid sick days and family leave that exist in nearly all other wealthy nations. Mothers are now the breadwinners or co-breadwinners in nearly two-thirds of all families, and they now account for half of all workers on U.S. payrolls.19 Yet the national conception of the social wage has not incorporated the need for paid time off for caregiving, or the need for workers to have access to workplace flexibility. The Family and Medical Leave Act of 1993 provides unpaid leave to approximately half the labor force, but it does not address the issue of how workers can afford unpaid time off.
But even in areas in which elements of the social wage were expanded, these expansions were effectively countermanded by the failure to strengthen labor market institutions or regulate market failures in ways that would ensure overall progress. For example, expansions of both Medicaid and the EITC did not result in overall gains for workers because of declines in real wages and employer-provided health insurance coverage.
Two trends that cut across various categories of the social wage should be noted. First, policymakers increasingly use tax expenditures to finance the social wage. This is evidenced by the increase in the EITC, the establishment of new education tax credits, and the conversion of traditional health and retirement tax benefits (for employer-sponsored health coverage and defined-benefit plans) to more privatized benefit plans—such as 401(k)s and IRAs— that come with greater risks for individuals. Second, conservative policymakers increasingly look toward the privatization of government services. The data presented here are not fine-grained enough to show that trend; but it is striking in light of the considerable evidence from the health care sector that private delivery of services does little to control costs.
1. John Schmitt, The Good, the Bad, and the Ugly: Job Quality in the United States over the Three Most Recent Business Cycles (Washington, D.C.: Center for Economic and Policy Research, November 2007), available at http://www.cepr.net/ documents/publications/goodjobscycles. pdf. Schmitt defines a “good job” as one that pays at least $17 an hour (the inflation-adjusted median male wage in 1979, roughly $34,000 a year in 2006 dollars), offers employer-provided health insurance for which the employer pays at least part of the premium, and offers an employer-sponsored pension or retirement savings plan (including 401(k) plans) in which the worker currently participates. Even in 2006, a year shy of the highest point in the business cycle reached over the last decade, only 23.6 percent of jobs were “good jobs.”
2. Thomas Piketty and Emmanuel Saez, “Income Inequality in the United States, 1913-1998,” Quarterly Journal of Economics 118, no. 1 (2003): 1-39. See Saez’s website for 2007-updated tables and figures, available at http://elsa.berkeley.edu/%7Esaez/.
3. Heather Boushey and Shawn Fremstad, “The Wages of Exclusion: Low-Wage Work and Inequality,” New Labor Forum 17, no. 2 (Summer 2008): 9-19.
4. In this article, we limit our focus to federal expenditures. State spending is not insignificant—as a share of GDP, it averaged around 5 percent of GDP during the 1990s—but state and local expenditures account for a relatively small portion of the social wage; moreover, such expenditures are generally part of federal-level programs (most notably Medicaid), and tend to follow federal-level trends.
5. National Health Expenditure Accounts, summary table (including share of GDP), CY 1960-2007, available at http:// www.cms.hhs.gov/NationalHealthExpendData/02_NationalHealthAccountsHistorical.asp#TopOfPage.
6. Hye Jin Rho and John Schmitt, Consistent Estimates of Health Insurance Coverage for Adults and Workers, 1979-2008 (Washington, D.C.: Center for Economic and Policy Research, December 2009).
7. Gerard Anderson, Uwe Reinhardt, Peter Hussey, and Varduhi Petrosyan, “It’s the Prices, Stupid: Why the United States Is So Different from Other Countries,” Health Affairs 22, no. 3 (2003): 89-105.
8. Ibid., citing McKinsey Global Institute, Health Care Productivity (Los Angeles: McKinsey and Company, 1996).
9. See http://www.cbo.gov/ ftpdocs/91xx/doc9104/04-17-LifeExpectancy_Brief.pdf.
10. This information is based on an analysis of March 2009 Current Population Survey (CPS) numbers by Hye Jin Rho and John Schmitt of the Center for Economic and Policy Research.
11. Jonathan Gruber, “Medicaid” in Robert Moffitt, ed., Means-Tested Transfer Programs in the United States (Chicago: National Bureau of Economic Research/ University of Chicago Press, 2003).
12. J. James, T. Neuman, and M.K. Strollo, Early Experiences of Medicare Beneficiaries in Prescription Drug Plans (Kaiser Family Foundation, 2006).
13. Employee Benefit Research Institute, Databook on Employee Benefits (July 2008): table 4.1a.
14. Philip Levine, “Unemployment Insurance over the Business Cycle,” in Rebecca Blank, Sheldon Danziger, and Robert Schoeni, eds., Working and Poor: How Economic and Policy Changes Are Affecting Low-Wage Workers (New York: Russell Sage Foundation, 2006).
15. United States Government Accountability Office, Unemployment Insurance: Low-Wage and Part-Time Workers Continue to Experience Low Rates of Receipt, September 2007, available at http://www.gao.gov/new.items/d071147. pdf.
16. Jessie Rothstein, “Is the EITC as Good as an NIT? Conditional Cash Transfers and Tax Incidence,” forthcoming in the American Economic Journal: Economic Policy (see http://www.aeaweb.org/aejpolicy/accepted.php, or http://papers. ssrn.com/sol3/papers. cfm?abstract_id=1405974).
17. See Bruce Meyer and Dan Rosenbaum, “Welfare, the Earned Income Tax Credit, and the Labor Supply of Single Mothers,” Quarterly Journal of Economics 116, no. 3 (August 2001): 1063-1114. EITC expansions accounted for as much as 60 percent of the increase in employment rates of lone mothers between 1984 and 1996; see also Jeffrey Grogger, “The Effects of Time Limits, the EITC, and Other Policy Changes on Welfare Use, Work, and Income Among Female-Headed Families,” The Review of Economics and Statistics 85, no. 2 (May 2003): 394-408. Examining changes between 1993 and 1999, he concluded that “the EITC may be the most important policy measure for explaining the decrease in welfare and the rise in work and earnings among female-headed families in recent years.”
18. See “Table 1, Share of Select Federal Tax Expenditures by Cash Income Percentile, 2006,” in Adam Carasso, Gillian Reynolds, and C. Eugene Steurle, How Much Does the Federal Government Spend to Promote Economic Mobility and For Whom?, available at http://www. pewtrusts.org/uploadedFiles/ wwwpewtrustsorg/Reports/Economic_ Mobility/EMP_Mobiilty_Budget.pdf .
19. Heather Boushey, “The New Breadwinners,” in Heather Boushey and Ann O’Leary, eds., The Shriver Report: A Woman’s Nation Changes Everything (Washington, D.C.: Center for American Progress, 2009).
New Labor Forum 19(1): 45-56, Winter 2010
Copyright © Joseph S. Murphy Institute, CUNY
ISSN: 1095-7960/10 print, DOI: 10.4179/NLF.191.0000008