ERS Charts of Note

Subscribe to get highlights from our current and past research, Monday through Friday, or see our privacy policy.

Get the latest charts via email, or on our mobile app for Download the Charts of Note app on Google Play and Download the Charts of Note app on the App Store

Reset

Fentanyl and other illicit opioids replaced prescription drugs as drivers of the opioid epidemic in 2011

Friday, April 2, 2021

Since the late 1990s, an opioid epidemic has afflicted the U.S. population, particularly those in the prime working ages of 25-54. As a result, the National age-adjusted mortality rate from drug overdoses rose from 6.1 per 100,000 people in 1999 to 21.7 per 100,000 in 2017, then dipped to 20.7 per 100,000 in 2018 and rose back to 21.6 in 2019. Among the prime working age population, the drug overdose mortality rate was 37.8 deaths per 100,000 people in 2019. This rate was exceeded only by cancer (39.2 deaths per 100,000) in 2019 as a major cause of death in this population. ERS researchers, examining the opioid epidemic from 1999 to 2018, observed two distinct phases: a “prescription opioid phase” (1999-2011) and a succeeding “illicit opioid phase” (2011-2018), marked especially by the spread of fentanyl and its analogs. Updated data show the second phase has extended into 2019. Mortality data indicate that in the prescription opioid phase, drug overdose deaths were most prevalent in areas with high rates of physical disability, such as central Appalachia. Rural residents, middle-age men and women in their 40s and early 50s were most affected, as were Whites and American Indian/Alaskan Natives. Opioid prescriptions ceased driving the epidemic in 2011 as increased regulation and greater awareness of prescription addiction problems took hold. The illicit opioid phase that followed involved primarily heroin and synthetic opioids, such as fentanyl. Fentanyl and its analogs are often used to spike other addictive drugs, including prescription opioids, creating powerful combinations that make existing drug addictions more lethal. During the study period, this second phase was concentrated in the northeastern United States, particularly in areas of employment loss. This phase most often involved urban young adult males, ages 25 to 39. All the racial/ethnic groups studied—Hispanics, Blacks, American Indian/Alaskan Natives, and Whites—were affected. This chart updates data found in the Economic Research Service report The Opioid Epidemic: A Geography in Two Phases, released April 2021.

Rural death rates from COVID-19 surpassed urban death rates in early September 2020

Friday, March 19, 2021

During the initial COVID-19 surge between March and June 2020, large urban areas had the highest weekly death rates from the virus in the United States. Those numbers declined as medical professionals learned more about the virus, how to treat it, and how to prevent its spread. As the virus spread from major urban areas to rural areas, the second COVID-19 surge, from July to August 2020, brought more deaths to rural areas. The peak in deaths associated with this surge was smaller because testing was more widespread, the infected population was younger and less vulnerable, and treatments were more effective. However, in early September 2020, COVID-19 death rates in rural areas surpassed those in urban areas. This trend continued into a third, still ongoing, surge that spiked in rural areas during the holiday season and again shortly thereafter. Rural areas have shown higher death rates per 100,000 adults since September in part because they had higher rates of new infections than urban areas, but that is not the whole story. Rural COVID-19 deaths per 100 new infections 2 weeks prior (to account for the lag between infection and death) were 2.2 in the first 3 weeks of February—35 percent higher than the corresponding urban mortality rate of 1.6 deaths per 100 new infections 2 weeks earlier. The rural population appears to be more vulnerable to serious infection because of the older age of its population, higher rates of underlying medical conditions, lack of health insurance, and greater distance to an intensive care hospital. As of early February, death rates have started decreasing, possibly because of more widespread vaccinations among the most vulnerable populations. This chart updates data found in the February 2021 Amber Waves data feature, “Rural Residents Appear to be More Vulnerable to Serious Infection or Death from COVID-19.”

Disparities in educational attainment by race, ethnicity persist in rural America

Wednesday, February 24, 2021

Higher educational attainment generally is associated with higher median earnings, higher employment rates, and greater workforce opportunity. Among all rural residents who are 25 years old or older, the percentage who had completed a bachelor’s degree or higher rose from 15 percent in 2000 to 21 percent in 2019. In addition, the share of the rural population 25 or older without a high school degree or equivalent dropped from 24 percent in 2000 to 12 percent in 2019. However, ethnic and racial disparities persist in education. Rural Hispanics continued to have the highest share of people without a high school degree in 2019 at 34 percent, despite significant gains in high school and higher educational attainment rates since 2000. Over the same period, Blacks or African Americans had the largest decrease of rural individuals without a high school degree (21 percentage points). This change narrowed the gap between the shares of Blacks or African Americans and Whites who had graduated from high school but had not completed a bachelor’s degree. Nevertheless, the share of rural Blacks or African Americans without a high school degree (20 percent) was nearly double that of Whites (11 percent) in 2019. This chart updates data found in the November 2020 Amber Waves finding, “Racial and Ethnic Disparities in Educational Attainment Persist in Rural America.”

Disparities in educational attainment by race and ethnicity persist in rural America

Monday, December 7, 2020

Higher educational attainment is associated with higher median earnings, higher employment rates, and greater workforce opportunity. Among all rural residents who are 25 years old or older, the percentage of those who had completed a bachelor’s degree or higher rose from 15 percent in 2000 to 20 percent in 2018. In addition, the share of the rural population 25 or older without a high school degree or equivalent dropped from 24 percent in 2000 to 13 percent in 2018. Even so, ethnic and racial disparities persist in education. Rural Hispanics continued to have the highest share (35 percent) without a high school degree, despite significant gains in high school and higher educational attainment rates between 2000 and 2018. Over the same period, Blacks or African Americans had the largest decrease (20 percentage points) of rural individuals without a high school degree. This change eliminated the gap between the shares of Blacks or African Americans and Whites who had graduated from high school but had not completed a bachelor’s degree. Nevertheless, the share of rural Blacks or African Americans without a high school degree remained nearly double that of Whites in 2018. This chart appears in the November 2020 Amber Waves finding, “Racial and Ethnic Disparities in Educational Attainment Persist in Rural America.”

More COVID-19 cases per capita in U.S. metro than nonmetro areas, but the share of cases in nonmetro areas is increasing

Friday, October 2, 2020

COVID-19 has spread to nearly every nation in the world, and to every State and nearly every county in the United States. The virus initially spread most rapidly to large metropolitan areas, and most confirmed cases are still in metro areas with populations of at least 1 million, according to the Economic Research Service’s (ERS) analysis of data from the Johns Hopkins University Center for System Science and Engineering. This is consistent with most of the U.S. population living in large metro areas. Even in per capita terms, the prevalence of COVID-19 cases has been greater in metro than in nonmetro areas since the initial appearance of the pandemic in the United States (the first confirmed case was reported on January 20, 2020). As of September 1, cumulative confirmed cases per 100,000 residents reached 1,877 in metro areas, compared with 1,437 cases in nonmetro areas. Although the prevalence of COVID-19 cases remains lower in nonmetro areas, the share of cases in nonmetro areas has grown since late March. The nonmetro share of all confirmed U.S. COVID-19 cases grew from 3.6 percent on April 1 to 11.1 percent on September 1. ERS regularly produces research on rural America, including demographic changes in rural communities and drivers of rural economic performance. This chart appears in the ERS topic page, The COVID-19 Pandemic and Rural America, updated September 2020.

Rural poverty rates dropped across all race/ethnicity groups between 2013 and 2018

Monday, August 17, 2020

In 2013, rural poverty reached a 30-year peak at 18.4 percent of the rural population. Between 2013 and 2018, the rural poverty rate fell 2.3 percentage points, a decline of about 1 million rural residents in poverty. Rural poverty rates declined for all race/ethnicity groups. The rural Black population showed the largest decline in poverty rates, from 37.3 percent in 2013 to 31.6 percent in 2018. Despite this decrease, Blacks continued to have the highest poverty rate among all rural race/ethnicity groups. While Blacks made up 7.6 percent of the rural population, they accounted for 14.9 percent of the rural poor in 2018. American Indians had the second-highest poverty rate (30.9 percent) among all rural race/ethnicity groups in 2018, 3.5 percentage points lower than in 2013. Hispanics had the lowest poverty rate among rural minority groups (23.8 percent) in 2018, an improvement of 4.4 percentage points from 2013. Whites have historically had a much lower rural poverty rate (14.0 percent in 2018), and their rate fell 1.9 percentage points from 2013 to 2018. However, the majority of the rural poor are White. Whites account for 84.8 percent of the overall rural population and 73.4 percent of the rural population in poverty in 2018. This chart updates data that appeared in the November 2018 ERS report, Rural America at a Glance, 2018 Edition.

Counties with low educational attainment were concentrated in rural areas

Monday, July 13, 2020

Between 2014 and 2018, the United States had 316 counties with low levels of educational attainment, meaning 20 percent or more of working-age adults (ages 25-64) living in the county lacked a high school diploma or equivalent. The majority of those counties—about 4 out of 5—were in rural (nonmetro) areas. Low-education rural counties were predominantly in the South (nearly 80 percent or 208 counties) and the economies of more than one-third (116 counties) relied on farming or manufacturing. Nearly half (156 counties) were high poverty counties, with a poverty rate of 20 percent or more, and most of those counties (113 counties) also had persistently high poverty over three or more decades. In addition, almost 60 percent of low-education rural counties were in areas where African Americans alone (70 counties) or Hispanics of any race (115 counties) accounted for 20 percent or more of the total population. In these counties, the low-education rates for African Americans or Hispanics were substantially higher than corresponding rates for white (non-Hispanic) individuals. This chart appears on the ERS topic page for Rural Education, updated May 2020.

All counties with extreme poverty in 2018 were rural (nonmetro) counties

Tuesday, June 16, 2020

In 2018, the United States had 664 high-poverty counties, where an average of 20 percent or more of the population had lived below the Federal poverty level on average over 2014-18. The majority were rural (78.9 percent, or 524 counties). These high-poverty counties represented about one of every four rural counties, compared with about one of every ten urban counties. Fifteen of the 664 counties were extreme poverty areas, where the poverty rate was 40 percent or greater. The extreme poverty areas were also persistent poverty counties, with poverty rates of at least 20 percent over the past 30 years. In 2018, all of the extreme poverty counties were in rural America. These counties are not evenly distributed, but rather are geographically concentrated and disproportionately located in regions with above-average populations of racial minorities. Several extreme poverty counties, for instance, were found in Mississippi, including four counties where there has historically been a high incidence of poverty among the African-American population. These counties were also found in South Dakota, with six counties where Native Americans made up more than 50 percent of the population. This chart appears on the Economic Research Service topic page for Rural Poverty & Well-being, updated February 2020. It is also in the May 2020 Amber Waves article, “Extreme Poverty Counties Found Solely in Rural Areas in 2018.”

Highest U.S. poverty rates are in the South, with over 20 percent poor in its rural areas

Friday, March 13, 2020

People living in poverty tend to be clustered in certain U.S. regions, counties, and neighborhoods, rather than being spread evenly across the Nation. Poverty rates in rural (nonmetro) areas have historically been higher than in urban (metro) areas, and the rural/urban poverty gap is greater in some regions of the country than others. At the regional level, poverty is disproportionately concentrated in the rural South. In 2014-18, the South had an average rural poverty rate of 20.5 percent—nearly 6 percentage points higher than the average rate in the region’s urban areas. An estimated 42.7 percent of the Nation’s rural population and 51.3 percent of the Nation’s rural poor lived in this region between 2014 and 2018. By comparison, 37.1 percent of the urban population and 39.4 percent of the urban poor lived in the South during that period. The poverty gap was smallest in the Midwest and the Northeast—with less than a percentage point difference between rural and urban poverty rates. This chart appears on the Economic Research Service topic page for Rural Poverty & Well-being, updated February 2020.

ICYMI... Rural families headed by single adults had higher poverty rates than urban counterparts in 2017

Tuesday, December 10, 2019

Family type has a significant bearing on poverty. For example, families headed by two adults are likely to have more sources of income than single-adult families—and are therefore less likely to be poor. In 2017, nearly 33.8 percent of rural families headed by a female with no spouse present and 18.5 percent of those headed by a male with no spouse present fell below the poverty threshold. In contrast, 6 percent of rural families with a married couple were poor. On average, 11.6 percent of all rural families were poor. Poverty rates for single-adult families were higher than average for urban area residents as well in 2017, but overall family poverty rates were higher in rural than in urban areas. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated March 2019. This Chart of Note was originally published May 29, 2019.

Poverty rates in 2017 were highest for children, particularly among those living in rural areas

Monday, November 4, 2019

U.S. poverty rates differ by age group. In 2017, the difference between rural and urban poverty rates was greatest for children under the age of 5 (26.0 percent in rural areas versus 19.3 percent in urban areas). Federal poverty thresholds vary by household composition. For a family of two adults and one child, the poverty line in 2017 was an annual income of $19,730. Overall, child poverty rates under age 18 were 22.8 percent in rural areas and 17.7 percent in urban areas. In contrast, the poverty rates for senior adults (age 65 and older) were much closer at 10.1 percent in rural areas and 9.1 percent in urban areas. Working age adults (ages 18–64) followed the pattern of other age-groups, in that they had higher poverty rates in rural areas (16.0 percent) than in urban areas (12.1 percent). Poverty rates do not indicate how long individuals have experienced poverty. Some families cycle into and out of poverty over time, while others are persistently poor. Persistent poverty among children is of particular concern, as the cumulative effects may lead to poor health, limited education, and other negative outcomes. Also, research suggests that the more time a child spends in poverty or living in a high-poverty area, the greater the chance of being poor as an adult. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated March 2019.

SNAP redemptions had larger effect on county employment during the Great Recession than before or after

Monday, August 12, 2019

The Supplemental Nutrition Assistance Program (SNAP) provided benefits to an average of more than 46 million recipients per month and accounted for 52 percent of USDA’s spending in 2014. That year, SNAP recipients redeemed more than $69 billion worth of benefits. Recent ERS research estimated the effect of SNAP redemptions on county-level employment. During and immediately after the Great Recession (2008–10), each additional $10,000 in SNAP redemptions contributed on average 1.04 additional jobs in rural counties and 0.41 job in urban counties. By contrast, before the recession (2001–07), SNAP redemptions had a much smaller positive effect on employment in rural counties (about 0.25 job per $10,000 in redemptions) and a negative effect in urban counties (a loss of about 0.22 job per $10,000 in redemptions). After the recession (2011–14), SNAP redemptions had a statistically insignificant effect on employment in both rural and urban counties. Per dollar spent, the effect of SNAP redemptions on local employment during the recession was greater than the employment effect of other government transfer payments combined—including Social Security, Medicare, Medicaid, unemployment insurance compensation, and veterans’ benefits—and also the employment effect of total Federal Government spending. SNAP’s relatively large effect on employment during the recession may owe to the fact that, unlike many other government programs, SNAP payments are provided directly to low-income people, who tend to immediately spend additional income. This chart uses data found in the May 2019 ERS report, The Impacts of Supplemental Nutrition Assistance Program Redemptions on County-Level Employment. Also see the May 2019 article, “SNAP Redemptions Contributed to Employment During the Great Recession” in ERS’s Amber Waves magazine.

SNAP participation and benefits grew rapidly during and after the Great Recession

Thursday, June 20, 2019

The Supplemental Nutrition Assistance Program (SNAP) is the largest USDA program. During fiscal year 2014, it provided benefits to an average of more than 46 million recipients per month and accounted for 52 percent of USDA’s spending. That year, SNAP recipients redeemed more than $69 billion worth of benefits at SNAP-authorized stores—83 percent of which were located in urban areas and 17 percent in rural areas. Between fiscal years 2000 and 2013, average monthly SNAP participation nearly tripled, while the inflation-adjusted value of benefits paid under the program nearly quadrupled. The growth in program participation and the value of benefits paid were particularly rapid during and immediately after the Great Recession, which officially began in December 2007 and ended in June 2009. However, the recession resulted in high poverty rates well after it officially ended. The increase in program spending between 2009 and 2013 was due in part to rising SNAP participation in response to high levels of poverty during this period. A temporary increase in benefit rates mandated by the American Recovery and Reinvestment Act (ARRA) in early 2009 and other policies to increase access to the program also likely expanded SNAP participation and spending. This chart appears in the May 2019 ERS report Investigating Impacts of SNAP Redemptions on County-Level Employment. Also see the May 2019 article, “SNAP Redemptions Contributed to Employment During the Great Recession” in ERS’s Amber Waves magazine.

Rural families headed by single adults had higher poverty rates than urban counterparts in 2017

Wednesday, May 29, 2019

Family type has a significant bearing on poverty. For example, families headed by two adults are likely to have more sources of income than single-adult families—and are therefore less likely to be poor. In 2017, nearly 33.8 percent of rural families headed by a female with no spouse present and 18.5 percent of those headed by a male with no spouse present fell below the poverty threshold. In contrast, 6 percent of rural families with a married couple were poor. On average, 11.6 percent of all rural families were poor. Poverty rates for single-adult families were higher than average for urban area residents as well in 2017, but overall family poverty rates were higher in rural than in urban areas. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated March 2019.

Rural telehealth participation rates vary by the activity

Friday, April 19, 2019

Compared with traditional medical delivery systems, telehealth—personal health services or activities conducted through the internet—allows people to participate more actively in their health care. It also facilitates timely and convenient monitoring of ongoing conditions for those who may participate in connected telehealth practices. To better understand the factors affecting telehealth use, ERS researchers examined rural residents’ participation in three telehealth activities: online health research; online health maintenance (such as contacting providers, maintaining records, and paying bills); and online health monitoring (the transmission of data gathered by remote medical devices to medical personnel). Findings show that participation rates for telehealth activities varied in 2015. Many participants reported conducting only one telehealth activity, such as the 10.7 percent of participants who conducted only online health research. Some people conducted more than one telehealth activity, such as the 0.8 percent who conducted online health research, online health maintenance, and online health monitoring. The majority of participants who conducted both health maintenance and health monitoring also conducted online health research. This chart appears in the November 2018 ERS report, Rural Individuals' Telehealth Practices: An Overview.

Poverty rates in rural and urban areas vary across U.S. regions

Wednesday, April 3, 2019

Poverty rates in rural (nonmetro) areas have historically been higher than in urban (metro) areas, and the rural/urban poverty gap is greater in some regions of the country than others. For example, the gap has historically been largest in the South. In 2013–17, the South had an average rural poverty rate of 20.8 percent—nearly 6 percentage points higher than the average rate in the region’s urban areas. The difference in the South’s poverty rates is particularly important because an estimated 42.6 percent of the Nation’s rural population and 51.1 percent of the Nation’s rural poor lived in this region between 2013 and 2017. By comparison, 36.9 percent of the urban population and 39.1 percent of the urban poor lived in the South during that period. The poverty gap was smallest in the Midwest and the Northeast—with less than a percentage point difference between rural and urban poverty rates. This chart appears on the ERS topic page “Rural Poverty & Well-being,” updated March 2019.

In rural areas, single-parent families have higher poverty rates than families headed by married couples

Monday, February 25, 2019

Rural parents often face challenges—such as a lack of jobs, physical isolation, and limited transportation choices—that may put their children at risk of being poor. That risk is greatest among single-parent families, particularly those headed by a female. Research shows that among family types, single parents are less likely to have an education beyond high school and are more likely to be without employment or to work in a job that is not secure or does not pay a living wage. In 2016, rural female-headed families with no spouse present made up 26 percent of all rural families with children and 60 percent of all rural families with children that were poor. The poverty rate for rural female-headed families was 46 percent, compared with about 23 percent for rural male-headed (no spouse) families and 9 percent for rural married-couple families with children. These rates were nearly unchanged from 2007, indicating the persistently high likelihood of remaining in poverty for rural children in single-parent families. The U.S. Census Bureau’s American Community Survey, the source of this data, does not include sufficient information to explore the economic status of households headed by unmarried partners, which other ERS research has shown to be important for the study of rural child poverty. This chart appears in the July 2018 Amber Waves data feature, "Child Poverty Heavily Concentrated in Rural Mississippi, Even More So Than Before the Great Recession.

Rural poverty remains regionally concentrated

Monday, June 11, 2018

Rural Americans living in poverty tend to be clustered in certain U.S. regions and counties. Rural (nonmetro) counties with a high incidence of poverty are mainly concentrated in the South, which had an average poverty rate of over 21 percent between 2012 and 2016. By comparison, urban (metro) counties in the South had an average poverty rate of about 16 percent. Rural counties with the most severe poverty are found in historically poor areas of the Southeast—including the Mississippi Delta and Appalachia—as well as on Native American lands. The incidence of rural poverty is relatively low elsewhere, but generally more widespread than in the past. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated April 2018.

Rural median household income remains about 25 percent below the urban median

Thursday, July 20, 2017

In 2015, the median household income for rural (nonmetro) counties rose to $44,212, a 3.4 percent increase over the prior year. This was the second year in a row of rising real (adjusted for inflation) income for the median rural household, ending 6 years of income declines during and after the Great Recession of 2007-09. By comparison, urban (metro) median income has risen for 3 straight years, reaching $58,260 in 2015. However, these 2015 median incomes remain below their 2007 peaks of $45,816 for rural households and $60,661 for urban ones. Generally, rural median household income has remained about 25 percent below the urban median. Because the cost of living is generally lower in rural areas, the gap in purchasing power is likely smaller between rural and urban households. This chart appeared in the ERS topic page for Income, updated June 2017.

Over half of farmers had health insurance coverage through an employer

Thursday, March 9, 2017

Health insurance can help people and households manage the cost and uncertainty of healthcare expenses. Most Americans with health insurance coverage receive it through their employers, and farm households are no exception. Although many farm operators are self-employed, in the majority of farm households either the operator or spouse is employed off-farm. In 2015, more than half of farm household members had health insurance coverage through an employer—close to the rate for the overall U.S. population. Farmers reported similar rates to the general population in purchasing their health insurance directly from an insurance company—and are less likely to receive health insurance from a government-provided program, such as Medicare or Medicaid. Over 89 percent of farmers had some form of health insurance, similar to the general population (nearly 91 percent). This chart appears in the topic page for Health Insurance Coverage, updated December 2016.