ERS Charts of Note
Get the latest charts via email, or on our mobile app for and
Monday, July 11, 2022
Farming activities in the United States accounted for 11.2 percent of U.S. greenhouse gas emissions in 2020. From 2019 to 2020, agricultural greenhouse gas emissions declined from 699 to 670 million metric tons of carbon dioxide equivalent but increased from 10.6 percent to 11.2 percent as a share of the U.S. economy. The Environmental Protection Agency estimated that in 2020 agriculture produced 5.6 percent as nitrous oxide (N2O), 4.2 percent as methane (CH4), 0.8 percent as on-farm carbon dioxide (CO2), and 0.6 percent emitted indirectly through the electricity that agriculture consumes. Emissions come from cropping activities that emit nitrous oxide, such as fertilizer application and manure storage and management, and enteric fermentation (a normal digestive process in animals), which produces methane. Of the economic sectors in the United States defined by the Energy Information Administration, industry (excluding agricultural emissions) accounted for the largest portion of total greenhouse gas emissions (30.3 percent), followed by transportation, residential, commercial, agriculture, and U.S. territories (no specific consumption data can be attributed within the territories, so they are listed as a group). This chart appears in the USDA, Economic Research Service data product Ag and Food Statistics: Charting the Essentials.
Friday, March 25, 2022
Irrigated cropping patterns have shifted significantly in the United States during the past 50 years. In 1964, alfalfa hay and cotton were the most widely irrigated crops, but acreage under those crops has stayed relatively constant since then. Meanwhile, irrigated acres planted in corn for grain and soybeans have increased substantially. In 1964, farmers planted less than 2 million acres of irrigated land in corn for grain. By 2017, irrigated acreage planted in corn grew to more than 12 million acres, making corn for grain the most commonly irrigated crop. Over the same period, irrigated acreage planted in soybeans also increased substantially, from fewer than 1 million acres to nearly 10 million acres. The growth in irrigated corn and soybean acreage reflects, in part, increasing demand for these crops as feedstock sources for bioenergy production and feed for livestock operations, both domestically and abroad. Irrigated corn and soybean expansion also reflects a broader eastward shift in irrigated production acreage over the past five decades. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 28, 2021.
Monday, March 14, 2022
Surface and groundwater are the two primary water supply sources for irrigated agriculture. Groundwater is pumped from aquifers, while surface water is diverted from natural streams, rivers, and lakes. The predominance of surface versus groundwater use varies regionally. Groundwater is the most common source of water applied for irrigation in the Mississippi Delta, Northern Plains and Southern Plains regions. The prevalence of groundwater-fed irrigated agriculture in the Northern and Southern Plains relates to the regions’ historically abundant groundwater resources. The High Plains Aquifer, the largest aquifer in North America and also known as the Ogallala Aquifer, underlies significant portions of the Plains regions. The Mississippi Delta region also has abundant groundwater resources that are relatively shallow, making groundwater-based irrigation less expensive. Irrigated agriculture relying on surface water is most prevalent in the Mountain and Pacific regions. The extent of surface water use for irrigation in these regions reflects past Federal, State, and local investments in water conveyance and storage infrastructure, as well as characteristics of the regions’ legal institutions which grant water rights based on historical beneficial use rather than ownership of land along streams and rivers. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 28, 2021.
Wednesday, March 2, 2022
Irrigation methods vary by crop because of differences in production practices, crop value, water source, and soil characteristics. Irrigation application methods can be broadly categorized as either gravity or pressurized systems. Pressurized irrigation systems apply water under pressure through pipes or other tubing, while gravity irrigation systems use field slope to advance water across the field surface. In general, pressurized irrigation systems are more efficient than gravity irrigation systems under most field settings, as less water is lost to evaporation and seepage. Rice has the largest share of acres irrigated by gravity systems, which is related to the flooding requirements of most rice production systems in the United States. Peanuts have the largest proportion of acres irrigated by pressurized systems. Peanut cultivation is concentrated in the Southeastern United States (i.e., Alabama, Georgia, and Florida), where the prevalence of sandy, well-drained soils makes gravity irrigation methods generally unsuitable because of seepage losses. Pressurized systems are also prevalent among high-value specialty crops, such as vegetables and orchards. Pressurized irrigation systems, particularly low-flow micro irrigation systems, are generally more expensive than gravity irrigation systems, precluding their use among lower value crops. Pressurized systems are also more prominent among crops concentrated in regions more reliant on groundwater, including irrigated corn across the Eastern and Central United States. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 28, 2021.
Wednesday, January 19, 2022
The importance of irrigation for the U.S. agricultural sector has evolved significantly over the past century. Irrigated acreage in the country has grown from fewer than 3 million acres in 1890 to more than 58 million acres in 2017. The expansion of irrigated acreage during this period reflects Federal, State, and local investment in irrigation infrastructure to deliver surface water to farms and ranches. Additionally, this expansion is partly due to advancements in well drilling and pumping technologies, which have facilitated growth in groundwater-based irrigated agriculture. Since 1969, the amount of water used per acre irrigated has decreased substantially. The average water use per acre irrigated was more than 2 acre-feet (1 acre-foot = 325,851 gallons) in 1969, which declined to nearly 1.5 acre-feet by 2018. Efficient water application technologies, such as the transition from gravity-based to pressurized irrigation systems, have driven the reduction in water use per acre of irrigated land. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 2021.
Tuesday, January 4, 2022
Regional distribution of U.S. irrigated acreage changed significantly from 1949 to 2017. Trends in irrigated cropping patterns, technological advances, water availability, and changing growing-season weather drove this evolution. The arid Mountain and Pacific regions consistently irrigated the most farmland until 2007, when irrigated acreage in the Northern Plains region surpassed acreage in the Pacific region. Irrigated acreage in the Mountain and Pacific regions remained relatively constant over the 70-year period, despite increasingly limited opportunities for additional water development and increasing competition for water from non-agricultural sectors. The Northern Plains region has experienced the most substantial increase in irrigated acreage, expanding from less than 2 million acres in 1949 to nearly 12 million acres in 2017. The expansion of irrigated acreage in the Northern Plains is related to advances in groundwater pumping technologies, the diffusion of center pivot irrigation application systems, and the region’s abundant aquifer resources. The Southern Plains region experienced similar growth in irrigation until the 1980s, when dwindling groundwater supplies resulted in irrigated acreage declines. The Mississippi Delta and Southeast regions also have expanded irrigated acreage since 1949 reflecting, in part, changing cropping patterns, abundant aquifer water supplies, and producer responsiveness to changing precipitation levels during growing seasons. This chart was drawn from the USDA, Economic Research Service report Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity, published December 2021.
Monday, December 13, 2021
Irrigation organizations use a variety of methods to calculate on-farm water use so they can accurately track water use within their delivery systems. The methods used to calculate on-farm water use partially determine ways organizations can price water deliveries. For example, implementing volumetric water pricing is difficult unless organizations can directly meter on-farm water use. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, about 44 percent of irrigation water delivery organizations use direct metering to calculate on-farm water use, and about 42 percent of organizations use time-of-use estimation to determine water deliveries. The time-of-use method estimates the volume of water delivered based on the duration of deliveries and the characteristics of the conveyance infrastructure. About 17 percent of organizations calculate water deliveries based on self-reporting from irrigated farms and ranches. Many organizations use more than one method to determine on-farm water use. This chart was drawn from the USDA, Economic Research Service report Irrigation Organizations: Water Storage and Delivery Infrastructure, published October 2021.
Monday, November 29, 2021
Water storage infrastructure includes dams and reservoirs that provide a way to store water across seasons and years to meet the demands of irrigators. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, less than 20 percent of water delivery organizations own and manage their own water storage reservoirs. The remaining water delivery organizations rely on natural streamflow or storage infrastructure owned by State or Federal agencies or other irrigation organizations. Large irrigation organizations, defined as those organizations that serve more than 10,000 irrigable acres, are the most likely to own water storage infrastructure. Almost 37 percent of large irrigation organizations have at least one water storage reservoir. Meanwhile, 21 percent of medium organizations and 10 percent of small organizations, have at least one reservoir. Storage infrastructure is particularly important in snowpack-dependent basins where the timing of spring runoff does not align with peak irrigation water demand. The role of water storage infrastructure will be critical as snowpack decreases, snowmelt runoff shifts to earlier in the growing season, and water demand increases. This chart can be found in the USDA, Economic Research Service report Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.
Wednesday, November 3, 2021
Irrigation organizations that deliver water to farms and ranches use main and lateral canals, tunnels, and pipelines to transport water from natural waterways, reservoirs, or other infrastructure to irrigated farms and ranches. Transporting water to farms and ranches can result in conveyance losses, or water that is unavailable for irrigation use because of evaporation or seepage. Lining water canals with quasi-impermeable materials, such as concrete or plastic membranes, can reduce conveyance losses as less water is lost to seepage. However, the cost of lining canals may be prohibitively high for many irrigation organizations. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, almost 76 percent of water delivery organizations cite expense as a reason for leaving conveyance infrastructure unlined. In some scenarios, lining canals may not be feasible or warranted. For example, unlined canals may beneficially recharge aquifers or soil and geologic attributes may minimize seepage losses. A smaller percentage of organizations cite those as reasons for not lining main and lateral canals. This chart can be found in the USDA, Economic Research Service report, Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.
Monday, September 13, 2021
There are two methods to apply irrigation water to crops: gravity or pressurized irrigation systems. Gravity irrigation systems use on-field furrows, basins, or poly-pipe to advance water across the field surface through gravity means only. Pressurized systems apply water under pressure through pipes or other tubing directly to crops (e.g., sprinkler and micro/drip irrigation systems). Under many field conditions, pressurized irrigation systems use water more efficiently than gravity systems, as less water is lost to evaporation, deep percolation, and field runoff. Over the last 30 years, the number of acres irrigated using pressurized irrigation systems roughly doubled while the acreage irrigated using gravity systems declined substantially in the 17 Western States. In 2018, 72 percent of all irrigated cropland acres (28.96 million acres out of 40.31 million acres of total irrigated area) in 17 Western States used pressurized irrigation systems, up from 37 percent in 1984. This chart appears in the USDA, Economic Research Service topic page for Irrigation & Water Use, updated August 2021.
Monday, November 16, 2020
In 2019, Wisconsin’s production of fluid milk was second only to California’s. According to data from USDA’s National Agricultural Statistics Service, Wisconsin generated 30.6 billion pounds of milk that year, with milk sales totaling $5 billion. In recent years, Wisconsin dairy farms have been exposed to substantial weather volatility characterized by frequent droughts, storms, and temperature extremes (both hot and cold). This has resulted in considerable fluctuations in dairy productivity. Researchers from the Economic Research Service (ERS) among others, found that total factor productivity (TFP), which measures the rate of growth in total output (aggregate milk produced) relative to the rate of growth in total inputs (such as the number of cows, farm labor, feed, and machinery), increased at an average annual rate of 2.16 percent for Wisconsin dairy farms between 1996 and 2012. This increase was primarily driven, at an annual rate of 1.91 percent, by technological progress—such as improved herd genetics, advanced feed formulations, and improvements in milking and feed handling equipment. However, trends in rainfall and temperature variation were responsible for a 0.32 percent annual decline in the productivity of Wisconsin dairy farms during the same period. For example, an average increase in temperature of 1.5 degrees Fahrenheit reduced milk output for the average Wisconsin dairy farm by 20.1 metric tons per year. This is equivalent to reducing the herd size of the average farm by 1.6 cows every year. This chart appears in ERS’s October 2020 Amber Waves finding, “Climatic Trends Dampened Recent Productivity Growth on Wisconsin Dairy Farms.”
Wednesday, August 12, 2020
Knowing where natural resource use accumulates is fundamental to understanding what factors influence resource-use decisions. A recent Economic Research Service (ERS) study estimated natural resource use by the U.S. food system in 2007 (2007 data were the latest available with the level of detail needed for the analysis). Farm production was the smallest user of fossil fuels (12 percent of fossil fuel use); households were the largest users (35 percent). Over 40 percent of greenhouse gas emissions in food production were from farms and ranches, followed by households, and then companies that distribute and market food. For forest products, the greatest use occurred during food processing and packaging, with paper-based packaging accounting for most of this use. Farm production was the dominant user of freshwater withdrawals due to irrigation, but slightly over a third of water use by the food system in 2007 occurred after the farm, including in household kitchens (20 percent) and in the energy industry (12 percent). This chart appears in the ERS report, Resource Requirements of Food Demand in the United States, and Amber Waves article, “A Shift to Healthier Diets Likely To Affect Use of Natural Resources,” May 2020.
Thursday, May 28, 2020
Conserving natural resources starts with identifying where they are used. A recent Economic Research Service (ERS) study examined how much of 5 of the Nation’s natural resources were used in 2007 to feed Americans aged 2 and above. (2007 data were the latest available with the level of detail needed for the analysis.) The researchers looked at the entire U.S. food system from production of farm inputs—such as fertilizers and feed—through points of consumer purchases in grocery stores and eating-out places to home kitchens. Their estimates show that agricultural land use in the U.S. food system was 25.5 percent of the country’s 2.3 billion acres of total land. Although the study does not account for other food-related land use, such as by forestry and mining industries serving the food system, it does show that about half of agricultural land is dedicated to food production for the U.S. market, and the other half was devoted to nonfood crops, like cotton and corn for producing ethanol, and to export crops, like soybeans. The U.S. food system also accounted for an estimated 28 percent of 2007’s freshwater withdrawals, 11.5 percent of the fossil fuel budget, and 7.2 percent of marketed forest products. Air is a natural resource that is degraded by the addition of greenhouses gases. The food system accounted for an estimated 18.1 percent of U.S. greenhouse gas emissions in 2007. A version of this chart appears in the ERS report, Resource Requirements of Food Demand in the United States, May 2020 and the Amber Waves feature article, “A Shift to Healthier Diets Likely To Affect Use of Natural Resources.”
Wednesday, April 22, 2020
The U.S. Environmental Protection Agency estimated that agriculture and forestry together accounted for 10.5 percent of U.S. greenhouse gas emissions in 2018. This includes carbon dioxide (CO2) emissions associated with agricultural electricity consumption. The greenhouse gases with the largest contributions to rising temperature are CO2, methane (CH4), and nitrous oxide (N2O). Globally, CO2 emissions are the largest contributor to climate change. However, the emissions profile for agriculture differs from that of the economy as a whole. U.S. agriculture emitted 698 million metric tons of carbon-dioxide equivalent in 2018: 12.3 percent as carbon dioxide, 36.2 percent as methane, and 51.3 percent as nitrous oxide. Increases in carbon storage (sinks) offset 11.6 percent of total U.S. greenhouse gas emissions in 2018. Carbon sinks include forest management to increase carbon in forests, increases in tree carbon stocks in settlements, conversion of agricultural to forest land (afforestation), and crop management practices that increase carbon in agricultural soils. This chart updates data that appears in the Economic Research Service data product Ag and Food Statistics: Charting the Essentials.
Monday, September 9, 2019
U.S. farm output since 1948 has grown by 170 percent. Increases in total factor productivity (TFP), measured as total output per unit of total input, accounted for more than 90 percent of that output growth. However, TFP growth rates fluctuate considerably year-to-year, mostly in response to adverse weather, which can lower productivity estimates. Recent ERS research modeled a future climate-change scenario with an average temperature increase of 2 degrees Celsius (3.6 degrees Fahrenheit) and a 1-inch decrease in average annual precipitation. Results showed that the “TFP gap index”—the difference in total-factor productivity levels between the projected period (2030–40) and the reference period (2000–10)—varies by State. For some States, those climate changes fall within the range of what is historically observed, while for other States they do not, which accounts for regional variation. States in the latter category are projected to experience larger effects. The States experiencing the greatest impacts would include Louisiana and Mississippi in the Delta region; Rhode Island, Delaware, and Connecticut in the Northeast region; Missouri in the Corn Belt region; Florida in the Southeast region; North Dakota in the Northern Plains region; and Oklahoma in the Southern Plains region. This chart appears in the Amber Waves article, “Climate Change Likely to Have Uneven Impacts on Agricultural Productivity,” released August 2019.
Friday, September 30, 2016
In 2010, to help meet water quality goals, the U.S. Environmental Protection Agency (EPA) adopted a limit on the amount of pollutants that the Chesapeake Bay can receive. Nitrogen and phosphorus, in particular, can lead to adverse effects on public health, recreation, and ecosystems when present in excess amounts. The EPA estimates that applications of manure contribute 15 percent of nitrogen and 37 percent of phosphorus loadings to the Bay. Furthermore, ERS estimates that animal feeding operations (AFOs), which raise animals in confinement, account for 88 percent of manure nitrogen and 84 percent of manure phosphorus generation in that watershed. ERS also estimates that about a third of nitrogen and half of phosphorus produced at AFOs can be recovered for later use. That adds to about 234 million pounds of nitrogen and 106 million pounds of phosphorus recovered. These nutrients can then be redistributed regionally to fertilize agricultural land, thereby lessening nutrient run-off problems in the Bay. The remaining nutrients cannot be recovered. Both nitrogen and phosphorus may be lost during collection, storage, and transportation; nitrogen may also volatize into the atmosphere. This chart is based on the ERS report Comparing Participation in Nutrient Trading by Livestock Operations to Crop Producers in the Chesapeake Bay Watershed, released in September 2016.
Friday, September 9, 2016
Climate models predict U.S. agriculture will face changes in local patterns of precipitation and temperature over the next century. These climate changes will affect crop yields, crop-water demand, water-supply availability, farmer livelihoods, and consumer welfare. Using projections of temperature and precipitation under nine different scenarios, ERS research projects that climate change will result in a decline in national fieldcrop acreage in 2080 when measured relative to a scenario that assumes continuation of reference climate conditions (precipitation and temperature patterns averaged over 2001-08). Acreage trends show substantial variability across climate change scenarios and regions. When averaged over all climate scenarios, total acreage in the Mountain States, Pacific, and Southern Plains is projected to expand, while acreage in other regions--most notably the Corn Belt and Northern Plains--declines. Over half of all fieldcrop acreage in the U.S. is found in the Corn Belt and Northern Plains, and projected declines in these regions represent 2.1 percent of their combined acreage. Irrigated acreage for all regions is projected to decline, but in some regions increases in dryland acreage offset irrigated acreage losses. The acreage response reflects projected changes in regional irrigation supply as well as differential yield impacts and shifts in relative profitability across crops and production practices under the climate change scenarios. This chart is from the ERS report Climate Change, Water Scarcity, and Adaptation in the U.S. Fieldcrop Sector, November 2015.
Wednesday, May 11, 2016
Agriculture accounted for an estimated 10 percent of U.S. greenhouse gas (GHG) emissions in 2014. In agriculture, crop and livestock activities are important sources of nitrous oxide and methane emissions, notably from fertilizer application, enteric fermentation (a normal digestive process in animals that produces methane), and manure storage and management. GHG emissions from agriculture have increased by approximately 10 percent since 1990. During this time period, total U.S. GHG emissions increased approximately 7 percent. This chart is from the Land and Natural Resources section of ERS’s Ag and Food Statistics: Charting the Essentials data product.
Wednesday, February 17, 2016
ERS research projects that climate change will result in a decline in national fieldcrop acreage over analysis years 2020, 2040, 2060, and 2080, when measured relative to a scenario that assumes continuation of reference climate conditions (precipitation and temperature patterns averaged over 2001-08). Acreage trends are explored for nine climate change scenarios, and substantial variability exists across climate change scenarios and crop sectors. When averaged over all climate scenarios, U.S. acreage in rice, hay, and cotton is projected to expand, while acreage in corn, soybeans, sorghum, wheat, and silage declines. Acreage response varies across crops as a function of the sensitivity of crop yields to changes in precipitation, temperature, and atmospheric carbon dioxide; the resulting changes in relative crop profitability; the coincidence of climatic shifts with geographic patterns of crop production; and variables related to the extent of crop reliance on irrigation. This chart is from the ERS report Climate Change, Water Scarcity, and Adaptation in the U.S. Fieldcrop Sector, November 2015.
Wednesday, January 6, 2016
About 75 percent of irrigated cropland in the United States is located in the 17 western-most contiguous States, based on USDA’s 2013 Farm and Ranch Irrigation Survey (the most recent available). Between 1984 and 2013, while the amount of irrigated land in the West has remained fairly stable (at about 40 million acres) and the amount of water applied has been mostly flat (between 70 and 76 million acre-feet per year), the use of more efficient irrigation systems to deliver the water has increased. In 1984, 71 percent of Western crop irrigation water was applied using gravity irrigation systems that tend to use water inefficiently. By 2013, operators used gravity systems to apply just 41 percent of water for crop production, while pressure-sprinkler irrigation systems (including drip, low-pressure sprinkler, or low-energy precision application systems), which can apply water more efficiently, accounted for 59 percent of irrigation water use and about 60 percent of irrigated acres. This chart is found in the ERS topic page on Irrigation & Water Use, updated October 2015.