ERS Charts of Note

Subscribe to get highlights from our current and past research, Monday through Friday, or see our privacy policy.

Get the latest charts via email, or on our mobile app for Download the Charts of Note app on Google Play and Download the Charts of Note app on the App Store

Reset

Per acre water use in irrigated farmland shows a declining trend

Wednesday, January 19, 2022

The importance of irrigation for the U.S. agricultural sector has evolved significantly over the past century. Irrigated acreage in the country has grown from fewer than 3 million acres in 1890 to more than 58 million acres in 2017. The expansion of irrigated acreage during this period reflects Federal, State, and local investment in irrigation infrastructure to deliver surface water to farms and ranches. Additionally, this expansion is partly due to advancements in well drilling and pumping technologies, which have facilitated growth in groundwater-based irrigated agriculture. Since 1969, the amount of water used per acre irrigated has decreased substantially. The average water use per acre irrigated was more than 2 acre-feet (1 acre-foot = 325,851 gallons) in 1969, which declined to nearly 1.5 acre-feet by 2018. Efficient water application technologies, such as the transition from gravity-based to pressurized irrigation systems, have driven the reduction in water use per acre of irrigated land. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 2021.

The distribution of U.S. irrigated acreage has shifted eastward since mid-20th century

Tuesday, January 4, 2022

Regional distribution of U.S. irrigated acreage changed significantly from 1949 to 2017. Trends in irrigated cropping patterns, technological advances, water availability, and changing growing-season weather drove this evolution. The arid Mountain and Pacific regions consistently irrigated the most farmland until 2007, when irrigated acreage in the Northern Plains region surpassed acreage in the Pacific region. Irrigated acreage in the Mountain and Pacific regions remained relatively constant over the 70-year period, despite increasingly limited opportunities for additional water development and increasing competition for water from non-agricultural sectors. The Northern Plains region has experienced the most substantial increase in irrigated acreage, expanding from less than 2 million acres in 1949 to nearly 12 million acres in 2017. The expansion of irrigated acreage in the Northern Plains is related to advances in groundwater pumping technologies, the diffusion of center pivot irrigation application systems, and the region’s abundant aquifer resources. The Southern Plains region experienced similar growth in irrigation until the 1980s, when dwindling groundwater supplies resulted in irrigated acreage declines. The Mississippi Delta and Southeast regions also have expanded irrigated acreage since 1949 reflecting, in part, changing cropping patterns, abundant aquifer water supplies, and producer responsiveness to changing precipitation levels during growing seasons. This chart was drawn from the USDA, Economic Research Service report Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity, published December 2021.

Irrigation delivery organizations use a variety of methods to calculate on-farm water use

Monday, December 13, 2021

Irrigation organizations use a variety of methods to calculate on-farm water use so they can accurately track water use within their delivery systems. The methods used to calculate on-farm water use partially determine ways organizations can price water deliveries. For example, implementing volumetric water pricing is difficult unless organizations can directly meter on-farm water use. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, about 44 percent of irrigation water delivery organizations use direct metering to calculate on-farm water use, and about 42 percent of organizations use time-of-use estimation to determine water deliveries. The time-of-use method estimates the volume of water delivered based on the duration of deliveries and the characteristics of the conveyance infrastructure. About 17 percent of organizations calculate water deliveries based on self-reporting from irrigated farms and ranches. Many organizations use more than one method to determine on-farm water use. This chart was drawn from the USDA, Economic Research Service report Irrigation Organizations: Water Storage and Delivery Infrastructure, published October 2021.

Large irrigation organizations are most likely to own water storage infrastructure

Monday, November 29, 2021

Water storage infrastructure includes dams and reservoirs that provide a way to store water across seasons and years to meet the demands of irrigators. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, less than 20 percent of water delivery organizations own and manage their own water storage reservoirs. The remaining water delivery organizations rely on natural streamflow or storage infrastructure owned by State or Federal agencies or other irrigation organizations. Large irrigation organizations, defined as those organizations that serve more than 10,000 irrigable acres, are the most likely to own water storage infrastructure. Almost 37 percent of large irrigation organizations have at least one water storage reservoir. Meanwhile, 21 percent of medium organizations and 10 percent of small organizations, have at least one reservoir. Storage infrastructure is particularly important in snowpack-dependent basins where the timing of spring runoff does not align with peak irrigation water demand. The role of water storage infrastructure will be critical as snowpack decreases, snowmelt runoff shifts to earlier in the growing season, and water demand increases. This chart can be found in the USDA, Economic Research Service report Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.

Cost cited as major reason for not lining canals that carry water to farms, ranches in 2019

Wednesday, November 3, 2021

Irrigation organizations that deliver water to farms and ranches use main and lateral canals, tunnels, and pipelines to transport water from natural waterways, reservoirs, or other infrastructure to irrigated farms and ranches. Transporting water to farms and ranches can result in conveyance losses, or water that is unavailable for irrigation use because of evaporation or seepage. Lining water canals with quasi-impermeable materials, such as concrete or plastic membranes, can reduce conveyance losses as less water is lost to seepage. However, the cost of lining canals may be prohibitively high for many irrigation organizations. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, almost 76 percent of water delivery organizations cite expense as a reason for leaving conveyance infrastructure unlined. In some scenarios, lining canals may not be feasible or warranted. For example, unlined canals may beneficially recharge aquifers or soil and geologic attributes may minimize seepage losses. A smaller percentage of organizations cite those as reasons for not lining main and lateral canals. This chart can be found in the USDA, Economic Research Service report, Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.

Local residents living in oil-dependent counties experienced long-term effects following the oil boom and bust of the 1980s

Monday, October 18, 2021

Errata: On October 22, 2021, the map presented in this Chart of Note was revised to show the correct number of counties in the contiguous United States.

Focusing on the rapid rise and decline of oil production in the 1970s and 1980s, researchers at USDA’s Economic Research Service (ERS), the University of Oregon, and the University of Wisconsin-Madison studied the cumulative effects of oil booms (and subsequent busts) on households living in counties with the most dependence on oil extraction. The authors identified individuals living in “boom counties” in 1980, defined as those with greater than 2.5 percent employment in oil and natural gas extraction. On average, the incomes of boom households increased by $5,000 dollars annually during the early years of the 1975-1979 oil boom and $6,900 per year during the later boom of 1980-1984, compared with similar households in counties that were not producing oil. The subsequent bust, however, reduced household incomes on average by more than $8,000 annually from 1985 to 1992. These losses were driven in part by increased unemployment and the dissipation of relative wage gains during the boom. The earlier oil boom and bust appeared to have no effect on household income after 1993. The average household in a boom county saw cumulative income losses of $7,600 compared with households in non-boom counties between 1969 and 2012, the final year of the study. These income losses were experienced entirely by workers in their prime working age of 25-54. Boom household heads above 54 were also about 15 percent less likely to retire from 1989 to 1992, compared with non-boom household heads. To estimate the effects of booms and busts on employment, the researchers used annual household-level survey data from the Panel Study of Income Dynamics. This chart appears in the Amber Waves finding “Oil Booms Can Reduce Lifetime Earnings and Delay Retirement,” published October 2021.

Rates of cover crop adoption vary depending on the cash crop being planted

Friday, October 1, 2021

Farmers typically add cover crops to a rotation between two commodity or forage crops to provide seasonal living soil cover. According to data from USDA’s Agricultural Resource Management Surveys, the level of cover crop adoption varies according to the primary commodity. In the fall preceding the survey year, farmers adopted cover crops on 5 percent of corn-for-grain (2016), 8 percent of soybean (2018), 13 percent of cotton (2015) and 25 percent of corn-for-silage (2016) acreage. The adoption rate in the survey year (2017) was lowest for winter wheat. This reflects the fact that farmers typically plant cover crops around the same time as winter wheat in the fall, which makes it difficult to grow both winter wheat and a fall-planted cover crop in the same crop year. In contrast, the rate of cover crop adoption was highest on corn-for-silage fields in the 2016 survey. Because corn silage is used exclusively for feeding livestock, farmers planting corn-for-silage may also grow cover crops for their forage value. Corn-for-silage also affords a longer planting window for cover crops compared with corn planted for grain because of an earlier harvest, and cover crops can help address soil health and erosion concerns on fields harvested for silage. Harvesting corn-for-silage involves removing both the grain and the stalks of the corn plant, leaving little plant residue on the field after harvest. This chart appears in the ERS report Cover Crop Trends, Programs, and Practices in the United States, released in February 2021

Use of pressurized irrigation systems in the western United States roughly doubled from 1984 to 2018

Monday, September 13, 2021

There are two methods to apply irrigation water to crops: gravity or pressurized irrigation systems. Gravity irrigation systems use on-field furrows, basins, or poly-pipe to advance water across the field surface through gravity means only. Pressurized systems apply water under pressure through pipes or other tubing directly to crops (e.g., sprinkler and micro/drip irrigation systems). Under many field conditions, pressurized irrigation systems use water more efficiently than gravity systems, as less water is lost to evaporation, deep percolation, and field runoff. Over the last 30 years, the number of acres irrigated using pressurized irrigation systems roughly doubled while the acreage irrigated using gravity systems declined substantially in the 17 Western States. In 2018, 72 percent of all irrigated cropland acres (28.96 million acres out of 40.31 million acres of total irrigated area) in 17 Western States used pressurized irrigation systems, up from 37 percent in 1984. This chart appears in the USDA, Economic Research Service topic page for Irrigation & Water Use, updated August 2021.

Fields with cover crops more likely to be in a conservation crop rotation, especially in soybean and corn fields

Friday, August 20, 2021

A conservation crop rotation involves a sequence of crops grown on the same ground over a period of time for conservation purposes, such as soil erosion control, soil health, and increased crop diversity. To meet the conservation practice standard for a conservation crop rotation as determined by USDA, Natural Resources Conservation Service (NRCS), a given field must include crops, such as many small grains, that generate greater residue (crop materials such as stalks, stems, or leaves that are left in the field after the crop has been harvested) and meet crop diversity requirements across years. Cropping systems that include cover crops are more likely to meet the standard. Cover crops are typically added to a crop rotation between two commodity or forage crops to provide living, seasonal soil cover. For corn, 70 percent of acres with cover crops in 2016 were in fields that met the criteria for a conservation crop rotation, compared to 26 percent of acres without a cover crop that also met the criteria. For cotton in 2015, 34 percent of acres that used a cover crop were in a conservation crop rotation, compared to only 4 percent of acres without a cover crop that met conservation crop rotation criteria. For soybeans in 2018, 94 percent of acres that used a cover crop met conservation crop rotation criteria, compared to only 13 percent of acres without a cover crop that also met those criteria. The association between cover crops and the use of conservation rotations in corn and cotton is more limited than for soybeans because corn and cotton fields may not include a legume or other crop with low-nitrogen fertilizer demands. This chart appears in the ERS report Cover Crop Trends, Programs, and Practices in the United States, released February 2021.

Rye and winter wheat were the most common cover crops on corn, soybean, and cotton fields

Friday, July 9, 2021

Cover crops—which farmers add to a crop rotation in between the planting of two crops—provide living, seasonal soil cover with a variety of benefits, such as increased soil moisture capacity, weed suppression, and reduced nutrient runoff. Researchers from USDA, Economic Research Service (ERS) reported which cover crops were grown the fall before planting corn, cotton, and soybeans. For corn fields intended for use as grain or silage (the harvesting of the entire plant for forage) in 2016, more than 90 percent of acres with cover crops used a grass or small grain cover crop, such as rye, winter wheat, or oats. At 63 percent of acreage, rye was more than twice as common as winter wheat (26 percent) as the cover crop on corn for grain fields. Rye and winter wheat were also the most common cover crops on soybean fields in 2018. Winter wheat was the most common cover crop used on cotton fields in 2015. This likely reflects the role of wheat stubble in protecting cotton seedlings from wind and the potentially negative impact of certain chemicals produced by cereal rye on growing cotton plants. This chart appears in the ERS report Cover Crop Trends, Programs, and Practices in the United States, released in February 2021. It also appears in the July 2021 Amber Waves finding Grass Cover Crops, Such as Rye and Winter Wheat, Were the Most Common Cover Crops Used Before Planting Corn, Soybeans, and Cotton.

Cover crop use is more persistent on cotton and corn silage fields

Thursday, April 22, 2021

The use of cover crops on U.S. cropland increased 50 percent between 2012 and 2017, according to data in the U.S. Census of Agriculture. Cover crops—such as unharvested cereal rye, oats, winter wheat, and clover—are typically added to a crop rotation during the period between two commodity or forage crops. Persistent year-after-year adoption of cover crops (defined as 3 or 4 years of adoption within a 4-year crop rotation) can increase the accumulation of soil organic matter and provide a living, seasonal coverage of soil. Together, those two outcomes benefit farmers and the ecosystem. For example, healthier soils with consistent living cover can reduce the runoff of sediments and nutrients into waterways, increase soil moisture capacity, and sequester carbon. Among fields that adopted a cover crop in at least 1 year of the rotation, persistent cover crop use occurred on 69 percent of cotton acres (2015), 56 percent of corn-for-silage acres (2016), 19 percent of corn-for-grain acres (2016), and 32 percent of soybean acres (2018). Nationally, cover crop acreage has increased over time as conservation programs have promoted cover crop adoption through research, technical assistance, and financial assistance. Many of the fields with only 1 or 2 years of cover crops are those that started planting in the third or fourth year surveyed, suggesting that they may be new adopters. This chart appears in the Economic Research Service report Cover Crop Trends, Programs, and Practices in the United States, released February 2021, and in the March 2021 Amber Waves article, Persistent Cover Crop Adoption Varies by Primary Commodity Crop.

Irrigation delivery organizations released water from systems for a variety of users, purposes

Thursday, February 18, 2021

According to USDA’s 2019 Survey of Irrigation Organizations, irrigation delivery organizations such as irrigation districts and ditch companies supplied an estimated 41.4 million acre-feet of off-farm water to U.S. farms and ranches in 2019. These organizations also delivered water to other customers: 2.3 million acre-feet to domestic users, 1.5 million acre-feet to industrial users, and 1.5 million acre-feet to other irrigation organizations. In addition, organizations intentionally released water from their systems for other purposes, including 3.1 million acre-feet for downstream users, 1.2 million acre-feet for managed groundwater recharge, and 1.0 million acre-feet to meet environmental requirements. Beyond these intentional deliveries and releases, a total of 10.7 million acre-feet of water left organization systems as conveyance losses, which represents water lost to groundwater seepage or evaporation during transport or storage. This implies an average conveyance loss rate of 16 percent. As the second largest outflow from water delivery systems, reducing conveyance losses is an important focus for water conservation efforts. However, hydrologic systems are complex natural systems, so conveyance losses in many cases provide benefits elsewhere in the environment. For example, conveyance losses may provide unmanaged groundwater recharge or indirect flows into surface water systems that can support wildlife habitat. This chart is based on data found in USDA’s Survey of Irrigation Organizations, updated December 17, 2020.

Irrigation delivery organizations acquired most water from Federal water projects and natural water bodies

Monday, February 8, 2021

USDA’s 2019 Survey of Irrigation Organizations identified 2,543 irrigation organizations that delivered off-farm water directly to U.S. farms and ranches, including irrigation districts, ditch companies, acequias, and similar entities. Water is measured in “acre-feet,” or the amount of water needed to cover one acre of land under a foot of water. Irrigation delivery organizations obtained their water supplies, which totaled more than 70 million acre-feet, from a variety of sources. About 29 million acre-feet came from Federal water projects, which are large water storage and distribution systems built and maintained by the Bureau of Reclamation, the Army Corps of Engineers, and the Bureau of Indian Affairs. Irrigation organizations diverted an additional 22 million acre-feet directly from natural water bodies, such as rivers, streams, lakes, and ponds. The next largest sources of water were State water projects and private or local water projects, which delivered a combined 14 million acre-feet of water to organizations in 2019. Other water sources include water from other reservoirs, often owned by the organizations themselves (2 million acre-feet); water purchased or contracted from other suppliers (2 million acre-feet); groundwater pumped from well fields into water conveyance infrastructure (1 million acre-feet); water obtained directly from municipal and industrial suppliers (0.5 million acre-feet); and water captured from agricultural drainage systems (0.3 million acre-feet). This chart is based on data found in USDA’s Survey of Irrigation Organizations, updated December 17, 2020.

U.S. irrigation organizations performed a variety of water management functions

Friday, January 15, 2021

The 2019 Survey of Irrigation Organizations (SIO), jointly conducted by USDA’s Economic Research Service and National Agricultural Statistics Service, collected information about different types of organizations involved in the local management of water supplies for irrigated farms and ranches. Irrigation organizations directly influence on-farm water use through delivery of irrigation supplies and management of groundwater withdrawals. According to the survey’s data, in 2019, there were an estimated 2,677 irrigation organizations in the 24 States where most U.S. irrigation occurred. About 95 percent of these organizations—such as irrigation districts and ditch companies—had a primary function of delivering water directly to farms, typically through a system of irrigation storage facilities, canals, pipelines, acequias, and ditches. About 27 percent of organizations were involved in at least some aspect of groundwater management as a primary function, with 23 percent of organizations engaging in both water delivery and groundwater management. Groundwater management may include monitoring aquifer conditions, collecting pumping data, charging pumping fees, issuing permits for new wells, or overseeing aquifer recharge efforts. Some irrigation organizations perform secondary functions, such as delivering water to municipal and residential users (14 percent of organizations); managing agricultural water drainage (11 percent); and generating electricity (3 percent). This chart is based on data found in USDA’s Survey of Irrigation Organizations, updated December 17, 2020.

Climatic trends dampened productivity growth on Wisconsin dairy farms

Monday, November 16, 2020

In 2019, Wisconsin’s production of fluid milk was second only to California’s. According to data from USDA’s National Agricultural Statistics Service, Wisconsin generated 30.6 billion pounds of milk that year, with milk sales totaling $5 billion. In recent years, Wisconsin dairy farms have been exposed to substantial weather volatility characterized by frequent droughts, storms, and temperature extremes (both hot and cold). This has resulted in considerable fluctuations in dairy productivity. Researchers from the Economic Research Service (ERS) among others, found that total factor productivity (TFP), which measures the rate of growth in total output (aggregate milk produced) relative to the rate of growth in total inputs (such as the number of cows, farm labor, feed, and machinery), increased at an average annual rate of 2.16 percent for Wisconsin dairy farms between 1996 and 2012. This increase was primarily driven, at an annual rate of 1.91 percent, by technological progress—such as improved herd genetics, advanced feed formulations, and improvements in milking and feed handling equipment. However, trends in rainfall and temperature variation were responsible for a 0.32 percent annual decline in the productivity of Wisconsin dairy farms during the same period. For example, an average increase in temperature of 1.5 degrees Fahrenheit reduced milk output for the average Wisconsin dairy farm by 20.1 metric tons per year. This is equivalent to reducing the herd size of the average farm by 1.6 cows every year. This chart appears in ERS’s October 2020 Amber Waves finding, “Climatic Trends Dampened Recent Productivity Growth on Wisconsin Dairy Farms.”

USDA’s Conservation Reserve Enhancement Program in Kansas largely attracts participants with higher rates of groundwater depletion

Wednesday, October 21, 2020

Agriculture in the semi-arid region overlying the High Plains Aquifer, which spans parts of eight states, relies on groundwater. In several areas, significantly more groundwater is extracted than is returned to the aquifer each year, leading to declining water levels. In Kansas, USDA’s Conservation Reserve Enhancement Program (CREP) specifically focuses on retiring irrigated cropland to reduce stress on limited water resources. To represent the amount of water that retired rights would have used in the absence of CREP, in effect the amount of use reduced by the program, ERS researchers used a group of 98 unenrolled farmers similar to 98 enrolled farmers based on factors like farm size, crops grown, and soil quality. Trends of unenrolled matched farmers are largely representative of the average unenrolled farmer in the Western District, where most enrollments have occurred, and which has experienced the most significant aquifer depletion. From 1996 to 2017, unenrolled matched farmers decreased their water use by 0.94 percent a year relative to 1996 levels, compared to 0.64 percent a year for the average unenrolled farmer in the Western District. Furthermore, although unenrolled matched farmers initially experienced more rapid depletion, declines in saturated thickness have been very similar for the two groups since 2008. This chart appears in the October 2020 Amber Waves feature, “Incentives to Retire Water Rights Have Reduced Stress on the High Plains Aquifer.”

Conservation spending remained roughly level in recent years, a trend projected to continue through 2023

Monday, October 5, 2020

USDA’s voluntary conservation programs form the backbone of U.S. agricultural conservation policy. These programs include the Conservation Reserve Program, Agricultural Conservation Easement Program, Environmental Quality Incentives Program, Conservation Stewardship Program, Regional Conservation Partnership Program, and Conservation Technical Assistance. The programs help agricultural producers improve their environmental performance related to soil health, water quality, air quality, wildlife habitat, and greenhouse gas emissions. Between 1996 and 2011, real (inflation-adjusted) conservation spending grew by roughly 50 percent, largely due to expansion of the major working lands programs. Since 2011, annual spending has remained between $6.0 and $6.5 billion (except in 2015) and is projected to remain within that range between 2019 and 2023. Under the Agriculture Improvement Act of 2018 (also known as the 2018 Farm Act), the Congressional Budget Office (CBO) estimates mandatory conservation spending of $29.5 billion over 5 years. This is about $560 million more than CBO’s projection of 2019-23 spending with the extension of the programs and provisions of the 2014 Farm Act. Although most conservation programs receive “mandatory” funding, the funding levels are not guaranteed and could be revised in future years. This chart appears in the ERS topic page for Conservation Programs, updated September 2019.

Most land exiting USDA’s Conservation Reserve Program was used for annual crop production

Monday, September 14, 2020

Errata: On October 30, 2020, the Chart of Note was revised to correct shares of land exiting the Conservation Reserve Program (CRP) by land use category. Land used for crop land was corrected to 79 percent. Land used for trees was corrected to 6 percent. No other values were affected.

Between 2013 and 2016, contracts for about 7.6 million acres of land enrolled in USDA’s Conservation Reserve Program (CRP) expired. About 2.76 million acres of expiring land reenrolled in the CRP. Of the almost 4.89 million acres that exited the program during the period, 57 percent transitioned to annual crop production. At least half of the exiting CRP land transitioned to annual crop production in each of the four years. The most common annual crops grown on expired CRP land were soybeans (21 percent of the exiting CRP land that went into annual crop production), corn (16 percent), and wheat (16 percent). Perennial forage (such as alfalfa) and specialty crop (such as pecans) production accounted for 12 and 11 percent, respectively. Taken together, 79 percent of former CRP land was put to some type of crop production (annual, perennial forage, or perennial specialty) after exiting the program. The remaining exiting land was most often used as grass cover (14 percent) or tree cover (6 percent). Post-CRP acreage under grass cover may be used as pastureland or represent acres that are untouched after expiring from a grassland practice in CRP. This chart appears in the December 2019 ERS report, The Fate of Land in Expiring Conservation Reserve Program Contracts, 2013-2016.

A large share of land enrolled in USDA’s Conservation Reserve Program was located in the Plains, from Texas to Montana

Tuesday, May 26, 2020

USDA’s Conservation Reserve Program (CRP) covered about 22.3 million acres of environmentally sensitive land at the end of fiscal 2019. With an annual budget of roughly $1.8 billion, CRP was USDA’s largest single conservation program in terms of spending that year. CRP enrollees receive annual rental and other incentive payments for taking eligible land out of production for 10 years or more. Voluntary retirement of cropland under CRP provides numerous environmental benefits related to soil erosion, water quality, wildlife habitat provision, and other environmental services. As of January 2020, total CRP enrollment was 21.9 million acres—with a large share of that land located in the Plains (from Texas to Montana), where rainfall is limited and much of the land is subject to potentially severe wind erosion. Smaller concentrations of CRP land were found in eastern Washington, southern Iowa, northern Missouri, and the Mississippi Delta. Approximately 5.4 million acres will expire in September 2020. CRP General Signup 54, which concluded in February 2020, accepted over 3.8 million acres for enrollment in October 2020. Acreage continues to be accepted under continuous signup. This chart updates data found in the Economic Research Service data product, Ag and Food Statistics: Charting the Essentials, updated March 2020.

Agriculture contributed 10.5 percent of U.S. greenhouse gas emissions in 2018

Wednesday, April 22, 2020

The U.S. Environmental Protection Agency estimated that agriculture and forestry together accounted for 10.5 percent of U.S. greenhouse gas emissions in 2018. This includes carbon dioxide (CO2) emissions associated with agricultural electricity consumption. The greenhouse gases with the largest contributions to rising temperature are CO2, methane (CH4), and nitrous oxide (N2O). Globally, CO2 emissions are the largest contributor to climate change. However, the emissions profile for agriculture differs from that of the economy as a whole. U.S. agriculture emitted 698 million metric tons of carbon-dioxide equivalent in 2018: 12.3 percent as carbon dioxide, 36.2 percent as methane, and 51.3 percent as nitrous oxide. Increases in carbon storage (sinks) offset 11.6 percent of total U.S. greenhouse gas emissions in 2018. Carbon sinks include forest management to increase carbon in forests, increases in tree carbon stocks in settlements, conversion of agricultural to forest land (afforestation), and crop management practices that increase carbon in agricultural soils. This chart updates data that appears in the Economic Research Service data product Ag and Food Statistics: Charting the Essentials.