Data Quality and Accuracy

Evaluation of the First Round of FoodAPS (FoodAPS-1)

Under contract to ERS, Mathematica Policy Research designed and conducted the first National Household Food Acquisition and Purchase Survey (FoodAPS-1) which was fielded as the National Food Study. After the survey was completed, Mathematica prepared a report ("lessons learned") that explains the limitations and challenges faced throughout FoodAPS and provides recommendations for future improvements. ERS also awarded a contract to Westat, a social science research firm, to perform an independent assessment of the quality and accuracy of the survey's data and data collection procedures.

This page provides the following information:

Lessons Learned from Designing and Conducting FoodAPS-1

Major Findings from Review and Evaluation of FoodAPS-1

Lessons Learned from Designing and Conducting FoodAPS-1

Upon completion of FoodAPS-1, Mathematica was asked by ERS to provide a report to explain the challenges encountered throughout FoodAPS-1 and the "lessons learned" that ERS can incorporate to improve future data collection efforts. The report covers several areas, including: the FoodAPS-1 field test; sample design; field staff recruitment and training; telephone staff recruitment and training; food-at-home (FAH) acquisitions; food-away-from-home (FAFH) acquisitions; and household interviews and self-administered forms. The report's Executive Summary reviews the key findings.

ERS has made minor edits to the report to reduce the risk of disclosing confidential information about respondents.

See: Lessons Learned from the Design and Implementation of the National Household Food Acquisition and Purchase Survey (FoodAPS)

Major Findings from Review and Evaluation of FoodAPS-1

After FoodAPS-1 was completed, ERS awarded a contract to Westat to perform an independent assessment of the quality and accuracy of the survey's data and data collection procedures. The assessment was conducted in line with guidance from the Office of Management and Budget (OMB). To ensure the independence of Westat's review, ERS's only role in the preparation of the reports was normal contractual oversight. Additionally, Westat's evaluation reports were peer-reviewed.

The assessment comprises five reports which focus on: instrument design, response burden, use of incentives, and response rates; sample design; completeness and accuracy; potential for nonresponse bias; and imputation approaches for income and price data. Each report includes an executive summary. These reports may be accessed from this website by clicking on the links below:

  1. Instrument Design, Response Burden, Use of Incentives, and Response Rates
  2. Sample Design
  3. Completeness and Accuracy
  4. Potential for Nonresponse Bias
  5. Imputation Approaches for Income and Price Data

ERS has determined that the major findings from the entire evaluation are as follows:

  • FoodAPS-1’s reliance on paper-and-pencil recording of data reduced data accuracy and increased the need to process collected data prior to conducting analyses.
  • Improvements could be made in framing income questions and questions about SNAP participation.
  • There is some evidence of under-reporting of food acquisitions in FoodAPS-1. Larger households (greater than 4 persons) reported fewer food acquisitions on a per-person basis than did smaller households. In addition, the percentage of households that did not report any daily expenditure on food increased over the course of the data collection week, suggesting response fatigue to the survey. Several sociodemographic factors, including race, ethnicity, and education, are related to the likelihood of reporting a food acquisition. In addition, those who had a weaker connection to the household, such as nonrelatives of the primary respondent, were less likely to report on their food acquisitions.
  • The overall response rate for FoodAPS-1 is relatively low (41.5 percent) and raises concern about the potential for nonresponse bias, especially as the respondents to FoodAPS-1 differed significantly from nonrespondents on several socioeconomic characteristics. The risk of bias was reduced, however, by adjustments to sample weights and, overall, the analysis did not indicate that nonresponse bias is a concern.
  • There appear to be relatively few income outliers in the data. These outliers, however, can affect modeling results, especially if they occur in households with large sample weights.
  • The effort to confirm SNAP participation through matching to administrative data does not appear to affect analyses of SNAP effects. In several analyses, the effect of using a measure of SNAP participation based on self-reports was compared with using a participation measure based on data matching; no significant differences in results were found. The inability to obtain administrative records from all States, however, may be partly responsible for these results.
  • The missing rates for income variables ranged from 1.5 percent to 4 percent, with monthly earnings having the highest missing rate. The iterative imputation approach used to impute missing income data was done reasonably well, although some improvements could have been made to the models used. The approach for imputing missing price information was also appropriate, but it could not impute all missing food prices due to its reliance on other FoodAPS-1 data. Other methods could have been used to impute the remaining missing prices.

ERS researchers are using the results of this third-party evaluation in framing requirements for the design and characteristics of the second round, FoodAPS-2.