Quality Data for Monitoring and Evaluation

Data is a key component of Monitoring and Evaluation (M&E) work. On one level, M&E planning involves identifying information needs and defining measures and indicators. On another level, evaluators must also determine what data are available and what gaps exist, and identify information sources and how data gaps can be filled. As Bamberger, Rugh, and Mabry (2012) discuss, in the face of budget, political, data, or time constraints, the data acquisition process could easily and understandably prioritize data availability and quantity, while paying less attention to data quality assessment.

However, data quality assessment is relevant because data quality affects the quality of decisions made using the findings. After all, spending resources gathering good quality versus unchecked quality data makes efficient use of limited resources. Good quality data is relevant, accurate, reliable, timely, punctual, accessible, clear, coherent, and comparable. Being aware of the weaknesses of your data also helps you to factor the shortfalls into your analysis and conclusions.

When data collection is required, note that data quality is set when the dataset is created. It is helpful to be conscious of data quality from the onset, and integrate quality control into the data collection and acquisition process, to the best extent possible. If hiring a data collection firm, it helps to agree on how data quality will be assessed and the plan of action if set standards are not met. There are a variety of factors that can affect data quality including budgeting, timing, survey design, questionnaire design, the data collection mode, the interviewer, and the respondent.

Budgeting and timing: As far as budgeting is concerned, ensure that the budget carefully and correctly captures the value of items without underestimating them. One should be conscious of possible fluctuations in the price of materials between drafting the budget and purchasing the materials as failure to do so could seriously endanger the success of the task. Also budget for incidental costs. Commonly overlooked or undervalued costs include training of field staff, data analysis, data archiving, and dissemination.

It is important to have a good understanding, before the budget is prepared, of the size of the sample, the period during which data will be collected, and the length of the interview. Only then can the number of field teams be estimated. These basic parameters affect staff costs, outlays on transport and travel allowances, and equipment used. If these parameters are not clearly defined at the start, it is advisable to prepare the budget in several variants, based on different assumptions of size (IHSN, undated).

According to the Office of Quality Improvement at the University of Wisconsin-Madison (2010), developing a detailed timeline is a crucial step in planning your survey. Determine when the survey data is needed. Know when the best time would be to contact potential participants. If your survey is going to a campus population, you may want to avoid coinciding with one of the periodic campus-wide surveys or exams. Ask yourself what other workload issues may affect those involved in sending the survey and collecting and analyzing the results? How long will it take to design and obtain approval for the survey? How much time is needed for collecting responses? The answers to all these questions will help you develop a realistic timeline.

Survey design: Know your target audience and the best way to reach them. Research sampling methods or hire a sampling expert to help you draw your sample.

Questionnaire preparation:

  • Reduce bias by avoid leading, ambiguous, confusing, double barreled, or complex questions. Be mindful of the way you formulate questions. If you want to measure the same variable or if you want your statistic to be comparable to what has been measured prior, consider looking at the way these variables were asked before. A difference in the way a question is formulated can lead to incomparable statistics. For instance, asking How many children go to school?” does not yield a response that is comparable to: “How many children ages 5-14 years of age are enrolled in primary school?” 
  • There are a variety of resources available online to assist with question formulation. One example is the IHSN Question Bank, maintained by a network of international agencies. It provides a central repository of international questionnaires, interviewer instructions, classifications, concepts and indicators.
  • Order your questions carefully, utilizing follow-up questions. Skip patterns when possible.
  • Make sure translations are appropriately worded so as to preserve the meaning, and to ensure the concept remains the same across cultures.
  • Test your questionnaire before data collection, taking note of the length of your questionnaire and addressing any unclear areas that might need polishing up. If the questionnaire is too long, respondents could lose interest and this could affect the quality of data collected. As much as time and resources allow, prepare an enumerator manual and train fieldworkers on how to collect data. 

Data collection/interview process:

  • Reduce interviewer bias by working on your interviewing skills. Whenever possible, make use of data collection tools that minimize data entry error, safeguard data, and make efficient use of time. Technology has made it possible to make efficient use of time by enabling enumerators to collect and capture data, perform cross validations and safely store data from the field. An example of such technology is the Computer Assisted Personal Interview technology developed by the World Bank and is offered free of charge for use on low cost Android tablets.

Clean your data and make sure the data is well-labeled and documented so everything is clear during analysis.

Analysis: Good data will not guarantee good decisions (Shah et al 2012). It takes good analytical skills to make good sense of the data to make good decisions. After spending resources gathering data, appropriate skills need to be utilized to interpret the data for the audience. If a weighted sample was used, weights need to be properly factored into the analysis. Utilize the appropriate methods for analyzing your data and employ appropriate visualizations for the type of data you are analyzing. Visualize your data using a program or service such as Tableau.

Curation: Last, but not least, is the data curation and dissemination. It is good practice to maintain reproducible research. As you conduct your data collection, carefully document your work so people can trace your steps. It adds credibility and transparency to the process. Feedback from users may even improve the quality of future surveys. After you have collected and used the data, prepare it for future reference or future use and educational processes as the evaluation agreement dictates. If data dissemination is part of the contract, make sure to anonymize data.

From my work, I have found that proper assessment and constant awareness of data quality helps to make informed decisions regarding programs under evaluation. This requires careful planning and attention to quality of data during acquisition, from budgeting and survey design to data collection and analysis.

Resources

Bamberger, Michael, Jim Rugh, and Linda Mabry, RealWorld Evaluation, 2nd ed, Thousand Oaks: Sage, 2012.

International Household Survey Network, undated, available on: http://www.ihsn.org/.

Office of Quality Improvement, University of Wisconsin Madison, Survey Fundamentals, 2010, available on: https://oqi.wisc.edu/resourcelibrary/uploads/resources/Survey_Guide.pdf.

Shah, Shvetank, Andrew Horne, and Jaime Capella, “Good Data Won’t Guarantee Good Decisions,” Harvard Business Review, April 2012, available on: https://hbr.org/2012/04/good-data-wont-guarantee-good-decisions.

Sheppard, S. Andrew and Loren Terveen, Quality is a Verb: The Operationalization of Data Quality in a Citizen Science Community, 2011, available on: https://wq.io/research/quality.

About the Author

Catherine MachingautaCathrine Machingauta is a development professional and a graduate of American University’s Graduate Certificate in Program in Project Monitoring and Evaluation. Her work over the past 8 years includes research on the effect of antiretroviral treatment on the quality of life of HIV/AIDS patients and families; the World Development Report 2013: Jobs; Statistical Disclosure Control and Microdata Curation. She has an MA in International Development Studies from the National Graduate Institute for Policy Studies, Tokyo, Japan, and Bachelors of Science in Economics from the University of Zimbabwe. She currently works for the World Bank, Development Data Group, in the Survey Unit.

To learn more about American University’s online MS in Measurement & Evaluation or Graduate Certificate in Project Monitoring & Evaluation, request more information or call us toll free at 855-725-7614.