Beverly Peters, Assistant Professor
I started my career in measurement and evaluation much by happenstance*. In 1992, while attending the University of Zimbabwe for master's coursework, there was an extreme drought. NGOs provided food assistance throughout the countryside. As a young graduate student, I got involved in these efforts.
Entering the Field
I studied the Shona people and was in a village doing research on women's access to land. I spoke the local language and had spent time learning about the local culture, so I had a good understanding of community realities and perceptions.
Early on, I faced an ethical dilemma that I often share in my classes. As a 20-something female graduate student, I tried to be accepted into the local culture in a village that was food insecure. Lack of food meant women in the village were skipping meals every other day. Men and children ate daily. The anthropologist in me wanted to fit in, to keep to this rotational eating schedule with the other women. But that young graduate student in me wanted something entirely different. I wanted to tell the women that I did not think it was right — that the food assistance should not be going mostly to men — that men should also be sacrificing and eating less.
As I talked to people from local civil society organizations, I realized the food assistance programs in the region were not taking this situation into account. Their assistance was not meeting the needs of the women in the village.
Exploring Cultural and Ethical Factors in Evaluation
Why was this occurring? Food security programs were a big priority for organizations, given the severity of the drought in the early 1990s. Perhaps detailed project monitoring took a back seat to handing out foodstuffs.
Also, 25 years ago when I was in Zimbabwe, quantitative data seemed to dominate evaluation efforts. Donors wanted numbers. Those who managed food security programs seemed more interested in how many kilograms of maize were distributed rather than rich, qualitative data and explanation. One might argue there was not enough time to monitor project implementation qualitatively, but it was qualitative data and explanation that was needed to provide insight into program implementation, including the nutritional needs and realities of the women in these villages.
These early experiences had a profound influence on what I call my personal measurement and evaluation ethos — the importance of understanding local-level emic perspectives.
The emic (or local) perspective is the insider perspective that those at the local level have of themselves, their culture, or their situation. For example, the perception of those in the village in Zimbabwe included notions of gender and local-level politics. Conversely, the etic perspective is the perspective that outsiders have of the local population. In this case, the etic perspective came from international development organizations managing the food security program in Zimbabwe. This etic perspective did not appreciate the gender dynamics taking place that influenced the household allocations of food in the village.
My new ethos has been solidified in subsequent work throughout Africa. My practical experiences show that qualitative methods give insight into rich, emic data that can be insightful for project planning, implementation, management, and evaluation. For example, when working in the field of microcredit in southern Africa, I used qualitative methods and emic perceptions to help inform research on why women with access to formal banking institutions preferred to save with informal credit associations. I also found that emic perceptions were helpful while managing and evaluating programs supporting women's political participation in West Africa.
Teaching From Experience
I integrate my personal measurement and evaluation ethos into all my Measurement and Evaluation courses, encouraging students to use qualitative data collection techniques to uncover emic perspectives. I encourage students to think ethically about their role as project managers and evaluators and to respect the diverse cultures they encounter. I also use case studies to help enrich classroom discussion.
Combining theory and practice in the online classroom strengthens the quality of the student experience and my interactions with those in my classroom. For my Measurement and Evaluation courses, I require practical assignments such as research or evaluation projects, so students apply their studies to practical examples. This approach helps students develop practical measurement and evaluation skills during their studies in American University's program.
About the Author
Assistant Professor Beverly Peters, Ph.D., is a specialist in human security in Africa and has written extensively on economic development, democratization, and HIV/AIDS. She has more than 20 years of experience teaching, conducting research, and managing projects in southern and West Africa.
An expert on political and economic development in Zimbabwe, she has provided political analyses to the government of South Africa and the private sector, and she is regularly featured in local and international media including the South African Broadcasting Corporation news, The New York Times, Voice of America, and Radio France International.
To learn more about American University's online MS in Measurement & Evaluation or Graduate Certificate in Project Monitoring & Evaluation, you can request more information or call us toll-free at 855-725-7614.
*This blog article is based on Beverly Peters' presentation "Using Qualitative Methods Effectively: The Why and The How," American Evaluation Association Coffee Break Webinar, Washington, D.C.: American Evaluation Association, May 1, 2018.