Suggested Citation: Zhang, Kelly and Chaning Jang. 2020. “Behavioral Science in the Field Course Syllabus.” Massachusetts Institute of Technology Governance Lab (United States) and Busara Center for Behavioral Economics (Kenya).

Course Instructors: Kelly Zhang (MIT GOV/LAB), Chaning Jang (Busara)
Contacts: kwzhang@mit.educhaning.jang@busaracenter.org
Location: Nairobi, Kenya
Dates: Pilot course — January 5, 2020 – January 31, 2020

Additional contributors:
Nathanial Peterson, Busara Center for Behavioral Economics
Rebecca Littman, Massachusetts Institute of Technology
Lily Tsai, Massachusetts Institute of Technology
Alexandra Scacco, WZB Berlin
Macartan Humphreys, Columbia University & WZB Berlin
Edward Miguel, University of California Berkeley
Daniel Posner, University of California Los Angeles
Guy Grossman, University of Pennsylvania

Course Overview

The Behavioral Science in the Field Course is a collaboration between MIT GOV/LAB and the Busara Center for Behavioral Economics, to train graduate students from the U.S. and local universities in East Africa in cutting-edge behavioral science research. Conducted in Kenya, the course is structured as an intensive deep dive into interdisciplinary behavioral science and provides students the opportunity to develop novel behavioral science games to answer research that will result in data collection.

In recent years, behavioral sciences, or ways to better understand determinants of human behavior, have emerged as a leading innovation across disciplines and sectors. These novel methods and data allow us to measure what incentivizes individual and group behavior to inform numerous interventions; for example, targeted online marketing, incentives for healthier eating, improved educational pedagogy, enhanced community policing protocols, effective policy design for compliance with public health ordinances or paying taxes. Lab in the field experiments, in particular, are one of the gold standards to test behavior using a method that most closely resembles real life. Training in behavioral sciences is a critical skill for students to master from across disciplines and is one of the most exciting developments for bridging the gap between theory and practice with proven potential to achieve real-world impacts.

The course will provide students with practical experience in implementing a lab in the field experiment. To encourage innovative thinking beyond disciplinary boundaries, the course will be open to PhD students in the social sciences more broadly (e.g. political science, economics, business, psychology).

The pilot course was made possible with generous support from MIT’s J-WEL Grant in Higher Education Innovation and MITx Express Exploration Grant, and core contributions from Busara and MIT GOV/LAB.

Goals

Graduate students participating in the course will gain hands-on experience in designing and implementing behavioral games in the lab, and come away with a novel behavioral game and data for an academic publication. At the end of this course, students will be able to:

  • Identify and recognize the fundamental assumptions/premises of different disciplines
  • Judge and assess the appropriate tools to tackle research question
  • Combine inductive and deductive learning in the development of research questions
  • Develop original research questions that are meaningful to the local context
  • Recognize and respect the perspective of the research subject in the research process
  • Design a novel lab experiment or behavioral measure
  • Plan a well-designed and well-implemented lab in the field project
  • Develop productive and long-lasting relationships between disciplines and cultures

Outputs

Original research question
Lab protocol for new game
Pre-analysis plan
Descriptive summary and pre-analysis plan results
Blog post

Expert Mentoring

The course is designed with comprehensive mentoring and iterative feedback sessions throughout the course with disciplinary and topic experts. Experts from economics, social psychology, and political science included in the pilot:

Guest speakers:
Joshua Dean, University of Chicago
Alexandra Scacco, WZB Berlin and Shana Warren, New York University
Daniel Posner, University of California Los Angeles

Pre-analysis plans and game design:
Macartan Humphreys, Columbia University and WZB Berlin
Alexandra Scacco, WZB Berlin

Issue area experts:
Amanda Robinson, Ohio State University
George Ofosu, London School of Economics
Johannes Haushofer, Princeton University
Joshua Dean, University of Chicago
Lorenzo Casaburi, University Zurich
Julian Dyer, University of Toronto
Catherine Thomas, Stanford University

Schedule

June-October 2019 – Recruitment
Recruitment and application stage for PhD students in the U.S. and East Africa

January 2020 – Course and Fieldwork Intensive

Week 1 – Literature: (Location: Busara, Nairobi, Kenya)
Seminars on experiments in economics, political science, and social psychology.

Behavioral Economics (Chaning Jang, Nathanial Peterson)
Social Behavior (Rebecca Littman)
Political Behavior (Kelly Zhang)
Guest speaker call-ins (Joshua Dean, Alexandra Scacco, Shana Warren, Daniel Posner)
Student presentations on research interests
Common tools in behavioral science experiments
Busara lab session to participate in common lab games

Week 2 – Contextualize: (Location: Field Site)
Exploratory fieldwork to refine research questions.

Visit to fieldsite in Kenya for focus groups and interviews
Best practices for contextualizing games and qualitative research
Best practices for field research and research ethics

Week 3 – Design: (Location: Busara, Nairobi, Kenya)
Develop novel experiments using open science tools.

Students propose an original research question and design
Initial draft of pre-analysis plan
Initial draft of lab protocol
Game programming for pilot
Office hours with course facilitators, lab staff, and issue area experts

Week 4 – Pilot: (Location: Busara, Nairobi, Kenya and Field Site)
Pilot new behavioral game and finalize research design.

Students pilot their game with a small sample
Refine pre-analysis plan, lab protocol, and game programming
Final student presentations of research design

March-May 2020 – Data collection

Finalize lab protocols
Refine pre-analysis plans
Register pre-analysis plans
Submit IRB protocols
Data collection for 1,500 individuals, to be used by graduate students for publication

Behavioral Economics

Instructors: Chaning Jang and Nathanial Peterson, Busara Center for Behavioral Economics

Pre-Course Reading

  • Thaler, Richard H. Prize Lecture. NobelPrize.org. Nobel Media AB 2019. Sun. 29 Dec 2019.

Course Readings

  1. O’Donoghue, Ted, and Matthew Rabin. “Doing it now or later.” American Economic Review 89, no. 1 (1999): 103-124.
  2. Kremer, Michael, Jean Lee, Jonathan Robinson, and Olga Rostapshova. “Behavioral biases and firm behavior: Evidence from Kenyan retail shops.” American Economic Review 103, no. 3 (2013): 362-68.
  3. Tversky, Amos, and Daniel Kahneman. “The framing of decisions and the psychology of choice.” Science 211, no. 4481 (1981): 453-458.
  4. Mani, Anandi, Sendhil Mullainathan, Eldar Shafir, and Jiaying Zhao. “Poverty impedes cognitive function.” Science 341, no. 6149 (2013): 976-980.
  5. Berkouwer, Susanna B., and Joshua T. Dean. “Inattention and credit constraints in energy efficiency adoption: Evidence from Kenya.” (2019).

Supplemental Readings

Background

  • Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and uncertainty, 5(4), 297-323.
  • Thaler, R. (1985). Mental accounting and consumer choice. Marketing science, 4(3), 199-214.
  • Rabin, M. (1998). Psychology and economics. Journal of economic literature, 36(1), 11-46.

Time and Risk Preference

  • Laibson, D. (1997). Golden eggs and hyperbolic discounting. The Quarterly Journal of Economics, 112(2), 443-478.
  • Frederick, S., Loewenstein, G., & O’donoghue, T. (2002). Time discounting and time preference: A critical review. Journal of economic literature, 40(2), 351-401.
  • Tanaka, T., Camerer, C. F., & Nguyen, Q. (2010). Risk and time preferences: Linking experimental and household survey data from Vietnam. American Economic Review, 100(1), 557-71.
  • Charness, G., Gneezy, U., & Imas, A. (2013). Experimental methods: Eliciting risk preferences. Journal of Economic Behavior & Organization, 87, 43-51.
  • Andreoni, J., Kuhn, M. A., & Sprenger, C. (2015). Measuring time preferences: A comparison of experimental methods. Journal of Economic Behavior & Organization, 116, 451-464.
  • Balakrishnan, U., Haushofer, J., & Jakiela, P. (2017). How soon is now? evidence of present bias from convex time budget experiments (No. w23558). National Bureau of Economic Research.

Social Preference

  • Camerer, C. F., & Thaler, R. H. (1995). Anomalies: Ultimatums, dictators and manners. Journal of Economic perspectives, 9(2), 209-219.
  • Charness, G., & Rabin, M. (2002). Understanding social preferences with simple tests. The Quarterly Journal of Economics, 117(3), 817-869.
  • Henrich, J., Boyd, R., Bowles, S., Camerer, C., Fehr, E., Gintis, H., & McElreath, R. (2001). In search of homo economicus: behavioral experiments in 15 small-scale societies. American Economic Review, 91(2), 73-78.
  • Bénabou, R., & Tirole, J. (2006). Incentives and prosocial behavior. American economic review, 96(5), 1652-1678.
  • Mazar, N., Amir, O., & Ariely, D. (2008). The dishonesty of honest people: A theory of self-concept maintenance. Journal of marketing research, 45(6), 633-644.
  • Ariely, D., Bracha, A., & Meier, S. (2009). Doing good or doing well? Image motivation and monetary incentives in behaving prosocially. American Economic Review, 99(1), 544-55.
  • Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world?. Behavioral and brain sciences, 33(2-3), 61-83.

Scarcity / Poverty

  • Haushofer, Johannes, and Ernst Fehr. “On the psychology of poverty.” Science 344.6186 (2014): 862-867.
  • Shah, A. K., Mullainathan, S., & Shafir, E. (2012). Some consequences of having too little. Science, 338(6107), 682-685.

Social Networks

  • Breza, E. (2015). Field experiments, social networks, and development. The Oxford Handbook on the Economics of Networks.

Development

  • Kremer, Michael, Gautam Rao, and Frank Schilbach. “Behavioral development economics.” Handbook of Behavioral Economics 2 (2019).
  • Woodhouse, Philip, et al. “African farmer-led irrigation development: re-framing agricultural policy and investment?.” The Journal of Peasant Studies 44.1 (2017): 213-233.
  • Emerick, Kyle, et al. “Technological innovations, downside risk, and the modernization of agriculture.” American Economic Review 106.6 (2016): 1537-61.
  • Dean, J. T. (2019). Noise, cognitive function, and worker productivity. Mimeo.
  • Bhanot, S. P., Han, J., & Jang, C. (2018). Workfare, wellbeing and consumption: Evidence from a field experiment with Kenya’s urban poor. Journal of Economic Behavior & Organization, 149, 372-388.
  • Niederle, M., & Vesterlund, L. (2007). Do women shy away from competition? Do men compete too much?. The quarterly journal of economics, 122(3), 1067-1101.
  • Cardenas, J. C., & Carpenter, J. (2008). Behavioural development economics: Lessons from field labs in the developing world. The Journal of Development Studies, 44(3), 311-338.

Social Behavior

Instructor: Rebecca Littman, Massachusetts Institute of Technology

Pre-Course Reading

  • Tankard, M. E., & Paluck, E. L. (2016). Norm perception as a vehicle for social change. Social Issues and Policy Review10(1), 181-211.

Course Readings

  1. Peysakhovich, A., Nowak, M. A., & Rand, D. G. (2014). Humans display a ‘cooperative phenotype’ that is domain general and temporally stable. Nature communications5, 4939.
  2. Gächter, S., & Schulz, J. F. (2016). Intrinsic honesty and the prevalence of rule violations across societies. Nature531(7595), 496.
  3. Everett, J. A., Ingbretsen, Z., Cushman, F., & Cikara, M. (2017). Deliberation erodes cooperative behavior—Even towards competitive out-groups, even when using a control condition, and even when eliminating selection bias. Journal of Experimental Social Psychology73, 76-81.
  4. Scacco, A., & Warren, S. S. (2018). Can social contact reduce prejudice and discrimination? Evidence from a field experiment in Nigeria. American Political Science Review112(3), 654-677.
  5. Blum, A., C. Hazlett, and D. Posner (2019). Measuring Ethnic Biases: Can Misattribution- Based Tools from Social Psychology Reveal Group Biases that Economics Games Cannot? Working Paper.

Supplemental Readings

Cooperation and Punishment

  • Jordan, J. J., & Rand, D. G. (2019). Signaling when no one is watching: A reputation heuristics account of outrage and punishment in one-shot anonymous interactions. Journal of personality and social psychology.
  • Arechar, A. A., & Rand, D. G. (2019). Learning to be selfish? A large-scale longitudinal analysis of Dictator games played on Amazon Mechanical Turk.

Intergroup Interactions

  • Cikara, M., Bruneau, E., Van Bavel, J. J., & Saxe, R. (2014). Their pain gives us pleasure: How intergroup dynamics shape empathic failures and counter-empathic responses. Journal of experimental social psychology, 55, 110-125.

Morality

  • Cushman, F., Gray, K., Gaffey, A., & Mendes, W. B. (2012). Simulating murder: The aversion to harmful action. Emotion, 12(1), 2.

Social Norms

  • Stagnaro, M. N., Arechar, A. A., Rand, D. G. (2017). From good institutions to generous citizens: Top-down incentives to cooperate promote subsequent prosociality but not norm enforcement. Cognition, 167, 212-254.
  • Tankard, M. E., & Paluck, E. L. (2017). The effect of a Supreme Court decision regarding gay marriage on social norms and personal attitudes. Psychological science, 28(9), 1334-1344.

Political Behavior

Instructors: Kelly Zhang, Massachusetts Institute of Technology Governance Lab

Pre-Course Reading

  • Ofosu, George Kwaku. “Experimental Research in African Politics.” Oxford Research Encyclopedia of Politics. 25 June 2019.

Course Readings

  1. Habyarimana, James, Macartan Humphreys, Daniel N. Posner, and Jeremy M. Weinstein. 2007. “Why Does Ethnic Diversity Undermine Public Goods Provision?” American Political Science Review 101(4): 709–25.
  2. Gregory A. Huber, Seth J. Hill, and Gabriel S. Lenz. 2012. “Sources of Bias in Retrospective Decision-Making: Experimental Evidence on Voters’ Limitations in Controlling Incumbents.” American Political Science Review, Vol. 106 (4 November): 720-41.
  3. McClendon, Gwyneth and Riedl, Rachel Beatty. “Religion as a Stimulant of Political Participation: Experimental Evidence from Nairobi, Kenya,” The Journal of Politics 77, no. 4 (October 2015): 1045-1057.
  4. Voors, Maarten J., Eleonora E. M. Nillesen, Philip Verwimp, Erwin H. Bulte, Robert Lensink, and Daan P. Van Soest. 2012. “Violent Conflict and Behavior: A Field Experiment in Burundi.” American Economic Review, 102 (2): 941-64.
  5. Finan, F. and Schechter, L. (2012), Vote‐Buying and Reciprocity. Econometrica, 80: 863-881.

Supplemental Readings

Background

  • Blair, G., Cooper, J., Coppock, A., & Humphreys, M. (2019). Declaring and Diagnosing Research Designs. American Political Science Review, 113(3), 838-859.
  • Graeme Blair, Alexander Coppock, Margaret Moor. “When to Worry About Sensitivity Bias: A Social Reference Theory and Evidence from 30 Years of List Experiments.” Revise and resubmit, American Political Science Review.
  • Grossman, G.  2011.  “Lab-in-the-field Experiments.” Newsletter of the APSA Experimental Section. 2(2):13-19.  

Governance

  • Grossman, G. and Baldassarri, D. (2012), The Impact of Elections on Cooperation: Evidence from a Lab‐in‐the‐Field Experiment in Uganda. American Journal of Political Science, 56: 964-985.
  • Martin, Lucy. 2016. “Taxation, Loss Aversion, and Accountability: Theory and Experimental Evidence for Taxation’s Effect on Citizen Behavior.” Working Paper.
  • Lierl, M.. (2019). Can Elections Reduce Embezzlement? Experimental Evidence on Selection Effects, Public Trust and Citizens’ Tolerance for Embezzlement. Working Paper.
  • Casey, Katherine, Rachel Glennerster, and Edward Miguel. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan.” Quarterly Journal of Economics 127 (4): 1755-1812.

Conflict/Post-Conflict

  • Zeitzoff, Thomas. 2014. “Anger, Exposure to Violence and Intragroup Conflict: A ‘Lab in the Field’ Experiment in Southern Israel.” Political Psychology. 35(3): 309-335.
  • Blair, Robert, Legitimacy After Violence: Evidence from Two Lab-in-the-Field Experiments in Liberia (January 23, 2018).
  • Gilligan, M.J., Pasquale, B.J. and Samii, C. (2014), Civil War and Social Cohesion: Lab‐in‐the‐Field Evidence from Nepal. American Journal of Political Science, 58: 604-619.
  • Paluck, E.L. & Green, D.P. (2009). Deference, dissent, and dispute resolution: A field experiment on a mass media intervention in Rwanda. American Political Science Review, 103(4), 622-644.

Identity Politics 

  • Carlson, Elizabeth. 2015. “Ethnic Voting and Accountability in Africa: A Choice Experiment in Uganda.” World Politics 67(2): 353–85.
  • Adida, C., Laitin, D., & Valfort, M. (2016). “One Muslim is Enough!” Evidence from a Field Experiment in France. Annals of Economics and Statistics, (121/122), 121-160.
  • Nugent, Elizabeth, The Psychology of Repression and Polarization in Authoritarian Regimes (August 11, 2017).
  • Lars Ivar Oppedal Berge, Kjetil Bjorvatn, Simon Galle, Edward Miguel, Daniel N Posner, Bertil Tungodden, Kelly Zhang, 2019. Ethnically Biased? Experimental Evidence from Kenya, Journal of the European Economic Association.
  • Robinson, Amanda. Nationalism and Interethnic Trust: Experimental Evidence from an African Border Region. 2016. Comparative Political Studies 49 (14): 819-854.
  • Cilliers, Jacobus, Oeindrila Dube, and Bilal Siddiqi. “The white-man effect: How foreigner presence affects behavior in experiments.” Journal of Economic Behavior & Organization 118 (2015) 397–414

Open Science Resource

Compiled with the assistance of BITSS (Berkeley Initiative for Transparency in the Social Sciences).

Pre-Analysis Plan Registries

Pre-Analysis Plan Resources

Relevant Papers

  • Christensen, Garret, and Edward Miguel. “Transparency, Reproducibility, and the Credibility of Economics Research.” Journal of Economic Literature 56, no. 3 (September 2018): 920–80.
  • Coffman, Lucas C., and Muriel Niederle. 2015. “Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible.” Journal of Economic Perspectives, 29 (3): 81-98.
  • Blair, Graeme, Jasper Cooper, Alexander Coppock, and Macartan Humphreys. “Declaring and Diagnosing Research Designs.” American Political Science Review 113, no. 3 (2019): 838–59. doi:10.1017/S0003055419000194.
  • Ofosu, George K. and Daniel N. Posner. “Pre-analysis Plans: A Stocktaking.” 2019. Working Paper.
  • Casey, Katherine, Rachel Glennerster, and Edward Miguel. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan.” Quarterly Journal of Economics 127 (4): 1755-1812.