Behavioral Insights Briefs

CBEAR is committed to disseminating the results of research from the behavioral sciences and applying it to topics within agri-environmental programs. We’ve developed a number of behavioral insights briefs, quick 1-page documents that will be easily accessible by policy makers, program administrators, and other interested in these lessons.

If you have questions about the Behavioral Insights Brief series or problems accessing any of the Behavioral Insights Briefs, please contact Maddi Valinski.

 

Climate Change Mitigation Outreach Experiment

Behavioral Insights Brief no. 4

behavioral-insights-brief-4-exp-design

Should climate change programs mention climate change? Unlike the environmental issues typically addressed by USDA programs—such as soil conservation, water quality, soil health, and wildlife habitat—climate change and its causes are more controversial. A 2012 poll of about 5,000 U.S. Corn Belt farmers reported that only 8% of respondents agreed that climate change is taking place and that human activity is the primary cause.1 In light of such reports, USDA must carefully consider, and test, different ways to engage farmers in its new Building Blocks initiative.

References: (1) Arbuckle, J.G., L.S. Prokopy, T. Haigh, J. Hobbs, T. Knoot, C. Knutson, A. Loy, A.S. Mase, J. McGuire, L.W. Morton, J. Tyndall, M. Widhalm. 2013 Climate change beliefs, concerns, and attitudes toward adaptation and mitigation among farmers in the Midwestern United States. Climatic Change 117(4):943–950.

More details are available in the appendix. 

 

 

The Costs of Complexity

Behavioral Insights Brief no. 3

Complexity

The most sophisticated programs are the simplest. To ensure that only eligible people apply for our programs, we often inadvertently dissuade many eligible people from ever applying. The application process should be as simple as possible to maximize participation. Thinking critically about the information that is truly needed from participants is one way to simplify a program. As shown with a number of examples in tax programs, simplifying sign-up procedures can not only increase participation, but increase the impacts of the program as well. As with any new program design, rigorous testing through randomized controlled trials allows for the design of evidenced-based programs.

References: (1) Bhargava, Saurabh and Dayanand Manoli. 2015. Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment. American Economic Review 105: 3489–3529 (2) Ross, Rebecca, Shannon White, Josh Wright, and Lori Knapp. 2013. “Using Behavioral Economics for Postsecondary Success.” Ideas42.

 

 

The Pull of Social Comparisons

Behavioral Insights Brief no. 2
Pull of Social Comparisons

People look to what others are doing as a guide for their own behavior. When designing programs, including information about behavior of peers can influence behavior. Including social comparisons where the behavior that is encouraged is popular can serve as a cost-effective way to boost the impacts of a program. Supported by evidence in a number of government programs, social comparisons can be used and CBEAR can help design randomized controlled trials to test the impact of social comparisons on program impact.

References: (1) Bernedo, M, PJ Ferraro, M Price. 2014. The Persistent Impacts of Norm-based Messaging and their Implications for Water Conservation. Journal of Consumer Policy 37(3): 437-452. (2) Ferraro, PJ, JJ Miranda, M Price. 2011. Persistence of Treatment Effects with Norm-based Policy Instruments: evidence from a randomized environmental policy experiment. American Economic Review: papers and proceedings 101(3): 318–22. (3) Ferraro, PJ, M Price. 2013. Using Non-pecuniary Strategies to Influence Behavior: evidence from a large-scale field experiment. The Review of Economics and Statistics 95(1): 64-73. (4) Behavior Change Advisory Group. Behaviour Change: Tax & Compliance. British Psychology Society Behaviour Change Briefings. (5) Schultz, P.: Changing Behavior With Normative Feedback Interventions: A Field Experiment on Curbside Recycling. Basic and Applied Social Psychology 21(1). 1999. Note: the figure uses the treatment effect estimates from Bernedo et al.’s sample of non-movers.

 

 

The Power of Defaults

Behavioral Insights Brief no. 1

Power of Defaults

Are your program’s defaults helping or hurting? Humans are prone to inertia, and tend to stick to the status quo, which can have profound effects. When we design policies and programs, we need to consider how the default options we select can influence participant decisions. Changing defaults can be a simple and inexpensive way to increase program participation, and changing defaults has been shown to have dramatic results in a number of applications.

References: (1) Beshears, John, James J. Choi, David Laibson, & Brigitte C. Madrian. The Importance of Default Options for Retirement Savings Outcomes: Evidence from the United States. Social Security Policy in a Changing Environment, Brown, Liebman and Wise. 2009. (2) Messer, Kent, William Allen, &  Paul Ferraro. Using Field Experiments to Improve Conservation Program Performance. AAEA & WAEA Joint Annual Meeting. San Francisco, CA. July 2015.

 

The Behavioral Insights Briefs are prepared with the assistance of the Johns Hopkins Carey Business School Office of Marketing and Communications.

Comments are closed