Behavioral Insights Briefs

CBEAR is committed to disseminating the results of research from the behavioral sciences and applying it to topics within agri-environmental programs. We’ve developed a number of behavioral insights briefs, quick 1-page documents that will be easily accessible by policy makers, program administrators, and other interested in these lessons.

If you have questions about the Behavioral Insights Brief series or problems accessing any of the Behavioral Insights Briefs, please contact Maddi Valinski.



Behavioral Insights Brief no. 8
Use this framework to strengthen your agri-environmental program with behavioral insights. In our budget-constrained world, we’re always looking for ways to make our voluntary programs more cost-effective. How can we design programs so that farmers and landowners want to participate and take actions to improve our environment?

Footnotes: (1) The MINDSPACE framework was developed by Paul H. Dolan and his coauthors and published in the Journal of Economic Psychology in 2012 (Vol. 33).  (2) Lead author of this Brief: Leah Palm-Forster (
Palm-Forster, L.H., P.J. Ferraro, N. Janusch, C.A. Vossler, and K.D. Messer. 2019. “Behavioral and Experimental Agri-Environmental Research: Methodological Challenges, Literature Gaps, and Recommendations.” Environmental and Resource Economics 73(3):719–742.
Dolan, P., M. Hallsworth, D. Halpern, D. King, R. Metcalfe, and I. Vlaev. 2012. “Influencing behaviour: The mindspace way.” Journal of Economic Psychology 33(1):264–277.


Labeling Success

Behavioral Insights Brief no. 7
People’s behaviors are motivated by labels. Labeling environmental practices used in food production can lead to higher consumer satisfaction, producer profits, and improved environmental outcomes. Could your program use labels to be more successful?

References: (1) Messer, K.D., M. Costanigro, and H. Kaiser. 2017. “Labeling Food Processes: The Good, the Bad and the Ugly.” Applied Economics Perspectives and Policy. 39(3): 407-427. (2) Teisl, M.F., B. Roe, and R.L. Hicks. 2002. Can Eco-Labels Tune a Market? Evidence from Dolphin-Safe Labeling. Journal of Environmental Economics and Management 43 (3): 339–59.


Test, Learn, Adapt

Behavioral Insights Brief no. 6
Innovate by running experiments in your program. In business, health, public policy and nearly every other field, people want to know how actions impact outcomes. Do farmers obtain higher yields using fertilizer A or fertilizer B? Do patients live longer taking pill A or pill B? How much more money do workers earn if they study high school curriculum A relative to curriculum B? 


Gains from Avoiding Losses

Behavioral Insights Brief no. 5
It’s not just how much incentive you offer, but how you offer it. Paying people based on performance is common, whether it be in our jobs or in voluntary conservation programs. But did you know that how one describes— or “frames”— the payment can affect performance?

References: (1) Hossain, Tanjim and John A. List. 2012. “The Behavioralist Visits the Factory: Increasing Productivity Using Simple Framing Manipulations.” Management Science, 58(12): 2151-67. (2) Tversky A., Kahneman D. (1981) “The Framing of Decisions and the Psychology of Choice.” Science, 211(4481): 453-458. (3) Fryer, Roland G., Steven D. Levitt, and Sally Sadoff. 2012. “Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment.” NBER Working Paper No. 18237.



The Costs of Complexity

Behavioral Insights Brief no. 3


The most sophisticated programs are the simplest. To ensure that only eligible people apply for our programs, we often inadvertently dissuade many eligible people from ever applying. The application process should be as simple as possible to maximize participation. Thinking critically about the information that is truly needed from participants is one way to simplify a program. As shown with a number of examples in tax programs, simplifying sign-up procedures can not only increase participation, but increase the impacts of the program as well. As with any new program design, rigorous testing through randomized controlled trials allows for the design of evidenced-based programs.


References: (1) Bhargava, Saurabh and Dayanand Manoli. 2015. Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment. American Economic Review 105: 3489–3529 (2) Ross, Rebecca, Shannon White, Josh Wright, and Lori Knapp. 2013. “Using Behavioral Economics for Postsecondary Success.” Ideas42.



The Pull of Social Comparisons

Behavioral Insights Brief no. 2
Pull of Social Comparisons

People look to what others are doing as a guide for their own behavior. When designing programs, including information about behavior of peers can influence behavior. Including social comparisons where the behavior that is encouraged is popular can serve as a cost-effective way to boost the impacts of a program. Supported by evidence in a number of government programs, social comparisons can be used and CBEAR can help design randomized controlled trials to test the impact of social comparisons on program impact.


References: (1) Bernedo, M, PJ Ferraro, M Price. 2014. The Persistent Impacts of Norm-based Messaging and their Implications for Water Conservation. Journal of Consumer Policy 37(3): 437-452. (2) Ferraro, PJ, JJ Miranda, M Price. 2011. Persistence of Treatment Effects with Norm-based Policy Instruments: evidence from a randomized environmental policy experiment. American Economic Review: papers and proceedings 101(3): 318–22. (3) Ferraro, PJ, M Price. 2013. Using Non-pecuniary Strategies to Influence Behavior: evidence from a large-scale field experiment. The Review of Economics and Statistics 95(1): 64-73. (4) Behavior Change Advisory Group. Behaviour Change: Tax & Compliance. British Psychology Society Behaviour Change Briefings. (5) Schultz, P.: Changing Behavior With Normative Feedback Interventions: A Field Experiment on Curbside Recycling. Basic and Applied Social Psychology 21(1). 1999. Note: the figure uses the treatment effect estimates from Bernedo et al.’s sample of non-movers.



The Power of Defaults

Behavioral Insights Brief no. 1

Power of Defaults

Are your program’s defaults helping or hurting? Humans are prone to inertia, and tend to stick to the status quo, which can have profound effects. When we design policies and programs, we need to consider how the default options we select can influence participant decisions. Changing defaults can be a simple and inexpensive way to increase program participation, and changing defaults has been shown to have dramatic results in a number of applications.


References: (1) Beshears, John, James J. Choi, David Laibson, & Brigitte C. Madrian. The Importance of Default Options for Retirement Savings Outcomes: Evidence from the United States. Social Security Policy in a Changing Environment, Brown, Liebman and Wise. 2009. (2) Messer, Kent, William Allen, &  Paul Ferraro. Using Field Experiments to Improve Conservation Program Performance. AAEA & WAEA Joint Annual Meeting. San Francisco, CA. July 2015.


The Behavioral Insights Briefs are prepared with the assistance of the Johns Hopkins Carey Business School Office of Marketing and Communications.

Comments are closed.