Social Sciences & Behavioral Nudges

Our research uses tools, such as behavioral nudges, to influence the behavior and decision-making of groups or individuals through positive reinforcement and indirect suggestions.

Understanding how choice architecture, social cues, and framing effects influence behavior has recently become a shared aim of academics, businesses, and governments, working to promote social welfare. For example, behavioral tools have proven successful at increasing voter turnout, encouraging policy compliance, and adopting best practice health behavior.

In the lab, we study how technology paired with social science insights can be used to “nudge” individuals and groups toward more socially beneficial behaviors — in educational apps, on social media platforms, and in online markets. Using flexible approaches for research design and analysis built on machine learning methods, we bring new insights to pressing research questions from across the social sciences. Our lab also develops public-facing tools for researchers to learn about and apply these methods, and collaborates with social scientists in academia, industry, and government.

Project Abstracts

Read about a few of the research projects the lab is currently working on.

Texts Encouraging Low-Income Students to Apply for Federal Financial Aid

Lab researchers Susan Athey, Henrike Steimer, Matt Schaelling, and Niall Keleher are working with ideas42 to examine the effects of an email and SMS program that aims to encourage community college students to maintain financial aid packages. Using administrative records, machine learning algorithms test differential impacts of the reminders on educational outcomes among different types of students.

Financial Health

Increasing Voter Turnout with Text Reminders

In collaboration with ideas42, lab researchers are investigating whether or not an SMS-message campaign led to increased voter participation during a recent election. Using fine-grained information about voters, causal inference methods are being used to test whether those SMS reminders led to higher voter turnout among certain subgroups of potential voters.

Government Services

Reduce Unanticipated Financial Burdens of Ticketing and Fines Among Car Owners

Utilizing an email campaign through New York City’s Department of Finance, lab researchers and ideas42 are targeting identified behavioral bottlenecks preventing at-risk ticket holders from paying their tickets in New York City. Machine learning and casual inference tools are used in randomized experiments to test whether personalized emails reduce the number of vehicles being booted, especially vehicles of those likely not able to pay increased fines.

Financial Health

Academic Publications

Publication Search


Learn firsthand from researchers and practitioners associated with the lab.

Thought Leadership


A report of the collaboration between ideas42 and the Golub Capital Social Impact Lab to advance practical applications of machine learning to behavioral science policy and field experimentation. The contents of this report are designed for any audience looking to learn more about how machine learning can add new techniques to behavioral design, causal inference, and experimentation.

Stanford GSB Insights

The Innovation for Shared Prosperity Conference focused on innovative financial, social, and research models that can improve inequality and how universities and private industry can partner to bring about social change.

Stanford GSB School News

Led by initial Faculty Director Susan Athey, the Golub Capital Social Impact Lab will help leading social sector organizations solve pressing challenges by providing a platform for faculty and students to advance digital innovation in the social sector while catalyzing groundbreaking social impact research and tangible outcomes.

Stanford GSB Insights

Using GPS data to analyze people’s movements, Stanford researchers found that in most U.S. metropolitan areas, people’s day-to-day experiences are less segregated than traditional measures suggest.