The phenomenon where an individual’s judgment is influenced by a reference point, or anchor, that is used as a basis for comparison. It’s important in communications to be aware of the reference point set, so as to not misinform the target audience.
Anchoring is a cognitive bias in which an individual’s judgment is influenced by an initial value or reference point, known as the “anchor,” that is used as a basis for comparison. This can happen in a variety of ways, such as through the use of numerical values, past experiences, or social norms.
For example, if a salesperson starts a negotiation with a high initial asking price, the final price that the buyer is willing to pay may be higher than it would have been if the salesperson had started with a lower asking price. This is known as “anchoring effect”, the initial price acts as an anchor, and the buyer’s subsequent offer is influenced by this anchor, even if the anchor is not rationally related to the true value of the item.
Anchoring can also occur in non-numerical ways, such as when an individual’s judgment is influenced by their past experiences, or by social norms. For example, if an individual has always been told that a certain behavior is right or wrong, they may find it difficult to see things differently.
In international development, anchoring can play a role in decision-making and communication, specifically in situations where a development initiative must be presented and understood. If a message is framed in a certain way, it can lead the audience to form a certain opinion or have a certain understanding, which could be different if the message was framed differently.
It’s important for professionals working in international development to be aware of how anchoring can influence decision-making and communication, and to be conscious of the potential for anchoring in their own work and to counteract it.
The tendency to prefer avoiding losses to acquiring equivalent gains. This is important in communications of development projects where people may be reluctant to change their current situation even if it is not optimal.
Loss aversion is a cognitive bias in which individuals tend to prefer avoiding losses to acquiring equivalent gains. This means that people tend to feel more negative about losing something than they do positive about gaining something of the same value. This bias is rooted in the way the human brain processes information and is thought to be an evolutionary mechanism that helped humans to survive in uncertain environments by motivating them to protect what they have.
For example, if an individual is offered a gamble that has a 50% chance of winning $100 or losing $50, most people will avoid the gamble because of the fear of losing $50. This is because the perceived pain of losing is greater than the perceived pleasure of winning, even though in terms of the expected value the gamble is a neutral decision.
In the context of international development, loss aversion can play a significant role in the decision-making of communities and stakeholders that are being impacted by development projects. Many times, the benefits of these projects are not immediately apparent, and people may be hesitant to change their current situation even if it is not optimal. They may also be reluctant to give up what they already have, even if the project could potentially bring benefits in the long term.
Understanding loss aversion can be important for development practitioners, organizations and governments for designing effective communication strategies for their development programs, to reduce the impact of potential losses and to help people understand that the benefits in the long-term outweigh the short-term sacrifices. They can also use strategies like reframing the message, providing more details and evidence to reduce the feeling of uncertainty, and providing safety nets to reduce the risk of loss.
A small, subtle change in the environment that influences behavior without restricting freedom of choice. This could be used in development communication to encourage good behavior such as saving money or using more sustainable resources.
A nudge is a small, subtle change in the environment that influences behavior without restricting freedom of choice. The concept of nudging was popularized by Richard Thaler and Cass Sunstein in their book “Nudge: Improving Decisions About Health, Wealth, and Happiness.” They argue that small changes in the environment can be used to “nudge” people towards making better decisions for themselves and society as a whole.
For example, a company might place healthy snacks in an easy-to-reach place in the office kitchen and unhealthy snacks in a less accessible place. This small change in the environment could nudge employees to make healthier food choices without restricting their freedom to choose whatever they want to eat.
Nudges can also be used to influence decisions that have a social impact, such as increasing tax compliance, saving energy, and reducing littering. For example, in the context of international development, nudges can be used to encourage behaviors such as saving money or using more sustainable resources.
Nudges are different from traditional regulation or persuasion because they do not restrict people’s freedom of choice. Instead, they aim to create a “choice architecture” that makes it easier for people to make the decision that is best for them, without limiting their ability to make other choices.
In international development, nudging can be an effective strategy for promoting positive behavior change, as it can help people to make better decisions for themselves and society without restricting their freedom or violating their autonomy. It is also cost-effective and easy to implement, making it accessible for a wide range of organizations and governments.
A theory that describes how people value potential gains and losses differently. This can be used to understand the decision-making process of communities or stakeholders in development projects.
Prospect theory is a theory that describes how individuals value potential gains and losses differently. The theory was developed by Amos Tversky and Daniel Kahneman in the 1970s, and it has since been widely used to explain how people make decisions under uncertainty.
One of the key insights of prospect theory is that individuals have a greater sensitivity to losses than to gains. For example, people tend to experience more distress from losing $100 than they do pleasure from winning $100. This is known as loss aversion (see above), which is a fundamental aspect of prospect theory.
Another key insight is that people evaluate potential outcomes relative to a reference point, and their behavior is influenced by the reference point. For example, if an individual’s current wealth is $500 and is presented with a gamble that has a 50% chance of winning $100 or losing $50, an individual will be more likely to accept the gamble if the reference point is $500 than if the reference point is $1000. This is known as the framing effect, which is also a fundamental aspect of prospect theory.
Prospect theory can be applied in the context of international development to understand the decision-making process of communities or stakeholders that are being impacted by development projects. It can also help development practitioners, organizations and governments to design effective communication strategies and make choices that align with the preferences of the target population and to make the outcomes of their actions more acceptable.
Prospect theory also has implications on how people evaluate risks and how they value different outcomes. Development practitioners and organizations can apply this knowledge in their program design and communication strategies to improve the perceived value of the projects and to increase the chances of acceptance.
Behavioral Game Theory
The application of game theory to the study of decision-making in interactive situations. This can be used in the communication of international negotiation and trade agreements.
Behavioral game theory is the application of game theory to the study of decision-making in interactive situations. Game theory is a branch of mathematics that studies how individuals or groups of people make decisions and interact with each other, and it is often used to model strategic situations in fields such as economics, political science, and psychology. Behavioral game theory adds the insights from behavioral economics and psychology to the traditional model of game theory and considers how cognitive biases and emotions affect decision-making in strategic situations.
Behavioral game theory can be applied in a variety of settings, such as in business, politics, and international relations. In international negotiations, behavioral game theory can be used to understand how emotions and cognitive biases influence the decision-making of negotiators and how that affects the outcome of the negotiation.
An example of behavioral game theory being used in international negotiations is the study of the “ultimatum game.” In this game, one player is given a sum of money and must propose how to divide it between themselves and another player. The second player can either accept or reject the proposal. If the proposal is accepted, the money is divided as proposed. If the proposal is rejected, neither player receives any money. Traditional game theory predicts that the first player will offer the smallest possible amount, as any amount is better than nothing, and the second player will accept any non-zero offer. However, in reality, people often reject low offers as a form of punishment or fairness. Behavioral game theory can help to understand how emotions, such as resentment or fairness concerns, can influence the decision-making of negotiators and affect the outcome of the negotiation.
Likewise, in the context of international development, behavioral game theory can be used to understand how different stakeholders interact and make decisions, and how their behavior might be influenced by cognitive biases and emotions.
Understanding behavioral game theory can be important for development practitioners, as it can help them to design effective negotiation strategies, understand the decision-making process of the stakeholders and anticipate their behavior, and design programs that take into account the strategic context in which they are implemented. It can also be used to understand the role of emotions and biases in cooperation and conflicts in international development projects and programs.
A concrete example of how understanding behavioral game theory can be important for development practitioners is in the context of negotiating land rights with indigenous communities. Development practitioners may use traditional game theory to assume that the community will accept any offer of compensation for their land, as it is better than receiving nothing. However, using behavioral game theory, practitioners may understand that the community places a high value on the cultural significance of their land, and may reject offers that they perceive as unfair or disrespectful.
Understanding this, development practitioners can use this information to design a negotiation strategy that takes into account the emotional and cultural significance of the land to the community, such as involving representatives of the community in the land-use planning process, or providing alternatives for compensation that respect their cultural heritage. Additionally, practitioners can anticipate the behavior of the stakeholders and design programs that take into account the strategic context in which they are implemented, such as involving community leaders in decision making, and providing alternatives to land rights that are respectful of cultural heritage and fair.
The tendency for people to go along with the default option. This is important in communication for NGOs or governments trying to encourage certain behavior as people may follow the default option without considering alternatives.
The default effect is a cognitive bias in which individuals tend to go along with the default option, or the option that is pre-selected or pre-set for them. This is because people often have a bias toward the status quo and tend to stick with the existing situation unless there is a compelling reason to do otherwise. The default effect can also be thought of as a form of inertia, where people are reluctant to change their behavior and tend to stick with what is familiar.
For example, when people sign up for a new service, the default option for privacy settings might be to share personal information with third parties. If this is the default option, most people will go along with it, even if they would prefer not to share their information.
In the context of international development, the default effect can play a big role in how people respond to programs or initiatives. For example, if you tell people that it is totally okay not to go along with a particular program as many people did before. In fact, in this case they wouldn’t have to do anything but leave the meeting, people may be less likely to participate than if the default option is to participate because you tell them that are signed up automatically if they don’t come to the front later on to explicitly note down that they are not onboard.
Understanding the default effect can be important for development practitioners, organizations, and governments when trying to encourage certain behaviors, as people may follow the default option without considering alternatives. One way to counteract the default effect is to make the desired behavior the default option. It can also be beneficial to provide clear information and easy-to-understand instructions on how to change the default option, if that’s the case.
The struggle to resist immediate temptations in favor of longer-term goals. This is an important concept to consider when communicating about development projects or programs that have long-term benefits but may require short-term sacrifices.
The self-control problem is a cognitive bias in which individuals struggle to resist immediate temptations in favor of longer-term goals. This bias is rooted in the idea that the human brain has two systems for decision-making: one that is focused on short-term pleasure and immediate reward, and another that is focused on long-term goals and long-term consequences.
When individuals are faced with a temptation, such as the opportunity to indulge in an unhealthy treat or to procrastinate on a task, the brain’s short-term pleasure system can take over and make it difficult to resist. This can be especially challenging when the temptation is immediate, while the benefits of self-control are delayed.
In the context of international development, the self-control problem can become critical for how communities and stakeholders respond to development projects or programs. Many development projects require people to make short-term sacrifices in order to achieve long-term benefits. For example, a community may be asked to reduce their use of natural resources in the short term in order to preserve them for future generations.
Understanding the self-control problem can be important for development practitioners, organizations, and governments when designing programs and initiatives, as it can help them to anticipate potential obstacles and to design strategies that can help people to resist temptations and to stay on track toward their long-term goals. Examples of these strategies include providing incentives or rewards for self-control, increasing accountability and social pressure, and breaking down long-term goals into smaller, more manageable steps.
One example of a community-based self-control strategy in a development context could be a community savings and loan program. This type of program aims to help individuals and families in a community to save money and access credit for investments such as starting or expanding a small business. The program would be run by a community-based organization or a local NGO, and would involve training members of the community on financial management and providing them with small loans. The program would also establish a system of accountability and social pressure by requiring members to save a certain amount of money each month, and by having group meetings where members share their progress and hold each other accountable. Additionally, the program can break down long-term goals by setting small, intermediate savings goals, and tracking progress towards those goals.
This approach not only provides incentives and rewards for self-control, but also increases accountability and social pressure, and breaks down long-term goals into smaller, more manageable steps. Additionally, it emphasizes community building and collective efforts, which can be beneficial in a development context as the community can work together to achieve common goals and support each other in building sustainable livelihoods.
The tendency to strongly prefer small rewards that arrive sooner over larger rewards that arrive later. This is relevant when communicating in development projects that may have immediate, but small benefits, vs long-term but bigger benefits. This is clearly similar to the above.
Hyperbolic discounting is a cognitive bias in which individuals tend to strongly prefer small rewards that arrive sooner over larger rewards that arrive later. This bias reflects the human tendency to value the present more than the future, and it is thought to be caused by the fact that the brain’s reward system is more sensitive to immediate rewards than to delayed rewards.
For example, a person might choose to receive $100 today rather than $120 in a year, even though $120 is a larger reward. This is because the value of the $120 reward decreases as the time until it is received increases, while the value of the $100 reward remains constant.
In the context of international development, hyperbolic discounting can play a role in how stakeholders respond to development projects. Many development projects require people to make sacrifices in the present in order to achieve benefits in the future. For example, a community may be asked to invest in a long-term sustainable livelihood project that requires short-term sacrifices, such as not cutting down trees, while they see the benefits of the program in the future.
Understanding hyperbolic discounting can be important for development practitioners as it can help them to design effective communication strategies that take into account the fact that people are more likely to prioritize short-term rewards over long-term benefits, and to design interventions that have immediate and visible benefits. Examples include providing incentives or rewards that are delivered sooner or tie benefits to short-term targets or milestones.
One concrete example of providing incentives or rewards that are delivered sooner, tie benefits to short-term targets or milestones in a development scenario, could be a performance-based cash transfer program for smallholder farmers. This type of program would provide financial incentives to farmers who meet certain performance benchmarks, such as increasing crop yields, adopting sustainable farming practices, or participating in trainings. The program would be implemented by a government agency or development organization, and would involve regularly monitoring the farmers’ progress and providing cash transfers to those who meet the performance benchmarks.
This approach provides farmers with a direct and immediate financial benefit for meeting short-term targets or milestones, which can increase motivation and participation in the program. Additionally, by providing cash transfers, rather than in-kind rewards or benefits, it allows farmers to use the money in the way that they see fit, which can be more empowering and effective in addressing their specific needs. It also creates a positive feedback loop, as farmers who receive cash transfers are more likely to meet the performance benchmarks in the future, which can result in continued support and improvement in the long run.
The use of one’s own behavior to signal one’s own characteristics to others. This could be used in the communication of development programs that could be perceived as ‘too good to be true’ and to signal the credibility of the program.
Self-signaling refers to the use of one’s own behavior to signal one’s own characteristics to others. The concept is rooted in the idea that people’s behavior can be used as a signal of their underlying characteristics, such as their beliefs, values, or abilities.
For example, if a person is environmentally conscious, they may choose to recycle and reduce their use of plastics. By engaging in these behaviors, they signal to others that they care about the environment. Similarly, if a person is trying to signal that they are successful, they may choose to buy and use luxury goods. By doing so, they are signaling to others that they have the financial means to afford those luxury goods, which is often associated with success.
In the context of international development, self-signaling can play a role in how communities and stakeholders respond to development projects or programs. For example, if a development program is perceived as “too good to be true” or if people question the credibility of the program, self-signaling can be used to signal the credibility of the program or the organization.
Understanding self-signaling can be important for development practitioners, organizations and governments, as it can help them to design effective communication strategies that take into account the importance of signaling, especially in situations where trust and credibility are essential. It can also be used to design programs that align with the values and beliefs of the target population, making the program more appealing and increasing the chances of acceptance and success.
I hope this gives you a sense of some of the key terms and concepts in behavioral economics, and how they relate to communications in international development cooperation.📈
Compilation by Corbecoms from pictures by NASA, Michael Michelovski and Jack Carter, retrieved from Unsplash.