Chapter 11 Quiz: Self-Assessment

Instructions: Answer each question without looking back at the chapter. After completing all questions, check your answers against the key at the bottom. If you score below 70%, revisit the relevant sections before moving on to Chapter 12.


Multiple Choice

Q1. In the one-shot prisoner's dilemma, defection is the rational strategy because:

a) Players are assumed to be immoral b) Defection yields a better payoff regardless of what the other player does c) Cooperation is never stable in any game d) The players cannot communicate

Q2. A Nash equilibrium is a state in which:

a) Both players achieve the best possible outcome b) Neither player can improve their payoff by unilaterally changing their strategy c) Both players cooperate perfectly d) The game ends in a tie

Q3. In Axelrod's tournaments, tit-for-tat won primarily because it was:

a) The most mathematically sophisticated strategy b) The most aggressive strategy c) Nice, retaliatory, forgiving, and clear d) Designed to exploit weaker strategies

Q4. Which of the following is NOT one of tit-for-tat's four key properties?

a) Nice -- never defects first b) Retaliatory -- punishes defection immediately c) Deceptive -- hides its strategy from opponents d) Forgiving -- returns to cooperation after the opponent cooperates

Q5. Quorum sensing in bacteria is an example of:

a) Individual selection against cooperation b) Cooperative behavior triggered when population density crosses a threshold c) Central command by a dominant bacterium d) Random mutation producing cooperative behavior

Q6. Hamilton's rule (rB > C) states that cooperative behavior will spread when:

a) The cost to the cooperator exceeds the benefit to the recipient b) The benefit to the recipient, weighted by genetic relatedness, exceeds the cost to the cooperator c) Relatedness between cooperator and recipient is zero d) The cooperator receives direct compensation

Q7. In Cold War deterrence, Mutual Assured Destruction (MAD) maintained cooperation because:

a) Both superpowers trusted each other's good intentions b) A central authority enforced disarmament c) Defection (first strike) was guaranteed to result in catastrophic retaliation d) Both sides lacked nuclear weapons

Q8. The free rider problem in open source software refers to:

a) Contributors who write too much code b) Users who benefit from the software without contributing to its development c) Developers who charge for free software d) Bugs that appear in open source but not proprietary software

Q9. Open source communities sustain cooperation primarily through:

a) Legally binding contracts between all contributors b) Central government mandates to contribute c) Indirect reciprocity (reputation), modularity, and shared tools d) Direct payment to all contributors

Q10. Coral bleaching can be understood in game-theoretic terms as:

a) A defection cascade caused by environmental change altering the payoff structure b) A successful cooperative strategy by the coral c) Evidence that mutualism is always stable d) An example of the free rider problem

Q11. In a blockchain network, honest mining is the Nash equilibrium because:

a) Miners are legally required to be honest b) Miners know each other personally and maintain reputations c) Dishonesty costs more than honesty due to the incentive structure of the protocol d) A central authority verifies all transactions

Q12. Mechanism design is best described as:

a) Analyzing existing games to predict outcomes b) Engineering rules and incentives so that self-interested behavior produces cooperative outcomes c) Designing computer hardware for game simulations d) Creating games for entertainment purposes

Q13. Garrett Hardin's "Tragedy of the Commons" describes:

a) A situation where shared resources are sustainably managed b) The overuse and destruction of shared resources when each user bears only a fraction of the cost c) The success of privatization in all contexts d) The failure of all cooperation mechanisms

Q14. Elinor Ostrom's research showed that the tragedy of the commons can be avoided through:

a) Only privatization of shared resources b) Only government regulation and enforcement c) Self-governing communities that develop their own rules for managing shared resources d) Eliminating all shared resources

Q15. Which of Ostrom's design principles most closely parallels tit-for-tat's "forgiving" property?

a) Clearly defined boundaries b) Graduated sanctions c) Conflict resolution mechanisms d) Rules match local conditions

Q16. According to the chapter's threshold concept, cooperation is best understood as:

a) A moral achievement that requires altruistic individuals b) An emergent equilibrium that arises from the structure of repeated interactions c) Something that requires a central authority to enforce d) An anomaly that contradicts rational self-interest

Q17. Which of Nowak's five mechanisms for the evolution of cooperation does NOT require repeated interaction between the same individuals?

a) Direct reciprocity b) Indirect reciprocity c) Kin selection d) All five require repeated interaction

Q18. The difference between direct reciprocity and indirect reciprocity is:

a) Direct reciprocity involves money; indirect reciprocity does not b) In direct reciprocity, you cooperate because the other person cooperated with you; in indirect reciprocity, you cooperate because your reputation is at stake c) Direct reciprocity is biological; indirect reciprocity is cultural d) There is no meaningful difference between them

Q19. In the cleaner fish example, audience effects (being watched by potential future clients) help maintain cooperation. This is an instance of:

a) Kin selection b) Direct reciprocity c) Indirect reciprocity d) Group selection

Q20. The chapter argues that cooperation does not require trust. Which of the following systems best illustrates this claim?

a) A family sharing household chores b) A blockchain network maintaining a ledger among anonymous participants c) A group of friends cooperating on a road trip d) A married couple dividing responsibilities


Short Answer

Q21. Explain in two to three sentences why the one-shot prisoner's dilemma has a different outcome than the iterated prisoner's dilemma.

Q22. Give one example from the chapter of a system where cooperation broke down. What structural condition failed?

Q23. In your own words, explain what "incentive compatibility" means and give one example.

Q24. Why did Ostrom's work challenge Hardin's claim that the tragedy of the commons had only two solutions?

Q25. Name one forward connection mentioned in the chapter. What future chapter is referenced, and what concept from Chapter 11 will it build upon?


Answer Key

Multiple Choice:

Q1: b -- Defection is a dominant strategy (better regardless of the other player's choice). (Section 11.1)

Q2: b -- Nash equilibrium is defined as a state where no player can improve by unilaterally changing strategy. (Section 11.1)

Q3: c -- Axelrod identified these four properties as the keys to tit-for-tat's success. (Section 11.2)

Q4: c -- Tit-for-tat is clear (transparent), not deceptive. Clarity is one of its four key properties. (Section 11.2)

Q5: b -- Quorum sensing coordinates cooperative gene expression when bacterial density crosses a threshold. (Section 11.3)

Q6: b -- Hamilton's rule: cooperation spreads when relatedness times benefit exceeds cost (rB > C). (Section 11.3)

Q7: c -- MAD maintained cooperation by ensuring that defection (first strike) would be met with guaranteed annihilation. (Section 11.4)

Q8: b -- Free riders use the software without contributing, which standard economics predicts should undermine production. (Section 11.5)

Q9: c -- Open source communities use reputation, modular architecture, and shared tools to sustain cooperation. (Section 11.5)

Q10: a -- Bleaching occurs when heat stress changes the payoff structure, turning mutualism into a harmful relationship. (Section 11.6)

Q11: c -- The protocol's incentive structure makes honesty more profitable than dishonesty under normal conditions. (Section 11.7)

Q12: b -- Mechanism design engineers rules so that self-interested behavior aligns with desired outcomes. (Section 11.7)

Q13: b -- Hardin described the overuse of shared resources when costs are shared but benefits are individual. (Section 11.9)

Q14: c -- Ostrom documented successful self-governance of commons by communities using eight design principles. (Section 11.10)

Q15: b -- Graduated sanctions mirror tit-for-tat's forgiveness: first offenses are punished lightly, repeated offenses more harshly. (Section 11.10)

Q16: b -- The threshold concept frames cooperation as an emergent equilibrium from game structure, not moral virtue. (Section 11.11)

Q17: c -- Kin selection requires only genetic relatedness, not repeated interaction between the same individuals. (Section 11.8)

Q18: b -- Direct reciprocity is bilateral (I cooperate because you did); indirect reciprocity is community-mediated (I cooperate because of my reputation). (Section 11.8)

Q19: c -- Audience effects are a form of indirect reciprocity: behavior is shaped by observers, not just the direct partner. (Section 11.6)

Q20: b -- Blockchain enables cooperation among anonymous, untrusting parties through mechanism design rather than trust. (Section 11.7)

Short Answer Rubric:

Q21: In a one-shot game, there is no future interaction to incentivize cooperation, so defection dominates. In an iterated game, the shadow of the future creates incentives to cooperate: defection is punished in subsequent rounds, and mutual cooperation yields higher cumulative payoffs. The possibility of future rounds changes the cost-benefit calculation.

Q22: Acceptable examples include coral bleaching (environmental change altered payoff structure), the Cold War near-misses (false positives in warning systems), or tragedy of the commons scenarios. The key is identifying which structural condition (repeated interaction, detection, punishment, or payoff alignment) failed.

Q23: Incentive compatibility means that the rules of a system are designed so that each participant's self-interested behavior also serves the system's goals. Example: in blockchain, miners earn rewards by validating honest transactions, so self-interest (earning rewards) aligns with the system's goal (maintaining an honest ledger).

Q24: Hardin claimed that commons could only be managed through privatization or government regulation. Ostrom documented real-world communities that successfully managed commons through self-governance -- developing their own rules, monitoring, and sanctions without either private ownership or external government control. This demonstrated a third option Hardin had not considered.

Q25: Acceptable forward connections include Chapter 14 (Overfitting -- overly specific rules creating loopholes for defectors), Chapter 17 (Scaling -- cooperative structures failing at larger scales), or Chapter 22 (Heuristics and Biases -- psychology of cooperation and fairness intuitions).