Chapter 5 Quiz: Ethics of Security Research

Multiple Choice Questions

1. Coordinated Vulnerability Disclosure (CVD) is best described as: a) Publishing vulnerability details immediately without notifying the vendor b) Reporting the vulnerability to the vendor and working with them on a fix, with a disclosure deadline c) Selling the vulnerability to the highest bidder d) Reporting the vulnerability exclusively to a government agency

2. Google Project Zero's current disclosure policy (as of the 90+30 iteration) provides: a) 90 days for the vendor to patch, plus 30 additional days before technical details are published if the patch is released on time b) 90 days total, with no extensions under any circumstances c) 30 days for patching and 90 days for technical detail publication d) Unlimited time as long as the vendor demonstrates good-faith progress

3. Which organization acts as a neutral intermediary to coordinate vulnerability disclosures between researchers and vendors? a) NSA b) CERT/CC (Carnegie Mellon's CERT Coordination Center) c) Zerodium d) Project Zero

4. Zerodium's published prices for zero-day exploits are generally: a) Lower than vendor bug bounty programs b) Comparable to vendor bug bounty programs c) Significantly higher than most vendor bug bounty programs d) Only available to academic researchers

5. The U.S. government's Vulnerabilities Equities Process (VEP) is designed to: a) Purchase all zero-day vulnerabilities discovered by U.S. researchers b) Weigh the intelligence value of retaining a vulnerability against the defensive benefit of disclosing it to the vendor c) Prosecute researchers who sell vulnerabilities to foreign governments d) Certify that vulnerability disclosures comply with the CFAA

6. The WannaCry ransomware attack of 2017 was enabled by: a) A vulnerability discovered by Google Project Zero b) The EternalBlue exploit, developed by the NSA and leaked by the Shadow Brokers c) A zero-day purchased from Zerodium d) A vulnerability disclosed through full disclosure on Bugtraq

7. The Wassenaar Arrangement's "intrusion software" provisions were controversial because: a) They were too narrow, exempting most offensive security tools b) They were broad enough to potentially cover legitimate penetration testing tools like Metasploit c) They only applied to U.S.-based researchers d) They required all security tools to be open source

8. Which ethical framework evaluates actions based on whether they produce the greatest good for the greatest number of people? a) Deontological ethics b) Virtue ethics c) Utilitarianism (Consequentialism) d) Social contract theory

9. Dan Kaminsky's handling of the 2008 DNS vulnerability is considered a model for coordinated disclosure primarily because: a) He published full details immediately to force rapid patching b) He coordinated with DNS vendors worldwide to develop and deploy a simultaneous multi-vendor patch before publishing technical details c) He sold the vulnerability to the U.S. government for use in intelligence operations d) He kept the vulnerability secret indefinitely

10. The CVE (Common Vulnerabilities and Exposures) system is maintained by: a) Google Project Zero b) The NSA c) MITRE Corporation d) HackerOne

Short Answer Questions

11. Explain the "researcher's paradox" described in this chapter. How does this paradox create an inherent ethical tension in security research?

12. A security researcher discovers a critical vulnerability in a widely used open-source library. The library is maintained by a single volunteer developer who responds to the researcher's disclosure email by saying: "I know about this issue but I don't have time to fix it. I might get to it in six months." Using two different ethical frameworks, analyze what the researcher should do next.

13. Compare and contrast the ethical positions of (a) a researcher who reports a vulnerability to a vendor's bug bounty program for $5,000, (b) a researcher who sells the same vulnerability to Zerodium for $500,000, and (c) a researcher who publishes the vulnerability on Full Disclosure without notifying the vendor. Which position do you find most ethically defensible, and why?

14. Describe three specific safeguards that a security researcher should implement when writing and publishing proof-of-concept exploit code to minimize the risk of misuse.

15. The text discusses the "front page test" as a tool for ethical decision-making. Describe a scenario in which a security research activity would pass the "front page test" but still be ethically questionable, and explain why.

16. Explain the concept of the Vulnerabilities Equities Process (VEP) and describe two specific criticisms that have been leveled against it. Include a reference to the Shadow Brokers incident in your answer.

17. A penetration testing firm is approached by a government client who wants the firm to develop custom exploit tools that will be used for "authorized law enforcement operations." The firm has no visibility into how the tools will actually be used. Apply virtue ethics and care ethics to this scenario. What should the firm do?

18. Why is it important for security researchers to develop a personal code of ethics, rather than relying solely on professional codes of conduct (like ISC2's or EC-Council's)? Give two specific examples of ethical dilemmas that professional codes do not adequately address.


Answer Key

1. b) Reporting the vulnerability to the vendor and working with them on a fix, with a disclosure deadline

2. a) 90 days for the vendor to patch, plus 30 additional days before technical details are published if the patch is released on time

3. b) CERT/CC (Carnegie Mellon's CERT Coordination Center)

4. c) Significantly higher than most vendor bug bounty programs

5. b) Weigh the intelligence value of retaining a vulnerability against the defensive benefit of disclosing it to the vendor

6. b) The EternalBlue exploit, developed by the NSA and leaked by the Shadow Brokers

7. b) They were broad enough to potentially cover legitimate penetration testing tools like Metasploit

8. c) Utilitarianism (Consequentialism)

9. b) He coordinated with DNS vendors worldwide to develop and deploy a simultaneous multi-vendor patch before publishing technical details

10. c) MITRE Corporation

11. The researcher's paradox is the inherent tension that security researchers must first find ways to make systems less secure (by discovering vulnerabilities) in order to make them more secure (by getting those vulnerabilities fixed). This creates ethical tension because the act of discovering a vulnerability creates a new risk — the risk that the discovery itself could be stolen, leaked, or misused before a fix is available. The researcher simultaneously serves as both a potential threat (by possessing knowledge that could be weaponized) and a protector (by enabling the vulnerability to be patched).

12. Under utilitarianism, the researcher should consider the consequences: waiting six months leaves millions of users vulnerable, so the researcher might publish a limited advisory (without full exploit details) to pressure the developer and enable users to implement workarounds, while also seeking other developers who could contribute a patch. Under deontological ethics, the researcher has a duty to protect users and to respect the developer's autonomy. The researcher might set a firm disclosure deadline (e.g., 90 days) and offer to help develop the fix, while preparing to disclose if the deadline passes — because the duty to protect users from known risk outweighs the duty to give the developer unlimited time.

13. The bug bounty researcher acts within the established disclosure ecosystem, receives fair (if modest) compensation, and ensures the vulnerability is fixed. The Zerodium seller receives higher compensation but loses control over how the exploit is used — it may be deployed against innocent targets by government clients. The full disclosure publisher may force rapid patching through public pressure but also gives attackers immediate access to the vulnerability. The bug bounty approach is most ethically defensible because it directly results in the vulnerability being fixed, aligns the researcher's financial interests with users' safety interests, and does not create a weapon that could be used against vulnerable populations. However, the systemic undervaluation of vulnerabilities by vendor bounty programs is an ethical issue in itself.

14. Three safeguards: (1) Minimize functionality — PoC code should demonstrate the vulnerability's existence (e.g., popping a calculator or printing a message) without including weaponized capabilities (reverse shells, data exfiltration). (2) Delay publication — publish PoC code only after the vendor has released a patch and users have had reasonable time to apply it. (3) Provide context — publish the PoC alongside defensive guidance, detection signatures, and mitigation advice so that defenders can use it to protect their systems.

15. A researcher who discovers a vulnerability in voting machine software and coordinates responsibly with the manufacturer would likely pass the front page test — the public would view this as beneficial research. However, if the disclosure occurs one week before a national election, even responsible disclosure could undermine public confidence in the electoral process, potentially causing more harm than the vulnerability itself. The front page test evaluates public perception, not actual ethical merit, and can fail when the ethical issues are complex or the consequences are delayed.

16. The VEP is a U.S. government process for deciding whether to disclose discovered vulnerabilities to vendors (for patching) or retain them for intelligence/military use. Criticism 1: The process lacks transparency — the public cannot verify that decisions are being made in the public interest rather than in intelligence agencies' institutional interests. Criticism 2: The government's track record of protecting its vulnerability stockpile is poor — the Shadow Brokers' theft and publication of NSA hacking tools (including EternalBlue) led directly to WannaCry and NotPetya, causing billions in damages and demonstrating that stockpiled vulnerabilities create systemic risk for everyone.

17. Under virtue ethics, a virtuous firm would demonstrate prudence, integrity, and justice. Without visibility into actual use, the firm cannot ensure its tools serve just purposes. A virtuous firm would either decline the engagement or insist on transparency and contractual guardrails restricting use to lawful purposes. Under care ethics, which prioritizes protecting vulnerable parties, the firm should consider who might be harmed by the tools — potential targets include journalists, activists, and dissidents in authoritarian regimes. Without assurance that the tools will not be used against vulnerable populations, care ethics counsels declining the engagement.

18. Professional codes of conduct are necessarily general and cannot address every scenario a researcher will face. Example 1: ISC2's code requires members to "protect society" and "act legally," but does not address the specific question of whether to sell vulnerabilities to exploit brokers (which may be legal but ethically questionable). Example 2: EC-Council's code emphasizes authorized access and privacy, but does not address the ethical implications of discovering that a client is violating the law during an authorized pentest — a personal code must address competing loyalties between client confidentiality and public safety. A personal code fills these gaps by providing principles tailored to the researcher's specific work and values.