Quiz: Surveillance: From Panopticon to Platform

Test your understanding before moving to the next chapter. Target: 70% or higher to proceed.


Section 1: Multiple Choice (1 point each)

1. Jeremy Bentham's panopticon was designed so that:

  • A) Every prisoner could see the guard at all times, creating a sense of protection and reassurance.
  • B) Prisoners could be observed at any time without knowing whether they were currently being watched, inducing self-regulation.
  • C) Guards could communicate with prisoners through a central loudspeaker system to issue commands.
  • D) Prisoners were kept in total darkness so they could not see one another or the guards.
Answer **B)** Prisoners could be observed at any time without knowing whether they were currently being watched, inducing self-regulation. *Explanation:* Section 8.1 describes Bentham's 1791 design: a circular building with cells arranged around a central inspection tower. The tower's windows were designed so that prisoners could see the tower but could not see whether a guard was inside. The genius of the design was economic — you did not need a guard for every prisoner, because the *possibility* of observation produced compliance. This principle — that the awareness of potential surveillance disciplines behavior — is the foundation of Foucault's analysis in Section 8.2.

2. Foucault's analysis of the panopticon in Discipline and Punish is primarily concerned with:

  • A) The engineering specifications of prison architecture and how they could be improved.
  • B) Whether Bentham's design was more cost-effective than traditional prisons.
  • C) How the panoptic principle extends beyond prisons to produce a society-wide mechanism of disciplinary power through internalized surveillance.
  • D) The moral rights of prisoners to privacy and freedom from observation.
Answer **C)** How the panoptic principle extends beyond prisons to produce a society-wide mechanism of disciplinary power through internalized surveillance. *Explanation:* Section 8.2 explains that Foucault's interest in the panopticon was not architectural but sociological. He used the panopticon as a metaphor for modern disciplinary institutions — schools, hospitals, factories, barracks — that produce "docile bodies" through the internalization of surveillance norms. The key insight is that power does not require constant coercion; when individuals believe they *might* be observed, they regulate their own behavior. Foucault called this a shift from sovereign power (visible, violent, intermittent) to disciplinary power (invisible, normalized, continuous).

3. Which of the following best describes the NSA program known as PRISM, as revealed by Edward Snowden?

  • A) A program that placed physical wiretaps on telephone lines of suspected terrorists.
  • B) A program through which the NSA obtained direct access to user data from major technology companies, including email, chat logs, stored data, and file transfers.
  • C) A public-facing transparency initiative in which the NSA disclosed its surveillance practices to Congress.
  • D) A program that used AI to predict criminal activity before it occurred.
Answer **B)** A program through which the NSA obtained direct access to user data from major technology companies, including email, chat logs, stored data, and file transfers. *Explanation:* Section 8.4 describes PRISM as one of the key programs revealed in the Snowden disclosures of June 2013. PRISM enabled the NSA to collect data directly from the servers of companies including Google, Facebook, Apple, Microsoft, and Yahoo, pursuant to Section 702 of the FISA Amendments Act. The program's scope was vast: it captured communications of non-U.S. persons abroad but inevitably swept up enormous quantities of Americans' data as well. Option A describes older wiretapping methods. Options C and D are fabricated.

4. Roger Clarke's concept of "dataveillance" (1988) refers specifically to:

  • A) The use of video cameras to monitor physical spaces.
  • B) The systematic use of personal data systems to monitor and regulate the actions of individuals.
  • C) Government censorship of digital content.
  • D) The practice of citizens recording police activity on smartphones.
Answer **B)** The systematic use of personal data systems to monitor and regulate the actions of individuals. *Explanation:* Section 8.5 introduces Clarke's concept: dataveillance is surveillance not through watching bodies but through monitoring data trails — transactions, communications metadata, location records, browsing histories. Clarke argued in 1988 that this was qualitatively different from physical surveillance because it operates at scale (you can dataveille millions simultaneously), is largely invisible to the subject, and creates permanent records. Option A describes traditional CCTV. Option D describes "sousveillance" — watching the watchers — which is a distinct concept discussed in Section 8.8.

5. Shoshana Zuboff's concept of "surveillance capitalism" argues that:

  • A) Governments use capitalism to fund surveillance programs.
  • B) Surveillance is an unfortunate side effect of digital commerce that companies try to minimize.
  • C) Tech companies have created a new economic logic in which human experience is claimed as free raw material for extraction, prediction, and sale.
  • D) Consumers willingly trade their data for services in a fair market exchange.
Answer **C)** Tech companies have created a new economic logic in which human experience is claimed as free raw material for extraction, prediction, and sale. *Explanation:* Section 8.5.2 presents Zuboff's argument from *The Age of Surveillance Capitalism* (2019). Zuboff contends that companies like Google and Facebook pioneered a new form of capitalism in which "behavioral surplus" — the data generated by users beyond what is needed to improve services — is extracted, analyzed, and sold as prediction products to advertisers and other buyers. This is not a fair exchange (D) because users do not understand what they give up, and it is not a side effect (B) but the core business model. Option A reverses the relationship Zuboff describes.

6. Eli learns that Detroit's Project Green Light provides police with real-time access to surveillance cameras at participating businesses. According to the chapter, the most significant concern raised by community members about this program is:

  • A) The cameras are too expensive for taxpayers.
  • B) The cameras produce low-quality images that are useless for investigations.
  • C) The program concentrates surveillance in predominantly Black neighborhoods without meaningful community consent, extending police monitoring into everyday commercial spaces.
  • D) The cameras violate the Fourth Amendment because they are placed on private property.
Answer **C)** The program concentrates surveillance in predominantly Black neighborhoods without meaningful community consent, extending police monitoring into everyday commercial spaces. *Explanation:* Section 8.6.2 discusses Project Green Light as an example of surveillance infrastructure that disproportionately affects communities of color. The program installs high-definition cameras with green flashing lights at gas stations, restaurants, and retail stores — spaces that residents must use for daily life — with real-time feeds to a police command center. Critics, including civil rights organizations and community members, argue that the program normalizes constant police surveillance in Black neighborhoods while similar programs are not deployed in wealthier, whiter areas. The concern is not cost (A) or image quality (B) but the racialized distribution of surveillance and the absence of community input.

7. The chapter argues that the distinction between the "surveillance state" (government monitoring) and "surveillance capitalism" (corporate data extraction) is collapsing. Which of the following best illustrates this convergence?

  • A) A government agency builds its own social media platform to monitor citizens directly.
  • B) A law enforcement agency purchases commercially available location data from data brokers, bypassing the warrant requirement that would apply to direct government surveillance.
  • C) A tech company donates computers to a government office.
  • D) A university uses both government and corporate software in its classrooms.
Answer **B)** A law enforcement agency purchases commercially available location data from data brokers, bypassing the warrant requirement that would apply to direct government surveillance. *Explanation:* Section 8.7 describes the commercial-government surveillance pipeline: companies collect vast quantities of personal data through apps, devices, and platforms; data brokers aggregate and sell this data; and government agencies purchase it, obtaining surveillance capabilities that would require a warrant if conducted directly. This is not a hypothetical — the chapter cites cases in which the U.S. Department of Homeland Security, IRS, and FBI purchased location data from commercial brokers. The convergence lies in the fact that the legal protections designed to constrain government surveillance (the Fourth Amendment, warrant requirements) are circumvented through the commercial data market.

8. The "chilling effect" of surveillance, as discussed in the chapter, refers to:

  • A) The physical discomfort caused by surveillance camera equipment in cold weather.
  • B) The tendency of individuals to self-censor, avoid lawful activities, or suppress dissent when they know or suspect they are being monitored.
  • C) The slowing of economic growth caused by excessive regulation of surveillance technology.
  • D) The decrease in crime rates caused by visible surveillance cameras.
Answer **B)** The tendency of individuals to self-censor, avoid lawful activities, or suppress dissent when they know or suspect they are being monitored. *Explanation:* Section 8.7.2 discusses the chilling effect as one of the most significant harms of surveillance. When people know they are watched, they modify their behavior — not just illegal behavior but lawful behavior that they fear might attract scrutiny. Researchers, journalists, activists, lawyers, and ordinary people may avoid certain searches, conversations, associations, or movements. The PEN America study cited in the chapter found that after the Snowden revelations, writers in the United States reported self-censoring their internet searches, avoiding certain topics, and declining to communicate with sources abroad. This behavioral change is the panopticon at work — Foucault's insight realized in digital form.

9. Mira encounters a debate at VitraMed about whether continuous patient monitoring through wearable devices constitutes "surveillance." Which of the following arguments from the chapter supports the claim that it does?

  • A) VitraMed's system records patient behavior and uses it to generate alerts about non-compliance, creating a power asymmetry between the observed patient and the observing institution.
  • B) VitraMed's system uses the same camera technology as CCTV.
  • C) VitraMed's system is operated by the government.
  • D) VitraMed's patients are forced to wear the devices against their will.
Answer **A)** VitraMed's system records patient behavior and uses it to generate alerts about non-compliance, creating a power asymmetry between the observed patient and the observing institution. *Explanation:* Section 8.6.1 explores the question of whether health monitoring is surveillance. The chapter argues that the defining features of surveillance are not limited to cameras or government actors but include any system that (1) continuously monitors behavior, (2) creates a record of that behavior, (3) enables institutional evaluation of the individual, and (4) creates a power asymmetry between the observer and the observed. VitraMed's system meets all four criteria: it tracks patient activity, flags "non-compliance," and positions the institution as an evaluator of the patient's behavior. The beneficent purpose (improved health outcomes) does not eliminate the surveillance dynamic — it complicates it. Options B, C, and D misidentify the relevant criteria.

10. Which of the following is NOT identified in the chapter as a tool of resistance against surveillance?

  • A) End-to-end encryption
  • B) Anonymity networks such as Tor
  • C) Surveillance impact assessments and community oversight boards
  • D) Installing more surveillance cameras to create "mutual transparency"
Answer **D)** Installing more surveillance cameras to create "mutual transparency." *Explanation:* Section 8.8 discusses several categories of resistance: technical tools (encryption, Tor, VPNs, metadata-minimizing protocols), legal tools (surveillance impact assessments, warrant requirements, community oversight ordinances), and social tools (sousveillance, community organizing, policy advocacy). The chapter does not advocate for "more cameras" as a solution. While sousveillance — citizens recording police or institutions — is discussed, it is framed as counter-surveillance by the less powerful, not as an argument for proliferating cameras. The idea that "mutual transparency" solves the power asymmetry of surveillance is addressed and critiqued: transparency is not symmetrical when one party is an institution with vast resources and the other is an individual.

Section 2: True/False with Justification (1 point each)

For each statement, determine whether it is true or false and provide a brief justification.

11. "The panopticon was never actually built during Bentham's lifetime, but Foucault argued its principles became embedded in modern institutions regardless."

Answer **True.** *Explanation:* Section 8.1 notes that Bentham spent years promoting his panopticon design but never secured funding to build it as a prison in England during his lifetime (partial panoptic structures were later built in other countries). Foucault's argument in Section 8.2 was precisely that the *principle* mattered more than the building: panoptic logic — the idea that the possibility of observation produces self-discipline — was adopted by schools, hospitals, factories, and military institutions regardless of whether they used Bentham's circular architecture. The panopticon as metaphor proved far more influential than the panopticon as building.

12. "The Snowden revelations demonstrated that the NSA's surveillance programs were conducted without any legal authorization whatsoever."

Answer **False.** *Explanation:* Section 8.4 explains that the NSA's programs — including PRISM and the bulk metadata collection program — operated under legal authorities, principally Section 215 of the USA PATRIOT Act and Section 702 of the FISA Amendments Act. The programs were authorized by the Foreign Intelligence Surveillance Court (FISC), a secret court whose proceedings and opinions were classified. What Snowden revealed was not that the programs were *illegal* per se, but that they were far more expansive than the public or most members of Congress understood, that the legal interpretations authorizing them were secret, and that the oversight mechanisms were inadequate. The distinction matters: the scandal was not lawlessness but rather the vast gap between public understanding and classified legal reality.

13. "Surveillance capitalism, as Zuboff defines it, is simply digital advertising — companies show you ads based on your data."

Answer **False.** *Explanation:* Section 8.5.2 explains that Zuboff's concept goes well beyond advertising. Surveillance capitalism involves the extraction of "behavioral surplus" — data about human behavior that exceeds what is needed to improve services — and its transformation into prediction products that anticipate what individuals will do, think, feel, and buy. While advertising is the most visible manifestation, Zuboff argues that the logic extends to insurance pricing, political campaigns, workplace management, and any domain where prediction of human behavior has economic value. The deeper claim is that surveillance capitalism represents a new form of power — the power to shape and modify behavior at scale — not merely a new form of marketing.

14. "According to the chapter, facial recognition technology performs equally well across all demographic groups, so concerns about racial bias in its deployment are primarily about how the technology is used rather than how it performs."

Answer **False.** *Explanation:* Section 8.6 discusses the technical accuracy disparities documented in multiple studies, including NIST's 2019 evaluation and the "Gender Shades" study by Buolamwini and Gebru. These studies found significantly higher error rates for darker-skinned individuals and for women compared to lighter-skinned men. The NIST study found false positive rates up to 100 times higher for certain demographic groups. This means the technology itself performs unequally — the concern is not only about deployment but about the technical foundation. The chapter argues that both dimensions matter: a biased tool deployed in a biased system compounds harm.

15. "The chapter argues that encryption is a complete solution to the surveillance problem because encrypted communications cannot be read by any third party."

Answer **False.** *Explanation:* Section 8.8 presents encryption as an important but limited tool. While end-to-end encryption protects the *content* of communications, it does not hide *metadata* — who communicated with whom, when, for how long, and from what location. As the chapter notes (echoing the discussion in Chapter 1), metadata alone can reveal intimate details about a person's life. Furthermore, encryption does not protect against surveillance at the device level (if a device is compromised, encryption is bypassed), does not address the collection of behavioral data by platforms, and does not solve the structural power asymmetries that enable surveillance. The chapter describes encryption as "necessary but not sufficient."

Section 3: Short Answer (2 points each)

16. Explain the difference between "sousveillance" and "surveillance" as described in the chapter. Provide one example of sousveillance and explain the power dynamic it seeks to reverse.

Sample Answer Surveillance is the observation of the many by the few — those with institutional power monitoring those without it (governments watching citizens, employers monitoring workers, platforms tracking users). Sousveillance, a term coined by Steve Mann, is the inverse: the observation of the powerful by the less powerful, or "watching from below." An example is citizens using smartphone cameras to record police conduct during arrests or protests. This reverses the traditional power dynamic by making the actions of authorities visible and accountable to the public. The chapter notes that sousveillance has been instrumental in documenting police violence, but it also has limitations — recording powerful institutions can be dangerous for the person doing the recording, and institutions can retaliate against those who expose their conduct. *Key points for full credit:* - Defines both terms correctly and notes the directional difference (top-down vs. bottom-up) - Provides a concrete example of sousveillance - Identifies the power dynamic at stake

17. The chapter discusses the Five Eyes alliance (Section 8.4.2). Explain what the Five Eyes is, why it matters for understanding surveillance governance, and how intelligence-sharing arrangements can circumvent domestic legal protections.

Sample Answer The Five Eyes is an intelligence-sharing alliance comprising the United States, the United Kingdom, Canada, Australia, and New Zealand. Originating from World War II-era signals intelligence cooperation, it has evolved into the most extensive intelligence-sharing partnership in the world. It matters for surveillance governance because it enables member states to share intercepted communications and surveillance data across borders. This creates a governance loophole: if Country A's domestic law prohibits it from surveilling its own citizens without a warrant, Country A can potentially receive that same data from Country B, which collected it under different legal authorities. The chapter argues that such arrangements effectively allow governments to outsource surveillance to jurisdictions with fewer protections, circumventing the domestic legal constraints designed to protect citizens' rights. This makes purely national governance of surveillance inadequate — the problem is inherently transnational. *Key points for full credit:* - Identifies the Five Eyes members correctly - Explains the intelligence-sharing mechanism - Articulates how cross-border sharing can circumvent domestic legal protections

18. Using Section 8.6.2 and the Detroit thread, explain why Eli argues that surveillance is not experienced equally across communities. What evidence does the chapter provide to support this claim?

Sample Answer Eli argues that surveillance infrastructure is disproportionately concentrated in low-income communities of color. In Detroit, Project Green Light cameras are predominantly located in neighborhoods with higher percentages of Black residents, creating a geography of surveillance that maps onto existing racial and economic inequalities. The chapter provides several forms of evidence: the spatial distribution of cameras (concentrated in certain zip codes, not citywide), the absence of equivalent surveillance in wealthier neighborhoods, and the disproportionate impact of facial recognition errors on Black individuals (due to higher error rates for darker-skinned faces). Eli's broader point is that surveillance is experienced as part of a larger pattern of over-policing, disinvestment, and institutional distrust in Black communities. A camera that might feel neutral in a wealthy suburb carries a different meaning in a neighborhood with a history of aggressive policing, stop-and-frisk, and mass incarceration. Surveillance is not an abstract policy question for these communities — it is lived experience. *Key points for full credit:* - Identifies the racial and economic dimensions of surveillance distribution - References specific evidence (Project Green Light, facial recognition disparities) - Connects surveillance to broader structural inequality

19. Describe two specific reforms enacted after the Snowden revelations (Section 8.4.4) and evaluate whether either reform meaningfully constrained mass surveillance practices.

Sample Answer Two reforms discussed in the chapter are: (1) the USA FREEDOM Act of 2015, which ended the NSA's bulk collection of domestic phone metadata under Section 215 of the PATRIOT Act, replacing it with a system where the NSA must obtain FISC approval to query records held by telephone companies; and (2) the establishment of a panel of independent "amici curiae" (friends of the court) to provide opposing perspectives in FISC proceedings, addressing the criticism that the FISC had operated as a rubber stamp because only the government presented arguments. The chapter evaluates both reforms cautiously. The USA FREEDOM Act constrained one specific program but left Section 702 authorities — which enable collection of vast quantities of communications involving non-U.S. persons — largely intact. The FISC amici provision addressed a procedural deficit but the amici are appointed at the court's discretion and do not participate in all cases. The chapter concludes that the reforms were meaningful but incremental, leaving the fundamental architecture of mass surveillance largely in place while adding procedural safeguards at the margins. *Key points for full credit:* - Identifies at least two specific reforms - Evaluates their effectiveness with nuance (not simply "they worked" or "they failed") - Demonstrates understanding that reform addressed symptoms without fully restructuring surveillance authorities

Section 4: Applied Scenario (5 points)

20. Read the following scenario and answer all parts.

Scenario: SafeWalk Campus

Ridgewood University announces a new safety initiative called "SafeWalk." The program combines three technologies: (1) a network of 200 high-definition cameras placed across campus, including in academic buildings, dormitory lobbies, parking lots, and outdoor walkways; (2) a facial recognition system that can identify every person on campus in real time by matching camera feeds against the university's student and employee photo databases; and (3) a mobile app that students can use to report safety concerns, which also passively tracks their GPS location "to enable rapid emergency response."

The university's press release states: "SafeWalk will make Ridgewood the safest campus in America. Students and parents can rest assured that every corner of our campus is monitored 24/7. We have nothing to hide, and neither do our students."

Students were not consulted before the program's announcement. A group of students objects, arguing that the system constitutes mass surveillance. The university responds: "This is about safety, not surveillance. We are protecting our students."

(a) Apply Foucault's concept of the panopticon (Section 8.2) to the SafeWalk system. In what ways does the system create panoptic conditions on campus? How might it produce "disciplinary power" over student behavior? (1 point)

(b) The university's statement — "We have nothing to hide, and neither do our students" — echoes the "nothing to hide" argument. Using both Chapter 7 and Chapter 8, identify at least three problems with this framing. (1 point)

(c) Analyze the facial recognition component using Section 8.6. What specific risks does campus-wide facial recognition create? Consider both technical risks (accuracy disparities) and social risks (chilling effects, power asymmetry). (1 point)

(d) The mobile app "passively tracks GPS location" for emergency response. Using the dataveillance framework from Section 8.5, analyze whether this data collection is proportionate to the stated purpose. What other uses might the location data be put to? (1 point)

(e) Drawing on Section 8.8, propose three governance mechanisms that Ridgewood University should implement if it proceeds with the SafeWalk program. For each mechanism, explain what harm it is designed to prevent. (1 point)

Sample Answer **(a)** The SafeWalk system creates panoptic conditions by making students *perpetually visible* — 200 cameras with facial recognition mean that every student's movements across campus are tracked, identified, and potentially recorded. Like Bentham's panopticon, the system produces asymmetric visibility: the university sees the students, but students cannot see who is watching, when they are being watched, or how the data is used. Over time, this awareness of constant monitoring is likely to produce self-regulation — students may avoid certain locations, curtail certain activities, or modify their behavior not because those activities are prohibited but because they feel observed. This is Foucault's disciplinary power in action: control achieved not through direct enforcement but through the internalization of surveillance. **(b)** Three problems with the "nothing to hide" framing: 1. **It conflates privacy with secrecy.** As Chapter 7 establishes, privacy protects autonomy, intellectual freedom, and the ability to develop ideas without scrutiny — not just the concealment of wrongdoing. A student browsing books on radical political theory, visiting a campus health clinic, or attending a support group for addiction has "nothing to hide" in a legal sense but still has legitimate privacy interests. 2. **It assumes the surveilling institution will always be benevolent.** The current administration may use the data only for safety, but leadership changes, policies evolve, and data persists. Data collected for safety can be repurposed for discipline, investigation of unpopular speech, or administrative decisions. 3. **It places the burden on the surveilled.** The argument requires individuals to justify their desire for privacy rather than requiring the institution to justify its surveillance. This reverses the proper accountability structure — in a democratic institution, the entity deploying surveillance bears the burden of demonstrating necessity and proportionality. **(c)** Campus-wide facial recognition creates multiple risks: - **Technical:** Facial recognition systems have documented accuracy disparities across race and gender. Students of color, particularly Black students, face higher rates of false identification, which could lead to wrongful security interventions, questioning, or disciplinary action. - **Chilling effect:** Students who know their identity is tracked in real time may avoid attending political protests, visiting certain campus offices (counseling center, LGBTQ+ center, reproductive health clinic), or participating in controversial academic activities. This undermines the university's core mission of intellectual exploration. - **Power asymmetry:** The university can track any student's movements comprehensively, but students have no equivalent ability to monitor how their data is used, who accesses it, or how long it is retained. This creates a profound imbalance between institution and individual. - **Function creep:** A system deployed for "safety" could be expanded to monitor class attendance, identify students at parties, or track who visits whom in dormitories — uses far beyond the original justification. **(d)** Passive GPS tracking is disproportionate to the stated purpose of emergency response. Emergency response requires location data *at the moment of an emergency* — not continuous tracking. A system that passively records location at all times collects far more data than is necessary for the stated goal. This is the hallmark of dataveillance: the collection of behavioral data at a scale and granularity that exceeds any specific functional purpose. The location data could be used for: monitoring class attendance, identifying students who spend time off campus, tracking which students visit which dormitories, profiling student social networks, or providing data to law enforcement without a warrant. The combination of continuous GPS tracking with facial recognition from cameras creates a surveillance infrastructure that can reconstruct a complete record of any student's daily life. **(e)** Three governance mechanisms: 1. **Community oversight board with student representation (prevents unchecked institutional power).** A standing committee including elected students, faculty, and an independent privacy expert should have authority to review all surveillance policies, audit data access logs, and recommend modifications or termination of the program. This addresses the power asymmetry by giving the surveilled a voice in governance. 2. **Strict data minimization and retention limits (prevents function creep and accumulation harm).** Camera footage should be retained for no more than 72 hours unless it is relevant to a specific, documented safety incident. GPS data from the app should be collected only when a student activates an emergency alert, not passively. Facial recognition matches should not be stored in searchable logs. These limits prevent the accumulation of a comprehensive surveillance archive. 3. **Mandatory surveillance impact assessment with public report (prevents deployment without accountability).** Before deploying the system, the university should conduct and publish a surveillance impact assessment — modeled on the privacy impact assessments discussed in the chapter — that evaluates the system's effects on student privacy, academic freedom, and equitable treatment across demographic groups. The assessment should include analysis of facial recognition accuracy rates by demographic group and a plan for mitigating disparate impact.

Scoring & Review Recommendations

Score Range Assessment Next Steps
Below 50% (< 15 pts) Needs review Re-read Sections 8.1-8.5 carefully, redo Part A exercises
50-69% (15-20 pts) Partial understanding Review specific weak areas, focus on Part B exercises for applied practice
70-85% (21-25 pts) Solid understanding Ready to proceed to Chapter 9; review any missed topics briefly
Above 85% (> 25 pts) Strong mastery Proceed to Chapter 9: Data Collection and Consent
Section Points Available
Section 1: Multiple Choice 10 points (10 questions x 1 pt)
Section 2: True/False with Justification 5 points (5 questions x 1 pt)
Section 3: Short Answer 8 points (4 questions x 2 pts)
Section 4: Applied Scenario 5 points (5 parts x 1 pt)
Total 28 points