Taking Down Project NOLA: The Carpenter Test and The Third-Party Doctrine
- TULJ

- 1 day ago
- 22 min read
Kayden Green
Edited by Jordan Perlman, Samuel Huron, Judge Baskin, and Sahith Mocharla
In May 2025, The Washington Post published an exclusive report documenting the first-known widespread facial recognition program utilized by law enforcement in the United States [1]. Project NOLA, a nationwide crime-prevention nonprofit, built a private network of over 200 facial recognition cameras that scan the public around the clock––and until April, 2025––automatically alerted New Orleans police via a mobile app with names and real-time locations of suspected matches [2] [3]. According to Project NOLA, the facial recognition surveillance network has contributed to at least 34 arrests since 2023 [4]. New Orleans police officers have not disclosed their reliance on the facial recognition matches for most of the arrests linked to Project NOLA, and no public data is available on which of Project NOLA’s 5,000 cameras in New Orleans contain the nonprofit’s facial recognition technology [5] [6].
The publicization of Project NOLA sparked outcry as many Americans learned for the first time that the United States has no federal law regulating the use of artificial intelligence (AI) and facial recognition technology (FRT) for law enforcement purposes, allowing police monitoring surveillance networks, like Project NOLA’s, to thrive [7]. Not only can these third-party surveillance companies exist and provide their data to government agencies unrestricted, but current federal law does not require governments to publicly disclose their use of facial recognition or AI [8]. Recent advances in machine learning, cloud computing, and condensed data storage have revolutionized security camera monitoring, enabling FRT-equipped organizations to increasingly (and often secretly) operate across realms of private American life [9]. As law enforcement agencies begin adopting FRT cameras, such as those in Project NOLA’s automated policing, Fourth Amendment-based privacy concerns are increasingly asking: when does FRT surveillance data constitute an unreasonable search [10]? The lack of federal statute enables organizations like Project NOLA to enjoy unchecked authority to pursue around the clock public monitoring without federal oversight. The Carpenter Test for emerging technologies, established in Carpenter v. United States (2018), provides a federal precedent for limiting law enforcement applications of digital surveillance technologies such as FRT. Extension of the Carpenter Test’s restrictions on the Third-Party Doctrine would severely curtail Project NOLA’s policing activities, restoring Fourth Amendment privacy to Americans and preventing the emergence of an oppressive, unregulated police state. The Carpenter Test is not only critical for protecting Fourth Amendment privacy rights amid the rapid integration of surveillance technology in everyday life, but also for terminating the legal ambiguity that much of Project NOLA’s collaboration with law enforcement operates under.
Section A: Carpenter v. United States
In 2018, the Supreme Court broke precedent in Carpenter v. United States by expanding what constitutes a “search” under the Fourth Amendment [11]. The 5-4 majority concluded that modern technologies can violate the basic purpose of the Fourth Amendment, which is to “to safeguard the privacy and security of individuals against arbitrary invasions by government officials” [12]. This conclusion led the Justices to place the first limitations on the Fourth Amendment’s Third-Party Doctrine, which previously held that individuals have no right to privacy over information they voluntarily share with a non-government third party, and no control over how the third party uses or distributes such information once provided [13]. In Carpenter, the Court established the Carpenter Test as a framework for evaluating how emerging technology can infringe on citizens’ Fourth Amendment protection of a reasonable expectation of privacy. The Court held the test applies even when the technology is operated by a third party or when data is voluntarily provided, noting that the exception is necessary as surveillance-capable tools become increasingly normalized in daily American life. Carpenter’s establishment of the first limitations on voluntarily provided data to protect the Fourth Amendment under the last decade’s exponential integration of emerging technology that collects personal, often understood as private information suggests that the Supreme Court believes Americans do have a right to the information given to a third party in cases of surveillance-oriented private innovations [14].
Carpenter established the first limitations on the use of voluntarily provided data to protect Fourth Amendment rights amid the past two decades’ exponential integration of emerging technologies into daily life that collect personal, often understood as private, information. This development suggests that the Supreme Court recognizes a constitutional interest in data shared with third parties. In the context of surveillance-oriented private innovations, Americans can therefore retain Fourth Amendment protections over such information collected by FRT if the Carpenter Test and Third-Party Doctrine is extended, ultimately undermining the backbone of surveillance programs like Project NOLA.
The Justice Department had charged Detroit resident Timothy Carpenter with aiding and abetting a robbery in April 2011 [15]. His arrest hinged on evidence authorities found from his cell phone location data. Carpenter argued that the government violated the Fourth Amendment by collecting his cell-site location information (CSLI) records under the Stored Communications Act rather than through a search warrant supported by probable cause. CSLI records provide location-specific cell-site location information that is routinely submitted to a mobile phone’s wireless carrier as long as the mobile phone is charged, regardless of whether it is powered ‘on’ [16]. As cell phone usage has grown and cell sites have become increasingly concentrated, time-stamped CSLI maps have become increasingly comprehensive and accurate, providing data on a cell phone and its user’s location through constant, seconds-long updates.
The government circumvented the need for a warrant to obtain Carpenter’s records because it only collected 129 days’ worth of CSLI data, while the threshold for government actors to provide a warrant under the Stored Communications Act is 180 days [17] [18]. To obtain CSLI data for less than six months, the government must only demonstrate that the data “might be pertinent to an ongoing investigation,” pursuant to the Stored Communications Act [19]. To obtain a search warrant that meets probable cause, the government would have needed to demonstrate “some quantum of individualized suspicion” before the CSLI data could be seized, which the government did not seek to establish before collecting the Stored Communications Act data [20].
The Justices, however, recognized that these CSLI advancements, as well as the near-inescapable reliance people have on cellphones for daily life, create a new privacy concern [21]. They described the lower evidentiary burden of the Stored Communications Act for data requests under 180 days to be a “‘gigantic’ departure from probable cause rule” [22]. In the 127 days’ worth of CSLI data collected by the government prosecution, 12,898 location points were generated by Carpenter’s cell phone [23]. For an average of 101 location points per day, the Court majority found that the breadth of CSLI data collection violated Carpenter’s reasonable expectation of privacy in his physical movements and location [24]. The decision relied on a finding by the Court in United States v. Jones (2012), where the warrantless GPS tracking of a suspect’s vehicle for twenty-eight consecutive days constituted a search under the Fourth Amendment [25]. While police only used seven of the 127 days worth of CSLI to locate Carpenter, the near perfect and routine location tracking provided by the seven days worth of CSLI data followed the precedent established by Jones [26]. In Jones, a reasonable expectation of privacy is defined as one where an individual “demonstrated an actual expectation of privacy” and is a scenario with which “society is prepared to recognize as reasonable” to expect privacy [27]. The Jones concurring opinions argued that a warrant is necessary when law enforcement seeks data related to “collective public monitoring over time,” such as data collected by generalized, routine surveillance [28].
After the Court determined that the CSLI data exceeded the scope of the surveillance warrant established in Jones, it turned to the Third-Party Doctrine, which holds that an individual has no right to privacy for information they voluntarily share with a third party [29]. Third-Party Doctrine, however, only applies when the individual’s participation in the data sharing is optional and consented to [30] [31]. Breaking from past Third-Party Doctrine cases that upheld that data provided to a third party does not give the individual any reasonable expectation of privacy, the Court distinguished CSLI data from previous cases due to the “qualitatively different category” that cell phone provider data exists in [32]. The different category was created by the Court due to the greater intrusion into individual privacy that government access to CSLI data presents, which exceeds the personal-information threshold considered by case law up until Carpenter v. United States [33]. The Supreme Court found that CSLI data opens a “window into a person’s life, revealing not only his particular movements, but through them his ‘familial, political, professional, religious, and sexual associations’” to a greater degree than previous third-party data turned over to the government, such as GPS data or bank records, had previously revealed about the individual’s life [34] [35]. Not only did the Court recognize CSLI data as more invasive than past data obtained through the Stored Communications Act, but they recognized the extent to which mobile devices enable near-perfect tracking of a cell phone’s operator as they have practically become a “feature of the human anatomy” [36].
The Justices further emphasized that the expansive historical record and the “retrospective quality” of CSLI location data enable law enforcement to significantly broaden the reach of police authority [37]. They noted that law enforcement is historically limited by time during investigations, “the dearth of records,” and “the fragility of recollection,” a check on police’s investigative options that years of receipts of near-perfect locational CSLI data all but eliminates [38]. Considering the location tracking nature of the technology, the routineness of time-stamped cell tower pings, and the expansion of investigative power CSLI data creates, the Court agreed with Carpenter that law enforcement “invaded Carpenter’s reasonable expectation of privacy in the whole of his physical movements” and a reasonable expectation of privacy by obtaining CSLI without a probable cause search warrant [39]. Chief Justice Roberts summarized the break in precedent by stating “[in] light of the deeply revealing nature of CSLI, its depth, breadth, and comprehensive reach, and the inescapable and automatic nature of its collection, the fact that such information is gathered by a third party does not make it [private information] any less deserving of Fourth Amendment protection” [40]. Justice Roberts’ focus on “depth,” “breadth,” “ reach,” and “inescapable and automatic" makes it clear that the Court is concerned with privacy intrusions by general systems of digital surveillance.
In the eyes of the Court, the nature of what CSLI reveals is simply too broad in scope and too personal in nature to be acquired by the government without probable cause. The Court observed that CSLI is inescapable as there is no way to opt out of “leaving behind a trail of location data” unless the device is not charged [41]. Therefore the Court concluded that cell phone users can not assume the risk of voluntarily turning over CSLI [42]. Ultimately, the Supreme Court found that the “government’s acquisition of the cell-site records was a search within the meaning of the Fourth Amendment” and required a warrant issued with probable cause, establishing precedent for limiting the Third-Party Doctrine for government use of privately managed surveillance technologies [43].
Section B: Project NOLA’s Facial Recognition and the Carpenter Test
Carpenter v. United States was based on 2011 CSLI capabilities [44]. Now consider 2025’s facial recognition technology and AI biometric facial recognition systems. The new decade has brought with it another “seismic shift in digital technology”: AI-enabled FRT cameras that generate real-time, precise location data that goes far beyond CSLI’s location-estimating capabilities. Even in 2020, standard FRT had an over 99% “near infallible” accuracy rate, with FRT cameras creating an “exhaustive list of location information” as FRT pings an individual’s identity across its camera networks as they cross camera locations throughout the surveilled region [45]. Therefore, it seems obvious that facial recognition meets the pattern established by the Supreme Court in Carpenter v. United States for its data use by law enforcement, requiring a probable cause warrant to obtain, regardless of the data’s holder (third party or government entity).
CSLI was held to be protected under the Fourth Amendment because the technology met three elements that in conjunction violate the Fourth Amendment’s right to freedom from unwanted searches:
I. The deeply revealing nature of the data determined from the technology
II. technology’s “depth, breath, and comprehensive reach”
III. “inescapable and automatic nature of its collection” [46].
To solidify the legal linkage between Carpenter’s check on CSLI and FRT, the following explores how CSLI established the threshold for each element of the Carpenter Test, why the Third-Party Doctrine is irrelevant to technology that passes the test, and how FRT meets or exceeds each requirement, demonstrating that the Court should also necessitate a probable cause warrant for FRT.
First, the data collected must be “‘deeply revealing’ of some private quality of the person under surveillance” [47]. The Carpenter Test breaks precedent by establishing a framework for exceptions to the Third-Party Doctrine that considers the nature of what the collected information reveals, rather than simply the manner in which it was collected. To be “deeply revealing,” the data collected must “hold for many Americans the ‘privacies of life’” and show “familial, political, professional, religious, and sexual associations” [48] [49]. Camera feeds reveal these privacies of life and personal associations by going beyond the implied behavior of CSLI’s location data since they provide footage of a person’s location and physical movements, rather than simply the individual’s general location. Whether going to a church, visiting a health clinic, or entering a voting station during an election, surveillance cameras capture these intimate, often deeply personal activities that the public expects to be treated, by a democratic government, with a degree of privacy (religion, health, elections). The capture, catalog, and searchable storage of day-to-day activities posed by FRT-equipped cameras constitutes a “deeply revealing nature” as defined by the Carpenter Test because it enables law enforcement to track the routine movements of all people under a camera network. The biometric ‘face maps’ created by FRT allow authorities to search for a specific person’s identity within stored video-recorded data and images. Regardless of whether your identity is targeted for search (i.e. suspected criminals, intelligence operations, etc.) across the camera network’s data, all people who are captured by the FRT-enabled cameras can have their biometrics entered into and stored in the searchable biometric database operated by the third party. With FRT, police can search for any FRT recorded identity within the camera network’s biometric database and determine that specific person’s location and activities for as long as the video recordings are stored. Because the FRT data contains the camera’s precise location and is nearly infallible in capturing identities, this ‘find any person any time’ technology reaches beyond the Supreme Court’s threshold of the CSLI data that revealed near-perfect location data. Therefore, FRT meets the first element of the Carpenter Test.
The second element addresses the quantity of the information stored. The “depth, breadth, and reach” threshold can be understood as “time, frequency, and how many people” [50]. In Carpenter, the Court observed that “accessing seven days of CSLI constitutes a Fourth Amendment search” and would require a probable cause warrant to obtain [51]. While the Court’s timeline referred specifically to CSLI, the heightened scope and locational information that can be deduced from FRT identity matches across camera footage would likely lead a court to apply a similar, if not shorter, “depth” timeline for FRT [52]. “Breadth,” as defined in Carpenter, refers to the frequency of the data points, such as the daily average of 101 cell phone pings stored in Carpenter’s CSLI [53]. Both “breadth” and “depth” refer to the quantity of information collected, as more frequent data over a longer period of time progressively reveals more private details of a person’s daily life. When FRT data is collected over several hours or days within and across a network of cameras, the technology would likely be determined to meet the frequency and time threshold presented by the seven day standard in Carpenter [54]. Finally, “comprehensive reach” is met by a large population being affected nondiscriminately by facial recognition. In Carpenter, the Court recognized this because CSLI is “location information … continually logged for all of the 400 million devices in the United States” equipped with “this newfound tracking capacity [that] runs against everyone” [55]. FRT is indiscriminate as it creates a biometric profile for every person it identifies a match with within a still image database (which typically contains a combination of mugshots, social media photos, passports, licenses, and other photographed identities) [56]. These surveillance systems also collect data on every person who walks within the capture range of a camera that feeds into a facial recognition network. In large cities like New Orleans, where Project NOLA operates 5,000 cameras, the affected group constitutes a significant enough population to make FRT a privacy concern [57]. Like CSLI, FRT passes the threshold for the “breadth, depth, and reach” element and its subparts.
The final element of the Carpenter Test is the “inescapable and automatic nature of its [the data’s] collection” [58]. This element considers what the individual did, or failed to do, that ultimately led to the third-party obtaining the data [59]. In Carpenter, “inescapable” referred to an individual’s choice to use a cell phone, which the Court ultimately considered is not truly a choice for participation in modern society [60]. A person who lives in an area like New Orleans that is near-constantly surveilled by a 5,000-camera network similarly does not have a “choice” about being surveilled, given the demands of daily life that require moving through the high foot-traffic public areas monitored by FRT-enabled cameras [61]. If carrying a cell phone is not considered a real “choice” by the Supreme Court, then participating in daily life within a city likely would not be considered much of a “choice” either. While a person could attempt to avoid surveillance by moving cities, courts have consistently rejected the notion that individuals must alter fundamental aspects of daily life to preserve constitutional protections, rendering such relocation an unreasonable burden. Compounding this concern, federal law does not require cities or private entities to disclose the use of FRT-enabled camera networks, leaving individuals unaware that they are being surveilled and unable to meaningfully consent [62]. The “automatic collection” aspect refers to the digital process that creates a data point, such as a cell phone connecting to the nearest cell tower without an affirmative act by the phone’s carrier in Carpenter [63]. FRT-enabled cameras meet this burden by using training software that automatically scans people’s biometric ‘face maps’ as long as they are in frame, and then compares these ‘face maps’ against a biometric database to look for matches [64]. These matches under Project NOLA are typically suspected criminals [65]. However, the lack of any federal regulation for the use of FRT raises privacy concerns that individuals targeted for their protest behavior, religious associations, or speech can be instantly identified by FRT systems that can automatically alert law enforcement [66] [67]. While CSLI met the “inescapable and automatic nature” requirement because the Court ruled cell phones are “almost a ‘feature of human anatomy,’” FRT automatically tracks literal human anatomy, and therefore fulfills the third and final element of the Carpenter Test [68].
Facial recognition technology meets the three elements of the Carpenter Test, making the technology eligible for the warrant-necessitating exception to Third-Party Doctrine established in Carpenter v. United States [69]. The Court acknowledged during its creation of the Carpenter Test that the rule the Court would ultimately adopt “must take account of more sophisticated systems that are already in use or in development” [70]. AI-enabled, large-scale surveillance camera networks enhanced by FRT databases and in-camera capabilities are one of these more ‘sophisticated systems’ the Court alluded to. Carpenter was decided in 2018 and evaluated the privacy concern posed by 2011 capabilities [71]. The Supreme Court likely understood how rapidly technology has innovated and become increasingly integrated into modern society during the seven-year gap between the events and the ruling, motivating the majority to establish a safeguard within the Carpenter Test for emerging surveillance technologies like FRT [72]. Therefore, the legal idea that exceptions to the Third-Party Doctrine under the Carpenter Test cannot simply be limited to CSLI is already established and should not be of large debate if a case challenging FRT data collection without a warrant were to come to trial.
The Carpenter Test presently serves as one of the only legal frameworks and precedents through which the threat of emerging technology to American privacy is regulated. However, the Justices declined to apply the Carpenter Test logic to other specific technologies, leaving the decision short of creating a tangible, specific framework for the continued integration of contemporary surveillance innovations into civil society [73]. This failure, alongside the lack of federal regulation for facial recognition and artificial intelligence, fosters the legal conditions for organizations such as Project NOLA to proliferate beyond reasonable Fourth Amendment protection [74].
Section C: How the Carpenter Test Could Kill Project NOLA
Facial Recognition Technology, as deployed throughout a camera network covering an entire region, presents an adjacent, if not more severe, threat to individual privacy than that of CSLI in Carpenter v. United States. Project NOLA’s FRT-aided wide area camera network raises the same constitutional privacy concerns that CSLI did in Carpenter. Not only does FRT fulfill each element of the Carpenter Test, but Project NOLA’s use of its facial recognition cameras and biometric databases inherently disqualifies it from the Third-Party Doctrine.
Initially, it would appear that Project NOLA, as a 501(c)3 nonprofit, can provide its digital data to law enforcement without police providing a warrant supported by probable cause due to the Stored Communications Act, as was demonstrated in Carpenter v. United States [75]. However, Third-Party Doctrine holds that there is no reasonable expectation of privacy when information that is voluntarily provided to others [76]. While Project NOLA is a third party due to its nonprofit status, the people who are surveilled by its citywide network of cameras in New Orleans did not vote on the cameras being installed [77]. The program was unannounced and largely secret until May 2025, when the Washington Post reported on Project NOLA in a report titled Police secretly monitored New Orleans with facial recognition cameras [78]. Without any federal or local New Orleans law requiring the disclosure of which cameras are enabled with facial recognition technology, and the people surveilled by the secret cameras having never consented to them, the “voluntarily provided” factor to qualify for the Third-Party Doctrine exception is not met by Project NOLA––unless simply living in New Orleans is considered voluntary disclosure [79]. While courts have recognized that a person does not have a reasonable expectation of privacy in a public space, such as the areas where Project NOLA cameras are typically installed, the scope of Project NOLA’s 5,000 camera network and the intimate personal movement data that is automatically collected and stored by cameras goes beyond the invasive nature of CSLI in Carpenter [80]. Because FRT passes the Carpenter Test, Project NOLA’s data provisions to law enforcement without presenting a probable cause warrant put the organization’s current FRT police operations at risk of probable cause limitation.
Before backlash following the Washington Post report, Project NOLA constantly monitored New Orleans streets for wanted suspects, with the facial recognition technology automatically pinging New Orleans police officers’ mobile phones through an app when FRT cameras located a potential suspect match [81]. The app provided the name and current location of the person the camera scanned the biometric ‘face map’ of and could track their movements throughout the city as other FRT cameras pinged the suspect [82]. The app automated the FRT law enforcement relationship by directly providing FRT location data unrequested and unwarranted to the New Orleans Police Department (NOPD) [83].
Project NOLA also accepts requests to run pictures through their Facial Recognition Analysis ‘face-mapping’ software and biometric database online from anyone who submits a request [84]. Project NOLA claims this technology “routinely allows Project NOLA staff to identify suspects in just seconds, then transmit a subject’s probable name and date of birth to detectives” [85]. While their website clarifies that the NOPD is prohibited from directly accessing FRT, it notes that New Orleans City Law does permit NOPD detectives to accept and use FRT match results from Project NOLA when the FRT search was initiated by Project NOLA staff or a private citizen [86]. This creates a legal loophole for NOPD to bypass security arrangements aimed at protecting anyone NOPD decides to search from being scanned in an FRT database by putting the search initiation on the third party itself. In these cases the third party could be Project NOLA staff, a private citizen, or most dangerously a police officer in their ‘individual capacity’ [87]. All FRT match results, regardless of who initiated the search request, are then provided to NOPD detectives investigating particular incidents [88]. The ability for NOPD to access FRT data without a probable cause search warrant, as long as someone fills out the online form on Project NOLA’s website, violates the additional protections Carpenter found are necessary for emerging technology data provided to law enforcement. The deeply personal nature, inescapable data provision, automated data collection, and breadth, depth, and scope of Project NOLA’s FRT put it at risk of needing to halt its unwarranted, in some municipalities automatic provision of FRT data to law enforcement [89].
Necessitating a probable cause search warrant for law enforcement access to FRT and some form of public disclosure of where FRT-enhanced surveillance cameras are located would resolve the disparities between Project NOLA’s current operations and Carpenter’s Fourth Amendment protections for emerging technologies. While this would simply be an operational shift for Project NOLA that reduces their ability to disclose their data to law enforcement without a warrant, Fourth Amendment protections at the federal level would receive enhanced protection while setting a limitational precedent on law enforcement use of increasingly revealing AI-enhanced data collection technology. With the need for a warrant established, it is expected that Project NOLA would be limited to the operations outlined by a New Orleans FRT City Ordinance limiting the relationship between Project NOLA and the NOPD from 2022, which Project NOLA violated from 2023 to 2025 by sending automated alerts to law enforcement [90].
Recent local actions demonstrate that FRT is already facing public and governmental pushback that may limit its future scope beyond Project NOLA’s operations. After ordinance violations in New Orleans and the NOPD’s decision to de-automate alerts from Project NOLA’s FRT network, other cities have begun watching closely as they consider their own policies [91]. Some municipalities, such as Austin, explored adopting FRT but ultimately abandoned the proposal after community protests; others, including twenty-three states and twenty-one cities, have enacted bans or strict limits due to concerns about constitutional privacy rights and Fourth Amendment protections [92] [93]. These responses reflect growing public unease with commercial, private, and government surveillance as city-wide surveillance networks become increasingly equipped with FRT and AI data management.
This growing patchwork of local and state regulations does not resolve the core constitutional issues posed by FRT. While some jurisdictions require warrants or restrict use to serious crimes, the use of FRT over long periods of time violates the Fourth Amendment no matter the application under Carpenter (although a Supreme Court case regarding FRT specifically could establish different limitations) [94]. In the absence of comprehensive federal standards, government agencies face no uniform obligation to disclose or meaningfully limit their use of FRT, creating significant gaps in transparency and civil liberties protections. As personal technologies and third party surveillance systems become increasingly embedded in jobs, homes, and daily life, meaningful privacy choice is being eroded, rendering fragmented local regulation inadequate and reinforcing the need for federal FRT limitations.
While Project NOLA specifically came under scrutiny in the Washington Post for their 500 FRT-enabled cameras in greater New Orleans violating local statute, the nonprofit operates nationwide, making the extension of the Carpenter Test to FRT critical for Fourth Amendment protection nationally [95]. Project NOLA uses its own funds to supply cameras to individuals and businesses that request them, only charging a $300-1900 annual cloud fee, with an additional $350 annual fee to “upgrade any Project NOLA Crime Camera to automatically search” with FRT, using “powerful AI driven automated alerts” to alert government entities of targeted activity [96]. This means that any U.S. private entity can sign up to purchase a Project NOLA camera, with the biometric and personal data collected by the surveillance network being automatically monitored by Project NOLA’s Real Time Crime Center (RTCC) and available to the third party who hosts the camera [97]. Despite Project NOLA’s RTCC managing this national camera network in its own capacity, the “crime camera initiative” was created by criminologist and former police officer Bryan Lagarde to “reduce crime by dramatically increasing police efficiency,” demonstrating the third party’s intention to aid law enforcement [98]. In making crime detection more efficient, Project NOLA’s existence is rooted in its ability to circumvent requirements for police to establish probable cause. Even when the FRT data is processed by a third party, Project NOLA’s collusion and intent to aid law enforcement makes Big Brother fears a reality in the Big Easy and beyond. Allowing Project NOLA and third-party FRT surveillance to operate nationwide by not federally restricting its use by law enforcement creates a legal loophole that enables the U.S. to operate as an unrestrained police surveillance state, a reality inherently in tension with constitutional protections against unreasonable search & seizure and civil liberties.
Conclusion
Regardless of the specific regulations a Supreme Court decision directly addressing FRT would provide, any Fourth Amendment privacy case challenging FRT collection without a warrant under circumstances similar to Carpenter is essential to preventing unconstitutional search violations by law enforcement. While some municipal and state laws have addressed the privacy law innovation gap highlighted by 2011’s Carpenter relative to 2025's scaled FRT deployment, this patchwork system does not protect all Americans from the constitutional inadequacies posed by automated facial recognition. The sheer volume and scope of data collected by mobile phones, surveillance cameras, and digital platforms render opting out of passive data sharing incompatible with participation in modern society, exemplifying the undue burden Third-Party Doctrine imposes when not extended to technology like FRT. Conditioning access to such information on a probable cause search warrant is fundamental to enforcing constitutional limits on government power and protecting the privacy & First Amendment civil liberties that depend on those limits. To solve the FRT privacy loophole, a court case that directly acknowledges the issue of law enforcement's FRT data collection without probable cause search warrants would need to come to trial, meaning the Fourth Amendment protection gap could take several years to become legally institutionalized.
While Fourth Amendment search protections do not extend to areas where information is in plain view (public spaces) or is voluntarily provided to a third party, the scope of daily physical movements collected by FRT and inescapability for New Orleans residents to avoid the 5,000 cameras placed in public spaces makes FRT qualify for the Carpenter Test exemption [99] [100]. As emerging technologies become increasingly equipped with basic law-enforcement capabilities, such as AI’s ability to analyze video footage and compare a videoed person’s likeness to still images to determine an identity during a search, these technologies automate the search process to the point of violating the individual’s Fourth Amendment right to privacy.
Project NOLA’s use of facial recognition technology fails both the Carpenter Test and the consent foundation of the Third-Party Doctrine, placing its FRT law enforcement activities squarely outside of the Fourth Amendment’s constitutional protections. Therefore, FRT-enhanced surveillance without the need for a probable cause warrant constitutes an unlawful expansion of police power—one that federal courts must curtail before it becomes a normalized feature of American policing.
[1] Douglas MacMillan & Aaron Schaffer, Police Secretly Monitored New Orleans With Facial Recognition Cameras, The Washington Post (May 19, 2025), https://www.washingtonpost.com/business/2025/05/19/live-facial-recognition-police-new-orleans/.
[2] See [1].
[3] Aliana Mediratta, Is the NOPD Breaking the Law by Using Tips From Project NOLA? Depends Who You Ask, Verite News (Nov. 5, 2025), https://veritenews.org/2025/11/05/project-nola-nopd-local-attorneys/.
[4] See [1].
[5] FOX 8 Staff, Project NOLA Blocks NOPD’s Remote Access to over 5,000 Crime Cameras amid Privacy Dispute, FOX 8 (Sept. 29, 2025), https://www.fox8live.com/2025/09/29/project-nola-blocks-nopds-remote-access-over-5000-crime-cameras-amid-privacy-dispute/.
[6] See [3].
[7] See [1].
[8] Daniel Weatherholt, Facing Carpenter: Facial Recognition Technology and the Fourth Amendment, 56 Tulsa L. Rev. 339 (2021), https://digitalcommons.law.utulsa.edu/tlr/vol56/iss2/9.
[9] See [8].
[10] See [8].
[11] Carpenter v. United States, 585 U.S. 296 (2018).
[12] See [11].
[13] Dmitry Gorin, The Third-Party Doctrine and the Fourth Amendment, Eisner Gorin LLP (Dec. 8, 2023), https://www.thefederalcriminalattorneys.com/third-party-doctrine.
[14] Wex Definitions Team, Expectation of Privacy, Legal Information Institute, https://www.law.cornell.edu/wex/expectation_of_privacy (last visited Nov. 19, 2025).
[15] See [11].
[16] Stephanie Lacambra, Cell Phone Tracking or CSLI: A Guide for Criminal Defense Attorneys, Electronic Frontier Foundation, https://www.defendyouthrights.org/wp-content/uploads/2017/10/Cell-Phone-Location-Tracking-or-CSLI-A-Guide-for-Criminal-Defense-Attorneys.pdf (last visited Nov. 19, 2025).
[17] Jimmy Balser, Overview of Governmental Action Under the Stored Communications Act (SCA), Library of Congress (Aug. 3, 2022), https://www.congress.gov/crs-product/LSB10801.
[18] See [11].
[19] See [16].
[20] See [11].
[21] See [8].
[22] See [11].
[23] See [11].
[24] See [8].
[25] United States v. Jones, 615 F. 3d 544, affirmed (2012).
[26] See [11].
[27] See [11].
[28] See [24].
[29] See [13].
[30] See [13].
[31] See [11].
[32] See [11].
[33] See [11].
[34] See [12].
[35] See [24].
[36] See [11].
[37] See [11].
[38] See [11].
[39] See [11].
[40] See [11].
[41] See [11].
[42] See [11].
[43] See [11].
[44] See [11].
[45] William Crumpler, How Accurate Are Facial Recognition Systems – and Why Does It Matter?, CSIS (Apr. 14, 2020),
[46] See [11].
[47] See [25].
[48] See [11].
[49] See [8].
[50] See [11].
[51] See [11].
[52] See [11].
[53] See [11].
[54] See [11].
[55] Bennett Cyphers, Adam Schwartz & Nathan Sheard, Face Recognition Isn’t Just Face Identification and Verification: It’s Also Photo Clustering, Race Analysis, Real-time Tracking, and More, EFF (Oct. 7, 2021), https://www.eff.org/deeplinks/2021/10/face-recognition-isnt-just-face-identification-and-verification.
[56] See [1].
[57] See [11].
[58] See [8].
[59] See [11].
[60] See [1].
[61] See [1].
[62] See [11].
[63] See [11].
[64] See [1].
[65] See [8].
[66] Facial Recognition Request by Civillian, Project NOLA, https://www.projectnola.org/facial-recognition-search-request-by-civilian.html (last visited Nov. 19, 2025).
[67] See [11].
[68] See [11].
[69] See [11].
[70] See [11].
[71] See [8].
[72] See [11].
[73] See [8].
[74] Project NOLA, https://www.projectnola.org/ (last visited Nov. 19, 2025).
[75] See [13].
[76] See [73].
[77] See [1].
[78] See [1].
[79] See [1].
[80] See [1].
[81] See [1].
[82] See [1].
[83] See [65].
[84] See [65].
[85] See [65].
[86] See [65].
[87] See [65].
[88] See [73].
[89] New Orleans, Louisiana, Ordinance Cal. No. 33,809 (July 21, 2022).
[90] See [1].
[91] Luz Moreno-Lozano, Kut, Austin Drops AI Surveillance Cameras From Consideration as Residents Raise Privacy Concerns, Austin Monitor (September 25, 2025),
[92] See [1].
[93] See [1].
[94] See [13].
[95] See [74]
[96] See [74]
[97] See [74]
[98] See [74]
[99] See [1].
[100] See [8].




Comments