We live in a world of rapidly advancing technology where things once thought impossible suddenly become common, and we live in a legal system based upon stare decisis, where rapid decision making is discouraged. The Supreme Court often takes years before even accepting a case for review where the underlying issues have been around for decades. This can leave the victims of sudden technological developments without legal recourse until the vagaries of the law are finally hashed out.
Such is the case with “nudification”—the process of taking a photograph of an individual and removing that individual’s clothing so that they are exposed completely nude to the public. The technology itself is relatively new, going back no earlier than 2017. Basically, using artificial intelligence, or “AI”, a company can take any conventional photograph of an individual, and create an image that looks like the individual, only without clothing. This naked image of an individual can then be shared online with basically the entire world. The image is often so realistic that people viewing the image believe it is a nude image of the person it purports to be. In the case of the Lancaster Country Day School in Lancaster County, Pennsylvania, a student subjected the photos of at least 50 of his classmates for nudification and then displayed those images on the internet.
Now, one might think that such an action is clearly illegal. If a nude photograph of a person were displayed in a public forum without the victim’s permission, the person posting it could be sued for invasion of privacy and other similar torts. Nudified images are different, however, because they are not actually photographs of anyone. Some would argue that they are, at most, virtual depictions of an image created by AI that do not necessary reflect the victim’s image in any way—these people would argue that the images are, in fact, protected by the First Amendment to the U.S. Constitution. They would rely on two Supreme Court cases to support this position.
Child Pornography is a very serious felony often carrying significant mandatory prison sentences. Virtual child pornography—images of children engaged in sexual acts with adults and with each other—however, has been recognized by the Supreme Court as being constitutionally protected under the First Amendment.[1] In both cases, the Court focused on the fact that the images were not images of real children; rather, the images had been created by a computer. Since they were not real children, the Supreme Court held that they were not child pornography. The images were therefore protected speech under the First Amendment. Whether we like it or not, that is where the law stands.
[1] See Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002); United States v. Williams, 553 U.S. 285 (2008).
But doesn’t the nudification of images result in an image of a real child (or real person) being displayed naked without their permission? At this point, we don’t know for sure. On one hand, the image is a “virtual” image, just like the images in Free Speech Coalition and Williams. On the other, they are made using AI and intended to look like the images of a real person. If an AI generated image appears identical to a photograph, is it not a real image the same as the photograph? We do not know for sure yet.
It is first important to understand what AI is. According to the International Organization for Standardization, AI “a technical and scientific field devoted to the engineered system that generates outputs such as content, forecasts, recommendations or decisions for a given set of human-defined objectives.” ISO/TEC 22989:2022. There are other definitions, the basic principal is that AI allows a computer, to think, decide, and predict the outcome of a set of facts just as a human would. There has even been talk of implanting AI into the brain of the President to improve thought processes, but that may be a joke.
If we look at nudification from an AI perspective, it is clear that the images being produced are not random images like “twelve-year-old girls engaged in lesbian sex.”[2] Rather, they are specific images of specific individuals that are being created because the original subject is a person. Are the fifty victims in the Lancaster Country Day School actual victims of the AI company that generated their images? I believe that there is every reason to believe that they are.
[2] As Justice Robert Jackson once said, “We are not final because we are infallible, but we are infallible only because we are final.” See Brown v. Allen, 344 U.S. 443 (1953).
Holding the AI Company Accountable.
Recently, the San Francisco City Attorney’s Office filed suit against a number of AI companies[3] alleging violations of California Statute 17200 which makes it unlawful to engage in any “unlawful, unfair or fraudulent business act or practice” or any “unfair, deceptive, untrue or misleading advertising”. While most states have similar business fraud statutes, the actual victims of fraud may also be able to recover for defamation or invasion of privacy allowing the victims to recover punitive as well as actual damages from the nudification company’s product.
[3] People of the State of California v. Sol Ecom, et al., Case No. BC430421.
It should be noted at the outset that it is difficult to imagine any legitimate uses for nudification by third parties. Indeed, if anyone wants their naked image displayed in public, and they are an adult, they would be free to do so. If a person ordered their own nudification, there would be no complaint and no victim. It would therefore appear that the entire nudification industry is illegitimate.
Invasion of Privacy.
Invasion of Privacy in Pennsylvania, where the Lancaster Country Day School disclosures took place, is both a crime[4] and a civil tort. For purposes of this article, however, we will focus upon a civil tort since the victim has greater control of a civil case than they do a criminal case. Indeed, a state prosecutor can decide not to pursue a case. A civil invasion of property, however, is the intentional revealing of personal and delicate information in a public forum where there is a “reasonable expectation of privacy.”
[4] 8 Pa.C.S.A. 7507.1. Violations of the law should be reported to the police and are prosecuted by the District Attorney’s Office; however, the victim does not control the prosecution, and the District Attorney does not recover damages for the victims in most cases. Also, the crime of Invasion of Privacy is much narrower than a civil action for invasion of privacy.
“Reasonable expectation of privacy” is an element that receives a lot of attention from the courts. Basically, if there is not a reasonable expectation of privacy, there is no invasion of privacy. In cases like the Lancaster Country Day School case, nude pictures made and distributed without a victim’s permission would beyond question violate any “reasonable expectation of privacy.”
There are essentially four types of invasion of privacy recognized in Pennsylvania: Invasion of Solitude, Appropriation of Name or Likeness, Disclosure of Private Facts, False Light. Based upon press reports concerning the Lancaster Country Day School, all of these causes of action could be relevant; however, these are legally complicated causes of action, and it is important that any victim speak with a knowledgeable lawyer before taking any legal action.
Intrusion of Solitude requires an interruption of someone’s privacy, either physically or through the use of a camera or some other recording device to capture images that a reasonable person considers offensive. Most often, this involves videotaping someone in their home or in some other location they consider private. The key is that the film which is captured involves a very private and personal moment that a reasonable person would not want displayed to the public. While the filming of anyone in a private setting where they are nude violates this specific situation, defendants in the Lancaster Country Day School case might argue that since the images are “virtual”, there is no invasion of privacy. This too is likely to become a complex issue, and again, it is important that any victim contact an experienced lawyer as soon as possible to determine what their rights are or might be.
The second type of invasion of privacy involves the Appropriation of Name or Likeness. Everyone has a privacy interest in their name or likeness, and, when then the name or likeness is stolen by another person, they may have committed the tort of invasion of privacy. The key thing to remember, however, is that there must be a reasonable expectation of privacy. Therefore, images captured in a location where there is no expectation of privacy are not going to give rise to liability for this tort. In the Lancaster Country Day School matter, it appears that private images were taken and used without the permission of victims. This could give rise to the tort of invasion of privacy.
Disclosure of Private Facts, the third type of invasion of privacy, could also be relevant in a nudification case. Although the tort can sometimes be difficult to prove depending on the specific information disclosed and whether the First Amendment to the Constitution allows its disclosure, in the Lancaster Country Day School matter, involving nude photos of students, these concerns should not arise. However, it must again be emphasized that anyone suspecting a violation of their right to privacy should contact an experienced attorney knowledgeable in this area of the law.
The final type of Invasion of Privacy involves “False Light” publication. Whenever someone reveals information that is offensive or embarrassing and false, they may have committed False Light Invasion of Privacy. In a nudification case, if the defendant contends that the nude photos are virtual and therefore “false” this type of Invasion of Privacy may come into play. It is similar to, and may be filed in conjunction with Defamation, and it cannot be emphasized enough that only an experienced lawyer knowledgeable about Invasion of Privacy can advise a victim as to the proper course of action.
Invasion of Privacy cases can be complicated, and there are a lot of twists and turns that can affect the outcome of a case. However, if there has been an invasion of privacy, verdicts can be quite large. In a recent New York case, a Manhattan jury found Curtis Jackson a/k/a 50 Cent liable for invasion of privacy and awarded damages of $7,000,000.
Defamation.
Defamation is an action that may also be appropriate in a nudification case. Unlike Invasion of Privacy, which focuses on the disclosure of private information, Defamation focuses upon the falsity of the information presented. To prevail in a defamation action, a plaintiff must prove:
- The existence of a statement that is false and damaging;
- That the defendant made (published) the statement;
- That the statement describes the plaintiff;
- That the defendant understood the statement to be defamatory;
- That the recipient(s) of the statement understood it to apply to the plaintiff; and,
- That the recipient suffered some harm as a result of the statement
Defamation can be a complex statute, especially when public figures or matters of public concern are involved, and these matters can result in a change in the burden of proof in some cases. However, in a nudification case, where the victim is a private citizen, it should be possible to prove Defamation. Again, however, both Invasion of Privacy and Defamation can involve complex action, and the advice of an experienced attorney, knowledgeable in this area of the law, is absolutely necessary before any action is taken by a victim.
At Boyle & Jasari, LLP, we represent victims of crimes and other intentional torts, including Invasion of Privacy and Defamation. If you have any questions about nudification or any other actions from school that result in emotional or physical injury to a student, please call us. We may be able to help.
Dennis Boyle
Founder / Partner
Mr. Dennis Boyle is an accomplished white-collar criminal defense and complex civil litigation attorney who practices throughout the United States and internationally.