By J. Brett Smith
Criminal District Attorney in Grayson County
For those in our profession who have never handled a Possession or Promotion of Child Pornography case, consider yourself very blessed. Unfortunately, the images and videos in these cases may never be fully erased from one’s memory. Couple that fact with trying not to imagine your own children or grandchildren in the same scenario as abused children, and you have accomplished a major feat.
All that said, some of the most rewarding cases we have prosecuted have been CSAM cases—CSAM stands for child sexual abuse material. (I tend to use that term instead of “child pornography” to differentiate pornography, which is created between consenting adults, and images of children’s sexual abuse.) We hate to receive and work on such cases—but we love stacking up counts and racking up time for those guilty of this crime. This article outlines some of what we have learned about CSAM cases.
Watching our language
To start with, and I already mentioned this, I advocate for changing the use of the term “child pornography” to “child sexual abuse material” (abbreviated as CSAM). I do use the term child pornography, but I’ll sometimes use CSAM before a jury—and more law enforcement is calling this material CSAM. CSAM is a more accurate description of these images and videos, according to the National Center for Missing and Exploited Children (NCMEC). The children in CSAM are victims; they are not willing or consenting participants. Referring to CSAM as any type of pornography just seems to place it into a category that we personally find abhorrent.
Changes to the law
Know the law for these offenses committed on or after September 1, 2023. During the 88th Regular Session, the Texas Legislature passed and Governor Greg Abbott signed some very helpful changes to §43.26 (Possession or Promotion of Child Pornography). By now we have all watched or attended TDCAA’s Legislative Update, which noted that three very different Senate bills changed the law—and two of them conflict with each other. I strongly suggest reading the 2023 Legislative Note following §43.26 in the most recent edition of TDCAA’s Annotated Criminal Laws of Texas book. In essence, possession of fewer than 100 images of CSAM is a third-degree felony. Possession of 100–499 images is now a second-degree, and possession of 500 or more images is now a first-degree felony. Any video or film which visually depicts conduct constituting an offense under §22.011(a)(2) of the Penal Code (Sexual Assault of a Child) is now a first-degree felony. Of course, the courts will ultimately have to reconcile these conflicting punishment schemes, so stay tuned—but based on our experience, most suspects have a large cache of images and, sadly, many videos.
Art. 42A.054 of the Code of Criminal Procedure (Limitation on Judge-Ordered Community Supervision) has also been amended to add Subsection (16). The result is that a defendant found guilty of an offense under §43.26 cannot be sentenced to probation, nor can he receive deferred adjudication community supervision.[1] Government Code §508.145(d) also requires defendants convicted under §43.26 to serve at least half their sentences or 30 years, without consideration of good time, whichever is less. A good writ reduction policy would be to ensure someone—the State, the court, or the defense—puts a defendant’s understanding of his parole eligibility on the record following a plea agreement.
Penal Code §3.03(b)(3)(A) allows the sentencing court to stack (run consecutively) offenses under §43.26. Section 3.03(b) states that if an accused is found guilty of more than one offense arising out of the same criminal episode, the sentences may run concurrently or consecutively if each sentence is a conviction for a listed offense (including §42.26). This would include sentencing following a conviction at trial or through a plea agreement.
The images’ origin
Know where the images and videos came from, how prosecutors got it, and how CSAM is identified. Throughout my time as a prosecutor, I have seen CSAM cases come from everywhere: suspects taking their computers into a repair shop and the shop owner discovering CSAM; civilian witnesses who observe images on a suspect’s phone or computer; and law enforcement stumbling across CSAM during ancillary criminal investigations involving the review of electronic devices, particularly in child sexual assault investigations.
Most frequently a CSAM case will come from a NCMEC Cyber Tip. The National Center for Missing and Expoited Children (NCMEC) is a private nonprofit based out of Alexandria, Virginia, which was established by Congress in 1984.[2] A Cyber Tip will typically be generated by NCMEC from an electronic service provider (ESP) that reports suspected CSAM to NCMEC. In 2022 nearly 99 percent of Cyber Tipline reports were submitted by ESPs—Facebook, Google, Instagram, Snap-chat, Yahoo, Microsoft, etc. The CSAM is reviewed by NCMEC and if confirmed, a Cyber Tip is generated. That tip will generally describe and detail the images including its assigned title, and provide an Internet Protocol (IP) address. IP addresses are generally a unique number assigned by local internet service providers. It is important to note that IP numbers can and do change over time. That is, local internet providers may change the IP address assigned to a given customer, which is one reason a prompt investigation of all Cyber Tips is important.
Cyber Tips are generally sent to a regional law enforcement agency with an Internet Crimes Against Children (ICAC) Task Force in the geographic jurisdiction where the ISP address is located. Once law enforcement receives the tip, officers can legally obtain the name of the person and the address associated with the IP through a grand jury subpoena. If law enforcement is seizing any data or documents, a search warrant is required. This may provide sufficient probable cause for a warrant to search for and seize devices associated with the IP for images of CSAM. Be aware that because law enforcement may not be fully aware of how many devices a suspect possesses, once all computers, tablets, cellphones, or other devices are seized, additional probable cause must be established in secondary search warrants to permit inspection and analysis of the devices.
Another reason for prompt investigation is that suspects or people helping them can wipe or delete account information, perhaps even while suspects are incarcerated. Law enforcement should obtain and send a search warrant to the electronic service provider (ESP) that submitted the Cyber Tip. The Cyber Tip may contain only information about a brief moment in time, i.e., a CSAM image or video download, but the account contents may provide additional valuable investigative material.
Known images of CSAM are often identified through what is known as a Secure Hash Algorithm Version 1 (SHA1 or “hash value” for short).[3] This is a file encryption method that produces a digital signature or fingerprint of a file. Two files will not produce the same hash value unless every pixel in the image or video is identical. As a result, identification of CSAM through a SHA1 hash value can be extremely accurate, more so than DNA results.
Be sure you can prove possession. We have learned the hard way that just because CSAM is found on a computer doesn’t mean the State can prove a particular suspect possessed those images. Encourage law enforcement to conduct interviews of suspects to obtain admissions and determine who has access to the device(s). This would include reviewing search histories and being cognizant of methods commonly used to obtain and distribute CSAM, such as peer-to-peer networks, BIT TORRENT, cloud sharing, and other tools used in the possession and distribution of CSAM. Have law enforcement review ancillary documents, emails, and even benign photographs on the computer to link a suspect to a particular device.
Practice tips
Always allege the image or video as both “actual or simulated.” Many CSAM images and videos may be both, and it is often hard to determine what is real and what is simulated. And in the age of Artificial Intelligence (AI) and photo-editing software, one can only imagine what images or videos could be created.
Some images or videos will be very clear that the victim is a child, but some cases may require that a SANE nurse or pediatrician reviews the images to testify about Tanner Stages of physical development and other indications to prove beyond a reasonable doubt the victim depicted is, in fact, a child.
Be prepared to show the images or videos to the trier of fact to prove beyond a reasonable doubt the elements contained in §43.26. Prosecutors may need to show an image for only a few seconds or show just 10–15 seconds of a video, but the judge or jury must see the evidence.
Finally, if there are hundreds of images, be sure to select the “best” (meaning, the most awful or egregious), and charge an appropriate number of counts. For us, that is generally at least 10. When alleging multiple counts, be sure that after each charging paragraph, you end the paragraph by identifying each video or images charges with a unique file name, number, or the SHA1 Hash Value, e.g., Video IMG_0820_MP4. You need some type of identifier to distinguish each image or video to avoid a Motion to Quash for lack of specificity, as multiple photos may appear to be indistinguishable without a unique identifier.
You can introduce additional evidence at punishment, but keep in mind that showing too many images or videos during trial can argued as prejudicial.[4] One safer practice would be to simply re-call the investigator during punishment and have him testify to the total number of CSAM images or videos recovered.
Our experience is that, on average, a prosecutor will have to review CSAM at least four times on each case: at intake, discovery with defense counsel, trial preparation, and the trial itself. Ensuring a good investigation and using strong charging practices should lead to a higher percentages of case resolutions or plea agreements—as well as you or your staff having to review the CSAM on fewer occasions.
Conclusion
While many studies have attempted to find a link between child pornography and pedophilia, several with mixed results, there does appear to be empirical data indicating suspects who view CSAM have engaged in contact sex crimes with children. At a very minimum, each time a CSAM video or image is circulated through the internet, the child portrayed in the image is, once again, victimized. The prosecution of these cases is very important.
Endnotes
[1] Tex. Code Crim. Proc. Arts. 42A.102(b)(2)(A) and 42A.453(b).
[2] We strongly encourage you to review and digest the abundance of information on the NCMEC website regarding CSAM.
[3] Categorized as MD5 HASH or SHA 1 HASH, these are long alpha-numerical identifiers of unique images or videos.
[4] See Pawlak v.State, 420 S.W.3d 807 (Tex. Crim. App. 2013).