Just recently in the local news a man in the next few towns over has been arrested and charged for possessing child porn. This not a new or rare event, of course; when I ran MCSNet we used to get subpoenas every few months for records that were related to these sort of investigations -- and most of them were facially entirely-valid (in other words, I didn't have go look given what was being alleged and the subpoenas were signed by a judge, so there you are.)
But this one is a bit different; the allegations include that the suspect possessed child pornography entirely produced by an AI; that is, there was no actual child involved.
Tennessee, along with other states, have recently criminalized this act.
The man in question above allegedly had both AI-generated and real material, so at least on the real material the law is well-settled and, in my view, well-founded. Since he was caught with actual material (produced through the abuse of real children) he's cooked and should be.
But where things get more-murky is on the AI side.
These statutes leave more questions than answers for two rather-obvious reasons. First, possession of "actual" (e.g. actual person) material cannot occur without an actual criminal sexual act first occurring with a child, obviously, in that to possess it someone had to produce it and that production inherently involved a felonious act perpetrated on a person. Therefore the same sort of derivative liability theory that leads you to go to prison if you drive someone to a bank where you have reason to believe their intention is to make a rather-unlawful withdrawal, and they shoot a teller in the process, gets you the same punishment as the robber since you are an accessory before the fact (and after the fact if you drive the getaway vehicle once the robbery takes place) logically and reasonably applies. You become an accessory after the fact to the abuse by partaking of it and you know damn well how what you possessed came into being thus you can logically reach the conclusion that the possessor meets the "mens rea" test (of criminal mind; that is, knowledge of wrongdoing) required for criminal prosecution.
But in the case of an AI-generated image where no actual child is involved the weeds get thick fast. There's no cogent argument that ties the creator or possessor to an inherently-felonious and abusive act. It is similar in form and fashion to criminalizing the thought of such an action which of course we cannot because there's no way to get into someone's mind and be certain of what they envisioned in it -- yet, in terms of actual harm to actual children -- that is, in the real world -- they're precisely the same thing.
But let's say we overcome this on a Constitutional basis. This is potentially problematic due to a 2002 Supreme Court decision (ASHCROFT v. FREE SPEECH COALITION 198 F. 3d 1083.) That decision did not entirely release any wholly machine-generated pornography (nor one produced by adults but intended to depict someone underage) but it did slam the door on what could have otherwise, for example, led to prosecution of the makers and distributors of The Blue Lagoon -- and by the way, Brooke Shields was underage when that was filmed. In short that decision came down to criminalizing legally obscene (not just pornographic) material irrespective of the means of production.
Nonetheless times have changed and so has the Court.
That's the first obvious question.
The next one is of far more importance to the technology industry: Who created the porn, if criminalizing the act of creation itself passes constitutional muster?
The problem for the tech industry is that the creator wasn't the possessor; the possessor solicited the creation (making he or she equally liable under the same derivative theory that criminalizes you driving the getaway car) but the actual entity that created the images is one of the large corporate entities that owns and operates the AI, and which under the law is thus also a felon exactly as the photographer of such an act with a real child is equally guilty as are the adult actor(s). Worse, as such laws are written not only did the corporation produce the porn it also distributed it to the user and Section 230 provides no safe harbor as the act of creation occurred on the corporation's premises, not on the user's personal machine.
The specifics of the law in Tennessee are:
The legislation, sponsored by Senator Ken Yager (R-Kingston), was signed into law by Governor Bill Lee on April 24, and it will take effect on July 1. The law makes it a felony to “knowingly possess, distribute, or produce any software or technology specifically designed to create or facilitate the creation of AI-generated child sexual abuse material.” Possession will be a Class E felony, distribution will be a Class C felony and production will be a Class B felony.
That ain't no small-potatoes felony -- Class B felonies in Tennessee are eight to thirty year prison term offenses never mind distribution which the company also engaged in when it transmitted the produced image and each person involved in the production of said thing is equally-liable. Thus all the coders, IT people and executives in a firm that has a public-facing AI which is used for this purpose are exposed to three decades in the slammer for the production plus the penalties for distribution, assuming of course the law survives challenges as to its constitutionality.
In short if this law survives said challenges then any owner of an "AI" that does not prevent it from being abused in this way, along with all employees of said firm, are individually and personally liable for 30 years in the slammer.
Thus quite-obviously, since they caught this guy with AI-generated porn -- the obvious question I'd like an answer to is who's AI was it and where are the indictments aimed at said operator's executives, directors and employees?