On Sunday, September 29, the <lead article in the New York Times>https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html was titled: "The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?" by Michael Keller and Gabriel Dance. This is my critique of it.

The title as well as the article are designed to maximize our outrage. And while there is genuinely outrageous behavior involved, the problem becomes more complicated when we think more carefully about it -- when we consider realistic remedies. The authors are gravely negligent for failing to consider those issues.

The article makes a big deal out of how dramatically the quantity of illegal images has grown. The introductory graphics wow us with how many little dots there are now, each representing an abuse image. But there has been explosive growth to all images and videos over the same time period. Revenge porn, hate speech, and Russian disinformation have all grown at the same rate, along with video of cats and laughing babies. There is no evidence presented that the actual number of children being abused has gone up dramatically, just that far more video of it is now available online.

There is a comparatively small group of people who abuse children, film it, and distribute those films. There is a far larger group of people who only look at such videos but never create, buy or distribute. The article itself does not suggest this division, but it is apparent when articles like this are read with a careful eye. All the low-hanging fruit of the first group have already been captured, and the rest of them are protected by encryption and/or their location overseas. When US law enforcement is given money and told to solve the problem, the best they can usually do is go after the second group. While this is a very large group, their main "crime" is to feel a sexual attraction to children. The crime of looking at child sex abuse images is quite minor by any objective standards.

Why a minor crime? There is no allegation that all this CP is distributed for the money -- we used to hear it was a multi-billion-dollar business. Law enforcement has realized that so little money is changing hands that it's time to quietly stop talking about it. The other common allegation is that people looking at the material encourage others to make more. The article says, "A private section of the [Love Zone] forum was available only to members who shared imagery of children they abused themselves." It sounds like a small group of detestable people want more children abused so they can see new material, but these producers do not actively want their material seen by as many people as possible. Viewing by the second group, the passive consumers, does not encourage making more. All a passive access of a CP image does is to increase the number of hits, and it turns out that even those hits aggregated by the thousands don't encourage more production either.

The article interviews law enforcement personnel who are frustrated that they have the resources to deal with only a small fraction of the cases they could. They need more resources to perform the mission they have been given. But they have enough to investigate producers. The cases they cannot investigate are entirely from the second group of passive consumers. Though it may sound radical, society should instead stop supporting this mission entirely.

The article laments that not enough money and attention is directed to this problem -- even less than legislation requires. My hunch is that policy-makers who look at the problem can see that the real problem of the creation of child sex exploitation materials cannot be solved by allocation of further funds or high-profile reports. For political reasons they cannot express their reservations directly, so they do so by passive means.

Another target of the article is US tech companies, who reputedly do not cooperate adequately with law enforcement. For instance, they do not keep records long enough or provide complete information. These problems can be easily remedied and probably will be without expenditure of government funds, though it is not clear that their cooperation will make any major difference.

The article lumps these child exploitation images with hate speech and terrorist propaganda as things that proliferate on online platforms and need attention by the big tech companies. They are all difficult problems without easy solutions. One possibility that the article implicitly suggests is to regulate encryption technology so that criminals cannot use it to hide from law enforcement. This is a grave step with serious civil liberties implications. Governments cannot be trusted to use private information only for certain clearly stated legitimate purposes.

What we most want to prevent is child sex abuse. This has been a problem throughout human history. What has changed dramatically is how aware we are of this problem because so many images are now available online. There is little evidence that the actual sexual abuse of children has increased very much because images can be distributed.

The article says many abuse survivors and their families have had their view of humanity changed for the worse by the crimes captured in the videos and people's apparent interest in viewing them. They have glimpsed an unfortunate truth, but no amount of law enforcement can ever change their view back. Similarly, abuse survivors are haunted by the idea that people might recognize them from the images -- but they would still have that fear even if 90% of the copies of the images were removed.

Child sex abuse is a terrible thing. Reducing it, including cases where it is filmed, is a high priority. But once the images are released, the damage is already done. Societal obsession with the images is akin to the man searching for his keys under the streetlight because that's where the light is, even though that's not where he dropped them.

No comments

Add Comment

Enclosing asterisks marks text as bold (*word*), underscore are made via _word_.
Standard emoticons like :-) and ;-) are converted to images.
E-Mail addresses will not be displayed and will only be used for E-Mail notifications.
To leave a comment you must approve it via e-mail, which will be sent to your address after submission.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA

Submitted comments will be subject to moderation before being displayed.