Michael Harris made a lengthy podcast called <"The Worst Epidemic">https://samharris.org/podcasts/213-worst-epidemic/. This worst epidemic is allegedly that of child pornography images on the web. His guest is Gabriel Dance, one author of a couple New York Times pieces on this same subject in 2019. I responded to one of them in <this post>https://celibatepedos.blogspot.com/2019/10/hysteria-vs-analysis-regarding-cp.html.

I listened to the full 2 hours. There are certainly things that I agree with. People raping small children and recording it to share with others is a truly horrible crime. Others watching and urging them on is nearly as horrible. I welcome steps to combat such activities and bring the perpetrators to justice.

In the podcast, child pornography is relabeled child sex abuse material (CSAM) and identified as such. I'm <not in favor of this terminology change>https://celibatepedos.blogspot.com/2019/05/terminology-cp-versus-csem.html but pick my battles. We also learn that most of the statutes use the term "child pornography".

My main complaint is that the podcast lumps together in one group all CSAM (and all those who view it), when there are important distinctions.

It's a bit as if you said "Lawbreakers! Disgusting people! An epidemic of vast proportions! Some of them murder and torture people!" That's true. But there are other lawbreakers who park in "no parking" zones or let the time expire on their meters. The first class is tiny, and the second class is huge. "Lawbreakers" covers a wide range and there are important distinctions to be made.

I propose that there is a similar distinction to be made between the small group of those who make and trade CSAM, and a far larger group who watch it -- never paying, and never communicating in any way with the makers.

This second group is fundamentally misunderstood. For the moment I will restrict my attention to pedophiles (though the article claims that many of those who watch are not pedophiles but just find it arousing). The entire group is "foreign" and "other". Most people have a gut-level revulsion against anyone finding children sexually attractive. They do not know personally any members of this group, and the only ones they hear of at all are lawbreakers.

Most people would like to punish such people just for existing. In many places, they punish them for any related activity that is detectable. As a result, countries such as Canada and the UK have banned text-only stories that involve fictional underage people (under 18) engaging in sexual activity. They have banned drawings or cartoons showing such activity. And they can impose harsh criminal penalties on those who are found with such things. Arguments in support are: it might lead to hands-on abuse, or it stands in for the crime of hands-on abuse -- all of them are going to molest children. The evidence for such connections is very weak, and in any case nowhere else in the law would such reasoning be accepted. You cannot punish people for something that is merely correlated with crime and not a crime itself. For comparison, there is controversy about violent movies and video games and some talk of banning them. But there is absolutely no serious talk of sending to prison those who have such items in their possession.

With that as background, it is no surprise that all those who view CSAM are put in the same class as those who produce it or encourage others to produce it. There is no motivation to learn more about those who just watch and consider that different treatment might be called for -- lesser penalties and even a measure of compassion(https://celibatepedos.blogspot.com/2015/05/compassion-for-cp-viewers.html).

The podcast laments the huge numbers of CSAM images that are on the web, without considering just what it is that makes this situation so bad. We can identify possible reasons and consider them separately.

One way they are bad is as a representation of child sex abuse. For such purposes, what is relevant is the number of unique images. The podcast gives numbers that are all about total reports/images, without every emphasizing that the vast majority of them are duplicates. It notes that victims are notified when their images are found -- without noting that they can opt out of receiving such notifications. Those who do not opt out are inviting their own distress.

Another way they are bad is as a representation of how many people are looking at such images for erotic satisfaction. I will examine that below. But that is not correlated with the number of images in any obvious way. One possibility is that the purveyors of such images flood the internet with them, most of them irrelevant and unwatched, as a way of making it hard for law enforcement to find the ones that matter.

Another way they are considered bad is simply the awareness that such images exist and can be found in such quantities -- this in itself is a mark of a sick society or a problem that needs urgent attention. I hope on a little reflection people would see this differently -- this is NOT "the worst epidemic". This is not where you should be sending your charitable contributions when there are so many other issues where people are suffering direct harm.

Let's look at the effect of a few different policies on the incidence of child sex abuse. There is an assumption that reporting of such things will lead to more children being rescued, while no numbers are offered. We hear of how the man raping a boy on Zoom was caught -- but also of how very rare this is, and how it came about because of an undercover agent, not detection of the images. But here's another possible policy: people of good will have suggested that if your goal was to reduce child sex abuse, you would make possession of such images legal and encourage wide distribution to increase the chance that someone will recognize the child or the perp. THAT could rescue a child. Since this idea does not even get a hearing, it is fair to infer that protecting children is not what really what motivates the campaign.

Going through the podcast, there is a clear line to be drawn between the worst-case scenarios, presented as anecdote without numbers, and the vast majority of passive watching.

Harris speaks of a vast audience willing to pay to see horrific abuse. His guest Gabriel Dance much later corrects him that money is hardly ever changing hands.

We hear an anecdote of a man who put drugs in juice boxes to makes children unconscious and then film himself abusing them. A horrible image -- but how common is it? On the anecdotal side of the divide.

There is anecdotal mention of some men who are interested in sexual activity with very young ("pre-verbal") children. There is no mention of how many people actually make such videos. I suspect it is quite small. I want a number, and we are not given even an estimate.

On the other side of the divide is the statement that New Jersey could arrest 400,000 people for viewing CSAM images-- nearly 5% of the population. But based on the report of Michael Seto(https://celibatepedos.blogspot.com/2014/08/setos-internet-sex-offenders-on-cp.html), very few of them are watching videos of crying children, or unconscious children, or pre-verbal children. Mostly they are children who appear to be happy. I am not saying they ARE happy, or that even if they are, it is OK -- they are being exploited and may come to feel bad about it later. But though data is scarce, <Virtuous Pedophiles surveys suggest>https://celibatepedos.blogspot.com/2019/10/an-actual-survey-of-cp-viewers-on.html  that the majority of the men who watch it are not in favor of child suffering. Watching is an activity motivated by lust, and if there is no evidence on the screen of the child suffering, they can put that aside in the throes of sexual desire. They feel guilty as soon as they have climaxed. Much of society is disgusted by any of that, but they will have trouble finding any harm in an individual act. Very few people actually believe in telepathy(https://celibatepedos.blogspot.com/2014/08/do-you-believe-in-telepathy.html). Yes, victims are upset to know that others are looking at their abuse, but sadly, this is just not a problem we can solve. If we removed 90% of the images of them being abused, they would still have to live with that basic psychological fact.

Much of the podcast takes as a premise that it is a high priority to remove such images from the web. I disagree with that premise. I'm not saying I am in favor of such images existing -- I would prefer that they not exist and were not present on the web at all, let alone in such numbers. But this is a problem which cannot be easily solved, and the benefit from throwing resources at the problem is very limited. The common measure of the size of the problem -- number of images online -- has no remotely direct connection to the one fundamental horror at the heart of this -- the numbers of children being sexually abused.

We are presented with the astounding fact that it is a crime to see child pornography and not report it (including if someone sends it to you unsolicited), while it is not a crime to see a murder and not report it. It is hard to see how that is compatible with much of any concept of civil liberties. We are told that surely THIS (such an explosion of CSAM images) is a reason why full encryption is a terrible thing -- law enforcement needs a back door -- while also noting that the chances of actually rescuing any children by this method are tiny. The podcast suggests that the police want such back doors for other crimes, but are (cynically) happy to jump on the bandwagon of public outrage at child sex abuse images to get what they want for other purposes.

The solution to child sex abuse is to put a camera in every room of every house in America. That would be a way to make a stab at catching ALL the abuse, not just the tiny subset that is recorded or the subset of that that is put up on the web or the subset of that that "has legs" and is widely disseminated. There is serious potential here for catching other serious crimes too. Harris asks us to consider whether there is anything we are doing online that is justifiably secret to the extent that it makes encryption necessary if the cost is not being able to track down child sex abuse. Let me ask the very same question about what you are doing in your home. If you have nothing to hide, why should you object to cameras in every room?

Of course this solution isn't really enough -- the web would still be flooded with CSAM images made in other countries.

 

No comments

Add Comment

Enclosing asterisks marks text as bold (*word*), underscore are made via _word_.
Standard emoticons like :-) and ;-) are converted to images.
E-Mail addresses will not be displayed and will only be used for E-Mail notifications.
To leave a comment you must approve it via e-mail, which will be sent to your address after submission.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA

Submitted comments will be subject to moderation before being displayed.