The Internet Watch Foundation is a UK organization dedicated to removing illegal images of children in sexual contexts (child sex exploitation material, or CSEM) from the internet wherever they can be found. A series of podcasts recently came to my attention: https://www.iwf.org.uk/what-we-do/why-we-exist/podcast. It is nearly three hours of somber, slow-paced audio addressing this problem.
Readers of my blog posts on this issue would know that my main contention is that most CSEM viewers are not monsters, that the harm caused by their viewing is almost impossible to detect, and that criminal penalties for their behavior are far too severe. But I'm certainly not in favor of CSEM images. I suspect many would horrify me and make me very angry if I saw them.
I would love to enter into a debate with some thoughtful person who defends current policies on CSEM viewing. But I cannot locate such people. All I find is people who repeat slogans. So here I am, making a series of blog posts in response to a series of podcasts.
If I adopt a narrow focus, I'm not against IWF's basic mission. My main hesitation has to do with the priority they place on this work. They ask for donations, and the people who give must compare this with other worthy causes.
When we look at it from one end of the problem, it seems clear enough: Here is an image of child sex abuse on my screen. It shouldn't be there. I will remove it. It may come back but I will keep removing it over and over because it shouldn't be there.
But look at it from the other end. A worthy goal is preventing the suffering of children. They suffer for many reasons, but one reason is that they are abused. Abuse can be physical, emotional, or sexual, and neglect is often put in the same category. So sexual abuse is something a decent person would like to reduce. The overwhelming majority of child sex abuse is never recorded. According to Seto (2013), of what is recorded, 3/4 is never released but kept for private use. We can also assume that large segments of what is released are not popular and quickly vanish. But some are popular (with certain groups of people), and are widely distributed. Sometimes the children in those images become aware of this, and are distressed by the knowledge -- it adds to their suffering, beyond what came from the abuse itself and the fact that the perpetrator chose to release it online.
Now, if you look back through that list of steps, you see that the harm that IWF is seeking to lessen is one part of the suffering of the tiniest fraction of child abuse victims.
This mismatch might lead you to wonder what the actual problem is that is being addressed. I submit it is not primarily the desire to reduce child sex abuse, but to reduce publicly visible evidence of that abuse.
Let me turn to analogies. It's as if you dislodge a rock from the ground and find lots of creepy-crawly things under it. With a narrow focus, you kill those creepy-crawlies. But you look around and see rocks peppered everywhere in this vast field, and know the same unpleasant sight awaits beneath every one -- and many are too large to turn over. The podcast at one point addresses the concern that vigorous enforcement in the UK just pushes the illegal material to somewhere else in the world. The director's reply is that if every country did what they did, the problem would be solved. But this is a very weak defense, as of course there will never be unanimity, and it takes only a few countries to host the offending material.
Consider the parable of the starfish on the beach: https://starfishproject.com/the-parable/. There may be too many to throw back into the sea, but if you throw one back you've saved it. It's a very narrow focus. Which question would you really like to answer, "Is it crazy to throw this starfish back?" or "How should I spend my time?" But there is a further twist. In fact, once the starfish are dislodged from the rocks where they live they are doomed, and returning them to the sea off a sandy beach does them no good. I would never want to say that a child whose images appear in CSEM is doomed (quite the opposite), but their troubles include: premeditated sexual abuse, betrayal by someone who they trusted, and knowledge that thousands of others have seen the images. IWF can take down the images. But they never get rid of them completely, and there is no control of what might be shared privately. The horse is gone, and closing the barn door can't solve the problem. Perhaps the victims feel some comfort that someone is trying to work on the issue, but if so it's a very expensive form of comfort.
Any organization that solicits funds from the public has an incentive to spin the story to make their work seem more important and relevant than it is. The IWF is no exception.
One key example is that they emphasize over and over that every image their analysts look at is of a real child -- it is at the heart of their appeal. Yet their report hotline (https://report.iwf.org.uk/en) immediately offers you the choice between reporting "Child sexual abuse images and videos" and "Non-photographic child sexual abuse images". This second group explicitly does NOT include an image of a real child. Perhaps they have a rationale for why looking at those images is bad, but if so it's got to be a very different story from the main one they are telling.
We are told with indignation that babies appear in these videos. We are told that people in these videos are advocating the rape of babies. We are told that you can pay to order the sexual abuse you desire and someone from a poor country will do it for you. These surely are rare and in no way representative of the vast majority of the images.
All that said, I don't think what IWF is doing is a bad thing. It is better if hard-core CSEM cannot be accessed on the web, and that pertains to every individual copy of every image.