Troubling story in the FT looking at child abuse images being shared via groups on Facebook-owned WhatsApp.
It’s concerning not just because of the material, but the fact has apparently failed to tackle the specific complaint, months are being told about it. From the report:
These groups were monitored and documented for months by two charities in Israel dedicated to online safety, Netivei Rishet and Screensaverz.
Their purpose was often obvious from names such as “cp” and from explicit photographs used on their profile photos. Such identifiers were not encrypted, and were publicly viewable so as to advertise the illegal content, yet systems that WhatsApp said it had in place failed to detect them.
A review of the groups by the Financial Times quickly found several that were still extremely active this week, long after WhatsApp was warned about the problem by the researchers.
This story gives a hint, I think, of one of the biggest challenges going into 2019, not just for the social networks, but for those who are monitoring and reporting on them. Conversations of all kinds are moving to closed groups, on platforms where the contents of messages are hidden by encryption.