Ofcom combatting child abuse online
Ofcom, the communications watchdog, has issued its first guidance for tech platforms. The guidance is all about complying with the Online Safety Act. They are calling on social media platforms to fight online grooming by not suggesting children as "friends" by default. This shows their efforts to tackle illegal content, including child abuse online.
In the guidance they have issued, Ofcom emphasizes on taking preventive measures. They have stressed on making changes on default settings so that it can prevent children from being added to suggested friends lists. They have suggested to protect children's location information as well, and prevent them from receiving messages from people that are not in their contacts list.
The guidance issued by them also requires content moderation teams to have the necessary resources. It talks about the use of hash-matching technology, which is effective in detecting child sexual abuse material (CSAM). However, the hashing technology will not apply to private or encrypted messages as encrypted messages are designed to be end-to-end encrypted. Even the service provider should not decrypt private messages.
If hashing technology were applied to private or encrypted messages, it would be possible for the service providers to access and analyze the message content. This goes against the principles of end-to-end encryption. Ofcom emphasizes that the measures taken will not break the encryption.
The powers in the bill have been controversial as they can be used to force private messaging apps to scan messages for CSAM. Ofcom stated that they will not be consulted on until 2024. Even if the consultation begins in 2024, they are not expected to get implemented until 2025.
Ofcom's chief executive, Dame Melanie Dawes, expressed the difficulty in finding a solution to scan encrypted messages without breaking encryption. Though he has encouraged encrypted messaging companies to work on combating child abuse on their platforms.
This guidance presents a significant challenge for Ofcom. It covers over 1,500 pages and affects over 100,000 UK based and other services. Substantial resources are needed to implement the guidance, but Ofcom is committed to the task. Managing public and campaigner expectations also poses a challenge, as they may be criticized for being too lenient or too strict on tech platforms. Ofcom's primary role is to ensure tech firms have robust systems for users to report illegal or harmful content rather than handling individual content complaints directly.