Another proposal it’s consulting on is that all general search services should ensure URLs identified as hosting CSAM should be deindexed. But it’s not making it a formal recommendation that users who share CSAM be blocked as yet — citing a lack of evidence (and inconsistent existing platform policies on user blocking) for not suggesting that at this point. Though the draft says it’s “aiming to explore a recommendation around user blocking related to CSAM early next year”.
Ofcom also suggests services that identify as medium or high risk should provide users with tools to let them block or mute other accounts on the service. (Which should be uncontroversial to pretty much everyone — except maybe X-owner, Elon Musk.)
It is also steering away from recommending certain more experimental and/or inaccurate (and/or intrusive) technologies — so while it recommends that larger and/or higher CSAM-risk services perform URL detection to pick up and block links to known CSAM sites it is not suggesting they do keyword detection for CSAM, for example.
Other preliminary recommendations include that major search engines display predictive warnings on searches that could be associated with CSAM; and serve crisis prevention information for suicide-related searches.