01 873 2134 
Facebook is introducing the ability to control who comments on posts. The social giant says the measure is especially aimed at making public figures feel safer online. 
Facebook is introducing a new feature that lets users control who is allowed to comment on their posts. 
 
The move is aimed at tackling trolls and online harassment. Facebook says it is to make people feel more “safe” and is particularly aimed at public figures, who frequently face torrents of abuse. 
 
Twitter introduced a similar feature last year. 
 
The new Facebook control will allow people to limit who can comment on public posts, from anyone who can see the post to only the people and pages ‘tagged’. 
 
It is also introducing a new ‘feed filter bar’ that will allow people to choose between chronological, algorithmic or ‘favourite’ news feed posts. 
 
Facebook allows up to 30 friends and ‘pages’ to include in ‘favourites’, so that their posts will appear higher in the ranked News Feed. 
 
And it is expanding the “Why am I seeing this?” explanation option on algorithmic news feed posts to allow users to “learn more about the signals that influence” the inclusion of such posts. 
 
“You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes, to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform,” said Nick Clegg, Facebook’s vice president for global affairs and a former deputy British prime minister. 
 
“In the long run, people are only going to feel comfortable with these algorithmic systems if they have more visibility into how they work and then have the ability to exercise more informed control over them. Companies like Facebook need to be frank about how the relationship between you and their major algorithms really works. And we need to give you more control over how, and even whether, they work for you.” 
 
Mr Clegg also defended Facebook’s incentives and commercial motives, denying that the company is agnostic about promoting extremism and sensationalism. 
 
He disputed the “perception” that Facebook’s algorithm “fuels polarisation, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality … in a relentless pursuit of profit.” 
 
He said that “human nature” has a more prominent role than is attributed. 
 
“Of course, on a platform built around people sharing things they are interested in or moved by, content that provokes strong emotions is invariably going to be shared,” he said. 
 
“At one level, the fact that people respond to sensational content isn’t new. As generations of newspaper sub-editors can attest, emotive language and arresting imagery grab people’s attention and engage them. It’s human nature. But Facebook’s systems are not designed to reward provocative content. In fact, key parts of those systems are designed to do just the opposite.” 
 
Mr Clegg also took issue with accusations that Facebook has a leading role in the polarisation of society. 
 
“Even if you agree that Facebook’s incentives do not support the deliberate promotion of extreme content, there is nonetheless a widespread perception that political and social polarisation, especially in the United States, has grown because of the influence of social media,” he said. 
 
“This has been the subject of swathes of serious academic research in recent years – the results of which are in truth mixed, with many studies suggesting that social media is not the primary driver of polarisation after all, and that evidence of the filter bubble effect is thin at best.” 
 
Share this post:
Our site uses cookies. For more information, see our cookie policy. Accept cookies and close
Reject cookies Manage settings