The company usually promptly removes posts or images potentially violating its policies when they are reported.

But it gives preferential treatment to certain organizations and politicians, bosses, advertisers, journalists and celebrities, taking more time to examine their content in order to avoid hasty judgments.

The supervisory board of Meta, an entity described as independent but financed by the company, had lambasted these privileges in December, accusing the company of putting its economic interests before the need to moderate content.

He then proposed 32 recommendations to make this moderation program called "cross check" more transparent, more responsive and fairer.

The group said on Friday that it would implement, fully or partially, 26 of them, and study the feasibility of one of them.

However, he rejected five others.

In particular, Meta refuses to make public the personalities benefiting from privileged treatment for commercial reasons - because they pay the company for its services or generate traffic -, believing that this could "identify them as potential targets for actors malevolent".

The group also does not want to implement a formal process allowing personalities, including government officials, to apply for a "cross check".

Mark Zuckerberg's firm also didn't want to prohibit its government relations team from making decisions on whether certain figures should be included in the list, after the board pointed out that this created conflicts. of "inevitable" interests.

Meta has, however, agreed to limit the visibility of potentially problematic messages pending their review and to differentiate between users to be protected for human rights reasons, such as NGOs or journalists, and for commercial reasons.

Meta also plans to modify its operational systems so that decisions are made more quickly and to make regular reports on "cross check".

© 2023 AFP