Instagram tweaks the algorithm after allegations of Palestinian censorship

Business

[ad_1]

Instagram is changing its app to show more viral and current posts amid complaints from its staff that users did not see pro-Palestinian content during the recent conflict in Gaza.

Until now, the social networking application prioritized the original content over the “stories” it displays at the top of a user’s feed over content that is shared or republished by others.

Instagram will now rank the original and posted content equally, according to two people familiar with the staff’s internal situation and messages, in a move that will help breaking news posts find a wider audience.

A spokesman said there has been an increase in users sharing posts about the recent conflict in Gaza, but the way the app is being set up has a “greater impact than expected” on the amount of people who had seen the posts.

“Stories that re-share feed posts don’t get the reach people expect and it’s not a good experience,” the spokesman said. “Over time, we will be giving the same weighting to posts that are reshared as we do with the stories originally produced.”

Instagram said the measure did not fully address issues related to pro-Palestinian content, but had been raised for some time.

The spokesman said the algorithm “made people believe we were deleting stories about specific topics or points of view,” but added, “We want to be very clear, that’s not the case. This applies to any publication that it is shared again in the stories, no matter what it is about. “

One employee involved said a group of up to 50 employees on Facebook, the owner of Instagram, had shown concern over the suppression of pro-Palestinian voices.

The employee said the group had filed more than 80 appeals on content that had been censored by the company’s automated moderation system. BuzzFeed too before reported about the existence of the group.

Facebook’s algorithms had labeled words commonly used by Palestinian users, such as “martyr” and “resistance,” as incitement to violence and removed posts about the al-Aqsa mosque after mistakenly associating the third holiest site in Islam to a terrorist organization, according to the US media reports.

The employee told the Financial Times that he did not believe there was deliberate censorship by Facebook, but suggested that “large-scale moderation is biased against any marginalized group” and leads to excessive reinforcement of eliminations.

Facebook said: “We know there have been a number of issues that have affected people’s ability to share in our apps. We apologize to anyone who felt unable to call attention to important events or felt that it was a deliberate suppression of his voice. That was never our intention, nor do we ever want to silence any particular community or point of view. “

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *