House panel urges Meta to take responsibility for false, harmful content | ABS-CBN

Featured:
|

ADVERTISEMENT

Featured:
|
dpo-dps-seal
Welcome, Kapamilya! We use cookies to improve your browsing experience. Continuing to use this site means you agree to our use of cookies. Tell me more!

House panel urges Meta to take responsibility for false, harmful content

House panel urges Meta to take responsibility for false, harmful content

Paige Javier,

ABS-CBN News

Clipboard

House panel urges Meta to take responsibility for false, harmful content
iWantTFC

Watch more on iWantTFC.com. Watch hundreds of Pinoy shows, movies, live sports and news.

Watch more on iWantTFC.com. Watch hundreds of Pinoy shows, movies, live sports and news.

MANILA – In its fourth hearing on fake news, the House Tri-Committee zeroed in on the accountability of social media platforms when it comes to content labeled as false or misinformation.

Representatives of Meta, the parent company of social media platform Facebook, faced lawmakers on Tuesday the first time since the panel opened its inquiry.

Antipolo Rep. Romeo Acop grilled Meta representatives while asking them if the social media platform should have responsibility for false news and harmful content.

"The posting of false information and misleading news, the content that harms not only the subjects of the questionable post but also the public trust in state institutions, should not be tolerated, would you agree with that," Acop asked.

ADVERTISEMENT

Meta Director of Public Policy for Southeast Asia Dr. Rafael Frankel did not directly answer yes or no.

"I wanna say that when we come to topics like misinformation, there are different interpretations of that. And this is why we rely on third party fact-checkers to be the authoritative sources of what is true and false, and to make those determinations. So that as we as a neutral platform... do not have the responsibility to determine which is true or false," he said.

"We have the responsibility to act on the findings of the third party fact-checkers provided to us. If they provide that what they find is false, we have the responsibility, it is in our community standards to reduce the distribution of the content and properly label it," Frankel added.

The Meta representative said they will go as far as removing misinformation if it is labeled as false, incites violence and violates election law.

He added that they are balancing the right to freedom of expression and safety of users in the Philippines and globally.

ADVERTISEMENT

"The approach we have is based on remaining a neutral platform that tries our best in very difficult circumstances, in a very robust democracies like the Philippines where there are very charged political debates going on. We're doing our best to balance freedom of expression and safety on the other," Frankel added.

"We take our content moderation, our fact-checking very seriously. We have 11 different channels between your government and our platforms to address any types of misinformation or harmful content. We undertake coordinated disruptions when we see there is an abusive pattern of disinformation or misinformation," he explained.

The Meta representative later pointed to the user to be liable for his or her posts.

"Our responsibility is to make sure that we are doing the best we can to balance between a voice and safety," Frankel reiterated.

"The individual user bears the responsibility that they upload on our platform. Now we have a set of community standards that we need to uphold," he said.

ADVERTISEMENT

Acop further pressed Frankel to give a more direct answer, saying he is evading his questions on whether or not bears responsibility.

"I'm not asking whether you are the primordial entity to be responsible. No, I am not asking that. My question is whether you bear some responsibility," Acop said.

"It seems from our discussion that social media platforms are making some efforts to curb the spread of fake news in this country. However, I disagree with the impression that acting merely as hosts of user-generated content exempts these platforms from any liability for the harmful content they allow to be exhibited," he added.

Because of this, Acop said there may be a need to review the law to place more accountability from social media platforms.

"As part of our legislative duties, I think it may be time to revisit the current legal framework regarding these social media platforms if only to incentivize them to further crack down on these types of illegal posts," he said.

ADVERTISEMENT

ABSENCE OF META ENTITY IN PH 

Earlier in the hearing, 1-Rider Party-list Rep. Rodge Gutierrez questioned Meta for not having a legal entity in the Philippines.

Frankel said in the Philippines, there is a representative office for Meta.

"It is important to note that the representative office is not in terms of business or operations, It does not do anything involving content decisions. Those are undertaken by Meta Platforms Inc.," he said.

Gutierrez asked what the representative office like Facebook Philippines does if it does not make crucial decisions.

"No, decisions are made here. These decisions undertaken by our parent entity in Singapore or in the US. This is really a sales or support office that exists here," Meta Head of Law Enforcement Outreach for Asia Pacific Rob Abrams said.

ADVERTISEMENT

Frankel said Facebook Philippines also runs programs on digital literacy and digital entrepreneurship.

When asked about Facebook Marketplace, widely used by Filipinos, he said is is being operated by the United States company.

"There’s no commerce taking place in Marketplace. But it doesn’t mean we don't comply with local laws. We as a American global company, that lives up to the best legal standards of the world, we always want to comply with local legal laws. So just because any of the platforms is being operated by American company doesn't mean we’re not complying with local law," Frankel said.

"What we’re driving at is local accountability, which goes beyond legal requirements... However, I hope you understand is what makes it very difficult for us when we have problems, it’s very hard to reach Meta Platforms Inc. We are of the position if we have any problems or something we want to relay, we go through Facebook Philippines," Gutierrez said in response.

The lawmaker cited previous hearings, wherein they have been tossed left and right, as the reason why they want to push for a legislation.

ADVERTISEMENT

Frankel expressed the platform's commitment and support to attend hearings to contribute to a possible measure on the issue.

Acop likewise manifested the absence of an accountable and empowered Meta entity in the Philippines is an important issue.

"I hope the Meta people will agree with that. As we have heard, Facebook Philippines does not control the platform, nor does it handle content regulation and policy enforcement. This makes it difficult for Filipinos to seek timely and effective recourse when harm occurs online," he said.

"That is why we are exploring the idea of requiring accreditation or registration for social media platforms operating in this country to ensure that there is a readily accessible responsible point of contact that understands and complies with our laws," Acop added.

HOW META HANDLES MISINFORMATION

Meta representatives emphasized they have a three-pronged approach to handle misinformation: remove, reduce and inform.

ADVERTISEMENT

In the Philippines, the social media platform has been partnering with third party fact-checkers Vera Files, Agence France-Presse and Rappler to identify misinformation since it is more "nuanced" and highly contextual.

"Number one, we remove the most harmful misinformation that can lead to offline physical violence. Number two, we reduce the distribution. If it is found to false by our third party fact-checkers, we will reduce it on our platform. Number three, we seek to inform. If people have engaged in misinformation on our platforms, we want them to know. We want them to know that maybe they have unintentionally shared misinformation," Frankel said.

"I just do want to do a distinction between moderators that look at posts versus third party fact-checkers. Moderators that we have are really looking at potential violations of our community standards. Third party fact-checkers are looking to determine if pieces of content are misinformation," he added.

He added, they outright remove misinformation on elections from their platforms.

Frankel explained that repeated instances of misinformation from the same pages and same people can lead to overall less distribution of content and demonetization, among others.

ADVERTISEMENT

The Department of Information and Communications Technology Cybercrime Investigation and Coordinating Center (DICT-CICC) said it takes about 6 hours to 24 hours before the requested content is removed from the platform.

When it comes to harmful content, they do content moderation through user reports, human review and artificial intelligence.

"With community standards moderation, what we do is we prioritize the most severe harms first. It is not necessarily a first in and first out moderation system," Frankel said.

"What we do is prioritize the most harmful pieces of content first. For example if we see reports of child exploitative imagery, that's gonna jump all the way to the top of the queue. If we see terrorism or incitement to violence, those are going to jump ahead because of the potential to cause harm," he explained.

Frankel said Meta's ability to remove content has no impact on the third party fact-checkers operating on misinformation.

ADVERTISEMENT

He added that Meta has 11 different reporting channels with local government agencies and release a transparency report every six months.

Bataan Rep. Geraldine Roman pointed out that we do not have legislation that will establish the duty of social media platforms and other stakeholders.

"It is this void where I think our committee can be most useful, to regulate social media platforms... We can regulate social media platforms. But to actually regulate content creation is unconstitutional," she said.

"Content creation is our fundamental right to free speech and this is guaranteed by the constitution. We cannot regulate opinions, how and what content creators can produce. But we can regulate social media platforms by empowering them to act as gatekeepers," Roman added.

The lawmaker said if they will regulate, they must only regulate the social media platforms.

ADVERTISEMENT

The social media platform expressed support to work with Congress efforts to create any legislation on the issues.

ADVERTISEMENT

ADVERTISEMENT

It looks like you’re using an ad blocker

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker on our website.

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker on our website.