ADVERTISEMENT

ADVERTISEMENT

12:16pm 08/04/2023
Font
TikTok and content moderation: Should Indonesia worry?
By:Haikal Al-Asyari / The Jakarta Post / ANN

It is unfathomable what type of content we can encounter on our smartphones on a daily basis.

From cooking tutorials and people playing with lato-lato (clackers) to elderly people bathing in mud and children participating in dangerous trends.

Some of this content lies within a gray area of legality.

Recently, the global community has been dumbfounded by the United States congressional hearing that grilled TikTok CEO Shou Zi Chew about data security and harmful content on TikTok.

Some US lawmakers are striving to ban the app nationwide for fear of harmful content and security threats.

But is banning social media platforms really the answer? Are there any alternatives to strike a fine balance between citizens’ freedom of expression and protecting them from harmful and illegal content?

On March 23, Chew was questioned extensively on TikTok’s policy and practices in terms of moderating content, data security and protecting young users.

While some may criticize Congress’ lack of research in questioning, the hearing also brought to light critical issues for the urgency of strengthening data privacy protection and content moderation.

An interesting question by one of the congressmen asked how the TikTok algorithm “pushes” dangerous challenges to the feeds of young children.

The case of a 10-year-old girl participating in a “blackout” choking challenge served as an example of harmful content found on TikTok.

This is just one of the examples where content and its moderation are crucial, and Indonesia is not an exception.

Early this year, the Communications and Information Ministry asked TikTok to ban all content related to mandi lumpur (mud baths).

The online begging phenomenon is one of many situations, alongside pornographic, inappropriate and harmful content, where platforms should cooperate effectively with the government for moderation.

It might be a surprise that the debate over the assessment of what is “graphic” or “obscene” or content that contains “nudity” has been occurring for decades.

From when information was still printed on paper, content moderation has been inseparable from the media.

Content moderation is aimed at protecting one user from another, or one group from their opposition, and removing content that is considered “offensive” or “illegal.”

Content moderation involves anything from removal of content, flagging, the use of filters, labeling, demonetizing, deactivating comments or certain features to blocking users.

In principle, due to the agreement between users in accessing the social media platforms that are bound by the freedom of contract, social media platforms are free to restrict whatever contract they feel is necessary.

Nonetheless, they should still respect human rights, the freedom of expression, the right to privacy and due process.

The current approach to content moderation heavily relies on self-regulation, where individuals decide for themselves which information they share with whom and for what purposes.

TikTok has officially taken down “ngemis online” on their platform which many deemed to be exploiting elderly individuals. KOMPAS

Moderation policies are reasonable compromises between users with different values and expectations as well as between different demands of profit.

But the question is, which “values” do the platforms follow? Is it their own or the users’?

There’s a new consensus to bring self-regulation toward co-regulation. Where the burden of decision-making responsibility between companies and the government is shared.

In this sense, states need to take the role of “structuring self-regulation” while providing companies some room for decision-making.

Such an approach also includes the involvement of multiple stakeholders, including industries and public interest groups to establish a mandatory code of conduct.

Launched in August 2018, TikTok is a subsidiary of Chinese technology firm ByteDance Ltd., which is based in Beijing.

With the companies’ seating and affiliation with the ruling Communist Party, the US government worries that Chinese authorities could force ByteDance to hand over TikTok data on American users, exposing sensitive information.

This is based on China’s 2017 National Intelligence Law that states “any organization” must assist or cooperate with state intelligence work.

Nonetheless, a spokeswoman for the Chinese Foreign Ministry has stated that China will never ask for companies to “collect or provide data, information or intelligence” held in foreign countries.

During the congressional hearing, Chew also promised that data on American users would be stored on servers operated by an outside contractor, Oracle Corp., which is a part of “Project Texas.”

He claimed that all US citizens’ data would be stored in the US.

In terms of content moderation, China does not face the same problem as the rest of the countries.

China has an exclusive tool in its version of TikTok “Douyin” which applies restricted censorship rules to prohibit material deemed subversive or pornographic.

Such extensive filters block most users in China from content that is deemed to be against public morale and interest.

One of the unique challenges that Indonesia faces is due to the high amount of diversity of social, cultural and religious values that are embedded in deep historical roots.

These elements could either become a shield or weapon for people to spread false information, hate speech or violent extremist materials.

According to Sherly Haristya in her report with Article 19, effective content moderation in Indonesia requires a transparent and sustainable dialogue between platforms and local civil society groups.

A local coalition on freedom of expression and content moderation is one of the keys to the multi-stakeholder mechanism for the oversight of content moderation.

This is because local decision-makers are well-informed of the local context and understand its cultural, linguistic, historical, political and social nuances

Some possible policy solutions that Indonesia must consider include strengthening content moderation by platforms and adding new censorship and surveillance requirements for technology companies.

The government should also push toward greater transparency and data sharing among platforms, obliging companies to fully disclose self-regulatory efforts to address illegal and harmful content on their platforms.

(Haikal Al-Asyari is a lecturer of law at Gadjah Mada University and PhD candidate at the University of Debrecen, Hungary.)

ADVERTISEMENT

Indonesia
TikTok
Asia News Network

ADVERTISEMENT

6 h ago
8 h ago
18 h ago
2 d ago
2 d ago
2 d ago

Read More

ADVERTISEMENT