Friday, January 17, 2025
spot_img

Guarding facts in the digital age

Date:

Share post:

spot_img
spot_img

By VK Lyngdoh

Guarding facts in the digital age is more crucial than ever, especially with Meta’s recent decision to abandon its fact-checking programme. There is a need to educate the public on how to critically evaluate information sources. This includes understanding biases, recognising credible sources, and verifying facts before sharing. Encourage and support independent fact-checking organizations as they play a vital role in verifying information and debunking false claims. Here we have to leverage technology which means use AI and machine learning to detect and flag false information. These tools can help identify patterns of misinformation and alert users to potentially false content. Community can also be involved by encouraging community-driven fact-checking initiatives. Though this approach has its challenges, it can be effective if combined with proper oversight and guidelines.
As far regulatory measures are concerned, governments can implement regulations that hold social media platforms accountable for the spread of misinformation. This could include penalties for platforms that fail to address false information. Traditional media outlets, with their rigorous editorial standards, can serve as a reliable source of verified information. Collaboration between social media platforms and traditional media can help ensure the dissemination of accurate information. Social media platforms should be transparent about their content moderation policies and provide users with clear guidelines on how to report false information. By combining these strategies, we can create a more informed and resilient digital society.
Holding social media platforms accountable requires a multi-faceted approach. Some of the proactive measures that can be taken are: Governments can enact laws that require social media platforms to be transparent about their content moderation policies and hold them accountable for spread of misinformation. For example, the PACT Act in the United States aims to make platforms more accountable for content moderation. Similarly in the United Kingdom you have the On-line Safety Act, Digital Service Act in the EU, Sharing of Abhorrent Violent Material Act (2019) in Australia and NetzDG (Network Enforcement Act, 2017) in Germany. This law in Germany requires social media platforms to remove certain illegal content within 24 hours after it’s reported or face fines. It’s aimed at curbing hate speech and fake news. Online Harms Act a proposed legislation in Canada would allow the government to fine companies for not following orders from a new Digital Safety Commission regarding the removal of content deemed “legal but harmful.” These acts and laws focus on different aspects of social media governance, from content to privacy, reflecting the complex landscape of regulating digital platforms. However, the effectiveness, enforcement, and balance with free speech rights continue to be debated globally.
In our country we have the Information Technology (Intermediary Guidelines and Digital Media Ethic Code) Rules 2021 which mandates social media platforms to exercise greater diligence in content moderation. They require platforms to appoint a Chief Compliance Officer, a Nodal Contact Person and a Resident Grievance Officer in India to address user complaints and grievances. The rules have introduced important measures to regulate social media content but their effectiveness largely depends on the commitment of platforms to adhere to these guidelines and government’s ability to enforce them fairly. Despite these measures, there have been criticisms regarding implementation and enforcement of the rules. Some argue that the rules may lead to over-censorship and stifle freedom of expression. Additionally, there are concerns about the potential misuse of the rules for political purposes.
In fact, social media companies should regularly publish transparency reports detailing their content moderation efforts, including number of posts removed, the reasons for removal, and the appeals process. Conducting independent audits of social media platforms can help ensure that they are adhering to their content moderation policies and not promoting harmful content for engagement. The platforms should provide users with tools to report misinformation and harmful content easily. Additionally, they should educate users on how to identify and avoid misinformation. Social media platforms should be transparent about how their algorithms work and how they prioritize content. This includes providing users with options to customize their feed and reduce the impact of engagement-based algorithms that promote sensational content.
Social media companies should collaborate with independent fact-checking organizations to verify the accuracy of content and flag false information. Imposing financial penalties on platforms that fail to address misinformation and harmful content can incentivize them to take their responsibilities seriously. Last but not the least, governments and organizations can run public awareness campaigns to educate people about the dangers of misinformation and how to critically evaluate the information they encounter online. Implementing these measures can create a safer and more accountable digital environment.
CS Krishnamurthy in his article Guarding facts in the digital age (ST January, 2025) is absolutely correct that the absence of vigorous fact- checking mechanisms deepen societal rifts and fuel unrest. Without fact-checking, false information can spread rapidly, leading to misunderstanding and misconceptions. This can create divisions within communities and exacerbate existing tensions. Misinformation often reinforces existing biases and beliefs, leading to increased polarisation. People may become more entrenched in their views, making it harder to find common ground and engage in constructive dialogue.
When people are exposed to conflicting information, it can erode trust in institutions, media and even each other. This lack of trust can undermine social cohesion and make it difficult to address collective challenges. In extreme cases, misinformation can incite violence and unrest. False claims and conspiracy theories can provoke fear and anger, leading to protests, riots, and other forms of social unrest. A well-informed electorate is essential for a functioning democracy. Without reliable information, citizens cannot make informed decisions, which can undermine the democratic process and lead to the election of leaders who do not represent the best interests of the people. To mitigate these risks, it is crucial to promote media literacy, support independent fact-checkers, and hold social media platforms accountable for the content they host.

spot_img
spot_img

Related articles

Indian men beat Bhutan; women thrash Malaysia

Women set up quarterfinal clash with Bangladesh; men to face Sri Lanka NEW DELHI, Jan 16: India continued their...

India women must make this year their best in ODIs: Smriti

RAJKOT, Jan 16: Smriti Mandhana believes India’s back-to-back 3-0 series wins against the West Indies and Ireland have...

Gukesh faces first test after becoming world champion

WIJK AAN ZEE, (Netherlands) Jan 16: Grandmaster D Gukesh will face his first big test since becoming the...

Sindhu, Kiran enter quarters, Satwik-Chirag too advance

NEW DELHI, Jan 16: Two-time Olympic medallist PV Sindhu progressed to the quarterfinals, while Kiran George delivered a...