Meta's Disinformation Woes Fuel Business Opportunities for Former Employees
By April Fowell
Since Israel and Hamas went to war last month, there has been a steady stream of false information and violent propaganda on the internet.
The issue is made worse by the fact that, as part of larger cost-cutting initiatives that started late last year and will continue until 2023, Meta, Alphabet, the parent company of Google, and X, previously Twitter, have all removed positions related to content moderation, trust, and safety.
The world's most popular websites are currently having difficulty keeping up with the amount of content being shared, including violent recordings of terrorist attacks, audio that has been manipulated in news pieces, and out-of-context films of past battles.
Apart from conventional social networks, online platforms that were not previously recognized for facilitating political conversations are now compelled to adopt preventive measures due to the intensely politicized nature of the Israel-Hamas conflict.
Terrorist organizations and other bad actors may take advantage of popular online messaging and discussion platforms like Telegram and Discord, since they are increasingly employing numerous communication systems to plan and carry out their propaganda efforts.
Thousands of users of the children's game website Roblox recently participated in virtual world demonstrations in support of Palestine. According to a Roblox representative who talked to CNBC in a statement, this has forced the business to keep a close eye out for posts that go against its community standards.
The spokesperson stated that Roblox "allows for expressions of solidarity" but "does not allow for content that endorses or condones violence, promotes terrorism or hatred against individuals or groups, or calls for supporting a specific political party." Roblox has thousands of moderators and "automated detection tools in place to monitor," the spokesperson said.
Former Tech Titans Forge New Paths in Trust and Safety Sector
There's no shortage of skill to be found in the trust and safety arena. Former employees at Meta are still committed to the cause.
Cove, one of the few up-and-coming businesses creating technology that they can sell to businesses using a well-established enterprise software model, was founded by a former Meta employee. Recently, two more Meta veterans launched Cinder and Sero AI to target the same general market.
The founding staff of TrustLab was drawn from Reddit, Google, and ByteDance, the parent company of TikTok. Additionally, the Intrinsic founders had previously worked at Apple on matters pertaining to safety and trust.
Read also: Which Cryptocurrencies Are Worth the Money?
TrustCon Conference Navigates Online Safety Concerns Amidst Industry Shifts
Tech policy wonks and other industry professionals traveled to San Francisco in July for the TrustCon conference, where they discussed the newest hot themes in online safety and trust. Among their worries were the possible social implications of corporate layoffs.
In the exhibition hall, a number of companies displayed their goods in an effort to attract talent, market their offerings, and engage with prospective customers. Attending the conference was ActiveFence, a company that bills itself as a "leader in providing Trust & Safety solutions to protect online platforms and their users from malicious behavior and content." Checkstep, a platform for content moderation, also done so.
There is currently a deluge of news on social media platforms about two conflicts going on at once: one in the Middle East and the other between Russia and Ukraine. Not only that, but in less than a year they have to prepare for the 2024 presidential election. The front-runner for the Republican nomination is former president Donald Trump, who is being charged with a crime in Georgia for allegedly interfering in the 2020 election.
Related article: How To Install Payroll for Remote Workers?