What the TikTok government bans mean for you

 

By KC Ifeanyi

December 18, 2022

Looks like your favorite government officials won’t be doing any makeup tutorials or dance challenges anytime soon.

Earlier this week, the Senate passed legislation to ban TikTok from government devices. This comes on the heels of mounting pressure, predominately from state Republican lawmakers, to curb TikTok’s presence across the United States, citing national security concerns due to the app being owned by the Chinese company ByteDance.

In the past two weeks alone, governors from South Dakota, Maryland, Oklahoma, Alabama, South Carolina, Utah, and Texas have barred public employees from using TikTok on government devices.

This wave of banning employees from posting TikToks on their government-owned devices follows similar moves made by the U.S. military, State Department, and Department of Homeland Security. The Senate’s newly proposed legislation would extend that restriction to the entire federal workforce.

These bans are not a critical blow to TikTok. There are about 2 million U.S. government employees; state public workers constitute another several million people. TikTok purportedly has one billion monthly active users. But these moves are also hardly a sign that TikTok is on sure footing, either.

Should this recent avalanche of action against the popular short-form video app elicit concern that all citizens—and not just those who work for the government—will soon also find themselves banned from using TikTok? Not quite. Will TikTok remain the same in the near future? Maybe not.

A history of tension

TikTok has been caught up in geopolitical controversy from the very beginning. In 2019, the Committee on Foreign Investment in the United States (CFIUS) opened a national security review following ByteDance’s 2018 acquisition of Musical.ly, which was later relaunched as TikTok.

The CFIUS review is standard procedure for a transaction of this sort, but then Donald Trump caused shockwaves of panic in the summer of 2020 with an executive order to ban both TikTok and WeChat, a Chinese messaging and payment app, from U.S. app stores.

President Trump’s proposed ban stalled in court and was eventually revoked during President Joe Biden’s administration, which has since pursued a more measured approach to security concerns by calling for data collected from U.S. TikTok users to move to data centers within the country. TikTok has long maintained that data from U.S. users is, indeed, stored in the United States, with backup storage in Singapore.

TikTok now claims that 100% of U.S. user traffic is being routed through Oracle, a software company headquartered in Austin, Texas. (Oracle was TikTok’s would-be buyer in President Trump’s post-ban plans.) In a statement to Fast Company, a TikTok spokesperson says that politicians with national security concerns should encourage the Biden administration to conclude its national security review of the app.

“The agreement under review will meaningfully address any security concerns that have been raised at both the state and federal level,” the spokesperson said. “These plans have been developed under the oversight of our country’s top national security agencies—plans that we are well underway in implementing—to further secure our platform in the United States, and we will continue to brief lawmakers on them.”

More than jingoism

Although American exceptionalism and government fears of Chinese competition—particularly in the context of technological supremacy and artificial intelligence—cannot be underestimated as the motivation for these bans, TikTok’s actions have further deteriorated its standing.

A 2019 blog post from the company asserted that “none of our data is subject to Chinese law.” However, recent reports have surfaced of ByteDance allegedly planning to use TikTok to monitor the location of specific Americans, and that ByteDance employees based in China have repeatedly accessed nonpublic data on U.S. TikTok users.

According to leaked audio reviewed by BuzzFeed News, a member of TikTok’s trust and safety department flat-out said in a 2021 meeting that “everything is seen in China.”

“A platform saying, ‘We don’t gather this,’ and then to have some investigative reporter show that’s actually not true, we’ve seen that pattern over and over again,” says Philip Napoli, a professor in the Sanford School of Public Policy at Duke University. “So, public pronouncements about what kind of data access is or is not there, what kind of data gathering is or is not taking place, don’t mean a ton to me.”

TikTok has also dealt with a cavalcade of other controversies, from censoring marginalized creators to accusations that it doesn’t do enough to stamp out disinformation, dangerous challenges, or pedophilic content. A recent study called the app “the social media equivalent of razor blades in candy.”

So while one can reasonably assume that TikTok has suddenly become Republicans’s new favorite punching bag as a way to score easy political points after a less-than-stellar midterm election, TikTok is also giving them a lot of material to work with. Even if its data is hermetically sealed within U.S. borders.

The political challenge

To Kirsten Martin, director of Notre Dame University’s tech ethics center, the TikTok dogpiling seems more like a reaction from politicians not being able to crack the app’s influence. Martin compares the data collected through TikTok to what marketers and other social media sites scoop up on a regular basis.

“That’s why politicians aren’t ever going to get rid of big data, because their own campaigns use them exhaustively,” she says. “I guess one way that TikTok could become really secure is politicians figuring out how to use it effectively. But right now they don’t, so they don’t see any need to protect it.”

Although the conversation to date has largely focused on data being collected, it’s at least equally important to consider how content is distributed, i.e. what surfaces on a users’ For You page, the highly coveted and mysterious stream of content algorithmically curated to a user’s tastes.

“How you sell products and how you sell political ideas and candidates is not that different,” Napoli says, referencing the popular “TikTok made me buy it” trend where users share viral products found on the app. “So, I feel comfortable assuming that a large-scale political influence operation carried out over TikTok could be quite effective.”

This is not an idle concern. The Digital Threat Analysis Center, which Microsoft acquired earlier this year, identified more than 200 content creators who’ve been producing pro-China content across multiple social media platforms, including TikTok, in 40 languages. Some creators were even state media employees doubling as influencers. According to its report:

By developing an audience around certain target characteristics like language, culture, and interests, influencers are able to tailor their messaging to reflect the CCP’s strategic objectives for a given population. For example, lifestyle influencers whose target audiences live primarily in developing countries often seek to portray China as a potential benefactor and a trustworthy diplomatic ally . . . .

Can you tell the difference between propaganda and an innocuous “get ready with me” video? If politicians are truly concerned with China’s influence in the West and elsewhere, there’s theoretically much more at play than just data harvesting.

But in all likelihood, they aren’t. The current focus on guarding national security by barring state and federal employees from using the app on government devices is too narrow to mean much the average user or content creator. There are a lot of ways for those employees to find a workaround to the law and still get their TikTok fix without technically violating the orders—and still creating the same risks that ostensibly led to the government intervention in the first place.

The bans are currently as performative as the latest viral dance. But if they mean that there are fewer #trafficstophumor videos in your feed, maybe—for now—they’re a low-key win.

(32)