In recent days, several TikTok content creators have experienced significant difficulties uploading videos addressing the activities of U.S. Immigration and Customs Enforcement (ICE). Among them is comedian Megan Stalter, known for her satirical character sketches with substantial followings on Instagram and TikTok. Motivated by the fatal shooting of Alex Pretti, a nurse allegedly shot by federal immigration agents during a Minneapolis operation, Stalter aimed to share a personal message urging Christians to oppose ICE raids, asserting, "We have to abolish ICE," and believing this stance aligns with Christian values.
While her video gained traction on Instagram with over 12,000 reposts, Stalter was unable to upload the same content on TikTok despite multiple attempts. Frustrated by these failures and suspecting suppression due to the political nature of her message, she ultimately deleted her TikTok account. Efforts to contact Stalter for further comment have been unsuccessful.
Other TikTok users reported comparable experiences: videos featuring criticism of ICE repeatedly failed to upload over the weekend. This pattern of technical trouble has sparked widespread speculation that TikTok may be censoring politically sensitive content. The issue garnered public attention, including remarks from Connecticut Democratic Senator Chris Murphy, who identified alleged censorship on TikTok as a significant democratic threat, ranking it "at the top of the list" of modern challenges. Representatives from Senator Murphy’s office were contacted for comment but have not provided statements.
Responding to these concerns, TikTok issued a statement attributing upload delays and video visibility issues to a power outage at a U.S. data center. A spokesperson for TikTok's U.S. Joint Venture emphasized that these technical problems are ongoing but unrelated to recent high-profile news or content themes, implying no intentional restriction of any particular subject matter.
The timing coincides with a major change in TikTok’s U.S. ownership structure. Last week, a newly formed joint venture with majority American ownership assumed control of TikTok’s American assets. This transition resulted from a 2024 statute initiated under the prior administration, which mandated that TikTok divest from its previous Chinese proprietors to avoid a ban in the United States. Oracle, a notable technology corporation whose executive chair Larry Ellison is an affiliate of former President Donald Trump, is among the new investors. Oracle will manage U.S. user data via a "secure U.S. cloud environment," and the new joint venture will hold authority over trust, safety policies, and content moderation decisions.
Given TikTok's status as a privately owned platform, it legally reserves the right to shape content sharing parameters according to its policies. However, the opacity of TikTok’s content recommendation algorithms and the recent ownership shift have cultivated an atmosphere of distrust among American users, especially concerning sensitive political narratives. Casey Fiesler, an associate professor of technology ethics and internet law at the University of Colorado, Boulder, highlighted that broader mistrust towards social media leadership amplifies skepticism of TikTok’s new governance. Given the connection to the Trump administration and the contemporaneous ICE raids in Minneapolis, users’ concerns over potential content bias are understandable.
Fiesler noted that widespread misinformation has already begun circulating about supposed revisions to TikTok’s terms of service, particularly regarding location tracking and data collection, shortly after the ownership transfer. These concerns have prompted users to question who controls their data and how changes in ownership could impact content visibility and recommendation practices.
In an illustrative example, Fiesler herself successfully posted several videos debunking rumors about TikTok’s new terms. Nevertheless, some videos alluding to ICE activity in Minneapolis have encountered adverse treatment during the upload process, such as prolonged review periods and incomplete publication features. One such video eventually published but experienced unresponsive captions and a non-functional view counter for hours afterwards. She emphasized that even if technical glitches are unintentional, they still affect user perception and trust in the platform.
Veteran TikTok nurse content creator Jen Hamilton, who commands a following exceeding 4.5 million users, reported immediate upload failures beginning January 22 — the day the new U.S. ownership was announced. One video centered on a 5-year-old child, Liam Conejo Ramos, reportedly taken into custody by federal agents, failed to become publicly viewable. Hamilton characterized the timing as ironic and noted that subsequent attempts to post videos related to the incident also failed.
Hamilton expressed skepticism toward whether these failures were accidental, suspecting a systematic change in how content about ICE is handled on the platform. While acknowledging the lack of concrete evidence to prove intentional censorship, she underscored the coincidence in timing with TikTok’s administrative change. This milieu has prompted some users to sever ties with TikTok, though Hamilton herself is considering alternative platforms such as Substack and Patreon to maintain unfiltered communication with her audience.
Experts in media law and ethics, including University of Cincinnati professor Jeffrey Blevins, indicate that proving deliberate censorship by TikTok would be challenging due to the non-transparent nature of social media algorithms. Furthermore, as a private entity, TikTok retains the right to moderate content according to its discretion without infringing on First Amendment protections. Blevins cautions the common misconception of social platforms as public forums under legal definitions.
Statistical data provided to CNBC reveals that TikTok's daily uninstall rates have surged by approximately 150% over the past five days compared to a three-month average, suggesting growing user disaffection. Hamilton interprets this trend as motivated by a combination of fears about content suppression and uncertainty regarding the platform’s evolving policies and ownership implications.
Despite these concerns, Hamilton has adapted her approach to addressing political topics on TikTok. In a video posted to the platform, she ironically refers to herself as a "fashion influencer" and employs coded references to discuss sensitive subjects, including federal actions involving young individuals like Liam Conejo Ramos. She cynically remarks, "Fashion influencing is in my blood, and even a company with bad customer service won’t keep me from doing my fashion review," signaling her intent to persist albeit with strategic modifications to content presentation.
This evolving situation illustrates the complex challenges TikTok faces as it balances technical infrastructure issues, user trust, content moderation policies, and the political sensitivities entwined with ownership transitions and U.S. governance demands. For impacted creators and broader user communities, these challenges underscore enduring questions about transparency, the boundaries of platform control, and the implications for discourse on politically charged topics.