When the world is a sea of risk, you need to know your business, your people and your intellectual property are safe.
We’re your Liferaft.
It’s true, we’ve seen some fringe networks come and go, and when they go, security folk lend a sigh of relief, but how do you know they are really gone and deplatformed for good? The answer is, you don’t. Vigilance and awareness of chatter on those “thought to be gone” platforms that once kept you up at night still deserve, at least, a passive sense of monitoring. To prove this, let’s look at the case study on the fringe platform ‘Kiwi Farms’.
Kiwi Farms is an online forum notorious for organizing targeted harassment against people it labels as “lolcows,” especially transgender, neurodivergent, and otherwise marginalized individuals. Its trajectory from a niche trolling board to a global flashpoint in debates over online abuse, deplatforming, and free speech shows how persistent and networked harassment can spill violently into the offline world.
Kiwi Farms began around 2013 as the “CWCki Forums,” a spin‑off space dedicated to obsessively cataloguing and mocking Christine Weston Chandler (“Chris Chan”), an autistic webcomic creator who had been targeted by trolls since the late 2000s. Under the control of administrator Joshua Conner “Null” Moon, the forum rebranded as Kiwi Farms and broadened into a hub for tracking and ridiculing a growing roster of online personalities considered “eccentric.” Over time, its culture hardened around coordinated trolling and doxxing, with threads collecting personal information, family details, and work contacts for use in harassment campaigns.
Kiwi Farms users have repeatedly escalated from mockery to systematic abuse: doxxing targets, threatening violence, mass‑emailing employers, and using “swatting” (false emergency reports) intended to send armed police to victims’ homes.
Several harassment targets, including software developer “Near,” are widely understood to have died by suicide after sustained campaigns originating on the forum, which users sometimes celebrated as proof of success. The site hosted content tied to real‑world atrocities, such as materials linked to the Christchurch mosque shooter, further cementing its reputation as an extremist‑friendly space rather than a neutral message board.
In 2022, a sustained campaign against Canadian trans streamer Clara “Keffals” Sorrenti, who faced doxxing, swatting, and death threats coordinated on Kiwi Farms, triggered a broader #DropKiwiFarms movement. Activists pressured infrastructure companies like Cloudflare and hosting providers to cut services; Cloudflare ultimately withdrew protection, calling the threats emanating from the forum an “imminent and emergency threat to human life,” which briefly forced the site offline and into a game of host‑to‑host migration. Other organizers, including technologist Liz Fong‑Jones, pursued civil litigation against companies that helped keep Kiwi Farms online, winning a significant defamation judgment in 2023 and continuing behind‑the‑scenes deplatforming work.
Despite repeated takedowns, service withdrawals, and data‑hosting disruptions, Kiwi Farms has not disappeared. It periodically resurfaces on new domains and infrastructure, often with degraded performance or regional blocking, and remains a focus of regulatory and legal scrutiny. In 2025, for example, its operators joined a constitutional challenge to the UK’s Online Safety Act in a US court, underscoring that the forum still functions as an entity. They, indeed, are not gone!
For activists, targets, and policymakers, Kiwi Farms now serves as a case study in both the potential and the limits of deplatforming: ousting a high‑risk harassment hub from mainstream infrastructure can dramatically reduce its reach and capacity for harm, but persistence on the fringes means the danger to vulnerable people has not fully gone away.