The Risk of Staying on X: Why Safety Must Come First

The Risk of Staying on X: Why Safety Must Come First

After the sale of Twitter to Elon Musk in October of 2022, the platform began to change. 

Two years later, now re-named X, the site was a shell of its former self. Without a trust and safety team, we saw hate speech, pornographic material, and other nasty language proliferate across the platform and make it a really hostile place to be. In 2024, Catnip Comms made the decision to leave X and recommended that clients do the same in order to prioritise brand safety. 

For many of the organisations we work with, leaving X or scaling down activity on the platform in 2024 didn’t feel like the right move. For some, X was where their communities were most active and as a result this is where the majority of their social media engagement came from. For others, X was a legacy for their organisation. Teams had spent years building up a following they didn’t want to lose overnight. 

However, in the wake of the recent updates to Grok - X’s built-in AI bot - unfortunately, the risk to now remain active on the platform is huge. 

For those who spent the holidays offline, there is a high likelihood that the Grok ‘backlash’ may not be something you are aware of. Around Christmas Eve, a Grok update went live as part of an image editing functionality that allows users on X to ‘edit’ any image that was posted to the platform - by simply clicking a button and prompting the AI tool. Users began to immediately test the limits of this new feature and by the end of the year, Grok was generating CSAM (child sexual abuse material).  

As these illegal images filled the platform, media outlets rushed to get in touch with X for comment and were instead met with an automated message that simply read: Legacy Media Lies

Heading into 2026, Elon Musk and co. have made their intentions entirely clear to us all. Harassment, abuse, and hate speech have all been on the rise since 2022 - and now we can add CSAM to this list. Not only are these images triggering, harmful, and dangerous, but they are also illegal and X has chosen to allow their newest AI tool to run wild without taking any smidgen of responsibility. 

X is not a safe platform. Remaining active on the platform now means accepting a level of risk - ethical, reputational, and legal - that can no longer be justified by engagement or reach. At this point, the conversation moves beyond performance metrics and towards the associations organisations are willing to accept with a platform that has shown blatant disregard for safety. 

If your organisation is still active on X and considering its next steps, we’re happy to talk. We can offer initial guidance to help you make an informed decision in a rapidly changing social media environment.

Catnip Comms was born out of a love of all things social, a passion for using digital to do good in the world, and a drive to put the social back into social media.

Work with us

Fay Schofield fay@catnipcomms.com

Meet the team

Let’s get started

Have a question? Want to chat about your social media needs? Drop us a line at hello@catnipcomms.com and we’ll aim to get right back to you.

  • Why We Decided to Leave X

    X no longer aligns with our values. It's time to choose integrity.

    Read more
  • Our Day at The Third Sector Conference

    We spent 18th June at London's Barbican Centre for The Third Sector Conference. It was a fantastic...

    Read more
  • It’s all Bluesky(’s) over here…

    We left X behind. Bluesky feels like a fresh, safer place to reconnect. We'll see!

    Read more