A practical decision framework for Non-profit Organisations - X Platform Presence
Across the non-profit, public and higher-education sectors, many organisations are reassessing their presence on X. The shift is visible, but the decision is rarely straightforward.
This decision-making framework is designed to support organisations whose work serves the common or greater good. It does not recommend staying on X, nor does it recommend leaving. Instead, it provides a structured way to think through the operational, ethical and public-value considerations involved, from audience need and duty of care, to safeguarding people, institutional responsibility and the consequences of reducing public voice.
The framework is particularly relevant for organisations that communicate in the public interest, feature real people in their work, or rely on timely, trusted information to serve communities. It is intended to help leadership teams make decisions they can stand over, explain clearly, and review over time.
This resource can be used:
to support internal discussion and governance
to inform board-level decision-making
to assess risk and responsibility across platforms
as a starting point for developing a more intentional social media strategy
Leaving or staying can both be principled choices. What matters is that the decision is made deliberately, with a clear understanding of the trade-offs, and with the people you exist to serve at the centre.
Here are the considerations for any public-interest organisation to work through formally:
Who do we currently reach on X?
Which audiences would we lose if we left?
Which audiences would we gain if we reallocated effort elsewhere?
1. Audience reality
Is our information time-sensitive, safety-related, or essential for public understanding?
Is this primarily “brand communications”, or is it public service?
2. Content criticality
What risks does the platform pose to staff, volunteers, community members, or the people we serve?
What is our tolerance for harassment, pile-ons, misinformation adjacency, or content that undermines our mission?
3. Harm and safety
4. Featuring people and safeguarding participants
Do we regularly feature students, service users, community participants, volunteers or beneficiaries in our content?
Have people experienced hate, harassment or abuse as a result of being featured on this platform?
What additional duty of care do we owe to people who are not public figures and did not sign up for platform hostility?
Are our consent, safeguarding and risk-assessment processes still appropriate given the current platform environment?
Would continuing to feature people on this platform expose them to harm we cannot adequately mitigate?
Does being present help us deliver on our mission, or does it meaningfully erode trust?
Are we countering harm, or being pulled into it?
5. Mission impact
Do we have the resources to moderate replies, protect staff and participants, and respond responsibly?
Do we have clear escalation policies for harassment, misinformation, threats, impersonation or high-risk moments?
6. Capacity and governance
If we reduce activity or leave, where exactly are we directing people?
Are we moving to platforms that our audiences actually use?
What’s our plan for continuity in emergencies or major announcements?
7. Alternatives and migration plan
How will we explain the decision publicly, in a way that is calm, accountable and focused on service?
What would make us review or reverse this decision in future?

