Leading with openness, not fear: How unis can run national security reviews more effectively

a computer keyboard with a padlock on top of it

OPINION

A frontier AI lab proposes a cross‑border project. One collaborator sits in a flagged jurisdiction; another has defence funding. Your research office raises export‑control issues; the faculty worries about academic freedom; the funder wants new attestations. Everyone is trying to do the right thing – and the lines are still fuzzy.

That’s the reality for Australian universities operating in an environment that government acknowledges is more complex and exposed to foreign interference risks, even as our mission depends on global collaboration and open knowledge flows highlighted in the Australian Department of Education’s latest Pulse Check on the implementation of the Guidelines to Counter Foreign Interference in the Australian University Sector.

What should be clear to everyone is that national security reviews are now a core leadership discipline. They’re not the form you lodge at the end of a grant; they’re strategic decisions that balance protection with openness.

Australia can be a global leader in this space – if we double down on proportionality, transparency, and partnership; building on the collaborative architecture we already have through the University Foreign Interference Taskforce (UFIT) and strengthened processes at the Australian Research Council (ARC).

We’re not starting from scratch—so stop acting like we are

As we start another year with global geopolitical uncertainty, let’s not forget the positive steps that have already been taken and the tools that are available to aid decision making.

UFIT – co‑led by government and the sector – provides shared principles, guidance and a forum for joint problem‑solving, while the ARC’s research‑security processes move screening earlier in the grants lifecycle to surface issues before late‑stage surprises. The Department of Education’s sector‑wide Pulse Check shows maturing practice, but also persistent pain points – especially the need for proportionate, practical guidance that preserves academic freedom while addressing real risks.

Export controls have also been modernised. The Defence Trade Controls Amendment Act 2024 introduced “deemed export” offences (e.g., supplying certain controlled technology to non‑exempt foreign persons in Australia) and created a licence‑free environment for specific transfers among AUKUS partners, as outlined by Defence’s official implementation overview.

That raises the stakes, whilst also opening pathways to accelerate secure collaboration – if leaders can segregate sensitive and non‑sensitive flows and manage access with discipline.

Let’s also name a persistent problem: opacity. Universities want tailored advice; agencies are constrained by secrecy provisions. The result can be blunt, risk‑averse bans that chill legitimate work while missing genuine risks.

These are concerns echoed by sector experts in Times Higher Education’s analysis of secrecy’s unintended effects (here) and in the Education Department’s own reflections on sector needs (here). Higher Education leaders should continue to push government, respectfully but firmly, for unclassified pattern‑of‑risk briefings and then publish how those patterns translate into internal rules of engagement.

Our allies share the same intent, but use different levers

United States. Washington is codifying institutional capability. Under National Security Presidential Memorandum-33 (NSPM 33) and the CHIPS and Science Act, institutions receiving >US$50m/year in federal science and engineering funds must certify a research‑security program spanning cybersecurity, foreign travel security, research‑security training and export‑control training; the White House Office of Science and Technology Policy’s 2024 memo standardises certification (OSTP guidance), and NIST’s handbook translates expectations into operational steps (NIST CHIPS guidebook).

United Kingdom. The UK’s Trusted Research materials are plain‑English, risk‑based playbooks for academics and senior leaders; they defend collaboration by making it safer with partner‑vetting questions, legal hooks and leadership checklists (NPSA guidance; NCSC overview).

Canada. Ottawa has drawn bright lines for sensitive technology research. If a grant aims to advance listed sensitive technologies, named researchers cannot be affiliated with specified organisations linked to foreign militaries or security services; applicants must attest compliance (Policy on Sensitive Technology Research and Affiliations of Concern; Government publications record).

The convergence is clear: proportionate risk management that preserves openness and avoids discrimination. The differences are tactical: U.S. capability certification (OSTP), UK playbooks (Trusted Research), Canada’s domain‑specific prohibitions (STRAC), and Australia’s collaborative UFIT model now reinforced by modernised export controls (Defence overview).

Five decisive moves to improve outcomes for leaders

1) Open by default, controlled by exception. Reaffirm openness as the default and document exceptions with clear risk rationales and mitigations. Security should safeguard academic freedom and collaboration, not shrink them – an explicit UFIT principle (UFIT info).

2) Build institutional capability, not case‑by‑case heroics. Stand up (or uplift) a research‑security program office covering training, partner vetting, export controls and cyber baselines; the U.S. model shows how capability reduces late‑stage collapses and speeds good decisions (OSTP guidance; NIST playbook).

3) Move screening to the left. Emulate the ARC: run early checks (on partners, affiliations, export‑control flags) at the point of ideation or internal approvals, not after peer review, to catch problems when they’re still fixable (ARC research‑security; ARC CFI framework).

4) Use bright lines where they help; keep judgment where it matters. For quantum computing, advanced AI, and other sensitive technologies, consider Canada‑style attestations and clear, broad exclusions; for nuanced HASS collaborations, retain case‑by‑case discretion anchored in UFIT principles and open‑source due diligence (STRAC; UFIT info).

5) Demand two‑way guidance. Through UFIT and sector forums, continue to ask for regular unclassified pattern‑of‑risk briefings; in return, share anonymised campus insights so policy can adapt (Pulse Check; UFIT overview).

This is an opportunity, not just a compliance burden

Build visible, proportionate systems and you unlock a trust dividend with funders and partners: faster decisions, fewer late‑stage vetoes, and more confidence to green‑light the right projects. The ARC has already moved to earlier due diligence and U.S. funders are rewarding institutions that can certify research‑security capability (OSTP guidance).

The AUKUS licence‑free environment can be a growth engine for joint research – if you segregate sensitive workstreams, control access to DSGL technology, and manage “deemed export” risks with discipline (Defence overview; Act text). And there’s no shortage of practical tools: the UK’s Trusted Research checklists, the U.S. NIST playbook, and Australia’s UFIT partnership model are ready to adopt, adapt and publish today.

The best rebuttal to claims that proportionate security “chills collaboration” is clarity. Sector voices have warned that heavy‑handed crackdowns risk undermining collaboration without improving security (The Conversation analysis); a flexible, evidence‑based approach can protect both research and values (Pulse Check).

Lead with a trust‑and‑verify posture. Keep doors open, apply targeted mitigations, and document why you said yes, no, or not yet. Australia’s UFIT model, ARC early screening and updated export controls (judiciously combined with combined with the UK’s Trusted Research playbooks, U.S. capability standards and Canada’s clarity in sensitive technologies) are more than enough to act with confidence now.

Ross McLennan is Pro Vice‑Chancellor, Research Services at Macquarie University. He is co-Chair of the Australian Government’s Trusted Information Sharing Network (TISN) Higher Education and Research Sector Group and a member of the Critical Infrastructure Advisory Council.

The views expressed are the author’s own and do not necessarily reflect those of his employer. He receives no external funding related to this article.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Sign Up for Our Newsletter

Subscribe to us to always stay in touch with us and get latest news, insights, jobs and events!!