Keeping it human: Governance professionals and AI
The real challenge for governance professionals isn’t whether to use AI, but how to use it well: making smart, informed choices that protect what matters most - trust, integrity, and human judgement.
AI is changing the way we all work, fast.
For governance professionals, it promises efficiency, speed and new ways to manage information. But with every innovation comes risk. The real challenge isn’t whether to use AI, but how to use it well: making smart, informed choices that protect what matters most; trust, integrity, and human judgement.
At our last Governance Professional Network meeting, we heard a range of experiences and viewpoints – from the reluctant:
"Not using and not keen to do so - not scared just think brain power is better. Also potentially losing control as we allow confidential matters to be 'seen' by AI seems foolhardy."
To those already experimenting:
"Yes, I use it to summarise information on non-confidential documents and write emails and summaries occasionally but always check first of course!"
AI can’t replace human quality assurance, but it can help free up time for what really matters: strategic thinking, stakeholder engagement and the advice and support that sits at the heart of the governance professional’s role.
NGA’s position is clear: every governing board meeting - whether maintained, local academy committee or trust board - needs to have a skilled governance professional in attendance.
While it’s tempting to focus on efficiency, it’s easy to forget why governing boards exist in the first place. The best decisions come from diverse voices, open conversations, and mutual respect, not just around the board table, but across our whole school communities.
Let’s explore the role AI might play within those communities, how we might exploit its capabilities, and what its limitations are.
What is AI?
AI refers to systems designed to perform tasks that normally require human intelligence, such as recognising patterns, making decisions, understanding language, or learning from data. Most of us already use AI every day, whether it's Netflix suggesting what to watch next, or a customer service chatbot answering a quick question online.
In our professional lives, the most common form we’ll see is generative AI. This type of AI uses large language models (LLMs) to recognise patterns and create content from text and images to audio and video. It can be a useful tool for researching topics, summarising long documents, developing ideas, or drafting and editing written work.
What tools can I use?
We recommend that you stick to AI tools provided by or endorsed by your school or trust’s IT team or data protection officer (DPO).
Many tools now offer education-specific versions with stronger security features. Check that your DPO and school leaders have considered legal risks such as discrimination, data misuse and liability.
A data protection impact assessment is needed when a new digital tool could pose a higher risk to people's personal information. This includes any situation where an external company will have access to personal or private data.
For example, if a governing board is asked to approve a new online system that stores board papers, trustees’ contact details or records of meeting attendance, a data protection impact assessment should be carried out.
This is because a third-party provider or company will be handling confidential, and sometimes sensitive, information on the board's behalf. Â
The most robust AI tools come at a cost. Schools and trusts will need to evaluate whether those costs are justifiable for the benefits achieved.
NGA recently launched NGA Assist, a new AI-powered chatbot designed specifically for the governance community. The tool is available now for NGA Gold Governing Board and NGA MAT members at no extra cost.
Other AI platforms that NGA members tell us they have been exploring include:
- Generative AI tools, such as Microsoft Copilot, ChatGPT and Google Gemini.
- Meeting transcription tools, including Otter and Fireflies
- Corporate board tools like Board Intelligence
How can NGA Assist help governance professionals?
NGA Assist offers quick, reliable answers drawn directly from NGA’s guidance, resources and training materials. It is now available for NGA Gold and NGA MAT members.
It can also draft meeting agendas, board papers and policy documents aligned with best practice. NGA Assist will request some basic information to make a start, then compose the document for you.
Watch this short video with our director of marketing and communications, Rob Peters, to see how it works:
How can I avoid AI risks?
While AI offers many opportunities, it also carries risks. A key requirement for those using AI, including governance professionals, is to acknowledge those risks and take steps to minimise them.
- Refer to relevant school or trust policies, such as data protection, online safety and cyber security, and check that your use of AI falls within the expected standards. We’ve produced a model AI policy that covers roles and responsibilities, appropriate use of AI and AI misuse.
- Be transparent and ensure accountability. Document AI use and make sure final decisions are reviewed and approved by a human professional.
- Avoid inputting or uploading personal data unless you’re confident that the tool is GDPR compliant and handles data securely and legally. Never share information that could identify individual pupils, staff, or board members. DfE guidance recommends that personal data is not used in generative AI tools.
- Always check the accuracy of AI outputs. We know some governance professionals use AI for meeting transcription and note-taking. AI can sometimes struggle to pick up everything that is being said, so have a backup option, such as a recording, that can be used to produce minutes. AI is also known to ‘hallucinate’, meaning it presents fabricated facts or asserts nonsensical information.
- Check for bias and fairness. Machine learning uses data sets that are not necessarily representative of the populations that AI is designed to serve. Always review outputs critically.
- Seek permission where needed. Materials protected by copyright can only be used to train AI if there is permission from the copyright holder. For example, permission may be needed when publishing a policy that has been created by an AI tool that used input taken from another school’s policy.
Where do I start?
It’s important that everyone understands and feels comfortable with how AI is being used. Talk to your boards and the school leaders you work with about the benefits and risks before you introduce new practices like using AI for meeting transcription.
It’s likely that levels of understanding and experience of AI around the table will vary. You might like to share our Navigating AI: governance essentials webinar to start the conversation.
If you’re feeling ready to experiment or extend your use of AI, start by asking yourself:
- What are the repetitive tasks I need to complete regularly?
- Is there a task I would be willing to share with AI? (Hint: don’t choose something you enjoy!)
- Which AI tool is most helpful for this task?
Staying informed
AI is developing at an astonishing pace, with its reach and capability expanding every day. Its influence will be felt across education and far beyond, creating both opportunities and challenges. These will apply not only for our professional lives, but also for the future careers of children and young people.
By staying informed, we can make the most of what AI has to offer, while also understanding how it’s reshaping our world, our schools and the lives of those we serve.
Useful resources
Become a member