Adopting AI with Empathy
- Emma Bayne

- Oct 7
- 4 min read
By Emma Bayne, Principal Consultant / 7 October 2025

The more I read about AI and as it becomes an ever-increasing part of our futures, I’ve been contemplating the real impact it will have on our experiences, the importance of empathy in those experiences, and what we can do to forge a path that finds a balance.
AI has such wide-ranging possibilities and is already shaping the UK higher education landscape. From chatbots fielding common queries to platforms tailoring learning pathways, AI has become embedded in university life and increasingly in our daily lives.
It is a driver of innovation, efficiency, and competitiveness. However, we cannot think of AI in terms of efficiency alone. If adoption is transactional, we could lose the qualities that make higher education transformative: empathy, trust, and human connection. The real challenge is not whether to adopt AI, but how to integrate it in ways that amplify the human experience.
The promise of AI in higher education
The opportunities are vast. AI can adapt to learners in real time, identify early signs of disengagement, and free staff from repetitive tasks so they can focus on high-value work. These capabilities hold the potential to make education more personalised, responsive, and engaging.
Universities are already demonstrating what is possible. The University of Birmingham is piloting Jisc’s Graide, reducing STEM marking time by nearly 90% while improving feedback quality. The University of Leeds and the Open University are using AI to analyse student voice data at scale, helping leaders respond more quickly to emerging needs. Most recently, the University of Oxford has become the first UK institution to give its entire community free access to ChatGPT Edu, backed by sound governance structures, training opportunities, and an AI Competency Centre.
These initiatives show how AI, thoughtfully deployed, can strengthen rather than weaken the relationships that underpin learning.
The risks of AI adoption without empathy
It is important to remember technology is not neutral and has inherent bias. AI systems risk amplifying existing inequalities if access and digital confidence are uneven. Bias in training data could lead to unfair outcomes, while opacity in decision-making may undermine trust. If students experience AI as impersonal automation, universities risk eroding belonging, mentorship and many more aspects of the university experience that are so valuable.
In short, an efficiency-first approach may make processes smoother, but this risks hollowing out the essence of education.
What empathy in AI adoption requires
Empathetic adoption of AI needs much more than technical competence; it requires leadership committed to care, inclusion, and accountability. It is worth noting these themes are common across all digital transformation and change programmes and - unfortunately - are commonly the areas which receive less attention. Here’s how we can adopt AI with empathy:
Co-creation. Students and staff must be involved in shaping AI tools so that they reflect real needs and build trust.
Accessibility by design. Systems should work for diverse learners, including disabled and neurodiverse students and those for whom English is an additional language.
Transparency. Universities must explain how AI systems work and give students agency over how they use them.
Data stewardship. Privacy and academic freedom must be protected, with data used to support rather than police learners.
Some institutions are already modelling this approach. This includes UCL’s Academic Communication Centre and Students’ Union who co-run workshops on topics including 'Critical use of GenAI tools in writing' and 'Language + Writing Workshop using GenAI' to help students use AI tools responsibly in their writing.
Another example is the University of Greenwich which has partnered with Studiosity+ to give all students access to AI-for-learning technology that provides written feedback with a focus on inclusive access to these tools, and building skills for students in how to use AI as part of their education journey and beyond.
These examples show AI as an enabler of inclusion when empathy is at the core of adoption.
The requirement for leadership
The UK higher education sector operates within a regulatory environment that encourages innovation, yet requires compliance. The OfS requires fairness and transparency in AI’s role in assessment and outcomes. Government strategies emphasise digital skills and safety alongside adoption.
Within this context university leaders must act decisively. Vice-chancellors and senior teams must set a clear vision and governance structures for AI that balance ambition with responsibility. Staff development must extend beyond technical training to include reflection on ethics, fairness, and wellbeing. Leaders also have a responsibility to ensure they cultivate cultures where empathy and innovation reinforce one another.
Looking ahead
Moving from principles to practice requires steps like:
Investing in AI literacy for staff and students to ensure equitable participation.
Collaboratively piloting tools, involving stakeholders from the outset.
Continuously evaluating impact, measuring outcomes not only in efficiency but in wellbeing, inclusion, and trust.
Reinforcing human-AI partnerships, keeping teaching and mentorship at the heart of the student experience.
One promising model, from the Open University, pairs predictive analytics with dedicated student support. If a student is flagged at risk of not completing or missing the next assignment, AI prompts staff to reach out personally to proactively support them. This is a great example of combining machine efficiency with human empathy.
Conclusion
The choices universities make today will shape whether AI amplifies or diminishes the values of higher education. With empathy as the guiding principle, AI can enhance inclusion, strengthen relationships, and give both staff and students more space for the kind of learning that changes lives.
By embedding care, dignity, and trust into AI strategies, universities can ensure technology serves as one bridge to a more inclusive future of learning.
If you’d like a conversation with us about how to approach digital transformation with empathy, please get in touch.



