I recently had the privilege of running a workshop at the AI in Education conference that took place at Epsom college. Here are my main points and some of my learnings from running the workshop.
Mr Verry, Head of Digital Strategy
I recently had the privilege of running a workshop at the AI in Education conference that took place at Epsom college. Here are my main points and some of my learnings from running the workshop.
AI now speaks human
For some time, I have quite crudely understood the change over the last 12 months in AI to be best understood as “AI now speaks human”. Although I have been told it is a stochastic parrot and that it is a Generative Pre-trained transformer or that as a Large Language Model it has been trained on a vast array of data the nomenclature has meant very little to me until I was ready to look ‘under the hood’. Instead, simply understanding ChatGPT and other LLM’s as computers speaking human really helped me to understand what was taking place.
AI in Education: clever, but not necessarily relevant
The next issue that I had was that this incredibly powerful piece of software seemed unusable to me. Promptcraft helped with this somewhat, but I was still left feeling that I had to ask it all the right questions for it to be useful. This is clearly a skill that I need to work on, but it is also a barrier to use (I also have the sneaky feeling that it would be skill made redundant pretty quickly). Surely it could surmise what I needed more quickly than I could tell it? It seemed another way to make this software more useful would be to combine it with a local domain of knowledge. Some have given this the term ‘corpus’. I can understand why one might use this term as it conjures the correct images of a body of knowledge, but I preferred the term ‘domain’ as this suggested a limited set of data over which you had some control.
Quizlet and Q-chat AI platforms for education
When I discovered Q-chat on the Quizlet platform I realised it provided a model for exploring this approach perfectly. It combined a domain; a limited data set (the terms that you enter along with their definitions) with the super powerful LLM, ChatGPT. What Quizlet lacked in simulating understanding ChatGPT made up for. What ChatGPT lacked in narrowing the scope of focus, Quizlet provided. I’ll write a further blog on the case study I undertook with Q-chat across different departments in my school.
What next?
After Sam Altman’s ‘Dev day’ announcement (skipping over the soap opera with the board) that we would be able to build our own GPTs I happily set about creating a marking GPT for my subjects exam board and trained it on all the data that I could find that was publicly available, specifications, textbooks, notes generated by me for students, mark schemes and past papers. I then started to direct how I would want the feedback to be arranged. I found the accuracy of the GPT that I made to be startling, but I am still going through iterations to get it to provide feedback that is clearly actionable for pupils. It’s already getting better so I have every expectation that it will be usable shortly. It’s not the answer to how AI and specifically LLMs could be used in education but it is an answer, and it could provide us with a model that we can use. Powerful LLMs with domains of knowledge giving them specific applications.
AI & Education: Into the future
The main limitations with this approach seem to be related to safety. I cannot train the GPT on all the data that I want it to use as much of it is sensitive and subject to GDPR regulation. When we are given the chance to embed the GPTs into our local environment, then we can start to make use of it. There are signs that this might be possible in the near future. Ultimately my hope is that we can utilise these tools so that teachers can spend more time being humans in the classroom, allowing AI to run some of the mechanistic tasks. I want to be able to outsource some of my doing but never my thinking.