We all know that ed tech is big business ( say the industry is worth around £170 million to the UK economy) and artificial intelligence is set to be a big part of it.
In fact, it may already a bigger part of your school life than you realise.
A recent report from Nesta states that AI tools are “”, whether it’s students using interactive online tools like or by.
Quick read: Artificial intelligence could ease marking burden
Quick listen: The truth about screen time, tech and young people
Want to know more? How can AI improve your college?
But how well do we actually understand these systems?
Many of the apps we use at home or school - like ,, ǰClass Charts, as well as and - statistically analyse past actions and answers; these are known as training data and help AIs to adapt or predict future outcomes.
This is or; statistical pattern-recognition software.
No sentience, but statistics, replete with the vulnerabilities we learned about in GCSE maths.
We don’t have or yet, that’s the movie version (think).
A lot of AIs essentially just draw through data (or, to use the fancy term, fit a hyper-plane to an n-dimensional dataset).
Imagine the AI plotting lots of crosses on graph paper, then drawing a line of best fit through them.
If the graph were, for example, the AI might flag those students below the line as having made below average progress for their age, and refer them for extra support.
Others simply create to be able to classify things, like predicting what grades students are likely to get based on their track records.
Not human
AIs work rapidly and can spot patterns that humans cannot see. In 2016, AlphaZero (Google’s AI) played a winning move in the board game Go that a grandmaster described as ””
While this is great for gaming contexts, these AIs could also make non-human decisions about our human communities if we give them permission.
Since 2018, to .
The for-profit Behavioural Insights Team that sells the algorithm to Ofsted say that it can “”, thus “freeing up Ofsted employees to work with other schools”.
But they don’t say which factors influence outcomes the most, “partly because they don’t want schools to know” (and perhaps?), and also “partly because it’s difficult to know exactly how these algorithms are working”.
Problems with training data
Most AIs need to be trained with examples. They identify patterns from questions answered by students, and can use these to help adapt learning pathways and predict results.
But this model presents several possible issues:
Quantity
Many adaptive personalised learning platforms claim things like” ǰ to bolster confidence that they have learned from plenty of student answers.
But what if most of those were answered by a minority of students, who love working online? Then all our students would be pushed along similar learning paths as those few tech-savvy students.
Provenance
If an education AI has mainly been used by inner-city schools, for example, it will not be as effective if used for predicting outcomes for students at other types of institutions.
IBM’s medical AI suggested, due to being trained on fake made up training data (because they couldn’t get enough real training data).
Quality
How do we know the data is accurate? In 2011, 206 teachers in Washington DC were fired recommendations.
Investigations later revealed the abnormal student scores influencing their dismissals were likely due to cheating, enabled by other staff (who perhaps didn’t want to get fired).
Bias
AIs will blindly identify patterns in the data, oblivious to ethics. In 1986, a of racial and sexual discrimination.
It had used an AI to screen applicants and the AI had quickly learned from its human predecessors’ decisions (training data) to more readily reject people with foreign-sounding surnames and birthplaces, as well as women.
Data management
What, where and how is our data stored, and what happens when it leaves our control? Could a tired staff member inadvertently attach too much information to an email, as at (think Excel spreadsheets with hidden columns and tabs and)?
AIs can be incredibly effective in all kinds of areas of education, from with special educational needs and disabilities, to allowing students to.
We should certainly try to identify and utilise the genuinely useful educational AI products being developed.
But we need to be cautious, or we run the risk of giving away the private data of our students and staff to unsubstantiated and unaccountable black boxes, as well as handing over scarce funds for immature systems that might be the latest incarnation of the hi-tech snake-oil that.
Omar Al-Farooq is a secondary maths teacher and software engineer