Tarun Wadhwa, Visiting Instructor, Carnegie Mellon University Ian Hart, Head of Product Category. Capita
As we enter a fourth industrial revolution, where new socio-economic paradigms are being shaped by fundamental changes in technology, it seems strange that one of our most critical societal pillars - education and learning - continues to largely operate within the structures and ways of working that it has done for centuries.
Schools still fundamentally operate in the same way they did 100 years ago. Although schools, colleges and universities have introduced various forms of technology, some argue that a lot of this is mere ‘digital window dressing’.
And with whole industries on the brink of transformation thanks to automation and AI, many corporations stick to a traditional approach to learning and re-skilling, which won’t cut it in the dynamic future we will compete in.
We want to explore how technology is positively transforming how these learning institutions operate, and the barriers preventing technology from reaching its full potential.
Schools are now filled with digital-natives – students that have never known a life without smartphones, tablets and information at the touch of a button.
It’s therefore no surprise that technology is transforming how education takes place in classrooms – with interactive eBooks, tablets and educational cloud applications all on hand to help find instant answers to questions.
It’s not just changing the delivery of learning – it’s helping to shape the content too. One example is Angharad Holloway, Headteacher at Talbot Heath School, who has devised a new curriculum looking to 2050, based on business leader feedback, quotes from the Confederation of British Industry (CBI), and best practice to ensure kids are digitally proficient and understand future technology. From 3-18, their students learn coding, digital skills, material science, computer animation, ethics, sustainability and design thinking to prepare them for the digital world they’ll be working in.
Developing an entirely new programme of learning is, however, simply not an option available to many schools that are bound by the national curriculum, with budget restraints also a barrier to adding additional curricula choices. But it’s clear that technology is already playing a central role in shaping the learning content of the future.
A wide variety of technologies are helping to transform the role of the teacher. With workflow automation helping to alleviate the administrative burden of paperwork – from progress reports to permission forms – teachers are being freed up to spend more time engaging with students. But intelligent software is also helping teachers to personalise the learning experience for each individual.
Author Alex Beard talks of a Rocket Ship school in San Jose, where a programme learnt the strengths and weaknesses of students, tailoring the learning experience and generating data for teachers to act on. Desmond Bermingham from ACER talks about the organisation’s work with the Scottish government to develop the Scottish national assessment system, which will be delivered online when the teacher decides the time is right.
Schools are encouraged to use assessment to improve learning, rather than just to give a student a grade at the end of the year. These tools free up teachers to focus on the crucial human elements of their craft, and to arm them with data and insights to make informed decisions.
Technology has made it easier for us to solve problems, just like calculators gave us a shortcut to solving a maths equation. But understanding the process of getting to the answer is critical to learning. Do we risk hindering the learning process by using too much tech?
As Alex Beard says, “it’s good to be user-unfriendly sometimes. The struggle matters.” He references an experiment called ‘the paradox of the guided user’ where two groups were given a problem to solve – one with the help of a computer assistant, the other without. The group that struggled through were better able to solve future problems than the group who were told the answer.
Despite some great example of technology augmenting the classroom experience, the adoption of technology in the classroom has been slow – and the overall education experience still feels a world apart from the reality that children experience in every other aspect of their lives. But why? With the pressure of league tables and significant constraints on budgets and time, many teachers are understandably concerned about the risks involved with introducing radical new technologies. There’s simply too much at stake to risk the unproven and the unfamiliar.
But schools are also facing an overwhelming number of options. They are inundated with sales calls from education technology companies, and it’s increasingly difficult to evaluate which solutions will actually be helpful, with too much choice and not enough time to evaluate what has been rigorously tested.
When purchasing decisions are made, it’s often done at the administration level, without teacher involvement or even a consideration of the educational benefits.
Furthermore, it’s hard for non-professionals to understand how various technologies can fit together, and what to do and expect when they don’t work as planned.
Improving the appraisal and implementation part of the process can be hugely valuable to everyone involved, but like any investment, clear objectives are key. Technology is only valuable if it provides a solution to a real issue.
Although technology has an obvious role to play in the classroom, it also has a crucial role to play in the school infrastructure. Capita’s SIMS management information system is used by over 21,000 schools in 49 countries, helping schools to manage daily life by making it easy to do everything from collect payments for trips to book rooms.
But it’s not just a tool to make admin a bit easier – data from these systems can drive real impact. Capturing a wealth of data on student attendance, device use and performance can flag potential issues and help schools decide on the most effective interventions.
Microsoft’s AI technology is already helping to predict which students are most likely to drop out of school, using complex data sets which include details about enrollment, student performance, gender and socio-economic demographics, school infrastructure and teacher skills to find predictive patterns. The pilot project led to more than 60 patterns that helped track dropouts.
Another pilot scheme used a data scientist and school data to uncover the reason behind a dip in school attendance on 15 December. It was Christmas Jumper Day and many students couldn’t afford to buy one – so they stayed at home. Imagine the impact this data could have on a larger scale – especially when coupled with predictive analytics to predict future trends and behaviours.
Technology is key to keeping students attentive, motivated and curious in the classroom, and digitally dexterous outside of it – a vital skill for the future at a time where 4.3 million people in the UK are estimated to have no basic digital skills according to Lloyds Bank.
But access to powerful data is also key to schools and the wider community driving change outside of the classroom – and we’re still only just scratching the surface.
As the lines between industries are blurring, higher and further education institutions must adapt to prepare students for a world that is hyperconnected, globalized, and is increasingly converging.
Technology is enabling more flexible learning within further education. Dr Kameel Khan, a fellow at Stanford University’s Careers Institute, describes Stanford’s Open Loop degree, which allows more flexibility in where you do your learning, meaning students can learn off-campus whilst working. It also links the university to the workplace, which dictates what skills students need to learn.
Instead of taking a test to receive a final grade, the degree takes into account their entire skillset and competencies that they have learned, evaluating learning in a more holistic way. More cost-effective and accessible, students still get the quality and reputation of Stanford, but they are able to spread the costs and fit study around their work schedule.
There is also a focus on using the ecosystem of Stanford, with access to alumni with expertise in “co.labs” in order to develop their ideas.
Despite these advances with online learning and blended courses, we have yet to fundamentally change the way we receive higher education - it’s still dominated by the traditional three or four-year degree. The choice of university is critically important to students, mainly driven by location, prestige and ultimately its ability to prove their worth to future employers. And that often means that traditional institutions are the safer options.
But a business school dean told The Economist that the change facing universities was similar to what happened to the music industry – albums gave way to singles, just as degrees will give way to smaller credentials.
There has been a move towards “unbundling” the traditional approach, but it’s been met with only moderate success. By moving to “stackable,” “nano,” and “micro” degrees, as they are known, the idea is that a person can collect enough credibility to have something equivalent to the traditional bachelor’s degree. But the problem is that there’s still no way for employers to understand the value of these emerging credentials, meaning employers are even more dependent on the brand behind the certification.
The unintended consequence of the democratization of learning content is that the powerful brands become even more important as a shortcut to cutting through the noise.
This potentially proliferates the existing inequalities in the job market whereby employers use brand as a first filter for job candidates.
As with schools, it’s not just the delivery of the learning that technology can facilitate. Jon Cole, Head of Management Information Services at Morley College describes how they use technology to identify ‘dumped carts’ in their online application system, which allows them to offer help to students in completing their applications. “It enables us to identify the exact stage the student exits the application process, as well as what courses they were enrolling on. We can then follow them up by talking to the student and offering advice, so that they find, and apply for, the course they want to study.”
In the same way, chatbots - or digital campus assistants - are another form of AI that is being developed to support both students and teaching staff, from capturing initial course enquires to delivering assessment deadline reminders. Technology is also key to driving research in universities, attracting funding and driving innovation. Cloud and AI technology in particular have revolutionised the approach to big data research in the sciences.
And this innovation is driving wider economic growth, with the National Graphene Institute in Manchester spawning a huge commercial opportunity, and Stanford now a core part of the Silicon Valley ecosystem. It’s clear that leveraging technology in higher and further education is not only critical for the learner, but for the future of the establishments and wider society too.
To survive and thrive in the digital economy, workers now need to continuously update and acquire new skills, with more of those skills involving the use, maintenance, and operation of different technologies. The idea of continuous learning is more prescient than ever before.
Companies these days have many ways of trying to improve the knowledge of their employees: they hold brown bag lunches, sponsor conferences, host webinars, provide training materials, and make material available online through wikis and websites.
Yet the vast majority of these options might as well be analogue. They provide no feedback, have little interactivity, and participation is largely passive.
Once again, it’s one-to-many rather than one-to-one training tailored to the individual.The best programs are the most relevant – and don’t serve an abstract need but solve a recurring problem.
For example, if a salesperson is making a call, they should be able to bring up tips and best practices for how to approach a certain population segment in order to improve their odds of success. The less work they have to do to access that information, the more it can impact a company’s bottom line.
Sometimes the best people to teach you something are your colleagues, who are working through the same problems as you. Google found great success with their Googler to Googler learning platform – they put their focus on providing the infrastructure for employees to collaborate, removing barriers to sharing, and encouraging the development of communities. When lessons came from employees it was seen as more credible.
It’s not necessarily about the technology though – the cultural shift required to enable technology-powered learning is key.
Businesses must give employees the permission and space to learn freely from each other, find ways to figure out what their employees need to know, and how to best learn it and ultimately make learning relevant to each individual, delivered in a way that suits them.
How do we a) manage, retain, and make accessible existing critical knowledge and b) how do we upskill and re-educate? This decouples constituent skills, qualities and attributes from their current ‘qualification’ or ‘job role’ bodies.
Outside of the workplace, employees are already turning to e-learning resources like Lynda, YouTube, Medium posts, message boards, and podcasts to fill in their knowledge gaps on their own time.
Businesses must look at the opportunity to support the creation of coaching services, automated skill matching, and truly interactive and exciting training programs. In fact, Kameel Khan describes AT&T’s partnership with Coursera, the online learning platform, enabling employees to do any course they want, at any time. Each employee can dictate what they want to study and with an account and budget that they can use to learn. Empowering employees to learn through technology will help to create a culture of learning.
In the years ahead, there will be no shortage of promising technologies to evaluate – there will be more ideas and inventions in the next decade than we saw in the last century.
What we need to do is focus our efforts on how we can best integrate the most appropriate technologies for each situation.