From Learning By Rote To Laptops In The Examination Hall

times-tables-colour

Any parent born pre-1970 will be familiar with the constant struggle to limit the time our children spend living out their lives in a virtual world. We desperately want to see them devote a little more time to playing in the park, building dens, making things and the like. The kind of stuff that we used to do. A common attitude to the digital age is that it presents amazing possibilities and that the technology behind it obviously has innumerable uses, but the idea of the computer completely dominating our lives is still somehow abhorrent.

The truth is that we’ve really got to wake up and smell the coffee.  The computer has now effectively become inexorably linked with everything we do and there’s no going back.

While many of us are reasonably comfortable with the fact that the computer has become an integral part of both our working and our leisure lives, education, it seems, remains a final bastion that shouldn’t be entirely annexed by technology.

Recent suggestions that laptops are soon to be allowed to be brought into the examination room are causing somewhat of a stir. Technology now no longer simply dictates what we learn and how we learn it, but it also dictates how we prove that we do indeed know what we’ve learned.

It effectively replaces the things that we (of a certain age) felt were the ‘sine qua non’ of learning – the core elements that provided the foundations of our education; learning our times-tables by rote, using mnemonics to memorise all kinds of useful concepts and of course the physical aspects such as writing an essay with a pen and leafing through a dictionary or thesaurus.

Our reticence to accept this new world is not simply attributable to our natural ‘resistance to change’ instinct. It is equally due to the fact that we had to work damn hard to memorise all kinds of information that we then had to regurgitate under ‘exam conditions’.  It was all about having enough of a bedrock of knowledge to be able to draw upon and extract the relevant material. Surely that’s what learning’s all about.

Well, yes it is – and few would argue with that synopsis… BUT … what exactly is the point of ‘exam conditions’?  In the past, everyday life was much more akin to a stream of ‘exam condition’ moments, with decisions constantly needing to be made in a kind of vacuum – with no reference materials available. And so, in a way it was right to focus on preparing people to deliver the fruits of their knowledge under those conditions.

Nowadays, however, our lives very rarely – if ever – present us with ‘exam condition’ moments. Information is ever-present and accessible from everywhere. We can ‘phone a friend’ or ‘Google it’ or download the relevant documents from ‘the cloud’.  Sure we might have dropped our mobile down the toilet by mistake and not have the necessary equipment to hand, but the “just in case” argument is likely to become increasingly invalid as 24/7 accessibility to that knowledge bank becomes the norm.

It does raise the question of how we define an ‘expert’. Traditionally an expert in any field is one who has an extensive knowledge base to draw upon, the ability to extract relevant material from it, the ability to apply it and extensive experience of having repeatedly done that.

Nowadays, to a degree, and certainly in the future, an expert is or will be defined as someone who simply has the ability to extract relevant material from a knowledge base (irrespective of where that knowledge based is stored). That, and the ability to apply it, along with experience, will be enough for them to be able to claim authority status.

In terms of education, this is where the issue gets interesting. If the key to expertise is knowledge extraction rather than knowledge ‘per se’ then surely we should let go of some of our Victorian principles, reduce the amount of stuff that children are forced to memorise, and spend much more time teaching them how to use a knowledge bank. Or, in today’s currency, how to use the internet and how to streamline the multifaceted ‘signposting’ that Google provides. That in itself is a complex task.

From a parent’s perspective, I’m delighted that my kids frequently get given online research projects. What I would also like to see is better instruction in how to use the internet. There is a monumental amount of information available on any topic and the process of identifying where to find precisely what you need – without having to sift through a mountain of irrelevant material – is no mean feat. Call it a science, call it an art, our children need help mastering it.

For the longer term, education needs a few visionaries.  Education is now right on the cusp, undergoing what several other industries underwent a few years ago.  The past few years have seen an incredible digital revolution as we try to keep up with and make best use of, the ever rapidly changing technologies available. The next few years are likely to see more of the same.

But to a large extent we are simply using technology to do exactly what we used to do, albeit more quickly, more easily and more efficiently. What we really need are visionaries who look into the future, see how it’s going to pan out, understand what today’s kids will need for tomorrow and develop educational systems adapted to deliver it.

My hunch is that we have to look at the issue in an entirely new way. It is no longer a case of how to bring technology into education. Technology is in the driving seat and it is our image of education that will need to be completely overhauled.