Education has come a long way through the ages. Socrates liked to ask probing questions, encouraging his students to draw their own conclusions, and Aristotle walked the cloisters of the Lyceum, holding lengthy discussions with his students. Public education as we know it was conceived sometime in the 19th century, between the enlightenment period and the industrial revolution, and as such was driven by both intellectual concepts and economic needs: a model that experts today call factory-based.
Any discussion on how education has changed is a lengthy one, spanning shifting pedagogies and classroom cultures, and should also involve an analysis of the educational policies and cultural attitudes over the years. But perhaps the most engaging and relevant of these factors is the method of instruction, for it is by far the most visible (and measurable).
Educational technology was first introduced in the 1960s by Seymour Papert in the form of the Logo Programming Language. Developed in collaboration with Jean Piaget – a renowned psychologist – Logo was developed to teach novices, including children, programming. Since then, technology has been seeping into education steadily, even more so in the past few years. At first, technology was seen as a supplement to traditional teaching methods, with students queueing for their turn to have a go at Microsoft Word; today, technology is fast becoming integral to the classroom.
In recent years, many schools have taken steps to integrate technology into the curriculum, usually by acquiring more electronic tablets. Yet modern technology moulded to a conventional curriculum can do very little to transform the learning experience effectively. Over and again, research has indicated that in order to maximise the benefits of using technology in the classroom, the technology actually has to be used correctly. In particular, it must support the following key components of learning: active engagement, contextualising curriculum topics within real-world problems, frequent classroom interaction with opportunities for feedback, and the ability to build on existing knowledge. There is also the valid concern that it can actually prove detrimental in the classroom, with arguments surfacing that technology actually functions more as a distraction than a help. Some have also expressed concern that a generation dependent on technology is less inclined to think critically, choosing instead simply to look up answers online. So, has education changed for the better? Or has technology made it worse?
There are some strong arguments for the former: One study, by IT trade association CompTIA, shows that around 75% of educators believe technology to have a positive impact in the education process. Another, conducted in Auburn, Maine, shows that kindergarten students using tablets scored higher on literacy tests than those who did not. Finally, research conducted by The Open University states how, used correctly, ICT can improve student achievement and creative thinking skills, while also saving teachers time.
Studies aside, a lot of it really comes down to practical sense. Students today, often dubbed as millennials, grow up actively using technology – by the time they reach school, they are already competent users. For these children, entering a classroom modelled on 19th-century ideals, learning from a blackboard is both uninspiring and tedious. Another point to consider is this: how else can we prepare future generations to be competent and successful in a digital age without integrating technology at the basic classroom level? Like every instructional method, technology too has had its share of teething problems, but the key to overcoming these lies in using it correctly. Already, newer inventions are replacing older models, with virtual reality being the latest entrant to the arena. Handwriting, encyclopaedias and dictionaries are fast becoming a thing of the past, with more and more students doing their homework digitally and looking up their sources online. Technology has also allowed students in remote countries to gain access to quality education through initiatives such as the Hole-in-the-Wall Education Project, the “granny cloud” and Twig Box.
While we are still a few years away from a curriculum overhaul (although the NGSS framework in the US is a major step closer in that direction), technology has allowed educators and teachers to keep up with the progress in the real world and move forward – and that can only be a good thing.