From the DirectorIan Johnston
Director, UniServe Science
I must be getting old.
A couple of months ago, I attended the annual summer meeting of the American Association of Physics Teachers in San Antonio, Texas. (Aside: I thought I was inured to heat, having been raised in Queensland, but Texas was something else. Can anyone tell me why anyone would hold an international conference in the middle of summer in the southernmost big city in the US?) Anyhow, at this conference there were lots of bright young faces, all enthusing over the new way of teaching physics - with Java Applets. Their simulations looked splendid and wonderfully professional, but what was the physics they were showing? Cannon balls dropping from the tower of Pisa!!
Haven't we been there and done that?
I got interested in the use of computers and Information Technology in teaching in 1988, when there was a very influential conference at the University of North Carolina, with the simple title "Computers in Physics Instruction". While it is true that computers had been used in physics education for a long time before then, it is probably fair to say that a very large number of academics, myself included, got their first real taste of the possibilities on the new instructional methods at that conference. At that conference we saw simulations of balls being dropped from the tower of Pisa, as well as waves diffracting and interfering, and electric lines of force being created. And much more. At the time it was all very impressive.
In the decade since then I've seen even more amazing simulations - fast moving charges radiating synchrotron radiation, quantum mechanical particles bouncing and diffracting at the same time, space-time being warped by the presence of a black hole, what the universe looks like to something moving near the speed of light. The power of the computer to visualize obscure physical processes has proved to be awesome indeed.
And of course the same is true of the other sciences, not just physics. Just think of all those beautiful three-dimensional representations of complex chemical molecules. Just think of the amazing virtual field trips made by the Open University, or the simulated laboratory instruments. And the teaching packages that take students step-by-step through even the most complex pieces of science. I must confess my all-time favourite is still Sniffy the Virtual Rat1. I still find it almost unbelievable that the principles of Skinner conditioning can be reduced to a computer algorithm. So you can see, I hope, why I am dismayed by yet another cannon ball being dropped from the tower of Pisa. I can understand that those who have been writing simulations for the last decade, and constructing CAL packages, might be daunted by the prospect of having to learn Java, or whatever is the next enthusiasm. Its only natural that the next generation has to take over and do things their way. But does the science, and the pedagogy, have to start from scratch each time? Which brings me to what I really want to say. Perhaps the fault lies with us, the older generation. Is it really obvious that the things we developed shouldn't be thrown away? All those simulations and packages that looked so great to us, were they really successful where it counted? Did they improve student learning?
For many years now the educationalists have been warning us that we must evaluate the teaching materials we produce. Which means more than testing whether things are laid out properly, and the interface is intuitive. It must also include finding out what students learned from using it. And that's hard to do. Which is why so few of us have done it properly. But it's time to change our ways. If we want to tell the younger generation that they should learn from what we did, then we must be able to say with confidence that what we did, worked. We've just got to spend a lot more time evaluating.
So. Next year's annual UniServe Science workshop will target evaluation, not just of computer materials but of teaching techniques in general. We have already secured two keynote speakers: Professor Michael Prosser from La Trobe University to talk about the educational aspects of evaluation; and Professor Ann Sefton, the leader of the Graduate Medical Program at The University of Sydney to tell us about one of the largest Internet-based teaching programs in the country, which has had evaluation built in from the start. We hope you will all attend and give us your experiences with evaluations you have done in your own areas.
So put it in your diaries now:
UniServe Science Workshop
UniServe Science News Volume 14 November 1999
Page Maintained By: PhySciCH@mail.usyd.edu.au