*******
Ian Johnston is the director and
Mary Peat is Co-director of UniServe Science
UniServe Science News Volume 6 March 1997

Article

The Uniserve Science Computer Assessment Workshop

Ian Johnston and Mary Peat
director.science@uniserve.edu.au

Over the last decade or so the introduction of computers into mainstream teaching has spawned a number of sophisticated packages capable of providing Computer Based Assessment (CBA) on a large scale. In Australia a few university science departments have been using CBA in their mainstream teaching for some years. It seemed to us at UniServe Science that a discussion of the ways in which computers could be used in assessment would be timely. We invited representatives from several disciplines to share their experiences, and we put on our second workshop devoted to this subject on February 14, 1997.

What transpired at the workshop

The workshop was planned to be as hands-on as possible so that everyone attending could gain some experience of what was actually being used in science departments throughout the country. There were to be six sessions. In the end, one of the planned presenters fell sick, and only five presented — Fred Pamula, Department of Biology at Flinders University; Halima Goss, Teaching and Learning Support Services, Queensland University of Technology; Peter Ciszewski, Department of Biophysical Sciences and Electrical Engineering, Swinburne University of Technology; Roger Lewis, Department of Physics, University of Wollongong; and Sue Fyfe, Department of Human Biology, Curtin University of Technology. In the middle of this, we felt that it was important to have a plenary speaker, to remind people of the pedagogical issues involved in this kind of assessment. Professor Royce Sadler, from the Faculty of Education, Griffith University, presented these issues in a clear and concise manner. His talk was singled out by many in their final evaluation forms as the highlight of the workshop. On the day, there were a few technical difficulties with some of the software. There were some problems with the timetable because the weather delayed planes into Sydney by half an hour. But over all we believe the workshop was a success, and what was most appreciated by many of those attending was the chance to get to talk with others doing similar things. Ian Johnston is the director and Mary Peat is Co-director of UniServe Science

Outcomes of the workshop

When registering for this workshop, members were asked to fill out a short survey form concerning the use of CBA in their home departments. This provided us with information to start formulating questions that the workshop might address. Then throughout the day, and at the final wrap-up session, these questions were discussed in open forum. The questions that raised most interest were the following.

  1. Degree of usage of CBA. Of the 67 members attending the workshop, representing about 50 different science departments, 24 stated that they used some form of CBA in their (different) home departments. There were some 170 science departments who were not represented at the workshop, of course, but we have reason to believe, from surveys that we have done in the past, that the number of those who do use CBA but who were not at our workshop is a small number at most. Furthermore the relative use of CBA among the different science disciplines is very similar to what we have found elsewhere, viz that Biology and Chemistry are the main users of IT in their first year teaching.

  2. Feedback vs marks. Like most things in computer aided teaching, there is a lot of work involved in developing and using CBA. The question which worries us all is: will the students use it? One’s natural reaction, based on years of teaching experience, is that if you do not make it worth marks, they will not do it. One cautionary tale that surfaced during the workshop concerned physics students at QUT. Three years ago they were part of the campus wide CBA system that Halima Goss described in her presentation. It was compulsory and the participation rate was essentially 100%. Then a student survey suggested some of the students were unhappy about the inconvenience of the centralized system. The department chose to make it optional. Participation dropped to 10%. At the wrap-up session at the end of the workshop opinions were expressed that in well designed systems, students have proved that they will work without the carrot of marks. Nevertheless of the 24 departments represented at the workshop which use CBA, 9 of them use it purely summatively and another 12 use it formatively as well as summatively. Only 3 use it purely formatively.

  3. Questions of security. Opinions were widely divergent on this issue. Some departments adopted the attitude that a lot of thought must be devoted to keeping databases with student marks in them as inviolate as possible, but that it did not matter overly if students found out about some of the questions that were likely to be asked. Others felt that the development of good questions was a time-expensive business (the figure of one-and-a-half hours for a multiple-choice question was quoted by Royce Sadler in his plenary talk) and their usefulness was severely compromised unless security could be guaranteed.

  4. Cost effectiveness. The biggest question, at least in the view of the bean-counters, is whether CBA can save money. Some institutions (Swinburne University of Technology and the University of Adelaide) claim that they have demonstrated savings in their part-time teaching budgets. Others argued that it is naive to expect that real money will ever be saved, especially when the replacement cost of computers and the time investment in developing questions is properly taken into account; and we should focus our attention on increasing the quality of learning. CBA can relieve teachers of some of the drudge work and leave us free to put effort into what only we, as academics, can do. And then, even if there is no net saving of money or time, the whole exercise will truly be worthwhile.

    Conclusions from the workshop

    The most important conclusion that arose concerned the question of sharing the cost of using CBA. Writing questions for use with these systems needs a lot of time and energy, and it takes even more time and effort to evaluate whether or not they select out those students who understand our subject as we would like it to be understood. Those departments who have been using CBA for some years have built up item banks of these questions. Those who want to start using CBA are faced with the daunting task of building up their own from scratch, unless we can share our item banks with one another.

    As has been alluded to in the editorial of this newsletter, UniServe Science finishes its initial funding period at the end of 1997, and needs to consider future directions carefully. Perhaps we might capitalize on the goodwill that has been generated at this workshop and arrange for the establishment of nationwide item banks in which all those working in CBA may share. We will be actively exploring this idea during 1997.


    Return to Contents

    UniServe Science News Volume 6 March 1997

    [an error occurred while processing this directive]

    Page Maintained By: PhySciCH@mail.usyd.edu.au
    Last Update: Monday, 30-Apr-2012 15:41:56 AEST
    URL: http://science.uniserve.edu.au/newsletter/vol6/johnston.html