+353-66-714-5616, extension 2358
School of Science, Institute of Technology, Tralee, South Campus, Clash, Tralee, Co. Kerry, Ireland.
The module described here is called Business Mathematics and Statistics, and has been run as one half of a subject in what was an add-on Diploma in the School of Business. The students doing this subject had successfully completed a Certificate in Office Information Systems, and had gained entry to the Diploma in Information Systems Management. This typically required that the student had achieved a 55 percent average in their Certificate. In recent years, this requirement for entry to Diploma level courses has been dropped and now a grade of Pass (40%) will suffice. One of the most interesting aspects of this student group each year is that it is almost entirely female.
The e-learning platform Web Course Tools (WebCT) was introduced in the academic year 2003–04. The syllabus, full lecture notes, instructions for using the Statistical Package for the Social Sciences (SPSS), assessments and model solutions, and all other material previously printed for students are available on WebCT. The calendar facility is used to give an assessment schedule for the year and to link material relevant to each lecture and laboratory session. This academic year, the assessment tool was utilized, where students submitted a project on-line. To complement the course, the author found an excellent textbook that had an associated e-pack, which is supported by WebCT. This has some very good student and instructor resources, links to websites for resources and a set of quizzes corresponding to each chapter of the book. The students were not expected to buy the (very expensive) book, however, multiple copies were bought for the library.
For the first half -year, the students attend two 1-hour lectures per week and one 2-hour laboratory session where they are introduced to the statistical package SPSS. Typically continuous assessment strategies have included two unseen tests and two projects and, as the module has been half of a full-year subject, the final examination was then run at the end of the year.
With the implementation of the National Qualifications Awards of Ireland (NQAI) framework and the advent of modularisation within the Institute, the module has been reviewed and it will become a Level 8, 5-credit point module. To more closely align the assessment strategies with the learning outcomes, I changed the types of assessments, and trialled these changes in 2005–06. From 2006–07, continuous assessment will also include an integrated project with a marketing research module.
This module focuses on students being competent in manipulating and presenting data and being able to conduct appropriate hypothesis tests. The students learn how to use a typical statistical package (SPSS) to help solve statistical problems. The students were assessed on their ability to set up and conduct appropriate statistical tests and interpret the results, and on their use of SPSS.
As the student profile has changed over the past few years, it has become more difficult to cover the material set down for the module, and assessing the later sections of the module has become more problematic. In particular, there was less time for students to submit a project based on the later material. So, a change in the schedule and the assessments was negotiated with the students. Instead of a project and an unseen written test, a set of quizzes from the e-pack and a laboratory assessment was agreed.
The laboratory assessment was a 2-hour assessment where students were to use SPSS and other technology and material available to solve a set of problems. It was an open-notes assessment, as described by Race (2006, p. 52). Students were allowed to bring in three hand-written pages of notes and they could access any notes on WebCT. The reason for limiting the material they could bring with them was to encourage them to study, and this was stated explicitly. As this was a new addition to the module, a sample assessment was covered in the last laboratory session before the assessment.
In previous years the WebCT multiple-choice quizzes had been used as self-study tools and, again this year, the first few quizzes were open to students for self-study. The students could attempt each of the remaining four quizzes as many times as they wished and the average mark was recorded. The average of each student’s top two quizzes would make up the assessment mark. There was no restriction on what they could use to help them with the questions.
The students were definitely more confident coming into the laboratory assessment; they knew that they were going to be tested on how well they could use their knowledge and skills to solve problems, rather than on what they could remember. Particularly after trying the sample assessment, the students’ studying was more focussed, as they were aware of the types of knowledge and the level of skills they would be expected to display.
For the lecturer, it was a much-improved method of assessing learning outcomes, allowing the students to show their ability to apply knowledge to unseen problems and to show the appropriate use of technology.
The main limitation for both the students and lecturer was the time limit. Two hours were allocated, which was the same time given for previous unseen tests, but it was not enough. More time needs to be allocated to cover the same material, mainly due to inputting data and printing the outputs required for the solutions.
The quizzes in the e-pack are made up of extremely well designed multiple-choice questions; each time the students submit their answers their result is given immediately. So, with this feedback, the students could identify where they made mistakes and redo questions, referring to notes and completed problems, if needed.
As the quizzes are all corrected electronically, there is very little work involved for the lecturer. The average mark for each quiz is produced by the e-pack and a spreadsheet was made so that the best two marks could be averaged.
Students needed to be on-line to do the quizzes. Those students who did not have on-line access at home, needed to allocate sufficient time at college to complete the quizzes. As described below, some students decided to just guess the answers until, hopefully, they obtained a result they were happy with. Some even guessed correctly the first time!
The lecturer was surprised by the students’ results, and questioned 22 of the students about their learning and preparation before and after releasing the results. This survey brought out some interesting aspects regarding the students’ experiences and perceptions of the assessment. Even though they were third year students, it was the first time they had experienced open-book exams. Table 21.1 summarises the student responses.
|How did you feel about open book exam?||Frequency|
|Notes were helpful||10|
|Notes were a psychological help||5|
|Still have Limitations||4|
Some of the comments from students who thought having notes in the exam were helpful include ‘couldn’t learn all the theory’, ‘sometimes my mind goes blank’ and ‘needed direction’. Others commented that it gave them a psychological boost saying, for example, that they were ‘more confidant and found it easier’ and that there was ‘not as much pressure to memorize SPSS’. A few students realized that there were still limitations to doing such an examination, they stated that they ‘still need to know how to do questions’ and that they ‘still need to know material for notes to be useful’. There was only one student with a negative attitude to the open book examination, she stated that she got more confused with the information that she had brought in.
Research has shown that in open-book mathematics examinations, the better students do better, and the less able students do worse than in standard exams (Michael and Kierans in Brown et al., 2003, p. 43). The lecturer opted for open-notes examinations, and allowing the students to bring in a limited number of pages of notes proved to be a very good tactic. Most of the students brought in the maximum (three) pages.
When asked if they studied more, less, or the same as they would have studied for an unseen test, the theme of the students who said they studied at least as much as for a standard test is represented in comments, such as, ‘wanted to ensure good notes’ and ‘deciding what notes to bring, learnt more’. In contrast, two students who brought in less than three pages said ‘couldn’t give any more, lack of access to SPSS’ and ‘Thought notes would help’ (and so studied less).
The results from the quizzes were disappointing. As noted above, the students were given the opportunity to complete the WebCT quizzes as many times as they wanted, in their own time, and with all available resources. The overall average was under 52 percent and one student did not attempt any quiz. It appears that the instant feedback was not taken into account, with most students repeating the quiz immediately and, sometimes, scoring lower. This points to the students’ lack of motivation to achieve their best, and the survey backs this up, with half of those questioned admitting to guessing, half had referred to notes, and only a small number (5 of the 22 surveyed) revised before and/or between attempts. The most telling statistic is the average time a student spent doing a quiz. Overall, the average time was only 8.5 minutes, which dropped to under 4-minutes when outliers were discounted. This is, in effect, less than 30 seconds per question.
The lecturer is repeating the procedure with another group of students. Although the the possibility of negative marking for the quizzes was explored, the lecturer decided to discuss some of the pitfalls that the first group encountered and to set the same conditions. The current group has already started doing the quizzes, and is spending an average of 47 minutes per quiz, and achieving much higher results.