Improving the Quality of Student Learning

University of Massachusetts Amherst

Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?

In the fall 2000 semester, three sections of Introductory Biology were offered, each in a different course format, with 200-250 students in each section. One section was taught in a traditional lecture format. The second section incorporated active learning in the class using the in-class communication system, ClassTalk, to report student answers following small group discussion of posed problems. The third section followed the course redesign: students accessed a student preparation Web page as a guide to the reading and activities required to meet the course objectives for each class. Before each class, students also demonstrated their preparedness by completing an online quiz whose questions related to the reading material and class objectives. The goal of the redesign’s student preparation Web page and pre-class quiz was to prepare students for participation in the active learning class environment and to introduce content not covered in class.

The second half of the two-semester Introductory Biology sequence was taught in the spring 2001 semester. One section of this course was taught in the redesign format while the other section was a traditional lecture. While most of the information presented here pertains to the detailed assessment collected in the fall 2000 semester, we have included information obtained from the spring 2001 semester as well.

Learning Outcomes

We compared the course letter grade, 4 exam grades, lab grade, success rates (students receiving a C or higher), and withdrawal rates for each of the course formats. We found no significant difference in these measures among the three class formats.

We did observe however that student success was correlated with regular completion of the preparation page, and we will test this result with data from the second semester redesign course.

While the quantitative measures of student success did not reveal any significant difference in learning outcomes between course formats, no particular group of students (identified by gender, ethnicity, major, SAT scores, high school grade point average) was selectively harmed by the redesign we used. Given this lack of evidence that some group of students was not being well served by the redesign, we implemented redesign in both sections, thus serving all students in the second year of the project. We have just completed this full implementation of the redesign and do not yet have the analysis of the data regarding the impact of these variables on success in the course.

Attitudes toward Science

A multiple choice attitudinal Student Learning Survey was administered at the beginning of the fall 2000 semester and again at the beginning of the second semester to assess the impact of the course format on student perceptions and attitudes concerning science. Overall changes on attitude scales from the first to second survey were very small and may have been due to the composition of the population and the timing of the second survey.

Class Attendance

We did find striking differences in class attendance. The redesign and ClassTalk sections that used in-class problem-solving had a higher attendance on average than the traditional section. The fact that students received credit for participation in small group problem-solving likely contributed to this effect. Student opinion from mid-semester reviews, teaching evaluations, and small group interviews however reflected that students were more committed to attending classes that incorporated active learning and that they felt that they gained more from attending these classes when compared to a traditional lecture.

One unexpected outcome of the attendance data is that students in the ClassTalk and redesign sections were more likely to fill out teaching evaluations completed at the end of the semester since they were more likely to be in class. The net result is that the evaluations of these instructors better reflects the whole class opinion compared to a traditional section where about 50% of the students attend class and are available to report.

Usage of Preparation Pages and Duck Quizzes

We also assessed the student usage and opinions of the student preparation Web page and pre-class Duck quiz components of the redesign section. Most students accessed the relevant Web page and completed the Duck quiz by the designated time before each class meeting. Interestingly, we noted that students explored the instructor feedback of many of all answer choices even if they had initially selected the best or correct answer. One unanticipated outcome was that the students also used the preparation pages and quizzes heavily for review before exams; there was a striking increase in the preparation page logins on the days immediately before an exam.

Student Self-Reports

A survey of student opinions indicated they felt better prepared for class by the preparation page and Duck quiz. Sixty-seven percent of students agreed that completing the preparation page and quiz made them better prepared for class (13% disagree, 20% neutral) and 56 % agreed that they learned more from class if they had completed the preparation page and quiz (13% disagree, 32% neutral). Students also perceived that the preparation delivered additional course content; 58% of students agreed that they learned things from the preparation page and quiz that were not covered in class (18% disagree, 24% neutral). Finally, students indicated that these resources were valued as a study tool throughout the semester; 89% agreed that it was useful to return to previous preparation pages and quizzes during the semester to review or prepare for exams (4% disagree, 6% neutral).

Student Attitudes and Reactions

We assessed student attitudes and reactions using mid-semester reviews, teaching evaluations, surveys and interviews. Student opinion of the redesign was strongly positive with students identifying the pedagogical strengths of the preparation page, Duck quiz and active learning environment. We specifically sought student reaction to these technologies and are making modifications based on their input. The positive student opinion was of course not unanimous; a small fraction of students who experienced the redesign in the fall semester elected the traditional class format in the spring semester. The complaints from those dissatisfied with the redesign mention that they: 1) want more lecturing, 2) do not want to be compelled to attend class, 3) do not want to waste time hearing student opinions.

Student Profiles

When quantitative and qualitative measures of student learning are included in analysis of the course formats, it is possible to interpret the student profile best and least suited for the redesign course. The course redesign was best received by students who

  • have easy access to the Internet;
  • are willing to participate in small group discussion;
  • can adapt to the daily routine of completing the preparation pages; and
  • are willing to come regularly to class.

Course Topics

The course redesign works well for a broad range of topics in biology. In addition to the first semester of Introductory Biology, that covers molecules, cells, development, metabolism and genetics, we also used the redesign format in the second semester of the two-semester Introductory Biology sequence. The second course covers physiology, ecology, behavior and evolution. The redesign was judged equally effective in all content areas and was warmly received by students and instructors in the second semester course.

Problem-Solving Skills

Students in the course redesign section of the course engaged in and practiced problem-solving to a greater extent than in traditional sections, more frequently made statements about biological ideas, discussed and defended ideas in biology more frequently, came to class better prepared, learned from the preparation page as well as class, and attended class more frequently.

Scientific Reasoning Skills

The redesigned course has become the subject of analysis of the Research in Education And Learning (REAL) project, an NSF-funded research effort based at Hampshire College. The REAL project team is working with the faculty of several introductory science courses, including Introductory Biology, to determine changes in college students’ scientific reasoning skills and views of the nature of science. Using multiple measures, the REAL project is comparing the scientific thinking and attitudes of students in the Pew redesign course with students in traditional course formats.

During the fall 2001 semester, the team has devoted considerable resources to observations of Introductory Biology. Lectures have been videotaped, student discussions during problem solving have been observed and videotaped, and several quizzes and exams have been administered to assess student scientific thinking skills. The analysis of these data is underway, and we look forward to this exciting expansion of the original assessment plan outlined in our proposal.

Students at Risk

We have not yet completed our analysis of student performance in the spring 2001 semester. In addition, we are in the process of determining whether student progress and effort in using the preparation page and completing the Duck quiz serve as early identifiers of students at risk. We piloted the use of the Supplemental Learning Services, provided by the University, in both semesters and will use this to tutor at risk students if they are identified prior to the first exam.

Problems with assessment methodology

We are concerned about the nature of our analysis of student success across different designs. In the pilot year, we were comparing success rates of different groups of students with different instructors and course designs taking different tests. With this many variables, it is not possible to draw meaningful conclusions from the data. There was no prior agreement as to the cutoffs for letter grades, and our analysis revealed that instructors used different numerical cutoffs for letter grade assignment making course letter grade and success rates less useful as measures.

Similarly, there was sufficient variability in exam questions and degree of difficulty to make direct comparison of exam performance unreliable. Each instructor wrote an exam for the students in his/her section. An effort was made to include common exam questions on each of the four exams. Though there were approximately 30 shared questions between two of the three sections, the exams were largely independent. (Unfortunately, the instructor of the traditional section did not use these questions in the final version of the exam. Further, subtle difference in the wording of several shared questions on the exams of the ClassTalk and redesign sections may account for observed differences.) Though the exams were all multiple choice, there were significant differences in the nature of the questions asked. Instructors in the traditional section wrote exams primarily designed to assess recall of information presented during the semester, while instructors in the redesign sections wrote exams primarily designed to assess problem solving skills and conceptual understanding. Instructors also admitted that they wrote exams with the intention of making them “easier” or “harder” depending on the distributions of scores on previous exams. These factors made comparison of student performance on these different assessment tools highly suspect.

Thus, the comparative assessment of student performance among sections was not particularly productive. There were too many uncontrollable variables to allow reasonable interpretation of the data. In addition, the grading was different, and success in one section could not be equated with success in another. Finally, the comparisons with the “traditional” section were troubling because they left a feeling of ill will toward the instructors of that section who felt that the design of the assessment was to “prove” that they were “not teaching right”. We are shifting our assessment efforts as a result of these observations.

Since a key goal of the redesign is to improve students’ abilities with problem solving and concept manipulation, we have shifted our focus to comparisons of exam questions from before and after the redesign. Initial analysis suggests that there has been a large increase in the problem-solving complexity of questions asked on exams while student success on exams has remained constant or even improved slightly. In the second year of the redesign, the instructors collaborated extensively to write a single set of exams administered to both sections of the course. These exams are specifically designed to minimize pure recall. Instead, they require the logical manipulations of concepts to solve problems, interpret data, or predict the results of manipulations of processes.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...