Improving the Quality of Student Learning

Tallahassee Community College

Based on available data about learning outcomes from the course pilot, what were the impacts of re-design on learning and student development?

The consensus of the redesign faculty is that the elements of the redesign that were fully implemented were equally effective for all students and helped students at all levels. Specifically, the part of the course content best served by the redesign was the assimilation of reading into writing because it helped students develop a critical understanding of the texts they read, helped them produce good quality written work based upon text (a key academic skill). The redesign also enhanced reading and writing outcomes. The data support some of these beliefs. However, there are some interesting findings that will need to be addressed as the redesign proceeds.

An Analysis of Covariance (ANCOVA) with pre and post performance on Reading, English Language Skills, in-class writing assignments, and motivation and confidence instruments was conducted with the pre-test data serving as covariates and post-test data as the dependent variables. A random sample of out-of-class final essays from both the redesigned and the traditional sections were scored independently of the redesign faculty and analyzed using a t-test. Final grades and retention rates have also been examined for students in redesigned and traditional sections. To date, data have been analyzed by placement (prior enrollment in developmental English and/or reading courses), ethnicity, gender, and disability with the following results.

Reading data indicate no significant differences between traditional and redesigned sections overall, and no significant differences for gender, race, disability or placement. Gains were positive for all groups but were small for both traditional and redesigned sections.

English Skills data pose some questions relative to the redesign. There were no significant differences overall between redesigned and traditional sections. However, there were significant differences for race (p < 0.002), gender (p = 0.03), and prep placement (p < 0.001). Further analysis indicates the following:

  • The scores for black students in the redesigned sections decreased on the post-test;
  • The scores for LD students in the redesigned sections decreased on the post-test. As the numbers were very small (3 in redesigned sections and 10 in the traditional sections), this did not result in an overall effect for disabled students;
  • The scores for female students decreased on the post-test;
  • The scores for students who placed in both prep reading and English decreased on the post-test while those who placed in reading or writing only showed very modest gains. Students who did not place in prep reading or writing also showed a modest gain but had significantly higher scores on both the pre-test and the post-test.

Overall, the means for the traditional sections were slightly higher and in many cases scores decreased on the post-test.

Data gathered from in-class writing samples indicate no significant difference between traditional and redesigned sections overall. In-class writing samples were impromptu writings that are completed within one class period. Significant differences are evident for placement (p = 0.02) and race (p = 0.01). (See tables 5 and 6 for means). Further analysis indicates the following:

  • White students made greater gains that black students and achieved a higher average score on the final writing sample. While these differences were statistically significant there are not very large. White students had an adjusted mean of 7.8 compared to 7.2 for black students.
  • Students who did not place into either reading or writing prep courses had a statistically higher average adjusted mean (7.9) The differences were greatest between those who placed in both (7.2) and those that placed into writing only (7.2). These differences are not unexpected but point to the need to work closely with the college prep faculty and the students who are entering from prep courses.

Five final out-of-class essays were selected randomly from each section giving a sample of 70 essays (80 were planned but 10 were not turned in by the grading session). The 70 scored essays represented 22% of the students who finished the pilot. The essays were graded by an independent group of faculty using the established holistic scoring rubric. This rubric, which uses two graders each scoring on a 6-point scale, was developed for the College Level Academic Skills Test (CLAST) and has been in use a TCC for many years. The results indicate that the students in the redesigned sections performed significantly better (p = 0.03) than those in the traditional sections. The means were 8.35 for the redesigned group and 7.32 for the traditional group. This is a potentially important finding and goes to the heart of the redesign.

All of the final out-of-class essays for the redesigned group were graded in the independent scoring session as well as 100 of the first out-of-class essays. This resulted in 86 pairs of first and last out-of-class essays for the redesigned group that were used to examine growth. A dependent t-test showed significant growth over the course of the semester (P = 0.005).

Another scoring rubric was piloted and was used to grade earlier out-of-class essays. However, the group felt that the criteria were not well established and were not providing valid data. Hence the earlier scores were not used in the analysis. Another comparison is planned for later this summer when approximately 100 first out-of-class and 100 final essays from traditional classes will be independently scored and compared to the essays from the redesigned course. These essays will also be used to examine growth within the traditional group so that a comparison can be made.

The data on motivation produced no overall effects. However, there were significant differences for LD students (p = 0.04) and for instructor (p = 0.03). Further analysis indicates:

  • The adjusted means for LD students was significantly lower than for other students with other disabilities or no reported disability;
  • There were significant differences between instructors. This is not unexpected given that 8 different instructors were involved in the pilot. However, the effect that instructors can have on both motivation and confidence could be an important factor when considering training for new and adjunct faculty.

Overall, the adjusted mean for traditional sections was higher (67.4) than for the redesigned sections (65.3). While this difference is not statistically significant (p = 0.07) it is worth noting that this is likely a reflection of the learning curve of both the students and the faculty as they navigated the mountains and canyons of implementation. All groups made gains in motivation. However, many students had dropped out by the time the post-test was given.

There were no overall significant differences in confidence data between the redesigned and traditional groups. Significant differences were evident for prep students (p = 0.02) and for instructor (p = 0.005). Further analysis indicates:

  • Students who placed in both prep reading and writing made fewer gains (+7.49) than those with no prep courses (+14.31) or who placed in only reading (+13.28) or writing (+12.68). These students also had a lower adjusted mean (70.8). Interestingly, the students with both prep courses began the semester with a higher confidence score than their counterparts.
  • There were significant differences in the confidence levels of students by instructor. As mentioned earlier, this is not unusual when many different instructors are involved.

The success rates for the redesigned sections were not as high (56.8%) as for the traditional sections (62.8%). The overall success rate for all traditional sections including those in the pilot was 62.3%. As students are required to earn a grade of C in order to proceed to their second English course, success is defined as a grade of C or better.

The attrition rate was higher for the redesigned sections (22%) compared to traditional sections (16%). Not all instructors provided data on earned Fs as opposed to students who received an F because they stopped coming to class but did not withdraw. For this reason grades of F and W are combined to determine the attrition rate.

In terms of quality improvement based on outcomes, the most promising data points to improved quality of writing which the faculty attribute to the integration of reading, engagement with text, and the wide variety of collaborative activities, including extensive use of the discussion board, that focus student attention on critical thinking and thoughtful exposition. Clearly, there are still many issues to be resolved with regard to improving basic skills within the context of the redesign.

December 2002 Update: As the English faculty wished to proceed with implementing the redesign across all sections of Freshman Composition, there was no real comparison group for the fall, 2002 semester. Full implementation was defined as making available, advocating and strongly advising the use of the following course components:

The traditional course of summer, 2002 and earlier semesters no longer exists and, for that reason, no formal study was conducted. However, careful attention has been paid to outcomes from those sections that were taught by the faculty who participated in the original study. These faculty members were deemed to be truly implementing the redesign whereas other faculty members were struggling with the technology as well as the new curriculum and were implementing only portions of the redesign, with varying degrees of success.

Using data from the summer as well as from the original pilot we were able to establish that there were no significant differences between the writing scores of students in traditional sections and redesigned sections at the outset of the term (p = 0.26). This had not been established in the spring due to incomplete data. Using the traditional data from spring and summer for comparison, the significant differences between the final writing scores evidenced in the pilot (p = 0.03) were maintained (p = 0.04), reinforcing earlier indications that the redesign is having a positive impact on the quality of student writing.

Success data for the fall are also encouraging. Seventeen of the 57 on-campus sections of Freshman Composition offered in the fall were taught by faculty who participated in the pilot. These sections had a success rate of 74.6% compared to 64.1% for those sections that were just beginning implementation. This is particularly important because increased success and retention lead to cost saving for the institution.

On the down side, English Language scores for the 12 sections actually collecting data, increased only slightly (2.3%) and reading scores decreased (1%). This latter finding is of interest because the students are required to do a significant amount of reading throughout the course. This may point to a "disconnect" between the test (which is a CLAST facsimile) and the actual skills required to successfully decode, comprehend, and respond to a given piece of writing. The faculty will spend some time this summer revisiting the CLAST facsimiles in both English Language Skills and in reading.

The students from the pilot were also tracked into their next English course to see how they fared. The students from the redesign had a 61% success rate compared to 69% for the students in traditional sections. While this is disappointing, it is not unexpected given the incredible learning curve that was taking place during the pilot. At the end of the spring semester, the students from fall 2002 will be tracked and success in the next course will be evaluated.

Students in the sections taught by the pilot faculty members during fall, 2002 and spring, 2003, will be analyzed using the same criteria as during the pilot. The results of this analysis will be used to further inform decisions with regard to the impact of the redesign on different groups of students.

Back

 

Program in Course Redesign Quick Links:

Program In Course Redesign Main Page...

Lessons Learned:
Round 1...
Round II...
Round III...

Savings:
Round I...
Round II...
Round III...

Project Descriptions:
Sorted by Discipline...
Sorted by Model...
Sorted by Success...
Sorted by Grant Rounds...