|The Roadmap to Redesign (R2R)
Louisiana State University
Course Title: College Algebra
Louisiana State University (LSU) plans to redesign College Algebra, a three-credit course currently taught in a traditional format with three lectures per week in sections of 40-44 students. Course enrollment averages 3500 students in the fall, 1200 in the spring, and 200 in the summer. The course has traditionally been taught by a combination of senior instructors (15% of the sections), Rank I and II instructors (35%), and graduate teaching assistants (50%).
Because of an emphasis on hiring more research professors as part of its Flagship Agenda, LSU will no longer employ Rank I and II instructors as of fall 2005. The redesign for College Algebra is, therefore, essential in order to serve the same number of students with significantly reduced personnel. The success rate (grades of C or better) for College Algebra over the last five years has averaged 64%, with the fall average being significantly higher than the spring average. The goal of the redesign is to maintain this rate and possibly raise it over time.
The course redesign will follow the Emporium Model beginning with a pilot in the spring of 2005. LSU will use MyMathLab supplemented by materials developed at LSU and at the University of Idaho. In the fall of 2005, LSU will begin the full transition by teaching one-third of the course population in the emporium, one-third by senior instructors in the traditional format in sections of 175 students each, and one-third by TAs in traditional classes of 40-44. In the fall of 2006, the emporium will be fully implemented with the entire course population.
The planned redesign will enhance the quality of the College Algebra experience by motivating students to take an active role in learning and to spend time working rather than watching mathematics. Faculty members are constantly frustrated that students in the traditional course are so passive in the classroom and want a cookbook approach to mathematics. Furthermore, with such a large population each semester, grading homework by hand is just not feasible. Therefore, very few students do the work required to master the skills and concepts of the course. Online assessment software will provide a tool to allow for continual assessment and immediate feedback. The Emporium Model will also allow students with varied backgrounds to receive individualized assistance at their own pace in a learning center staffed with instructors and tutors. All students will be required to attend the learning center at least two hours per week; thus, many students are likely to spend more time doing mathematics than they are spending in the current model.
Student learning will be assessed by comparing final exam medians and course grade distribution in redesigned sections with data collected from traditional sections in five previous years and between formats running parallel during the pilot semester and transition year. Both versions of the course will follow the same syllabus, test guidelines, number of questions per topic, point-value distribution per topic and grading scale as in previous semesters.
The redesign plan will produce cost savings because the same number of students will be served using one-half of the current personnel. Section size will stay at 40-44 students, but the number of class meetings per week will be reduced from three to one. The cost of adding tutors in the learning center as well as increasing the amount of time for coordination and systems administration reduces the net savings, but the redesign project will reduce the cost-per-student from $121 to $95. The long-term plan at LSU is to phase in the Emporium Model for all pre-calculus courses, an annual enrollment of approximately 7500 students. The lab infrastructure created as a part of the redesign of College Algebra will scale, thus enabling LSU to achieve further savings.
At Louisiana State University, the redesign of College Algebra is moving along very well. The LSU team profited from visits to both the University of Alabama and the University of Idaho. During fall 2004, the team tested MyMathLab in four large sections and one smaller one and worked with the publisher to incorporate some needed additional features.
During spring 2005, LSU is running a seven-section (200 students) pilot for College Algebra taught by four experienced instructors using a modified emporium model. Students are required to attend the Math Learning Lab for a minimum of two hours per week and meet in focus groups one hour per week. Currently open 30 hours each week, the lab is staffed with four instructors, four TAs, and four undergraduate math majors.
Students are drawn to the lab because they can receive individualized, immediate assistance on homework and quizzes. Students are assigned eight quizzes over the course of the semester with 10 attempts allowed for each quiz. The quizzes can be taken in the lab. They are not proctored and can be done with assistance from tutors or other students. MyMathLab software is used for this as well as for the four proctored tests, which are administered online in the university testing center. The final exam will be a traditional paper test, which will also be taken by the other 21 sections that are following a traditional format. This will provide some initial assessment data, with the primary assessment being carried out in fall 2005.
LSU faculty have learned three valuable lessons from the pilot and will make changes accordingly in the fall: 1) quizzes will be due the night before a focus group is held rather than the night of the focus group so that students can start working on new material right away; 2) because task lists organized by textbook sections, while working well, generate too many pieces of paper for students, the lists will be combined and organized by units; and 3) make-up tests for the four major tests will be administered two days after the test cycle finishes, not one day after, allowing time to communicate with students, the coordinator and the testing center.
With funding from the LSU Student Technology Fee, the Louisiana Board of Regents, and the LSU Office of Academic Affairs, a ballroom is being renovated to house a new 114-seat Math Learning Lab slated to open in August 2005. A second pilot in fall 2005 will involve three different groups of about 1000 students each, one in the new, larger emporium, one in sections of about 40 taught by TAs, and one with larger sections of around 170 taught by experienced instructors. LSU anticipates moving to full implementation in spring 2006.
The fall 2005 semester started smoothly with 880 students enrolled in redesigned sections of College Algebra using LSU’s beautiful new 116-seat computer lab. Eight hundred students were enrolled in five 160-seat large lecture sections of College Algebra taught by career instructors, and another 560 students were enrolled in 14 traditional sections of College Algebra taught by TAs.
Then along came Hurricane Katrina, causing catastrophic death, damage, and destruction to south Louisiana and the Gulf Coast . Classes were cancelled for one week. The math lab was “borrowed” for 12 days and used to register the 3721 displaced students that LSU admitted from New Orleans universities. For six days, the math department shared lab space across campus with students taking music and chemistry tests. Traffic in Baton Rouge doubled; it is now the largest city in Louisiana . Many students shared living arrangements with multiple families and pets, but spirits remained high. Classes were extended for another week into December. The lab was returned, and students and teachers tried to get back on track. Over 450 students from New Orleans enrolled in College Algebra alone. In an end-of-the-semester survey, 55% of College Algebra students said that their performance in the course was negatively impacted by Hurricanes Katrina and/or Rita. The team thinks that is a conservative estimate.
All students in the three delivery systems took the same final exam. The median score out of 200 was 154 for large lectures, 148 for redesigned sections, and 128 for traditional sections. The overall median was 139. The team was pleased to see that 80% of the students who participated in at least 70% of the focus group classes and required two-hour per week lab times in the redesigned format earned a grade of A, B, or C. There were lots of drops, but with the kind of semester LSU experienced, that is not surprising.
In analyzing the data from the fall of 2005, the team found an interesting side-effect of the redesign model. It appears that the success rates of the students using this model were less sensitive to the experience and skill level of the teacher than they historically were in the traditional model used in past years. About half of the 22 redesigned sections were taught by experienced instructors and the other half by mostly first-time-teaching graduate students. The percentages of each grade (A, B, C, D, F, and W) that students earned were almost identical for the two groups.
During spring 2006, all College Algebra students used the redesigned format, and LSU added a five-hour Precalculus course as well. The semester moved along quite smoothly as compared to last fall, and the inclusion of a second course has been relatively painless.
The university has just committed $400,000+ to renovate space for another 136-seat math learning lab to open in fall 2006, bringing LSU’s math lab capacity to 232 seats. LSU plans to accommodate almost 3,000 students using the redesigned delivery format. Discussion has begun about bringing another course into the redesigned model in spring 2007, which would add another 1400 students.
In the redesign, did students learn more, less or the same compared to the traditional format?
Learning outcomes were measured by comparing final exam medians on a common final exam. From fall 2001 through fall 2005, the final exam median for traditional sections ranged from 70% to 76%. In fall 2005, the final exam median for the redesigned sections was 73%. It is interesting to note that from fall 2001 through fall 2004, the final exams were graded by individual instructors. In fall 2005, the final exam for both the traditional and the redesigned sections was group-graded, which means that grading was more rigid and more consistent from one section to the next. In fall 2006, the final exam median was 78%, the highest ever achieved.
From spring 2001 through spring 2005, the final exam median for the traditional sections ranged from 68% to 71%. In the spring 2005 pilot semester, the final exam median for the redesigned sections was 61% and the spring 2006 median was 67%. The 67% median is impressive considering the fact that the computer-based final exam did not allow for partial credit whereas previous paper-based exams allowed for partial credit.
DFW rates increased due to a series of unusual circumstances. From fall 2001 through fall 2005, the DFW rate for traditional sections ranged from 29% to 42%. In fall 2005, the DFW rate for the redesigned sections was 52%. Both hurricanes Katrina and Rita occurred early in the fall 2005 semester. After the students from the New Orleans schools who added late after Katrina were removed, the DFW rate for the redesigned sections decreased to 47%. The disruptions produced by the hurricanes (for example, the lab was taken over for two weeks by the university to process displaced hurricane victims) caused many regular LSU students in the redesigned sections to get behind early and never catch up. Obviously, this had a negative impact on retention and success as well.
From spring 2001 through spring 2005, the DFW rate for the traditional three-day-a-week sections ranged from 34% to 50%. In the spring 2005 pilot semester, the DFW rate for the redesigned sections was 53%. The spring 2006 DFW rate was 59%.
In addition to the problems created by the hurricanes in fall 2005, several causes for this increase in the DFW rate for spring 2005, fall 2005, and spring 2006 have been identified. First, the demographics of the students enrolled in the course were changed effective fall 2005 when students who earned a Math ACT score of 25 or greater (or equivalent QSAT score) were given credit for the course and removed from the population. Prior to this time the cutoff score for credit was 27.
Second, computer-based testing was introduced. There was a steep learning curve every semester when students were required to learn to use computed-based software for assessments. They had to learn to type mathematics precisely and give exact answers with no partial credit awarded. Until students learned these skills, grades on assessments in the redesigned sections, especially early in the semester, were much lower than they were in the traditional format. In addition, the software did not allow for credit to be awarded for individual parts of multi-part problems during those three semesters. This feature will be available in subsequent semesters.
Third, grade inflation was eliminated because computer-based assessments provided the opportunity for control and consistency of grade assignments by removing the subjective factor of instructor’s assigning grades.
Other Impacts on Students
Student participation grades were calculated and then compared to the final course grade. The participation grade was calculated by dividing the number of weekly class meetings the student attended plus the number of weeks a student spent working the required minimum hours in lab by the total number of class meetings plus the total number of weeks of lab participation. The result was that 80% of the students who earned a participation grade of 70% or higher earned a final grade of A, B, or C, which is considered success in the course. This indicates that the redesign method works as long as a student participates in 70% of the minimum course requirements. Therefore, the key to redesign success is to get students to “buy into the program” early in the semester before it is too late for them to be successful.
At the end of each semester, a student survey was taken. The results indicate that 89% of the students think that using the software helped them learn mathematics, and 88% would recommend this redesigned method to a friend.
Were costs reduced as planned?
The original Course Planning Tool estimated that the cost-per-student would decrease from $121 to $95 per student. Adjustments to the plan were made: using the tutor supervisor (formerly called lab director) only 25% of the time and using TAs as lab managers (formerly called system administrators) to handle the data. These changes further reduced the cost-per-student: the fall 2006 cost-per-student is expected to be about $78, a 36% savings.
Pedagogical Improvement Techniques
What techniques contributed most to improving the quality of student learning?
Excellent commercial software. The MyMathLab software used is student-friendly and has a tutorial feature that allows students to get help on any homework exercise without having to ask for help. This feature was valuable since some students prefer not to have anyone know what they do not know. MyMathLab also fostered active learning and allowed for individual differences in learning styles.
Multiple attempts at homework exercises and quizzes along with carefully selected deadlines. Assessments were structured so that homework exercises could be attempted an infinite number of times and quizzes could be attempted up to ten times. This meant that the student who wanted to learn and was willing to work for a good grade was capable of earning all credit available in those two categories. In addition, the team found that homework should be due the day before a topic quiz is due, and that quizzes should be due the day before new material is taught in class since most students waited until the last minute to do assigned work.
Caring, attentive but not overbearing lab tutors. The math graduate students and the undergraduate math majors who worked as tutors in the lab along with the instructors were taught to be friendly, helpful, and knowledgeable about the math topics and the software. They were always available to give immediate feedback on an individual basis.
Comfortable computer lab open at flexible hours. The new 116-seat lab was housed in an historic building with a high ceiling, exceptional natural lighting and a courtyard view from the large windows. This environment helped to prevent students from being stressed about their surroundings. The lab was open 60 hours a week in the fall and 48 hours a week in the spring (course enrollment is significantly lower in the spring semester.) Having the ability to choose the day of the week and the time of day that worked best for individual schedules supported the students’ learning performance.
Weekly class meetings. Students met weekly for 50 minutes with their instructor-of-record in a traditional classroom setting. This instructor provided the students with a person to identify with, helped keep them abreast of their upcoming deadlines, and pointed out and explained difficult concepts that were included in the math topics for the week. Some teaching in the traditional sense took place here, and students were given a feeling that they were not alone.
Weekly instructor meetings. A one-hour meeting was held each week with the course coordinator, the tutor supervisor, and all instructors, including faculty and graduate students. This meeting allowed everyone to discuss administrative issues, student questions, and content concerns. Though meetings are usually not popular with faculty or graduate students, attendance at these meetings was usually 100%, which was a sign of their effectiveness.
Cost Savings Techniques
What techniques contributed most to reducing costs?
Increase in instructors’ student load with no increase in workload. The redesign format allows one instructor to teach twice as many students as were taught in the traditional format without increasing class size. In the traditional format, each instructor taught one three-day-a-week section with 44 students. In the redesigned format, that same instructor could teach two sections of 44 students plus spend four hours tutoring in the lab. This could be accomplished because the class only met once a week and because no hand-grading was required.
Use of graduate students to handle data. Initially the team thought that a data manager would have to be hired to handle the course data, but four graduate students were identified who showed an interest in this activity. Three were trained to run the time clock and the attendance log, and the fourth handled the manipulation of data during and after the end of the semester.
Use of instructors to supervise tutors . Several instructors were identified to handle tutor scheduling and supervising in the lab. This job has been refined to take up only 25% of one person’s time each semester.
What implementation issues were most important?
Administrative support. The redesign had strong administrative support at all levels starting in the initial planning stages and continuing to this day. This was and is still essential for its success.
An effective leadership team. For redesign to be successful, it is essential to have an effective, determined leader supported by a team of dedicated, enthusiastic individuals all willing to do whatever it takes to make redesign work. Without an organized, committed team, achieving a successful redesign implementation will be difficult.
Teacher buy-in and training. Not everyone reacts well to change, especially when that change takes so much control away from the instructor-of-record. In the redesign, the instructor had little time each week to be “the sage on the stage,” so instructors had to be trained to handle their new role. Instructor attitude toward redesign can have a tremendous influence on the success of individual sections depending on their level of support. Instructors must be trained to believe in the new system and exude this belief to their students. Not surprisingly, there appeared to be a strong correlation between the final grades students earned and the attitude of their instructor toward redesign. In addition, the instructor who supervised the tutors in the lab needed to be heavily invested in the redesign in order to set the correct tone.
Tutor selection and training. Just because a person knows math does not mean that this person knows how to be a good instructor or tutor. Personnel including faculty, graduate students, and undergraduate math majors needed to be carefully selected and trained to have the proper skills and attitude to give one-on-one help in the learning lab.
Syllabus and scheduling. Syllabus preparation had to be well thought out and instructions had to be crystal clear to all because each semester new students began who had no prior understanding of how the redesigned course worked. Careful scheduling helped to avoid congestion in the learning lab and the testing center by staggering class days and staggering deadlines for assessments.
Student engagement. Preliminary data shows that a student who did not participate fully for the first three weeks had little chance of being successful in the course. This was probably the biggest implementation problem. The university is still exploring ways to get students engaged early.
Technology issues. The technology used in the redesign was a wonderful aid because it provided students with seemingly infinite numbers of algorithmic iterations of exercises to practice and because it reduced the time it takes for teachers to grade. There are, however, two problems in using computer technology for mathematics. First, the current version of the software cannot give partial credit. Students coming from high school think that they are entitled to partial credit, and they are not happy when it is not forthcoming. Second, typing mathematics can sometimes be difficult, and students must practice the techniques involved and learn to read and follow the directions. These skills take some time to develop, but the latter is a life skill worth learning. Despite the difficulties in typing answers in mathematics, the large majority of student errors were content errors and not typing errors.
Will the redesign be sustained now that the R2R program has concluded?
There is absolutely no doubt that this redesign will be sustained. The administration strongly supports the redesign and has been instrumental in its creation and early success. The university has already invested substantial resources in the existing 116-seat learning lab, and a second 122-seat learning lab currently under construction will be operational in fall 2006. Preliminary data shows that learning outcomes are close to the level prior to redesign.
Will you apply the redesign methodology to other courses and programs on campus?
The redesign initially focused on College Algebra. In fall 2006, Precalculus (5 credit hours) will be taught entirely using the redesigned format. Plans are being made now to include Trigonometry beginning in the spring 2007 semester.