Improving Learning and Reducing Costs:
Program Outcomes from Changing the Equation

by
Carol A. Twigg

Download a PDF version

In September 2009, the National Center for Academic Transformation (NCAT) launched a three-year program, Changing the Equation (CTE), funded by the Bill & Melinda Gates Foundation. The purpose of the program was to engage the nation’s community colleges in a successful redesign of their remedial/developmental math sequences to improve student learning and reduce instructional costs. Each institution participating in Changing the Equation redesigned its entire developmental math sequence--all sections of all developmental courses offered at the college--using NCAT's Emporium Model and commercially available instructional software. Each redesign modularized the curriculum, allowing students to progress through the developmental course sequence at a faster pace if possible or at a slower pace if necessary, spending the amount of time needed to master the course content.

Following a national competition, NCAT accepted 38 institutions to participate in the program. CTE partnered experienced NCAT Redesign Scholars who had led successful math redesigns with new institutions to mentor them in the process. The redesign projects piloted their redesigns in spring 2011 and fully implemented them in fall 2011.

At the start, we said that if institutions followed our advice—derived from the successes achieved in past course redesign programs in developmental and college-level mathematics— we could guarantee that they would improve student learning, increase completion of the developmental math sequence, produce students well-prepared to tackle college-level math and reduce instructional costs.

And that is exactly what happened.

Not all institutions followed our advice. Despite repeated advice from both NCAT staff and NCAT Redesign Scholars, a number of projects simply ignored that advice and failed to do such things as require lab participation, award participation points as an incentive for student engagement, establish deadlines and clear expectations, monitor student progress and intervene when they were not meeting deadlines, and so on. In addition, six of the original 38 institutions withdrew from the program due to an inability to meet the program’s requirements.

What follows are the outcomes for the 32 institutions that fully implemented their redesigns.

Improved Learning

Thirty-two institutions redesigned a total of 86 developmental math courses. Comparative student learning outcomes between the traditional and redesigned formats of the courses were measured by comparing common final examination scores, common exam items and/or gains on pre- and post-tests with the following results:

  • 71 courses (83%): showed significant improvements;
  • 5 courses (6%): showed improvements, but the differences were not significant;
  • 7 courses (8%): showed no significant difference;
  • 1 course (1%): showed decreased learning, but the difference was not significant;
  • 2 courses (2%): insufficient data were collected to make a comparison.

Examples of comparative means on common final examinations:

  • At Pearl River Community College (MS), a comparison of common final exam scores between the traditional developmental math courses and the redesigned courses indicated that mean final exam scores improved in all three of the redesigned courses: 45% vs. 84% in Fundamentals of Math; 51% vs. 74% in Beginning Algebra; and 60% vs. 72% in Intermediate Algebra.                
  • At Robeson Community College (NC), mean scores on common final exams improved from 69% to 85% in Essential Mathematics and 69% to 79% in Introductory Algebra.
  • At Somerset Community College (KY) mean scores on common final exams improved from 75% to 87% in Pre-Algebra and from 72% to 82% in Basic Algebra with Measurement.

Examples of comparative means on common items from examinations:

  • Manchester Community College (CT) compared 15 common test items embedded into course assessments, five early in the semester, five at mid-semester and five in the final examination. A weighted average of correct responses showed an increase from 49 to 57 in Prealgebra and from 34 to 50 in Elementary Algebra.
  • Northern Virginia Community College (VA) compared performance on 30 exam questions given to both groups of students. Means increased from 66% to 84% in Arithmetic, 65% to 91% in Algebra I, and 57% to 87% in Algebra II. Overall, for the traditional sections in spring 2011, 60.9% of the answers were correct; for the redesigned sections in spring 2012, 88.3% of the answers were correct. This amounts to an impressive 45% improvement in performance by redesign students over traditional students.
  • At Northwest-Shoals Community College (AL), means increased from 73% to 82% in Basic Mathematics, 70% to 79% in Elementary Algebra, and 64% to 79% in Intermediate Algebra.
  • At Oakton Community College (IL), means increased from 49% to 77% in Pre-Algebra, 42% to 68% in Elementary Algebra, 60% to 82% in Elementary Plane Geometry, and 33% to 64% in Intermediate Algebra.

Course-by-Course Completion rates

NCAT asked each institution to compare course-by-course completion rates (grades of C or better or grades of P in a P/F system) between the traditional and redesigned formats. Thirty-two institutions redesigned a total of 86 developmental math courses with the following course-by-course completion results:

  • 20 courses (23%): had higher completion rates, 6 of which were significantly higher;
  • 5 courses (6%): showed no significant difference in completion rates;
  • 36 courses (42%): had lower completion rates, 21 of which were significantly lower;
  • 23 courses (27%): completion could not be calculated due to collapse of multiple courses into one;*
  • 2 courses (2%): insufficient data were collected to make a comparison.

* In order to compare individual course completion rates, one needs to look at the percentage of students who complete the same amount of material in the same period of time. In the redesign of their developmental math sequence, four institutions collapsed what had been 12 different courses into four, modularized courses. Students enrolled in the redesigned courses could be beginning anywhere from Module 1 to Module 15, etc. and pick up where they left off in a subsequent semester. Thus, there was no comparative basis to calculate completion rates.

In conducting an extended analysis of the discrepancy between increased learning outcomes and decreased course completion rates in Changing the Equation, NCAT has discovered a variety of reasons why course-by-course completion comparisons are not a true measure of the success or lack of success of the program. Among them are:

  • Prior Grade Inflation

    The majority of Changing the Equation teams discovered that pass rates in the traditional format were inflated by prior inconsistencies in grading practices. Unlike redesign students who were assessed on common outcomes, those in the traditional courses were assessed in a variety of ways which led to overall grading differences. Contributors to prior grade inflation in the traditional course included 1) having no clear guidelines regarding the award of partial credit, 2) allowing students to fail the final exam yet still pass the course, 3) failing to establish common standards for topic coverage (in some sections, entire topics were not covered, yet students passed), and, 4) failing to provide training and oversight of part-time instructors. Thus, the C or better rates for the traditional courses were almost universally inflated. (See the July 2011 issue of The Learning MarketSpace for a full discussion of this issue.)
  • Mastery Learning Requirement in the Redesign

    In the redesign, students were required to master all of the content of all of the courses. Redesign students had to pass each module independently at levels ranging from 75% - 90% before being able to progress to the next module, showing mastery in homework assignments, practice tests and module exams.

    In the traditional format, students exited the course by simply attaining a total cumulative score of at least 70% or 75%. Based on averaging grades, students were able to earn a C or better by passing enough tests and learning enough competencies but not necessarily all. In traditional sections, students would often continue on to the next topic without having demonstrated mastery of the previous topic. Increasing the mastery level above 70%–75% to 80–90% essentially raised the cut score for a student to earn a C in the redesigned course.

    Mastery learning thus meant that students were doing more work and learning more, which often took longer. That meant that many students did not complete a particular course by the end of the term. They were able to start where they left off in the subsequent term. Mastery learning, while sometimes taking longer to accomplish, ensured that students were well prepared to take on college-level work.
  • More Difficult Redesigned Courses

    This phenomenon expressed itself in two ways: 1) the redesigned course had more assignments, more quizzes and more tests than the traditional course and consequently took longer to finish, and/or 2) the redesigned course included more content than the traditional course and consequently took longer to finish.

    For example, at Nashville State Community College (TN) the redesigned course was created by combining topics from the three traditional courses into one course with the expectation that students could master the material in one semester. This resulted in all students, regardless of initial placement scores and level, being required to master the same material in the same timeframe. The redesign course material covered topics from Basic Math through Intermediate Algebra, and a review of data indicated varying completion rates for students based on how they would have placed in the traditional format. At Manchester Community College (CT), students in the redesign were not allowed to use calculators while those in the traditional classes were able to do so.

Improved Course Completion: Making Progress Grades

Despite these differences in grading policies in the traditional and redesigned formats and consequently the lower course-by-course completion rates, there are other indications that redesign students, in the majority of instances, completed at a higher rate. Of the 86 developmental math courses that were redesigned, a grade of “making progress” (MP) or the equivalent was awarded in 50 of them. Students receiving an MP grade must have been making substantial progress at a high mastery level. Definitions varied from school to school (for example, must have completed 86% of modules at 80% mastery, 80% of modules at 70% mastery, 75% of modules at 75% mastery, 75% of modules at 80% mastery, 60% of modules at 80% mastery, 50% of modules at 80% mastery, 50% of modules at 75% mastery.) These definitions are equivalent to a grade of C or better in the traditional courses.

When adding the MP grades to the C or better grades, the completion picture improves significantly:

  • 37 courses (43%): had higher completion rates, 21 of which were significantly higher;
  • 4 courses (5%): showed no significant difference in completion rates;
  • 9 courses (10%): had lower completion rates, 6 of which were significantly lower;
  • 12 courses (14%): did not award an MP grade and did not do a hypothetical calculation;
  • 1 course (1%): insufficient data were collected to make a comparison;
  • 23 courses (27%): completion could not be calculated due to collapse of multiple courses into one.

Thus, of 50 courses that awarded an MP grade, 74% had higher completion rates. These results are extremely encouraging, especially when taking into account the differences between grading policies in the traditional and redesigned formats of the courses.

NCAT has learned that one cannot evaluate the success of Changing the Equation by simply comparing individual course completion rates. Completion of the developmental math sequence and success in subsequent college-level math courses are the two most important data points to use to compare student success rates between the traditional and redesigned formats. If only 20% of students exit the developmental math sequence but 75% pass the college-level course, you still have a problem just as you did when 50% exited the sequence but were unprepared and only 30% passed the college-level course. Unfortunately, the time period of the program did not allow sufficient time to collect those data.

Some of the participating institutions have had sufficient time to collect preliminary data on how well students who emerged from the redesigned sequence performed in college-level courses compared with those who exited from the traditional format. The results are positive and, we believe, will be replicated in the other projects as more time goes by.

  • The two most common college-level entry math courses at Northern Virginia Community College (VA) are Mathematics for the Liberal Arts and Precalculus. The success rate (grade of C or better) in Math for Liberal Arts for all students in spring 2012 was 67.7%; for students who had completed the redesigned developmental math course, the success rate was 72.5%. The success rate (grade of C or better) in Precalculus for all students in spring 2012 was 57.7%; for students who had completed the redesigned developmental math course, the success rate was 72.0%.
  • At Northwest Shoals Community College (AL), the percentage of developmental math students successfully completing a college-level math course increased from 42% before the redesign to 76% after the redesign in 2011. NWSCC course-by-course completion developmental math rates were equivalent (one course) or higher (two courses) in the redesign.
  • Pearl River Community College (MS) redesigned its college-level College Algebra course. Student performance on the final examination increased from a mean of 64.4 in the fall 2009 and 2010 traditional format to 73.8 in the fall 2011 redesign. Completion rates in College Algebra went from 59% prior to the redesign to 76% in the spring 2011 pilot and 67% during the fall 2011 full implementation.
  • At Somerset Community College (KY), the percentage of developmental math students successfully completing college-level Applied Mathematics increased from 56% before the redesign to 67% after the redesign in 2011. The percentage of developmental math students successfully completing college-level Intermediate Algebra increased from 37% before the redesign to 43% after the redesign in 2011.

Cost Savings

With regard to cost savings, the CTE results were very successful. All but one of the 32 CTE completed projects reduced their costs. Some saved more than their projected savings; others saved less.

The average projected percentage reduction in the cost-per-student for the 31 institutions that reduced costs was about 29%.

  • 5 institutions (16.1%) projected a reduction of 40% to 55%;
  • 11 institutions (35.4%) projected a reduction of 30% to 40%;
  • 10 institutions (32.3%) projected a reduction of 15% to 30%;
  • 5 institutions (16.1%) projected a reduction of 15% or less.

The actual percentage reduction in the cost-per-student for the 31 institutions was about 20%.

  • 6 institutions (19%) reduced the cost-per-student between 30% and 55%;
  • 13 institutions (42%) reduced the cost-per-student between 15% and 30%;
  • 12 institutions (39%) reduced the cost-per-student 15% or less.

There were two primary ways that cost reduction was achieved: 1) by increasing section size, and, 2) by increasing the number of sections that full-time and adjunct faculty counted toward their load. Both of these strategies were implemented without increasing faculty workload because of the elimination of repetitive tasks such as hand-grading of homework, quizzes and exams.

Examples of increasing section size:

  • The Manchester Community College (CT) team followed their plan and saved more than initially anticipated. Section size was doubled from 25 students in the traditional format to 50 students in the redesigned format. The number of sections offered was reduced from 60 to 31. The cost-per-student decreased from $255 to $165, a 35% savings. Instructors were able to double the number of students because there was significant reduction in faculty time required to grade homework and prepare assessment materials. In addition, instructors were assisted in each redesigned section by two or three tutors. This allowed ample time to provide the assistance needed for all students. There was almost never a time when students had to wait for help, and most instructors felt an improved engagement with their students.
  • Stark State College (OH) reduced the cost of developmental math by increasing section size from an average of 24 to ~48 on the main campus and ~40 students overall. A significant 81% enrollment increase (from 4,400 to 8,000 students) occurred also, yet the total cost of offering the developmental math sequence increased by only 36%. Stark State also reduced the number of contact hours per developmental math course from four to three. In the traditional format, Stark State had to pay an additional one hour per section and faculty could only teach eight sections annually. In the redesign, faculty could teach nine courses per year as part of their load. Together, these two actions reduced the cost-per-student from $238 in the traditional format to $178 in the redesign, a decline of 25%.
  • At Volunteer State Community College (TN), significant cost savings were accomplished by 1) increasing class size to a weighted average of 40, 2) changing faculty/staff ratios, and 3) streamlining the curriculum to include multiple exits to university-parallel mathematics. Vol State’s new mathematics lab was completed in fall 2012, seats 50 students, and classes on the main campus are larger, as originally planned. The overall cost-per-student after full implementation of the redesign declined about 28%, from $199 in the traditional format to $144 in the redesign. Savings realized will provide administration-endorsed opportunities for 1) expansion of the Emporium Model to university-parallel math courses, 2) support of upper-division, low-enrollment STEM courses, 3) scheduled professional development for faculty in NCAT/other conferences, 4) released-time supporting faculty research, and 5) supplemental instruction. 

Examples of increasing the number of sections counted in load:

  • In the traditional format, Laramie County Community College (WY) offered 51 sections with section sizes that varied from 23 to 30 students, depending upon the course, for a total enrollment of 1,346. Annually eight of these sections were taught by full-time faculty, and the rest were taught by adjuncts. After the redesign, faculty load was increased from ten sections annually to 15. However, the section size was reduced to 17. Twenty of the redesigned sections were taught by full-time faculty. These changes led to a reduction in the cost-per-student from a weighted average in the traditional courses of $161 to $153 in the redesign, a 5% decrease. The resources saved will be put back into departmental funds to be used for faculty professional development, facility expansion and redesign of additional courses.
  • At Lurleen B. Wallace Community College (AL), the primary cost-saving technique was that each faculty member (full-time and adjunct) taught two developmental math redesigned sections of 29 students for one workload credit rather than one section of 24 students as they did in the traditional format. The availability of tutors and instructors in each class made it possible to increase section size and still provide individualized attention and assistance to all students. In addition, the number of faculty hours spent on developmental math was reduced by eliminating duplication of faculty responsibilities. The cost-per-student decreased from $114 in the traditional format to $53 in the redesign, a 54% savings. Faculty time will be reallocated for other tasks within the mathematics department. A substantial portion of these cost savings were used to purchase additional computers for the Math Center for the redesign of other college-level math courses in spring 2012 including Mathematical Applications, College Mathematics with Applications and Introductory Mathematics I.
  • Pearl River Community College (MS) realized significant cost savings as a result of the redesign project. As part of the redesign, full-time faculty workload changed. Pearl River increased the number of developmental math sections taught by full-time faculty each term from five to nine for the same workload credit and reduced section size from 24 to 20. The student load for each instructor increased on average from 134 students each term to over 160 students. In addition, faculty worked five hours weekly in the lab with no change in the overall hours devoted to developmental math. The redesign format allowed one instructor to teach more students than were taught in the traditional format while decreasing class size. In the traditional format, each instructor taught five three-day-a-week sections with 24 students. In the redesigned format, that same instructor could teach 10 sections of 20 students plus spend five hours tutoring in the lab. This could be accomplished because the class only met once a week and because no hand-grading was required. Overall, faculty productivity rose by 31%, and the cost-per-student decreased from $252 in the traditional format to $168 in the redesign, a 33% reduction.

The One-Room Schoolhouse

Many of the projects also used the “one-room schoolhouse” approach to deal with low-enrollment sections, producing both institutional cost savings as well as clear benefits to students. Previously, when small sections did not “fill” (particularly at smaller campuses and sites or during certain class times), they had to either be cancelled, (interrupting student progression through the sequence and incurring lost revenue to the college) or offered at a relatively high cost. Using the one-room schoolhouse meant that the college offered multiple developmental math courses in the same computer classroom or lab at the same time. Students worked with instructional software, and instructors provided help when needed. Even though students were at different points in the developmental sequence, they could be in the same classroom. This strategy enabled the institution to increase course offerings and avoid cancelling classes, which, in turn, reduced scheduling roadblocks for students and enabled them to complete their degree requirements sooner. Since fewer sections were needed to accommodate the same number of students, the overall cost-per-student was lowered.

Student Savings

Although CTE’s cost savings goal was to reduce the institutional cost of offering developmental math, the program also produced substantial savings to students. Among them were:

  • Saving tuition dollars. The modularization of the developmental math course sequence allowed students to move from one course to the next within the same semester. Students saved tuition dollars because they were allowed to complete as many courses as possible in one semester while only paying tuition for the one for which they registered. A student who worked through all modules could finish the entire program in one semester and pay for one course instead of two or three courses as in the traditional format.

    Several of the CTE institutions calculated how much tuition a student could actually save as a result of the redesigned sequence. At Manchester Community College, the redesign allowed a student to complete up to three courses in one quarter and pay for only one. If taken individually over three quarters, the courses would cost the student $1,095 compared with taking them all in one quarter at a cost of $421.25. This is a 61.5% savings to the student. Even if the student could only complete two of the courses in one quarter and finished the third in the next quarter, s/he would save 30.8%. Iowa Western Community College offers four levels of developmental math. For students who completed the developmental math sequence in one term, the tuition savings was $1,071 per student. For students who completed the sequence in two terms, the tuition savings was $714 per student. At Lurleen B. Wallace Community College, a student who worked through all modules could finish the entire program in one term and pay for one course instead of three courses, a tuition savings of $654. 
  • Reducing the required number of credits. Several of the participating institutions redesigned multiple courses in the developmental math sequence to eliminate topic duplication and to eliminate topics that were beyond the scope of developmental math. This allowed the total number of credit hours for the sequence to be decreased, which represented savings for students by decreasing the number of credit hours for which they needed to pay tuition.
  • Lowering the cost of course materials. Several of the projects were able to lower the cost of materials significantly, creating additional savings for students. Students purchased one textbook and one software access code good for two years as opposed to the old system where students purchased three different textbooks to complete their developmental work. Several institutions developed customized textbooks, which included the material for all courses in the sequence. Students needed to purchase only one textbook and one software access code. In the past, developmental courses required two or three textbooks and access codes. Other projects entirely eliminated the need to purchase a textbook, requiring only the purchase of an access code, which included an e-book (electronic textbook) at no additional cost to the student. Paper copies were still available if desired.

  • Accommodating life events. Students in developmental math and in community colleges in general are juggling many responsibilities such as jobs, families, parents, etc. As a result, they are often unable to complete a course during the term. Many students may be working diligently to achieve their dreams but have a “life event” occur, prohibiting them from reaching their educational goals. When life interferes in the traditional model, students must withdraw, losing tuition and any progress they have made, and start over the following term. In the CTE redesigns, students could adjust their schedules to suit life changes instead of having to withdraw from the course and lose the tuition they paid for the course. They could return to the class and pick up where they left off after resolving the life event. Students no longer had to drop the course when work or family obligations kept them from attending class. Students could change course sections instead of withdrawing and attend class at a different time, decreasing the number of terms needed to complete developmental math requirements.

Sustainability

We asked the CTE institutions, will the redesign be sustained now that the CTE program has concluded? All 32 redesigns plan to sustain their redesigns. Here are some examples of their responses:

  • At both Guilford Technical Community College (NC) and Robeson Community College (NC), the sustainability of the redesign is not in question. The administrations at both institutions have supported the redesigns from the beginning and continue to support them. North Carolina has recently redesigned the statewide developmental math curriculum into eight one-credit modules. Students will have multiple exit points based on their program of study, and the Emporium Model will facilitate this implementation.
  • The mathematics department at Manchester Community College (CT) has consistently supported redesign. Although there was initial skepticism and inertia to overcome, the result has been a very collegial process, one which has strengthened the department. The entire college is exceptionally supportive of this initiative. The impact of redesign has been felt and recognized in sister community colleges, the state university system and local school districts. In addition, the process of creating the math redesign has positioned the college in a leadership role in several statewide initiatives to improve college readiness among high school graduates.
  • At Nashville State Community College (TN), there is little doubt that the changes to the developmental math program will be sustained and most likely will extend into other college-level math courses. The program has been changing for the past three years with the full support of the college in an effort to better address the needs of students. The improved retention and success rates indicate that the changes are having a positive impact, and Nashville State expects these trends to continue as the team modifies course content and delivery. Additionally, instructors who were initially hesitant about the program now fully support the changes, and some have begun to take an active role in the decision-making processes related to running the program.
  • Since participating in Changing the Equation, the implementation of the Emporium Model at Northwest Shoals Community College (AL) has shown increased student success. Due to the success of the project, the math department is considering the redesign of Pre-Calculus with Algebra, Introduction to Technical Mathematics and Mathematical Applications. The continued success of the Emporium Model strengthened faculty and administrator commitment to the mathematics redesign project. This commitment now extends throughout the transitional studies division to include redesigns of English and reading courses.
  • Sustainability of the redesigned developmental mathematics is not in question. The math department at West Virginia University Parkersburg is totally committed to offering these two courses using the Emporium Model. The team feels very strongly that this is the way to go, and their commitment is unwavering. The team intends to keep working toward improving completion rates. There has also been total support from the administration for the redesign.

Lessons Learned

CTE’s basic objective was to scale a proven innovation: demonstrate the feasibility of redesigning remedial/developmental math courses on a wider scale using a set of redesign tools and methods developed in NCAT’s prior mathematics course redesigns. Course redesign is a proven, data-driven innovation in institutional practice that makes it possible to improve student learning outcomes while reducing costs. NCAT brought ten years of experience in conducting large-scale course redesign programs to create CTE. The essential ingredients of CTE were:

  • A series of course redesign planning resources that incorporated the lessons of prior NCAT redesigns in mathematics to be used on CTE campuses in planning and implementing their redesigns;
  • Experienced advisors in the form of NCAT staff and NCAT Redesign Scholars, faculty experienced in successful mathematics course redesign; and,
  • Program workshops both prior to the submission of grant proposals and during the implementation of campus redesigns in which redesign techniques were presented and results were shared among participating campuses.

What Led to Success?

When CTE began, we said that we could guarantee that new institutions could improve student learning, increase course completion rates and reduce instructional costs if they followed our advice—derived from the successes achieved in prior redesigns in developmental and college-level mathematics using the Emporium Model. Did the CTE redesigns conform to the principles of effective redesign previously established by successful campuses—that is, did participating campuses “follow the rules”?

CTE reaffirmed the lessons learned in prior NCAT redesign programs. The pedagogical techniques (active learning, online tutorials, continuous assessment and feedback and on-demand support) that led to increased student learning in the prior mathematics redesigns also led to increased student learning in CTE. Similarly, the cost reduction techniques (online tutorials, automated assessment, course management systems, shared resources and staffing substitutions) that led to reduced instructional costs in prior redesigns also led to reduced instructional costs in CTE. This “scaling a proven innovation” program in developmental mathematics course redesign validated what was previously learned in prior NCAT projects—if you followed the “rules,” you were successful. If you did not, you were not.

Learning from Experience: Pedagogical Improvement Techniques

The techniques that universally contributed to improved student learning outcomes were those that are characteristic of the Emporium Model.

  • Holding class in a computer lab/computer classroom. Having students work on math during class proved fundamental to the success of the redesign projects. Students worked in computer classrooms/computer labs on a fixed, flexible or combination fixed/flexible schedule each week. The nature of the computer-supported classroom made it impossible for students to adopt a passive strategy in the course as was often the case with lecture-discussion approaches to teaching mathematics. During the scheduled class meetings, students spent more time working problems and doing math rather than watching their instructor work examples. The mantra “students learn math by doing math” was the redesign standard.
  • Basing the course on instructional software. The use of effective online instructional software (e.g., Pearson’s MyLabsPlus/MyMathLab, McGraw-Hill’s ALEKS online learning system and Hawkes Learning System) served as a key component in each redesign project. Each course delivery system offered consistent, high-quality, customizable content and created a student-friendly introduction to the math courses. Lecture videos, animated examples, electronic textbook, study plan, homework assignments, quizzes, practice tests, and post-tests were all in the same online location and could be accessed anywhere, anytime (although the proctored post-tests had be taken in the lab.) Students could work on assignments from any computer with Internet access and have access to step-by-step explanations and video lessons. A major advantage to using interactive software was the immediate feedback provided to students. Practice problems were retaken until mastery of a concept was achieved. When working a homework assignment, students got immediate feedback if an answer was correct or incorrect. The software gave students multiple resources (hints on how to solve problems, videos, animations, worked problems similar to the one missed and links to the e-textbook) to correct their understanding if they had not mastered a skill. The software allowed for a pool of questions to be developed that increased the variety of questions on a particular topic. Randomized order of questions also promoted more analytical thinking rather than blindly applying algorithms in a predictable order of question types.
  • Providing one-on-one, personalized, on-demand assistance for students. The availability of on-demand individual assistance in the lab/computer classroom ensured that students received immediate help when needed. A variety of resources were available to accommodate students various levels of preparation, anxiety and diverse learning styles. Students had the option to ask for help online, ask for help from an instructor or tutor, watch a video or attend a mini break-out session. Tutors and instructors in the math lab provided individual attention to address student-specific problems. The varying levels of personnel allowed students to seek help from someone with whom they were most comfortable and whose teaching style was better suited for the individual student’s learning needs. During class meetings, instructors were able to walk around the room for one-on-one direct interaction with each student. Such frequent interaction helped build a necessary relationship with each student that was more difficult to achieve in a traditional lecture format. “Teachable moment” opportunities in the lab allowed instructors and students to build relationships and further foster learning. Students tuned out less when they received targeted information to meet their perceived needs. More timely assistance was provided. It was also very beneficial to have more in-class opportunities for face-to-face interaction to offer encouragement, celebration of successes, and/or exhortation to make more effective progress, as appropriate.
  • Establishing clear expectations for progress with deadlines. Each project divided the content of its developmental math courses into learning modules or mini-modules with weekly expectations for completion. Weekly schedules provided a guideline to students on what pace of work was necessary to complete the course on time. These schedules were of significant value in helping students see what they had left to accomplish in the course and to ensure that each course could be finished within one semester. Dividing the course content into modules provided students with manageable units for study. Weekly homework assignments and assessment kept the students on task and engaged in learning throughout the course.
  • Requiring attendance. It was absolutely necessary to have an incentive for attending class and/or a penalty for not attending. Math faculty and tutorial staff quickly realized that, as NCAT had indicated, “Students don’t do optional.” Whenever optional lab time was offered, the vast majority of students failed to take advantage of it. At participating colleges, attendance counted between five and 10% of the final grade, which provided sufficient motivation for students to attend class during which they were required to work on their course. Students were more likely to seek help when in the emporium environment than when working at home. Some institutions penalized students for lack of attendance (e.g., students who missed 12 hours of class were administratively withdrawn from the course.)
  • Monitoring student progress via logs, guidebooks, workbooks, score sheets. Instructors who required that students spend hours in an open lab were provided with logs that indicated the dates and time intervals that students visited the open labs. All software packages contained excellent tracking and communication tools that enabled students and instructors to receive feedback on each individual student’s progress. Others used a weekly score sheet which included points for staying up to date with videos, worksheets, homework and quizzes as well as points for class and lab attendance. Still others created a paper workbook or notebook that students were required to maintain that contained class notes, notes from the software’s learning tools and solutions to exercises, which facilitated working through the steps of problems by hand. By recording the progress of each student every week in his or her respective workbook or notebook, instructors were able to discuss progress in the courses with each student as well as stimulate discussions which allowed students and faculty to become more comfortable with each other. Whatever the method, instructors monitored each student's progress as well as time-on-task and took appropriate action when needed.
  • Holding weekly discussions with students. In many projects, instructors met with each student individually each week to assess the student’s progress and to help the student develop a course of action for the next week. This face-to-face meeting helped students develop a sense of personal responsibility for their work, a major factor in their development as students and success in the courses.

In addition to the techniques described above that are characteristic of the Emporium Model in both developmental and college-level math, institutions participating in Changing the Equation were required to add a number of other practices based on prior NCAT experience.

  • Modularizing course materials and course structure. Each project divided their developmental math sequence into a series of modules or mini-modules. (See http://www.thencat.org/Mathematics/CTE/SchoolData/Courses_Modules.html for a listing of how many courses were converted to how many modules for each participating institution.) Each module corresponded to a learning objective or competency within the course sequence. Some institutions retained course titles. For example, at Lurleen B. Wallace Community College modules 1, 2, 3 and 4 equated to Basic Math (module 5 was the final exam); modules 6, 7, 8, 9 and 10 equated to Elementary Algebra (module 11 was the final exam); and, modules 12, 13, 14, 15 and 16 equated to Intermediate Algebra (module 17 was the final exam). Students were expected to complete at least one module per week, but they had the option of completing more. Other institutions got rid of the old course structure and simply offered modules in the context of one developmental math course. For example, Nashville State Community College and Volunteer State Community College eliminated their three traditional developmental math courses. The redesigned curriculum was based on topics corresponding to high school-level math and ACT content that translated to five competencies or five modules that students were required to complete for the course.

    In all cases, progress through the course(s) required completion of one module at mastery level before moving to the next. This included completing the online quizzes, homework problems, and notebook assignments that covered the objectives for the week. Students could complete one course early and move into the next course in the same semester. Students who completed two courses in one semester paid for only one course. If the second course was not finished at the end of the semester, the student could continue the next semester at the point they left off the previous semester. Students who did not finish the required modules in one semester were able to begin work the next semester exactly where they left off the previous semester. Students could progress more quickly or slowly, if needed. Because all sections used the same structure and procedures, students could change sections during the semester if needed without interrupting their learning.
  • Requiring mastery learning. Within each module, students were required to complete assignments and could not move to the next element within the module until mastering each component, with mastery levels ranging from 75% to 90% of the material. A typical sequence was for students to begin by taking a preview quiz, which they could complete in an attempt to achieve mastery and thus bypass the module or move directly to the homework if they felt they were unfamiliar with the material. Before students could move from one homework assignment to the next, they were required to demonstrate mastery on each homework assignment. After all homework for a module was completed, students took practice quizzes where online learning aids were not available. If the student did not demonstrate mastery on the practice test, they had to remediate on missed concepts before taking the practice quiz again. Students typically were allowed multiple attempts on the practice quiz. Though not part of the grade calculation, the quiz required mastery before students were permitted to take the module post-test. Once prepared, students took a proctored post-test in the lab. To move to the next module, students had to demonstrate mastery on the post-test. Students that were not able to do so had an opportunity to meet with the instructor who reviewed the student’s work on the test and recommended remediation techniques before the student retook the test.
  • Establishing greater course consistency. In the traditional format, consistency among different instructors or different campuses within the same institution was lacking. Course objectives, desired learning outcomes, instructional strategies and course materials varied. In the traditional format, students could fail a test and still pass the course, leaving gaps in their learning. Unlike redesign students who were assessed on common outcomes using common assessment methods, those in traditional courses were assessed in a variety of ways which led to overall grading differences and grade inflation. Contributors to prior grade inflation in the traditional course included 1) having no clear guidelines regarding the award of partial credit, 2) allowing students to fail the final exam yet still pass the course, 3) failing to establish common standards for topic coverage (in some sections, entire topics were not covered, yet students passed), and, 4) failing to provide training and oversight of part-time instructors. Thus, the C or better rates for the traditional courses were almost universally inflated. Furthermore, in the traditional format, students exited the course by simply attaining a total cumulative score of at least 70% or 75%. Based on averaging grades, students were able to earn a C or better by passing enough tests and learning enough competencies but not necessarily all. In traditional sections, students would often continue on to the next topic without having demonstrated mastery of the previous topic.

    The Emporium Model created consistency of course content and course delivery for students. Faculty from all campuses developed the module content and determined course delivery to ensure that all students had the same learning experience regardless of the instructor or campus location. Course redesign allowed alignment of course outcomes for all sections of developmental math courses offered at each institution. This resulted in every student moving forward to credit-bearing math courses having mastered defined learning outcomes for the developmental math sequence. Mastery learning thus meant that students were doing more work and learning more, which ensured that students were well prepared to take on college-level work.

Learning from Experience: Cost Reduction

As noted above, there were two primary ways that cost reduction was achieved: 1) by increasing section size, and, 2) by increasing the number of sections that full-time and adjunct faculty counted toward their load. Both of these strategies were implemented without increasing faculty workload because of the elimination of repetitive tasks such as hand-grading homework, quizzes and exams. Seven of the 32 institutions more or less achieved their cost savings projections, and seven additional institutions exceeded their original cost reduction goals.

But 17 of the 32 institutions failed to fully carry out their cost reduction plans, although all but one of them produced some savings. What were the decisions they made and the actions they took that led to this shortfall?

  • Four institutions significantly increased the cost of lab tutors over their planned expenditures.
  • Three institutions increased the cost of course coordination over their original plans.
  • Five institutions increased the percentage of sections taught by full-time faculty vs. those taught by adjunct faculty in excess of their planned percentages. Since full-time faculty members are more expensive than adjunct faculty, this action reduced the amount of savings generated by the projects.
  • 12 institutions simply did not carry out their plan to increase section size because they offered too many sections. To arrive at the correct number of sections, the anticipated enrollment (and we understand that one can never be exactly sure what that will be) should be divided by the planned redesign section size and only that number of sections should be offered. NCAT found that many projects were surprisingly unsophisticated about how to schedule sections for maximum efficiency.

Why was the 25% projected average CTE cost reduction and the 20% actual cost reduction lower than NCAT’s historical 37% average?

The average cost reduction in 120 completed NCAT redesigns has been about 37% whereas the average cost reduction in the Changing the Equation projects was projected to be about 25% and the actual reduction averaged 20%. Aside from the changes the projects made in their plans, there is another important factor that accounts for the difference.

Despite their utilization of the cost strategies described above, a large number of the redesigns planned to increase the proportion of full-time faculty teaching developmental math:

  • The total number of traditional sections offered at the original 38 institutions was 4,185. Of these 1,623 (39%) were taught by full-time faculty and 2,562 (61%) were taught by adjuncts.
  • After redesign, the total number of sections planned to be offered was 3,159. Of these, 1,377 (44%) were taught by full-time faculty, and 1,782 (56%) were taught by adjuncts.

A five percentage point swing in full-time/adjunct ratios may not seem like a lot, but since full-time faculty are generally paid about three to four times as much per section as adjunct faculty, the impact of this shift is quite dramatic.

In the traditional format, the proportion of full-time faculty teaching developmental math was less than 50% at the majority of institutions:

  • At 10 institutions (26%), the proportion of FT faculty teaching developmental math was less than 25%.
  • At 17 institutions (45%), the proportion of FT faculty teaching developmental math was between 25% and 50%.
  • At 11 institutions (29%), the proportion of FT faculty teaching developmental math was more than 50%.

In the redesigns, 55% of the institutions (N=21) planned to increase the proportion of full-time faculty teaching developmental math, ranging from a 6% to a 161% increase. At nine institutions (24%), the full-time/part-time ratio was planned to stay the same. Eight institutions (21%) planned to decrease the proportion of full-time faculty, ranging from an 8% to a 71% decrease. And, as we note above, five institutions failed to implement their planned full-time/part-time faculty ratios.

Learning from Experience: Implementation Issues

While we now know what works well—and what we need to do in order to achieve the goals of course redesign—we also know that getting there means a lot of hard work. Putting the good ideas derived from past successes into practice represents a real commitment on the part of participating institutions, especially when implementation challenges rear their ugly heads.

At the outset, NCAT tried to provide an “early alert” to the CTE institutions by identifying the five most important implementation issues encountered by prior projects. NCAT encouraged the new institutions to pay special attention to how they would:

  • prepare students (and their parents) and the campus for changes in the courses;
  • train instructors, graduate teaching assistants and undergraduate peer tutors;
  • ensure an adequate technological infrastructure to support the redesign as planned;
  • achieve initial and ongoing faculty consensus about the redesign; and,
  • avoid backsliding by building ongoing institutional commitment to the redesign.

Indeed, once the redesign implementation phase was launched, participating campuses encountered exactly the same implementation challenges that prior projects had reported.

  • Preparing students and the campus for change. For most projects, the biggest challenge for students, faculty, advisors and parents was making the change from traditional lecture-based, teacher-directed learning to the Emporium Model. NCAT staff and Redesign Scholars cautioned teams that the adjustment period would be difficult but that persistence would win out. The pilot semester was a difficult transition period for both faculty and students as the redesign methodology was introduced. Most common here were negative student reactions to the perception that the class was “an online class” that they did not think they had signed up for or that it “had no teacher.” These challenges were addressed by more up-front engagement with advisors to explain what the course would be like and the development of written materials and orientation sessions that explained the new format. Student complaints also diminished as they adjusted to new learning demands, gained more experience with the new format and recognized that it was here to stay. Faculty struggled with the virtues and shortcomings of the selected software. By the subsequent semester, instructors were knowledgeable about the learning software, and students had adjusted to their new roles as active learners.

    A number of participants also reported encountering challenges associated with preparing others on campus for the redesigned format. Most of these were related to advising, where advisors did not provide correct information to students or simply misunderstood what the course was about. Although departments worked closely with their administrations while planning the redesign, more effort should have been spent preparing the entire college community for the changes. Even though a thorough explanation of the redesigned rationale, benefits and structure was presented to academic advisors and student service personnel, some were not as supportive as needed to encourage the students to accept the change and take advantage of the ability to complete developmental math faster than in the past. Project leaders learned that they needed to constantly and consciously “market” the redesign to key campus constituencies who knew little about the new format and how it differed from more traditional offerings. Taking a pro-active approach by offering sessions at orientations for students and parents, using the summer to train instructors and to visit advisors’ and coaches’ meetings to describe the benefits of the new approach and addressing parent/student complaints immediately helped during the transition period. By the point of full implementation, these problems had been largely overcome.
  • Initial and ongoing training. About half of the participants reported challenges in training instructors, particularly adjunct instructors, and tutors. As one participant observed, “Full-time faculty needed to spend a considerable amount of time mentoring the part-time faculty and tutors when the redesign was fully implemented. This resulted in enhanced consistency in the redesign but also required more time than anticipated. Going forward, this investment will assure continued success of redesign.” Another commented, “Despite two comprehensive training sessions, there seems to have been insufficient opportunity to 1) explain the Emporium Model, 2) provide MyMathLab instruction, and 3) explain the rationale and benefits sufficiently to part-time instructors. With the instructors having to learn how to use the software while getting used to a different teaching style, it was difficult for some of them to be positive and encouraging to the students who were troubled by the change. Attitudes improved during the first semester, and the second semester went much more smoothly. At the start of the second semester, both instructors and students were given better explanations of the rationale and benefits of the redesigned course, as well as guidelines outlining the responsibilities of each and strategies for success.”

    An important aspect of ongoing training was maintaining consistency in implementing all elements of the redesign. Full-time faculty needed to spend a considerable amount of time mentoring the part-time faculty and tutors when the redesign was fully implemented. Although initial training was provided for all faculty and tutors, it was evident there were inconsistencies in the application of the redesign. For example, students were required to complete guided lectured notes before taking a quiz; however, some faculty and tutors did not monitor guided lecture note completion. Some instructors allowed students to listen to music with headphones. Some instructors permitted use of notes on proctored exams. The faculty met and formulated firm rules about such matters that will be used in future semesters. Faculty had to adjust to the concept that they could not make a decision based on their individual interpretations; rather all had to follow the same rules and guidelines. If an instructor had an idea for improving student learning and or the process, the idea had to be agreed upon and used by all instructors. Since constant issues not foreseen regularly arose, weekly staff meetings were necessary with results recorded, published and distributed so that all faculty and staff would consistently implement those decisions. This resulted in enhanced consistency in the redesigned course but also required more time than anticipated. Going forward, this investment will assure the continued success of redesign.
  • Ensuring an adequate technological infrastructure. Sufficient and dedicated technical support is imperative when implementing the Emporium Model since technology is an important component of redesign. Most participants reported encountering some challenges in ensuring an adequate technological infrastructure. Most of these occurred during early implementation and concerned periodic internet outages (sporadic interruptions in access to the course software or campus network interruptions), late-arriving equipment and software glitches. The times when the internet was not available created down times while students were scheduled in the labs or computer classrooms. It was important to develop an alternative plan to engage students when the internet was not available. Faculty teams responded by collecting an assortment of interactive electronic documents demonstrating mathematical concepts that students could turn to so that time in class was not wasted. Students demonstrated maturity and patience during these periodic episodes that have seldom reoccurred since that time.

    Students’ lack of computer literacy made the first few weeks of class frustrating. Technical issues with access codes and multiple log-ins created delay. Internet browsing (Facebook) during class time proved to be a distraction and interfered with students’ time on task, and problem-solving websites created academic integrity issues. These problems were resolved by installing software that locked-down internet surfing in computer classrooms. A number of projects encountered problems in finding and training a sufficient number of tutors for the computer labs. Scheduling was sometimes difficult to predict in open labs due to drop-in students who required more attention from tutors. All of these problems were resolved by the time of full implementation thanks to cooperation among faculty, IT staff and the commercial software providers.

    Technological problems unique to Changing the Equation had to do with interfacing with various campus student information systems (registration, financial aid, billing, registrar) due to the “non-traditional” organization of the modularized course(s). Establishing an interface between instructional software packages with their built-in course management systems and the larger campus-wide systems also presented challenges. It is difficult to generalize about this issue since the variety of course structures, software packages and student information systems and their interaction with one another created multiple kinds of problems and multiple kinds of issues. Suffice it to say, addressing these issues early in the redesign process enabled a smoother transition, and the cooperation of IT and other campus offices was essential.
  • Achieving initial and ongoing faculty consensus. About two thirds of participants reported challenges in the area of achieving faculty consensus within the department about the redesign. Some of this was attributed to leadership issues—for example, interim department chairs who were reluctant to press resisting faculty. Others stressed the need for strong leadership and administrative support to overcome these challenges. Some project leaders thought they had solved the problem of faculty buy-in at the outset, but were surprised to find that they had not communicated as effectively as they had thought. This underlines the importance of constant communication to check signals and maintain momentum. The fact that all of the final redesigns will be continued testifies to the fact that these challenges were eventually overcome. Failing to achieve faculty consensus was the most important reason why five of the original 38 institutions did not complete their redesign projects. Team leaders thought they had their colleagues’ support, but when the redesign got underway, they discovered that the opposition was stronger than anticipated.
  • Building institutional commitment. About half of the final participants reported progress in building institutional commitment to redesign outside their home department. Participants frequently cited leadership and administrative support as factors in sustaining and expanding interest in redesign. In some cases, redesign is being encouraged by system-level leadership; another project noted support by trustees as a factor. Like building acceptance within the department, however, broadening institutional commitment required continuing attention and support even under favorable circumstances.

NCAT planning resources like the Five Critical Implementation Issues were designed to reduce implementation challenges, and many respondents emphasized that it was helpful to learn that others had encountered a particular problem and how they had overcome it. But sentiment was equally clear that many of these problems could not have been foreseen and that every campus may need to go through the experience of redesign on its own. As one participant put it, “I’m not sure you can change human nature…I can read the Implementation Issues, but I need to have the experience before I really understand the issue.” Another noted, “I do not see a way to avoid implementation issues…it is more a matter of one set of problems versus another set.”

Participants stressed that foresight was helpful--and that they valued examples, advice and interaction with experienced practitioners (NCAT Redesign Scholars and NCAT staff)--but that there were no simple answers to complex organizational challenges. As one of them noted, “I think the Implementation Issues aren’t about the principles of course redesign as much as [they are about] the institutional context.” Providing materials and shared experiences were helpful to CTE participants in knowing that particular implementation issues would be likely to arise and in assuring them that they could work through them. But they could not provide them with ready-made “solutions.” Asked to sum up the implementation process as a whole, one participant emphasized, “It just takes hard work and time.”

Building on What We’ve Learned

CTE has provided a number of important lessons to inform next steps in developmental mathematics redesign—lessons that are already being incorporated into NCAT’s ongoing work. Among the most prominent lessons learned were the following:

1. The Emporium Model will increase learning outcomes, improve completion rates and reduce instructional costs in developmental math as long as one “follows the underlying rules.”

CTE sought to scale a proven innovation. Success required course teams to “follow the underlying rules” of the project with respect to proven redesign approaches and readily-available tools and resources. Those that followed the rules achieved success; those that did not struggled and needed to make corrections to bring their projects into line with the “rules.”

2. The Emporium Model has reached a tipping point.

There are now at least 50 ongoing large-scale mathematics redesigns that use the Emporium Model in both developmental and college-level mathematics at both two- and four-year institutions. CTE projects are serving as models for statewide developmental math reform in Connecticut, Kentucky, North Carolina, Tennessee and Virginia. In the past year, we have seen the implementation of the Emporium Model at many institutions that were not participating in a formal NCAT program. There are now sufficient examples, resources and experienced faculty leaders to enable other institutions to move to the Emporium Model without a formal NCAT program.

3. Strong local leadership is critical to staying the course.

Success in course redesign requires strong local project leadership—either at the department or at a higher administrative level. There was no one model of successful leadership in CTE. Some redesigns were managed collegially, others depended upon a core group of tenacious faculty, and still others were implemented in a “top-down” fashion by administration. An important function of top leadership was a willingness to stand up publicly on many occasions to talk about the project and its benefits and backing up the project team when they ran into trouble by providing resources or fixing administrative problems. But above all, local leaders had to ensure that the team stuck to the basic redesign plan.

4. Final course grades are not a good comparative measure of success.

Final course grades are not a measure of student learning outcomes unless the content, assignments and assessments of the courses being compared are the same. As reported above, grade inflation and/or inconsistency was near universal among the CTE projects and, we believe, is near universal in all developmental math programs. Anyone engaged in developmental math reform who uses final course grades to "prove" success is on very shaky ground.

5. Measuring comparative completion turned out to be very complex.

As reported above, comparison of course-by-course completion rates turned out not to be a true measure of the success or lack of success of the program for a variety of reasons. Taking into account those students who were “making progress” at the end of a course (i.e., earning the equivalent of a C or better under mastery learning) showed improvement in completion rates for 74% of the 50 courses that could be compared. NCAT has learned that completion of the developmental math sequence and success in subsequent college-level math courses are the two most important data points to use to compare student success rates between the traditional and redesigned formats. If only 20% of students exit the developmental math sequence but 75% pass the college-level course, you still have a problem just as you did when 50% exited the sequence but were unprepared and only 30% passed the college-level course. Unfortunately, the time period of the program did not allow sufficient time to collect those data.

6. Do you want students to learn math and be well-prepared for college-level work or to complete more rapidly? Both may not be possible.

In the redesigns, students were required to master all of the content of all of the courses at levels ranging from 75% - 90%. In the traditional format, students exited the course by simply attaining a total cumulative score of at least 70% or 75%. Based on averaging grades, students were able to earn a C or better by learning enough competencies but not necessarily all. Students would often continue on to the next topic without having demonstrated mastery of the previous topic. Mastery learning thus meant that students were doing more work and learning more, which often took longer. Mastery learning, while sometimes taking longer to accomplish, ensured that students were well prepared to take on college-level work.

7. Only a relatively small number of developmental math students are actually able to accelerate.

Many involved in developmental math reform want to create circumstances where students can accelerate their progress through the required course sequence. This was certainly true of the CTE participating institutions, and the redesigns were established to allow students to do so. In fact, only a small number of students in each project were able to accelerate. While this provided an excellent opportunity for these particular students, the reality was that the great majority needed the full term to complete a one-course equivalent, and many needed more than one term—i.e., they needed to slow down in order to be successful. Just about all projects believed that many students would be able to test out of a given module and accelerate their progress through the developmental math sequence. As most discovered, however, very few students were able to test out. Frequently, projects reported that only one or two students were able to do so.

8. The participating community colleges, in general, did not seem to care about cost reduction unless an action solved an immediate problem (the one-room schoolhouse) and the by-product was reduced cost.

Only a minority of projects bought into the cost reduction aspect of the program. This is reflected in the fact that 18 of the 32 projects did not carry out their cost plan and of the 18 that did, three reduced their costs by less than 9%. Yet all projects showed that one can reduce cost while improving quality, and a large minority (15 projects) showed significant cost reduction and clearly bought into this aspect of the program.

9. The participating community colleges were surprisingly unsophisticated when it came to dealing with data of any kind.

NCAT required CTE participants to collect data on comparative student learning outcomes, comparative completion rates and comparative instructional costs using relatively simple, straightforward forms developed over the past 13 years in working with hundreds of colleges and universities. (See http://www.theNCAT.org/PlanRes/R2R_ModAssess.htm for the assessment and completion forms and http://www.theNCAT.org/Mathematics/CTE/CTE_CSSF.xls for the cost form.) Most participants had a great deal of trouble completing these forms and found it difficult to deal conceptually with the data that were captured on the forms. Given that we were dealing with math faculty, NCAT found this phenomenon to be somewhat astonishing.

10. The successful projects valued and took advantage of the collaborative aspects of the program whereas those who were less successful did not.

CTE participants valued collaborative work throughout the project. Those attending the planning workshops overwhelmingly felt that the opportunity to see real examples and interact with Emporium Model “veterans,” as well as with program staff, were most valuable. Subsequent workshops were also highly praised for the interaction that took place. Such interaction not only disseminated ideas and techniques, but also built solidarity and mutual support through the knowledge that others were encountering (and overcoming) similar obstacles on their own campuses.

Participants also valued collaboration within their own campus teams. The CTE redesign methodology emphasized the use of a team to implement the projects, and all successful CTE redesign projects indicated that they had taken such an approach.

Yet the NCAT Redesign Scholar mentoring program—the most formal collaborative feature of the CTE program design—did not work as expected. Each institution was assigned a Redesign Scholar to mentor it through the redesign process. Funds from the program were made available to support campus visits. While some participants took full advantage of the Redesign Scholars, most did not. Many course teams preferred to “figure out things for themselves,” without consulting experienced partners, even though this sometimes meant “reinventing the wheel.” This has happened repeatedly in prior NCAT programs as well. It is clear, however, that the strongest projects took the most advantage of the Redesign Scholars. Those that are still “in progress” were often content to flounder on their own, ignoring the valuable advice given to them by NCAT Scholars and staff.

Conclusion

Changing the Equation required teams to radically redesign multiple courses in a relatively short period of time. Most projects had significant implementation issues that they needed to deal with (faculty training, faculty buy-in, technology problems, registrarial and financial aid issues, facilities issues, and so on.) In addition, most of the community colleges were unaccustomed to collecting assessment and cost data to evaluate the efficacy of their redesigns. Yet despite these challenges, they all intend to continue and improve on the initial implementations of their redesigns.

In retrospect, the timeline for Changing the Equation was probably too aggressive, given the amount of change that was required. The fall 2011 full implementation period was, in many respects, more similar to a typical NCAT pilot period. Teams made mistakes that they are now in the process of correcting, and many have acknowledged having multiple difficulties during the transition. It is not coincidental that those institutions with a “head start” on the redesign process (those that began their redesigns before the formal grant period started) have shown the strongest results.

We at NCAT are confident that if those institutions with poor results to date make the necessary changes and “follow the rules” of the Emporium Model, they too will show the strong outcomes demonstrated by prior math redesign projects. The emergence of institutions that have successfully redesigned their developmental math sequences outside of a formal NCAT program by “following the rules” gives us further confidence in the Emporium Model. We congratulate the Changing the Equation participants on their accomplishments thus far and look forward to their continuing achievements in addressing one of higher education’s most vexing problems, increasing student success in developmental mathematics.

Copyright 2013 The National Center for Academic Transformation

 

 

Changing the Equation Main Page...