The Roadmap to Redesign (R2R):
September 1, 2003 - August 31, 2006
Table of Contents
1. Establish academic practices in four disciplines (psychology, statistics, Spanish and pre-calculus mathematics) that will pair experienced with inexperienced institutions to diffuse the redesign methodology.
2. Establish initial repositories of research-based disciplinary materials that can be used by other colleges and universities.
3. Develop a streamlined redesign methodology that reduces the cost, the complexity and the length of redesign.
4. Establish a national competition to select new associates that will produce 40 new course redesign plans using the streamlined methodology.
5. Implement 20 expedited course redesigns that improve learning and reduce cost using the academic practice materials and the streamlined methodology.
6. Establish the conditions that will enable further expansion of the practice model and its ongoing stability.
The Center for Academic Transformation has established a solid track record of success and has created an initial proof-of-concept: that technology can be used to improve student learning while reducing instructional costs. Our goal now is to establish a more efficient means of spreading the ideas and practices that have come out of the Pew Grant Program in Course Redesign (PGPCR) to additional institutions.
Like the Pew-funded project, our new project is yet another proof-of-concept. We will test a new model--the academic practice--that partners experienced, successful institutions with new institutions and relies upon best practices and learning materials with a proven track record in particular disciplines. We will test a streamlined redesign methodology that continues to emphasize evaluation of both learning outcomes and institutional costs. If successful, we will lay the foundation for further escalation of redesign diffusion to many more institutions because both the "pieces" and the process will be in place. We anticipate an iterative process for adding "pieces" that can be replicated within the core disciplines and in other disciplines as well. As more "pieces" are added, redesign becomes more streamlined.
This project represents an important next step in getting us to the point that any institution of higher education will be able to draw upon the academic practices for redesign materials and assistance. Our long-term goal is to establish an independent, self-sustaining, subscriber-based entity that can maintain and expand the academic practice model to other disciplines and other institutions.
American colleges and universities are discovering exciting new ways of using information technology to enhance the process of teaching and learning and to extend access to new populations of students. For most institutions, however, new technologies represent a black hole of additional expense. Most campuses have simply bolted new technologies onto a fixed plant, a fixed faculty, and a fixed notion of classroom instruction. Under these circumstances, IT becomes part of the problem of rising costs rather than part of the solution. In addition, comparative research studies show that most technology-based courses produce learning outcomes that are "as good as" their traditional counterparts-what is called the "no significant difference" phenomenon. By and large, colleges and universities have not yet begun to realize the promise of technology to improve the quality of student learning, increase retention, and reduce the costs of instruction.
In contrast, the Center for Academic Transformation at Rensselaer Polytechnic Institute has collaborated with 30 institutions to demonstrate how IT can be used to achieve both quality enhancements and cost savings. In 1998, Dr. Carol Twigg, then vice president of EDUCAUSE, was asked by the Pew Charitable Trusts to develop a grant program focused on how technology could be used to improve student learning and do so cost effectively. She created the Pew Grant Program in Course Redesign (PGPCR) and was awarded a four-year, $8.8 million grant from the Trusts to carry it out. Dr. Twigg chose Rensselaer as the appropriate institution with which to affiliate the program and created the Center in April 1999.
The Center conducted a national competition to select 30 institutions from hundreds of applicants to each receive a grant of $200,000, awarded in three rounds of 10 per year. Each institution redesigned one large-enrollment, introductory course using technology. Why focus on such courses? Because undergraduate enrollments in the United States are concentrated heavily in only a few academic areas. In fact, just 25 courses generate about half of all student enrollments in community colleges and about a third of enrollments in four-year institutions. Successful completion of these courses is key to student progress toward a degree. High failure rates in them (15% at R1s, 30-40% at comprehensive universities, and 50-60% at community colleges) lead to significant drop-out rates between the first and second year.
The program followed a unique three-stage proposal process that required applicants to assess their readiness to participate in the program, develop a plan for improved learning outcomes, and analyze the cost of traditional methods of instruction as well as new methods of instruction utilizing technology. Prospective grant recipients were supported in this process through a series of invitational workshops that taught institutional teams these assessment and planning methodologies and through individual consultations with Center staff.
As part of the PGPCR, each of the 30 institutions conducted a rigorous evaluation focused on learning outcomes as measured by student performance and achievement. Results to date show improved student learning in 20 of the 30 projects, with the remaining 10 showing no significant difference. All 30 institutions reduced their costs by 40% on average (from 20% to 86%) and anticipate a collective annual savings of $3.6 million. The 30 courses impact more than 50,000 students nationwide each year. Other outcomes achieved include increased course completion rates, improved retention, better student attitudes toward the subject matter, and increased student satisfaction with the mode of instruction compared to traditional formats.
The basic assessment question associated with the PGPCR was the degree to which improved learning was achieved at lowered cost. Answering this question required comparisons between the learning outcomes of a given course delivered in its traditional and in its redesigned format. This comparison was accomplished by running parallel sections of the course in traditional and redesigned formats or by comparing baseline information from a traditional course to a later offering of the course in redesigned format, looking at whether there were any differences in costs and outcomes in the "before and after."
The degree to which students actually mastered course content was the bottom line. Techniques used to assess student learning included comparing the results of common final examinations; comparing the results of embedded common questions or items in examinations or assignments; collecting samples of student work (papers, lab assignments, problems) and comparing their outcomes according to agreed-upon common faculty standards for scoring or grading; tracking student records after they completed redesigned courses, looking at a) proportions satisfactorily completing a downstream course, b) proportions going on to a second course in the discipline, and c) grade performances in "post-requisite" courses.
"Before and after" course costs were analyzed and documented using activity-based costing. The Center developed a spreadsheet-based course planning tool (CPT) that supported institutions in this process, which involved the following steps: 1) determine all personnel (faculty, adjuncts, teaching assistants, peer tutors, professional staff) costs expressed as an hourly rate; 2) identify the tasks associated with preparing and offering the course in a traditional format and the personnel involved; 3) determine how much time each person involved in preparing and offering the course in a traditional format spends on each of the tasks; 4) repeat steps 1 through 3 for the redesigned format; 5) enter the data in the CPT. The CPT then automatically calculates the cost of both formats and converts the data to a comparable cost-per-student measure. At the beginning of each project, baseline cost data (traditional course costs and projected redesigned course costs) were collected, and actual redesigned course costs were collected at the end.
In contrast to the contention that only certain kinds of institutions can accomplish these redesign goals, and in only one way, the program has demonstrated that many approaches can achieve positive results. And to counter the belief that only courses in a restricted subset of disciplines-science or math, for instance-can be effectively redesigned, the 30 projects provide successful examples in many disciplines including the humanities (6), math and statistics (13), the social sciences (6), and the natural sciences (5).
To one degree or another, the 30 projects share six best-practice characteristics:
The Center's goal is now the widespread adoption of these new methods of redesign throughout the broader higher education community. While the development process used by the Center has been extremely successful, we cannot simply replicate those methods in the next phase because a considerable amount of money ($200,000 per project) and time in planning (nine months per project) and in implementation (two years per project) were spent. It would be too costly to replicate the development model for large numbers of institutions. Furthermore, the institutions involved were leaders in the field with significant expertise, and most institutions are not as experienced. The challenge is to accelerate institutional adoption by simplifying or streamlining the redesign process--making it as close to turn-key as possible--while allowing for institutional individuality in the adoption process.
We propose to create a configuration of discipline-based academic practices that will develop a repository of Web-based learning resources and a streamlined redesign methodology in specific disciplines. Each academic practice will be a virtual collaboration of experienced institutional teams (instead of interested individuals) and will comprise multiple players from each institution: faculty (key to creating high-quality content and sound pedagogy), IT staff (key to creating the technological infrastructure to support redesign), assessment experts (key to establishing reliable and valid measures of student learning), and administrators (key to making it possible for redesigns to be implemented and sustained.) Each practice will serve as a source of redesign expertise in a particular discipline and create a repository of research-based course materials to be used by other institutions in the redesign process.
The core of each academic practice will be three teams from the PGPCR that successfully completed a large-scale redesign, demonstrating that they can collaborate effectively with their peers. We call these teams the core associates. By the end of the project, five additional teams will join each practice after completing a large-scale redesign using a streamlined redesign methodology, thus enlarging the number of core associates in each practice from three to eight. We call these new institutional teams the new associates. Each practice will select a managing partner to coordinate the activities of the practice, play a substantial role in adding new associates, and serve as the primary contact between the Center and each practice. From the 30 PGPCR projects, we have selected four academic areas in which a body of learning materials and excellent redesign models already exists to test the practice model: statistics, pre-calculus mathematics, Spanish and psychology.
Critical to the redesign projects' success in improving student learning has been their use of Web-based learning resources that enable active student engagement with course material. Such resources include interactive tutorials, exercises and quizzes that give students needed practice and automated feedback; digitally recorded presentations and demonstrations; textbooks and other reading materials; examples and exercises in the student's field of interest; links to other online materials; and individual and group laboratory assignments. These Web-based resources also contribute significantly to cost reduction because they allow the transfer of some tasks to technology-assisted activities, reducing the time that faculty and other instructional personnel must spend and thus the numbers needed to teach a course.
In each academic practice, faculty redesign experts will work collaboratively to assemble a set of modularized course components that map to desired learning outcomes in each discipline and that can be used flexibly by different types of institutions. Some materials may be produced by commercial publishers; others may be developed by higher education institutions. New associate institutions will redesign courses using the materials assembled by the practice--evaluating and revising the materials as necessary--and contribute additional materials once proven to increase student learning or contribute to cost reduction. Thus, each practice will build a living compendium of research-based, best-practice resources that can eventually be drawn upon by all higher education institutions. By supplying many of the "pieces" needed in course redesign, the practices will accelerate the course redesign diffusion process.
The Center for Academic Transformation will provide a virtual home for the practices and will manage the development and implementation of a streamlined redesign process. In the process used in the PGPCR, the Center encouraged each institution to develop its own unique redesign. From that process, we learned what works well and what does not. We will model optimal designs based on an analysis of the 30 projects (i.e., in statistics, we would not replicate the designs of Penn State, Ohio State, University of Illinois, or Carnegie Mellon but rather take the best elements of each.) We will create streamlined CPTs for each discipline that will enable novice institutions to move quickly through the planning process.
The Center will support the selection process for new associates in each practice through workshops, planning materials and individualized assistance, replicating the successful process employed in the PGPCR. We will also establish a process whereby each practice can be enlarged year after year as well as a model that can be replicated in other disciplines. Just as the materials development process will integrate research results regarding improving learning and reducing costs with experience in course redesign, so too will the streamlined redesign process be continually evaluated and improved according to what works and what does not.
In summary, the project will achieve the following objectives:
Objective 1. Establish academic practices in four disciplines (psychology, statistics, Spanish and pre-calculus mathematics) that will pair experienced with inexperienced institutions to diffuse the redesign methodology.
a. Select core teams for each practice. The Center will select three core teams for each academic practice from among the institutions listed below, all of which have indicated a willingness to participate.
Psychology : Cal Poly Pomona, University of Dayton; University of New Mexico; University of Southern Maine
Statistics : Carnegie Mellon University, Ohio State University, Penn State University; University of Illinois at Urbana-Champaign
Spanish : Portland State University, University of Tennessee-Knoxville, University of Illinois at Urbana-Champaign
Pre-Calculus Mathematics : Northern Arizona University, Rio Salado College, Riverside Community College, University of Alabama, University of Idaho, Virginia Tech.
The choice of three is determined by the available funding for the project. Criteria for selection will be those institutions that have created or used the largest number of widely applicable course materials, those that have implemented the most successful redesign models that can be easily translated to a broader audience, and those that are most willing to offer their expertise to new practice associates. We anticipate that the remaining institutions identified above, particularly in pre-calculus mathematics, will participate in each practice as well.
b. Establish an organizational structure for each practice. Most of the practice activities will be conducted virtually under the leadership and coordination of the managing partners and the Center staff. The practices will meet face-to-face on three occasions: On or about 12/15/03, all four practices hold an initial organizational meeting in one location with Center staff (Practice Meeting I) with one representative from each core institution. The agenda will be to establish a work plan and timeline for each practice over the life of the project, to select a Managing Partner for each practice, to establish the role of the core associates, and to establish common standards for the materials repositories including common formats, access and storage strategies and resolution of technical issues. A technical consultant will be engaged to advise the practices regarding standards.
Objective 2. Establish initial repositories of research-based disciplinary materials that can be used by other colleges and universities.
a. Aggregate core learning materials . In each academic practice, faculty redesign experts will assemble a set of modularized course components that map to desired learning outcomes in each discipline and that can be used flexibly by different types of institutions.
Step 1. Identify learning objectives for each discipline. Each practice will produce a comprehensive list of learning objectives for the introductory course by aggregating desired learning objectives from all core institutions. While there will be a great deal of overlap among institutions, our idea is not to force consensus but rather to create an expansive list of objectives such that other institutions can select a sub-set from among them to use in redesigning their courses. A practice may collectively identify 25 learning outcomes in a discipline, but a particular institution may want to use only 15 of them in its redesign.
Step 2. Identify usable course components. Components from core associates' redesign projects proven to increase student learning and contribute to cost reduction will be identified. Components will include such things as interactive tutorial modules, interactive exercises with solutions, low-stakes quizzes, and team-based exercises.
Step 3. Map the learning materials to learning objectives. Each practice will map the course components to the established learning objectives, making it easy for new associates to identify appropriate materials to be used in their redesigns and to identify gaps in materials for which new materials need to be identified or created. For each practice, the Center Web site will list learning objectives, materials that support their accomplishment, and the institution that submitted them. This process of materials aggregation will be completed by 4/1/04 in time for potential new associates to develop their applications to join the practice.
b. Convert institution-specific materials for general use. Each core associate institution has developed or adapted redesign materials for its own students using different formats and different technologies. These materials will be converted from institution-specific to general use.
Step 1. Convert materials into agreed-upon formats. Following the agreed-upon formats established at Practice Meeting I, each institution will convert its own materials. This process will be straightforward for some and more complex for others; it will vary from practice to practice.
Step 2. Develop instructions for use. Following the agreed-upon formats established at Practice Meeting I, each institution will convert its existing instructions for student use into generic formats and develop guidelines for use by faculty members at other institutions. Each practice will consult with the new associates, which will have been selected by 8/15/04, as they carry out this process. Usable drafts of these converted materials will be completed by 1/1/05 in time for the selected new associates to incorporate the materials in their redesign pilots.
Objective 3. Develop a streamlined redesign methodology that reduces the cost, the complexity and the length of redesign.
In the PGPCR, we encouraged each institution to develop its own unique redesign within a specified number of parameters (e.g., the redesign has to produce cost savings.) From that process, we learned what works well and what does not. Our goal is to enable novice institutions to move more quickly through the both the planning and the implementation parts of the redesign process. To accomplish this goal, we have a two-fold strategy.
The first part of our strategy is to create repositories of learning materials described in Objective 2. While we encouraged institutions involved in the PGPCR to use existing materials whenever possible, many of the projects spent a considerable amount of time developing new materials, adopting existing materials for new use (e.g., converting paper-based materials to online materials), creating wraparounds, Web sites, and so on. Our idea is to have these materials ready for the new associates to use so that the cost, the complexity and the length of the redesign process will be reduced.
The second part of our strategy is to create a "Chinese menu" of redesign options: those techniques and models that resulted in optimal designs-that is, were most likely to lead to learning improvements and cost reduction--based on an analysis of the 30 Pew program redesign projects. Rather than beginning with a blank slate, each new associate institution will have a defined list of techniques from which to choose.
a. Create a compendium of critical redesign techniques. Based on the experience of the 30 redesign projects, we know that some techniques consistently led to better student learning and to reduced costs of instruction whereas others that were used showed no significant difference. In addition, an explicit set of implementation problems were encountered by some projects due to deficiencies in their planning. We will glean from the 30 redesigns descriptive lists of 10 or fewer pedagogical techniques proven to improve the quality of student learning; 10 or fewer techniques proven to reduce the cost of instruction; and 10 or fewer benchmarks for successful redesign implementation.
b. Collapse the 30 Pew program redesigns to five redesign models. While all 30 redesigns shared certain characteristics, each created a model that varied according to the discipline involved, the particular student audience and the preferences of faculty. After examining the similarities and differences in how these common characteristics were expressed, we can identify five distinct redesign models: 1) supplemental, 2) replacement, 3) emporium, 4) fully online and 5) buffet. (Please see http://www.educause.edu/ir/library/pdf/erm0352.pdf , an EDUCAUSE Review article elaborating these models.) We will create cases illustrating each model, CPTs describing how each model can be employed, and narratives explaining why an institution would choose one versus another and how it might mix and match among them.
By supplying many of the "pieces" needed in course redesign, we will accelerate the course redesign diffusion process. The cost of redesign will be reduced from a $200,000-grant to a level that can be supported by each new associate institution. The complexity of the planning process will be reduced by moving from a process of inventing each element from scratch to one of selecting from among proven techniques and models. The planning period will be reduced from nine months to two months. The implementation period will be reduced from two years to 15 months. (The rhythms of the academic calendar, especially the summer hiatus, make it impossible to reduce the implementation period further.) A description of the streamlined redesign methodology will be completed by 2/1/04 for incorporation in the guidelines for new associate participation. Examples illustrating the fully developed methodology will be completed by 5/15/04 for the new associate applicant workshop described below.
Objective 4. Establish a national competition to select new associates that will produce 40 new course redesign plans using the streamlined methodology.
a. Establish a national competition to select 20 new associates. The Center will establish a well-publicized national competition to select 20 new associates by replicating a streamlined version of the successful process employed in the PGPCR. The competition and a schedule of planned events will be announced in November 2003. The national competition process will teach the redesign methodology to double the number of institutions than those eventually participating and will create an awareness of the project in the higher education community reaching far beyond the participating institutions.
Step 1. Announce and publicize the national competition. The target date for announcing the competition is 11/1/03. We will utilize the following vehicles to publicize the it: 1) the Center's current listserve of 4,500 members of the higher education community interested in redesign; 2) EDUCAUSE's listserve of 10,000 campus representatives involved in using technology (the Center has a formal affiliation with EDUCAUSE); 3) a targeted mailing to 3,600 campus presidents; 4) announcements via the respective practices' affiliated disciplinary societies. Guidelines for participation will be posted on the Center Web site and all announcements will direct interested institutions to the site.
Step 2. Develop participation guidelines for prospective new associates. We will revise and adapt the guidelines used in the PGPCR to reflect the streamlined redesign methodology. Used to assist new associates in developing redesign plans, these guidelines will be distributed using the vehicles described above on or about 2/1/04.
Step 3. Establish readiness criteria for participation. Not all institutions are ready to engage in large-scale redesign using technology. A set of preconditions-or readiness criteria-must be in place before an institution can successfully implement such an effort. We will revise and collapse the institutional readiness criteria and the course readiness criteria used in the PGPCR to one set of readiness criteria appropriate to the streamlined process with examples of how institutions can meet them. Applicants will be required to submit written responses to the readiness criteria by 4/1/04 as the first step in the application process, and the criteria will be used to select potential new associate applicants.
Step 4. Select 40 new associate applicants. Center staff and the managing partners will review the readiness responses and rank order them according to the institutions' level of readiness. The project director will review the rankings and make the final selection. Ten institutions per practice (40 total) will be selected by 4/15/04 and invited to send teams to a training workshop (Workshop I) in early June.
Step 5. Train 40 institutional teams. Center staff will conduct Workshop I on or about 6/1/04. This workshop will teach participants how to use the program's streamlined tools and techniques as they develop their redesign plans. The four managing partners and Dr. Peter Ewell, the project evaluator, will participate in this workshop as resources and facilitators. Two-person teams from each institution will be invited.
b. Produce 40 new course redesign plans using the streamlined redesign methodology. The Center staff and the academic practices will provide individualized assistance as prospective new associates prepare redesign plans. Institutions will be strongly encouraged to submit drafts of their plans for review and feedback before the final submission.
Step 1. Assess the applicability of the learning outcomes and course materials developed by the academic practices to new applicant institutions. Prospective new associates will review the learning outcomes and course materials developed by the academic practices in comparison to their own set of desired learning objectives. Based on that comparison, they will identify any "missing pieces"-that is, desired learning outcomes for which there are no course components available via the practice. They will then know what materials need to be identified from free sources (e.g., NSF, MERLOT), from commercial publishers or developed themselves.
Step 2. Select the appropriate redesign model. Prospective new associates will select an appropriate redesign model in consultation with Center staff and the academic practices as well as appropriate pedagogical and cost reduction techniques.
Step 3. Develop final redesign plans. Prospective new associates must plan to conduct a pilot during the spring 2005 term and implement the full redesign during the fall 2005 term. Plans will consist of a narrative describing the specific activities required to implement the new course redesign, a plan to assess student-learning outcomes, and a completed CPT. Deadline for plan submission will be 8/1/04.
Step 4. Select 20 new associates (5 per practice). Three institutional representatives from each practice will review the 40 proposed plans and rank order them according to which institutions demonstrate the highest likelihood of success. Three external experts and the project director will then review the rankings and make the final selection. Five institutions per practice will be selected by 8/15/04 and invited to join the practices as new associates.
Step 5. Announce and publicize selection. Using the same outlets described above, the Center will announce and publicize the selection of new associates. In addition, each new associate institution will use its own publicity outlets to publicize their redesign projects.
Objective 5. Implement 20 expedited course redesigns that improve learning and reduce cost using the academic practice materials and the streamlined methodology.
a. Complete development of 20 campus redesign projects. This timetable anticipates four months of planning and development to complete redesign preparations prior to campus pilots. Activities during this period include finalizing project teams and completing redesign activities such as working with existing practice materials and modifying them when necessary and incorporating additional content into course materials.
b. Implement 20 course redesigns. During the spring 2005 term, 20 new associates will conduct pilot implementations of their course redesigns. They will collect data on comparative student learning outcomes, and they will offer feedback to the academic practices regarding course materials. An interim progress report will be developed by each new associate and submitted to the Center by 6/15/05 and posted on its Web site. Workshop II to be held in late June or early July 2005 will enable new associates to share and assess their pilot experiences and to provide an initial evaluation of the practice materials and the streamlined redesign methodology. Participants in the workshop will be one representative from each new associate institution, the four managing partners and Center staff.
During summer 2005, the 20 new associates will refine and revise their redesign plans based on the pilot experiences in consultation with Center staff and the practices. Materials will be revised based on feedback from instructors in the primary discipline and in client disciplines. New associates will also prepare to scale up the redesign for full implementation.
During fall 2005, the 20 new associates will fully implement their redesigns in all sections of the course. They will collect data on final student learning outcomes and final instructional costs; they will continue to offer feedback to the practices regarding course materials. By 5/1/06, final reports will be submitted to Center staff. Case-study versions will be presented at Workshop III in early July 2006. (Participants will be the same as in Workshop II.) New associates will assess their experiences and provide a final evaluation of the materials.
Objective 6. Establish the conditions that will enable further expansion of the practice model and its ongoing stability.
A major goal of this project is to test whether the academic practice model can be a viable way to increase the number of institutions engaged in course redesign. Two aspects of the practice model are fundamental: expanding the quality and quantity of available interactive learning materials and expanding the quality and quantity of human resources knowledgeable about redesign and willing to share that knowledge. In addition, some kind of ongoing structural entity is needed to sustain the effort. While the establishment of the practices and their interaction with the new associate institutions is an important component of this project, their ongoing stability is key to their value. We anticipate an iterative process that can be replicated within the core disciplines and in other disciplines as well. Our long-term goal is to establish an independent, self-sustaining entity that can maintain and expand the academic practice model to other disciplines and other institutions.
a. Expand and improve the repository of practice materials. Each academic practice will use an iterative process for revising and adding to its repository of materials throughout the life of the project. The core practices will establish the initial frameworks for the repositories and convert core materials for general use as described in Objective 1. Each practice will also establish standards and mechanisms for ongoing quality control and consistency. Insuring that the established frameworks are used and that all materials have been carefully vetted is key to enabling others who access the repository to find and use high-quality, tested materials.
The new associate applicants will critique and assess the materials in relation to their desired course learning objectives as described in Objective 4. The new associate institutions will redesign courses using the materials developed by the practice--evaluating and revising the materials as necessary--and contribute additional materials that have been proven to increase student learning or contribute to cost reduction as described in Objectives 5. These new materials will be converted so that others in higher education can also use them. Thus, each practice will build a living compendium of research-based, best-practice resources that can eventually be drawn upon by all higher education institutions.
b. Expand and consolidate the practices. After the new associates complete their redesigns, they will have developed substantial expertise in large-scale course redesign, which will be valuable to others in the practices as well as to higher education in general. They will continue to exchange ideas with other associates and with other interested institutions of higher education. Thus, the new associates will become core associates, the first step in an ongoing expansion of the practices and their repositories of materials. At a summative Practice Meeting in July 2006, the core associates will review and assess what has been accomplished in light of the full implementation of the new course redesigns. They will add new materials that have been shown to improve learning and reduce costs to the repositories and develop plans for ongoing sustainability and expansion.
Our dissemination strategy has two facets. The first is that the heart of our proposed project is to diffuse a proven educational reform and to accelerate the pace of change at other institutions, as described in the project work plan. The second is to continue to increase awareness in the higher education community about how course redesign using technology can improve learning and reduce costs. In addition, as a result of the FIPSE-funded project, we will help the community understand how course redesign can be accomplished without a grant and in a reduced time period.
The Center has established a number of communication vehicles, including a 4,500-person mailing list, print and electronic newsletters, a comprehensive Web site, published articles and interviews, and an active national speaking program, all of which will be employed in disseminating the results of this new initiative. Specifically, we will:
Both formative and summative evaluations will be central activities in this project. They will occur on two levels--at the new associate campus level and at the project level--and will be conducted by a mix of evaluators including the new associate institutions, the practices, the Center staff and an external evaluator.
a. New associate campus level. As mentioned earlier, the Center has four years of experience working with multiple campus efforts to embed formative and summative evaluation into their course redesigns. In collaboration with Dr. Peter Ewell, the Center has established a rigorous evaluation process for gathering both quantitative and qualitative information about the impact of course redesign on improved student learning, increased retention and reduction of instructional costs. The new associate campuses will be asked to conduct an evaluation plan based on that process during their association with the project. Dr. Ewell will review the data on learning outcomes and retention submitted by the teams and verify the results; Center staff will review the data on cost reduction and verify the results. The teams will also collect formative data such as student and faculty opinion surveys and self-reports that will be used to improve their redesigns as they move from the pilot stage to full implementation.
Each new associate team will also be asked to assess their redesigns as a case study in developing, implementing and sustaining innovative curriculum reform, at the beginning and through the life of the project. They will be asked to focus on pedagogical techniques that improve student learning, techniques that reduce instructional costs, and implementation issues that contributed to the success of the redesign or were barriers to completion, using a model and format for the preparation of case studies used successfully in the Pew program. The campus team will gather this information at intervals throughout the project and will synthesize it in a case study presentation at the final workshop and practice meeting.
b. Project level. The project itself will undertake a similar evaluation process to determine and describe the effectiveness of the practice model, that is, to evaluate whether the structure succeeds in providing new associates with the repository materials and guidance they need to redesign a large enrollment course using the streamlined methodology, and whether it succeeds at extending the lessons of these redesign efforts to a national audience. Each stage of the project will build in feedback loops to gauge progress and gaps by gathering interim and final reports from each practice and conducting reflective interviews. Each of the six major objectives will be individually measured, as noted below.
c. The external evaluator. Peter Ewell has been involved in thinking through the assessment approach as this proposal was written. He will provide technical assistance in further developing the evaluation plan and strategies to assess the project's activities at each of the six major steps in the project. He will assist in delivering those portions of the workshops that speak to how the participants might incorporate assessment into and develop evaluation of their programs and will work with participating campuses to refine their plans to document student learning. He also will receive and evaluate results on the assessment of student learning supplied by participating campuses and will collaborate with the project director to assemble the final project report, which will synthesize the results of the 20 course redesign projects and the project as a whole.
The Center's proven redesign methodology is unique in higher education. It combines reputable techniques from diverse knowledge bases such as student learning outcome assessment, activity-based costing, and pedagogically sound applications of information technology and applies them to large numbers of students and faculty members in many disciplines and diverse institutions. Its significance is its scalability: the methodology has enabled the faculty involved in these redesigns to incorporate good pedagogical practice into courses with very large numbers of students-a task that would have been impossible without technology-while significantly reducing the cost of instruction. The streamlined methodology will replicate these unique characteristics, with the added innovations described in our workplan.
The need to make available Web-based interactive learning resources in order to accelerate the benefits of technology-based learning has been well-recognized in higher education. Indeed, both private foundations and governmental agencies have funded numerous projects. Most recently, efforts such as MERLOT and the MIT OpenCourseWare Project (the latest incarnations of "build it and they will come") have received widespread attention. Most involved in developing and funding these efforts have defined the problem merely as lack of technology-based materials. They seek to create repositories whose contents are vouched for by institutional reputation (MIT) or by individual faculty contributors (MERLOT). The assumption is that providing access to these materials will cause individual faculty to use them in redesigning their courses.
This approach has several drawbacks. Entries are selected and mounted by interested individuals, but the materials are not tied to improved student learning outcomes. Many of the included learning objects are intended for specific (and possibly unique) upper division courses that are not necessarily part of the curricula at other institutions. Other materials are designed for sophisticated students and may not be relevant to a more diverse student body at other institutions. In addition, these projects tend to assume that more options are always better. MERLOT cites "links to thousands of learning materials" as one of its benefits, yet only a tiny subset has been evaluated by anyone other than the contributors. Most importantly, these projects lack a methodology for transfer to other institutions. Their strategy of hope-for-the-best has been tried many times in the past and failed (e.g., programs supported by Apple and IBM in the 1980's and 1990's, and attempts by national organizations like Educom).
To respond to the need for high quality, interactive learning materials, our strategy is fundamentally different. First, we intend to build upon a base of materials that has been tested with large numbers of students at multiple institutions and has demonstrated statistically significant increases in student learning. Second, establishing repositories of materials that have led to increased learning is a good first step, but it is not enough to ensure that they will be used. One must have a transfer methodology. The academic practices will consist of faculty members and others with substantial experience in how to use the materials. For example, our experience has shown that supplementing classroom experience with low-stakes quizzes (the materials) may lead to increased learning, but by giving points for student participation (how to use them), instructors can increase student learning substantially. Thus, we will also embed a method of transfer by pairing experienced institutions with less experienced institutions to ensure that the materials will be used successfully.