View Site Map

The Learning MarketSpace, March 1, 2002

Written monthly by Bob Heterick and Carol Twigg, The Learning MarketSpace provides leading-edge assessment of and future-oriented thinking about issues and developments concerning the nexus of higher education and information technology.


The results of the National Association of College and University Business Officers (NACUBO) study to explain college costs are available on their web site Three years in development, the study describes a uniform methodology for colleges and universities to explain to the public how much they spend to educate undergraduates.

Undertaken in response to a report by the National Commission on the Cost of Higher Education, Straight Talk About College Costs and Prices, which called upon colleges and universities to provide the public with more detailed financial information, the methodology is designed to permit institutions of whatever size, public or private, to show clearly the costs of undergraduate education. The Chronicle of Higher Education quotes James E. Morley, Jr., NACUBO president as saying, "By increasing the understanding of their finances, colleges and universities can create a clear context in which to explain tuition adjustments and call attention to rising costs and/or changes in government support. . . . Absent such information, students, parents, the media, policy makers and the public will continue to be uninformed, and, even worse, they will make their own assumptions about what's going on."

As the NACUBO report shows, but doesn't amplify, most people overestimate the price of tuition but underestimate the cost of an undergraduate education. I suppose that being reminded that the cost exceeds the price should somehow make us feel better, if not good, about the economics of higher education. As the Chronicle article observes, "In a study of the new methodology, conducted at 150 colleges, the association found that almost all colleges spend more to educate their students than they charge in tuition. Of the colleges that the group examined, the community colleges spent $3,000 to $7,000 per student more than they charged their students; the public four-year colleges spent between $4,000 and $11,000 more than they charged; and the private four-year institutions spent up to $20,000 more than they charged."

The Report doesn't attempt to deal with the fact that the excess of cost over price is dealt with either through taxation in public, and to some extent private, institutions and by private philanthropic gifts. Ultimately, the consumer--student if you prefer--pays something close to the full cost, not just the price. Naturally, institutions of higher education are much more comfortable talking about price rather than cost. Even if confronted only by the price, an increasingly larger number of students require student loans to afford a college education. Colleges and universities rely on the implicit understanding that, however high the tuition expense to the student, the average return in the form of increased lifetime earnings more than justifies the expense. That comparison doesn't look so attractive if it is made to cost rather than price.

The impact of the bursting high tech bubble in the stock market (still down about 20 percent from its high of over a year ago) and the fall out from the September 11 terrorist attacks has left the economy in recession. The impact on revenue in a number of states is stunning. As they scramble to balance dramatically out of balance budgets, the impact on public college and university tuition is predictable. The double-digit tuition increases of the 1980s are likely to be seen again in a number of states.

The Report attempts to downplay the comparisons between institutions and institutional types. That hardly pacifies the average citizen who is always inclined to ask, "Why is the price of a calculus course at one institution two, five, even ten times the cost at another?" "Does the student learn ten times as much calculus at one as it does at the other?" And, they ask, "In just what way is a calculus course today different from the one taught 20 years ago?"

The study also points out that the increase in the price of tuition has surpassed the inflation rate for the past ten years. They could have, as well, observed that it has done so for the past twenty years. Clearly, the explanation problem for higher education is difficult. Somehow, the average fellow on the street hasn't observed the price of a personal computer, or the price of a new car, doubling every ten years or so. Quite the contrary. If the price isn't going down, at least the quality is demonstrably improving.

What the NACUBO study doesn't do is address the question of why the price of tuition continues to rise faster than the inflation rate. Nor does it deal with the issue of what higher education should do to reverse this undesirable trend. The study does identify the cause--instruction and student services at community colleges and four-year institutions make up 85 to 87 percent of total costs. The vast majority of those costs are personal services--the salaries and benefits of the faculty and staff involved. Higher education is an extraordinarily personal services intensive activity.

But does it need to be? Is there a role for technology in reversing, or at least slowing, the inexorable increase in college tuition? We know one thing for sure, technology, simply "bolted-on" the traditional teaching paradigm, adds cost. The results of the Pew Learning and Technology program have clearly demonstrated that technology, used in conjunction with thoughtful course redesign, can produce significantly reduced instructional costs at no loss in learner subject material retention.

Nearly all of the attention and energy of colleges and universities is focused on explaining and rationalizing the increasing cost and price of higher education. Very little attention is devoted to actually doing something about them. The reasons are relatively obvious. Strategies to reduce the cost of the prevailing lecture paradigm are generally considered undesirable by both the institutions and the public. They include the substitution of adjuncts and graduate assistants for full time faculty and larger lecture sizes--a direct attempt to reduce labor costs.

Absent serious efforts to deploy an alternative learning paradigm that makes efficient use of information technology, we can expect a new commission to raise the same issues that have been raised over and over again for the past twenty years. Let's hope that NACUBO's next report is titled "Reducing College Costs" rather than "Explaining College Costs."



In the last issue of The Learning MarketSpace, we discussed the fact that many students are not spending sufficient time on task in their college courses. As findings from the National Survey of Student Engagement (NSSE) indicate, many students spend only about half as much time preparing for class as their instructors recommend, about one hour for each class hour instead of two.

Because previous research has demonstrated that time on task may be the single most important predictor of increased learning, an explicit goal of course design should be to identify effective ways to increase students' time on task. We promised to describe some of the ways that faculty from the Pew Grant Program in Course Redesign are achieving the goal of greater student engagement with the help of information technology.

The University of Southern Maine (USM) has moved their introductory psychology course from a lecture-based format to one in which there are many opportunities for active learning. Half of the old lecture format has been replaced with modularized, computer-based activities. USM requires students to complete quizzes online in order to master the material before coming to class. Students are allowed to take quizzes several times, until they received a satisfactory grade or time runs out. Because items on the quizzes are randomized, students focus on the concepts rather than the answers to the questions themselves. Feedback directs students to specific material that they need to review. Repetition and immediate feedback, tools that have repeatedly been documented to facilitate learning, are combined to provide a positive experience for the students and to show them what needs additional study.

Another important element of USM's redesign is the ability to continually monitor student progress. Both the instructor and the students know how they are doing in relation to others in class. Students report that they check their status frequently. Instructors find that this feature helps them identify and work with students who are doing poorly in the class as well as acknowledge the efforts of the best students.

USM's redesign has resulted in a 10% improvement in overall understanding of course content as measured by pre- and post- course assessments. In addition, a smaller percentage (19%) of students in the redesigned course failed, withdrew or received an incomplete compared to those in traditionally taught sections of the course (28%).

Students in redesigned sections report an increase in the amount of time studying for the course (typically 3-5 hours per week in contrast to 1-3 hours) than for other introductory classes and traditionally taught sections. This difference was highly significant (t (128) = 12.97, p<.001). Not surprisingly, this increase in effort may explain why students did better in the redesigned course than in the traditionally taught class.

In its redesign of Introduction to Statistical Reasoning, Carnegie Mellon University has developed an automated, intelligent tutoring system called SmartLab that monitors students' work as they go through lab exercises. The software provides feedback when students pursue an unproductive path and closely tracks and assesses each individual student's acquisition of skills in statistical inference in effect, providing an individual tutor for each student.

Early assessment results at CMU indicate that students are achieving a level of statistical literacy that was not thought possible prior to the redesign of the course. In the past, students were typically given the statistical method to be applied along with the data and the questions to be answered about the data. Now, students are able to identify the appropriate statistical method to use as well as how to apply it.

A part of the University of Iowa's redesign of its general chemistry course involves the use of a web-based, interactive software package called Mastering Chemistry as a component of student work done outside the classroom. The software allows students any number of attempts and provides feedback during the practice portion. Early assessment results indicate a direct correlation between completion of the homework and higher course grades. An added benefit of the software is that the homework grading is fully automated, thus eliminating the need for TA graders.

Penn State reports that in its redesign of introductory statistics the redesigned class outperformed the traditional class on a test of content mastery by 13% (60% in the traditional class versus 68% in the redesigned classes). In the redesigned classes, students also demonstrated a greater understanding of a number of critical statistical concepts. In addition, the Drop-Failure-Withdrawal rate decreased by 18%. The Penn State team attributes a significant part of their success to the introduction of Readiness Assessment Tests (RATs). Students are regularly tested on assigned readings and homework using RATs, which are short quizzes that probe student preparedness and conceptual understanding.

RATs are given 5-7 times during the course and are graded using machine-scanners, so no manual labor is involved. RATs constitute approximately 30% of the student's grade. In addition to motivating students to keep on top of the course material, RATs provide powerful formative feedback to both the students and the faculty member. Assessment of student understanding using RATs has proven to be very effective in preparing students for higher level activities in computer labs and in detecting areas in which students are not grasping the concepts, thereby enabling corrective actions to be taken in a timely manner.

In each of these instances, thoughtful course design is producing greater student learning. What do these techniques have in common? First, each relies on research-based pedagogical techniques that are known to contribute to better learning. Second, each requires the student to engage in the activity as part of the course grade rather than leaving to chance whether or not students are actually engaged. Third, actual student engagement is tracked so the faculty know whether or not students are meeting the requirement. Fourth, because these techniques are computerized, the improvements do not add to instructor workload. Finally, in each case, these techniques have very little to do with teaching and everything to do with, in NSSE's words, "getting students to participate in activities that are linked to student learning."

Some of you may have read a recent Chronicle of Higher Education article headlined "Online Students Don't Fare as Well as Classroom Counterparts, Study Finds." The article states, "Professors at Michigan State University have found that students who took an economics course online didn't do as well as students who took the same course in a traditional classroom." Buried in the article is a comment that half of the online students reported that they spent 0-3 hours a week studying for class. Perhaps this accounts for their relatively poor performance. We suggest our colleagues at Michigan State follow the lead of the faculty at Carnegie Mellon, Iowa, Penn State and USM and finds ways to encourage students to spend more time on task.

There is a saying among aficionados of thoroughbred racing: "It's not how fast you run; it's how you run fast." If your goal is to improve student learning, it's not putting courses online; it's how you put courses online. As we design online learning environments, we need to ask ourselves continually whether we are simply migrating our on-ground teaching approaches online or whether we are taking advantage of the capabilities of information technology to improve student learning.



To subscribe to The Learning MarketSpace, click here.

Archives of The Learning MarketSpace, written by Bob Heterick and Carol Twigg and published from July 1999 – February 2003, are available here.

You are welcome to re-post The Learning MarketSpace on your intranet without charge. Material contained in The Learning MarketSpace may be reprinted with attribution for non-commercial purposes.

Copyright 2002 by Bob Heterick and Carol Twigg.