Cracking the Credit Hour

Image of gears on a surface. Photo Credit: Flickr user col_adamson

The basic currency of higher education — the credit hour — represents the root of many problems plaguing America's higher education system: the practice of measuring time rather than learning.

Cracking the Credit Hour traces the history of the credit hour, which was created by Andrew Carnegie at the turn of the 20th century. A credit hour typically represents one hour of faculty-student contact time per week over a fifteen-week semester. Most bachelor’s degrees require 120 credit hours.

As the report notes, the credit hour “was never intended to be a measure of, or proxy for student learning.”  Over time, however, the credit hour has taken on enormous importance in everything from setting faculty workloads to determining state and federal funding and an institution’s eligibility for federal student aid.

Even though the federal government has tried to indicate a willingness to move away from the credit hour, “many in the industry still believe that their safest bet, if they want to keep access to federal financial aid, is to do what they have always done: use time to determine credits.”

The report recommends a variety of policy solutions that could help move the U.S. from a time-based higher education system to one based on learning. “If the U.S. is to reclaim its position as the most- educated nation in the world, federal policy needs to shift from paying for and valuing time to paying for and valuing learning,” the report concludes. “In an era when college degrees are simultaneously becoming more important and more expensive, students and taxpayers can no longer afford to pay for time and little or no evidence of learning.”