[Note: Dr. David Becher, VP of Institutional Research at APUS, collaborated with me on this article. He joined APUS in 2006 as a member of the IT department where he was responsible for institutional reporting, and in 2011, helped create the Office of Institutional Research, in which he is now responsible for overseeing institutional metrics, analytics, external reporting and university research. His specific areas of research include student persistence, progression and completion, and student engagement and mentoring.]
On September 12, 2015, the White House released its long-awaited College Scorecard and, much like other ranking and comparison tools available for use by students, the Scorecard came up short in representing all institutions fairly. While it may have been created with the latest mobile technology to allow for easier access, its data do not accurately portray many institutions, including those serving non-traditional students or where most students do not use federal student aid (FSA) to cover the cost of tuition. This is particularly distressing given that the stated intention was to increase transparency for prospective students by providing “clear, reliable data.” In some cases, the data are woefully misleading or absent, characteristic of an approach for which the Department of Education (ED) could penalize institutions, so why should ED present data so carelessly?
Starting with graduation rates, we looked up the Scorecard data for American Public University System (APUS). The specified rate is 31 percent, which is based on the percentage of first-time, full-time students using FSA who graduated from APUS. What the Scorecard doesn’t say is that there was a total of 35 first-time, full-time students at APUS during the two reporting periods used by ED for compiling the data. Those 35 students represent roughly 0.1 percent (yes, the decimal place is correct) of the approximately 30,000 students included in our 12-month enrollment average reported to the Integrated Postsecondary Education Data System (IPEDS) for the same two reporting periods.
According to IPEDS1, in Fall 2013, across all institutions only about 48 percent of entering degree-seeking undergraduate students (a population of almost five million students) were classified as both first-time and full-time. Stated differently, only 63 of the 3,710 institutions — fewer than two percent — listed in IPEDS that had at least one entering student in the fall of 2013 had 100% first-time, full-time students. Thus, it is irrelevant whether the graduation rate posted is 31% or 100%, the rate is unlikely to accurately represent the quality of the institution, because a first-time, full-time student is atypical at an institution like APUS. Working adults make up 89% of APUS’s enrollment, and our students generally do not have the time to attend college full-time. IPEDS excludes most, if not all, of this population from the graduation rate it calculates, and that flawed rate is then used by the Scorecard and other comparison and ranking tools meant to “inform” all consumers.
Russ Poulin, director of policy and analysis for the WICHE Cooperative for Educational Technologies has raised a similar point. He looked at the graduation rate for the following adult-serving institutions as reported on the Scorecard: Empire State (28%); Western Governors (26%); University of Maryland University College (4%); Charter Oak (no data); and Excelsior (no data). His common conclusion — the data do not reflect the composition of the institution’s student body. Not only that, the line that is drawn on the Scorecard to indicate whether the institution’s graduation rate is above or below the national average is misleading as well, because it represents an average for all institutions regardless of student body. APUS’s 31% percent rate is above that of all the institutions listed previously, but below the national average for all institutions. This is irrelevant since it is not reflective of who we serve.
If those who created the Scorecard have a true interest in correctly representing the graduation (or completion, to use another IPEDS term) rates for adult-serving institutions, they should read the white paper published by the Servicemembers Opportunity Colleges (SOC) in 2012. One of us wrote a blog on the topic in February 2015. APUS publishes a graduation rate based on the white paper recommendations, and other institutions do so as well. The paper’s recommendations for measuring student outcomes are more representative of our working adult population than the Scorecard system. There may be similar papers that address approaches to measuring graduation rates for community colleges.
The Scorecard also misleads prospective students by stating “Data not available” for some measures for some institutions without explanation. One area where this was done for APUS is the metric for “Salary After Attending.” Perhaps ED did this because the number of first-time, full-time graduates they are measuring is low. Perhaps they did it because they are only measuring FSA students, and APUS has not participated in the Title IV program for over 10 years. However, because APUS is obligated to participate in ED’s mandatory Gainful Employment reporting, related data should be available for APUS’s graduates who received federal student financial aid. In addition, APUS has more than 28,000 students who have completed our undergraduate programs over the past five years with over 80% of them never utilizing FSA. By failing to present complete data, ED is misleading prospective students.
The College Scorecard contains data for several other areas as well:
The Scorecard shows that at APUS the average annual net price for financial aid students was $9,095, well below the national average of $16,789. In calculating net price, it excludes aid from the school, and both the state and federal government. Between 2000 and 2015, APUS’s undergraduate tuition was $250 per credit hour. It increased to $270 per credit hour in July 2015, with a grant offered to military and veteran students in order to enable their tuition assistance from the Departments of Defense and Veterans Affairs to fully cover remaining tuition costs. Keeping our tuition at or below the reimbursement rates for active-duty military and veterans means that many of them are able to graduate debt-free.
- Financial Aid & Debt
The Scorecard reports that 27% percent of APUS students receive federal student loans. This is a reasonable statistic that is accurate and it accounts for one of the few compliments that we can offer about the College Scorecard. The Scorecard notes that at schools where few students use federal loans, the typical undergraduate may leave school with no debt, which accurately represents most current APUS students. Nevertheless, the Scorecard reports $17,311 as the typical total debt for APUS. The problem is that data point represents only undergraduate borrowers who complete APUS programs – it doesn’t represent the total debt of an average graduate because so many of our students do not need to borrow. The Scorecard reports that the median typical monthly loan payment is $192/month for APUS, which is calculated based on an assumption that our completers repay their loans over 10 years at six percent interest. Again, the data is derived from only the 27% of our graduates who received federal student loans.
- Graduation & Retention
The Scorecard calculates our retention rate for first-time, full-time students who returned after their first year. Our rate was 84% versus the national average of 67%. While this number happens to be a good rate, we are not confident that it represents our overall retention rate because it measures data from less than one percent of all of our students. Specifically, the calculation was based on 224 students – the number of students who were first-time, full-time students – which is less than one percent of the fall enrollment reported to IPEDS.
- Earnings after School
The “Percentage Earning above High School Grad” metric shows that 82% of our former students (not graduates) earn more than $25,000 six years after they first enroll. This number is misleading since it represents only students who ever received FSA and leaves out all other students who attended APUS (which in our case is most current and former students). Once again, reporting a number based on data from less than 30 percent of our students is misleading. Many of our graduates and former students who are veterans or active-duty military are older and likely to have salaries over $25,000 six years after enrollment.
In addition, the metric includes every student who ever received FSA for at least a 30-day period of enrollment. This “throw everyone into the barrel” approach is misguided. Our preference would be for the Scorecard to adopt the recommendations from the previously cited SOC white paper and include only students who have successfully completed 15 credit hours, a single semester for a full-time undergraduate student. Online programs are much more likely to enroll students who transfer from one institution to another before they decide upon their “home” institution where they ultimately earn a degree. Online education is not a good fit for everyone, and including those students who are uncomfortable with the online format and do not complete at least 15 credit hours does not fairly reflect potential earnings based on program completion at APUS.
The Scorecard’s “Salary After Attending” metric is based on data for FSA recipients 10 years after enrollment. It shows “Data not Available” for APUS. The missing piece of information in this section of the Scorecard is that APUS has not participated in the FSA program for 10 years. Once again, using only data from FSA recipients is misleading because prospective students viewing the Scorecard are not provided that context for why “data [are] not available.”
The Scorecard classifies APUS as a large institution with a total of 42,460 undergraduate students, based on our report to IPEDS for 2013 fall enrollment. Of these students, approximately eight percent were full-time. This full-time number is higher than the number used to calculate graduation rates – just 35 students, or 0.1 percent – but it is higher because for this metric IPEDS requests data for full-time students, not for first-time and full-time students. The Scorecard also shows that 92% of APUS students were part-time for this period.
The Scorecard’s Race/Ethnicity breakdown for APUS is set out below. It reports “Socio-Economic Diversity” of 31% percent for APUS, which represents the percentage of our students with a family income less than $40,000 and receiving Pell Grants. The Socio-Economic Diversity percentage is based on the IPEDS cohort of degree/certificate-seeking undergraduate students enrolled in the 2013 fall enrollment period as well, which totaled 42,460.
- 54% White
- 24% Black
- 10% Hispanic
- 4% Unknown
- 3% Two or more races
- 2% Asian
- 1% Native Hawaiian/Pacific Islander
- 1% Non-resident alien
- 1% American Indian/Alaska Native
SAT/ACT scores – This Scorecard metric is not applicable to APUS because we do not require applicants to submit such information to us, so we do not have it to report to IPEDS. As indicated above, many of our students are working adults who attend APUS 10 or more years after graduating from high school.
Finally, the Scorecard lists our “Most Popular Programs” based on the share of degrees awarded.
- Homeland Security, Law Enforcement, Firefighting and Related Protective Services (20%)
- Business, Management, Marketing and Related Support Services (19%)
- Liberal Arts and Sciences, General Studies and Humanities (16%)
- Multi/Interdisciplinary Studies (10%)
- Computer and Information Sciences and Support Services (9%)
The Scorecard also shows the list of Available Areas of Study, which include:
- Area, Ethnic, Cultural, Gender and Group Studies
- Business, Management, Marketing and Related Support Services
- Communication, Journalism and Related Programs
- Computer and Information Sciences and Support Services
- Education (Note: APUS does not offer an undergraduate education degree)
- Engineering Technologies and Engineering-Related Fields
- English Language and Literature/Letters
- Family and Consumer Sciences/Human Sciences
- Health Professions and Related Programs
- Homeland Security, Law Enforcement, Firefighting and Related Protective Services
- Legal Professions and Studies
- Liberal Arts and Sciences, General Studies and Humanities
- Military Technologies and Applied Sciences
- Multi/Interdisciplinary Studies
- Natural Resources and Conservation
- Parks, Recreation, Leisure and Fitness Studies
- Philosophy and Religious Studies
- Social Sciences
- Transportation and Materials Moving
For the reasons described above, anyone who uses the Scorecard to review data about APUS will not be able to obtain data that accurately reflect information about costs and outcomes for most of our current students and alumni. Our internally calculated graduation rate is higher than the 31% percent published in the Scorecard because we include all of our students, many of whom are not full-time and who transfer in credits from other institutions (meaning that they’re not first-time either).
In sum, schools which operate totally or mostly online and serve working adults are poorly represented by IPEDS and the Scorecard. Its salary data are limited to FSA recipients and do not reflect data from our typical graduate, who is not a FSA recipient.
Unfortunately, many consumers assume that government data are accurate and representative. Even if you read the fine print for the Scorecard, it’s almost impossible for the typical consumer to understand why the data may or may not be complete or accurate for someone like them. We encourage institutions whose data are not presented fairly to provide similar examples. While we may not see changes in the Scorecard any time soon, hopefully more people will realize that its design is based on assumptions that do not accurately represent all institutions of higher education.
1 U.S. Department of Education. Institute of Education Sciences, National Center for Education Statistics.