Educational Accountability: Using Practitioners’ Expertise

In my recent post about Opportunity America’s paper recommending higher education accountability measurements, I noted that the framework proposed by the Opportunity America research panel needed a lot more substance before I could support it.

I was particularly troubled by the notion of using “completion” as a required tool to hold institutions accountable without any definition of completion. In particular, there should be more specific indications of which students are included in the denominator and which students are included in the numerator.

Thinking about measurements related to completions reminded me of a blog post of mine from February 2015. In my 2015 post, I wrote about a white paper written by an Education Working Group convened by the Servicemembers Opportunity Colleges (SOC). Servicemembers Opportunity Colleges is a Department of Defense contract managed for the department by Defense Activity for Non-Traditional Education Support (DANTES).

What’s obvious when you read the white paper as well as my synopsis of the paper in my post is that the working group worked extensively with other accountability organizations. In addition, the group worked with stakeholders and constituents of active-duty military voluntary education.

In a section titled “Introduction to the Military Student,” the group noted that it is rare for a servicemember to be both active-duty military and a full-time, first-time student. That statement sets the stage that the traditional form of data captured from higher education institutions by the Department of Education was not likely to reflect accuracy in reporting the activities of students meeting this profile.

The group wrote that military students behave differently than other non-traditional adult populations. Because of deployments and rapid pace of theater (engagement with the enemy), it is difficult for military students to predict when a good time will be to start a course or if they will be able to complete it in time.

In addition, some military students are under-prepared for college because they did not complete a college preparatory track in high school. Other data from larger institutions serving the military indicates that the average military student attends three or more colleges before earning an undergraduate degree. Military students often stop out before continuing college later.

The group writes that the key issue is how to appropriately define the cohort for military-serving institutions because the Integrated Postsecondary Education Data System (IPEDS) only tracks first-time and full-time degree-seeking freshmen. They further note that military students do not typically start their college education as full-time freshmen or necessarily with the goal of pursuing a degree.

Consequently, defining a cohort appropriate to the measurement of persistence and graduation of military students must consider several factors unique to military students:

  • There is a fundamental difference between persistence and graduation rates of online/distance education programs and of traditional delivery methods (not sure about the gap today given that this point was made in 2012), paralleling the differences between all types of institutions.
  • Military training and service school credit may be accepted as college credits based on the American Council on Education’s “Guide to the Evaluation of Educational Experiences in the Armed Services.”
  • Like adult students in general, many military students enroll in a course offered through distance education institutions “to try out” online education. Later, those military students find out that they prefer to take their early courses face-to-face at a nearby institution.
  • Military deployments throughout the nation and the world expose servicemembers to many military-serving institutions, increasing the likelihood of their attendance at multiple institutions en route to graduation.
  • The increased use of government-sponsored online websites that facilitate enrollment, registration, tuition assistance disbursement, and degree planning, such as the GoArmyEd portal (which was recently replaced by the Army’s IgniteEd portal), allow students to determine the time it will take to earn a degree and allows the military services to maximize tuition assistance.
  • A good number of students enrolled in non-selective colleges and universities face significant educational challenges derived from inadequate primary and secondary educational preparation.
  • The outcome of these and other factors unique to military students is that military students, by the time they graduate, are likely to have attended more than five institutions.

The swirling of students with explanations provided in the previously mentioned factors is not necessarily bad, wrote the group. However, it raises a key issue: at what point is it “reasonable to expect that it is the intention of the student to complete a degree at a given institution?”

The group arrived on a logically derived recommendation that defined committed students as those completing 15 credit hours and transferring 9 credit hours. The paper as well as my blog article provide an excellent summary of the recommendations of the working group. I won’t summarize them again, but I recommend that you read both.

The point that I want to make is that the SOC working group that analyzed and assessed appropriate measurements for their report on the persistence and graduation success of institutions serving active-duty servicemembers thought more about the difficulties in providing data that would be comparable across institutions than the Opportunity America working group.

When I examine the working group membership list for the SOC white paper, I noticed that:

  • Six members worked for colleges whose students were largely servicemembers.
  • Three worked for the Department of Defense.
  • Three worked for academic organizations.
  • One was a lawyer specializing in education regulations.

When I looked at the working group on for-profit colleges, I noted that:

  • Two worked at for-profit institutions.
  • One worked for a higher education consulting firm.
  • Two worked at research institutes.
  • Two were professors at universities with some knowledge of higher education regulations.
  • Two were from higher ed think tanks.

The intro noted that some members dropped out of the project before it was completed. However, it’s clear that the SOC working group included more observations from experienced professionals – either serving or working for the military – than the Opportunity America group. The Opportunity America group consisted more of academics than individuals from the for-profit sector or who worked as administrators at other colleges.

Fluff is not better than substance. Rhetoric is the fluff of politicians, not researchers. People whose intent is to create bipartisan recommendations should engage with more experienced practitioners, rather than using researchers whose biases may favor the norms or averages instead of the cluster of students from a specific group.

The SOC white paper provides an excellent example of the differences for a sub-segment of working adult students, specifically active-duty servicemembers. If our goal is accountability, we need less of a one-size-fits-all solution. There should be a solution that better benefits students with specific attributes who differ from the 18- to 22-year-old full-time student that policymakers love to cite as the benchmark for all.

Subjects of Interest

EdTech

Higher Education

Independent Schools

K-12

Student Persistence

Workforce