The Carnegie Foundation for the Advancement of Teaching and The American Council on Education have partnered to underscore the necessity of proposing two methodologies for achieving a universal measurement of socioeconomic mobility across all American Higher Education Institutions. The importance of accurate data reporting cannot be overstated, as socioeconomic mobility, workforce development, performance-based funding, and student debt discussions revolve around one central concern: student performance.
Practitioners, Legislators, Students, and their families must be able to forecast student outcomes after students attend a postsecondary institution. Universities must be held accountable for developing holistic matriculation structures that yield a return on investment. These discussions have gained increasing importance, championed through the development of transparency about colleges and universities when students select where they should attend, with developments like the College Score Card created during the Obama Administration.
"The report demonstrates two university examples: Rutgers University, with three campuses, and Kent State University, with eight campuses. The self-reported information for each university varies based on the data system."
The report introduction is a detailed discussion of how data is reported to the federal government, the Education Department (ED), through two distinct channels. The Integrated Postsecondary Educational Data System (IPEDS) and the National Student Loan Data (NSLDS) are crucial in this process. These two data systems collect information in an aggregated format (NSLDS) and a disaggregated format (IPEDS). For instance, IPEDS collects individual school data such as enrollment, net cost, cohorts’ trends, retention rates, and graduation rates for individual institutions, even if there are multiple campuses for one school. Each campus self-reports data to IPEDS.
NSLDS reports average salary outcomes compared to student loan data during college enrollment. The difference is that this system combines the information for all campuses, even if they have different enrollment sizes and net costs of attendance. The report demonstrates two university examples: Rutgers University, with three campuses, and Kent State University, with eight campuses. The self-reported information for each university varies based on the data system.
This presents a challenge and potentially generates inconsistent and misleading data when determining the average salary outcomes for students, as the NSLDS current method uses the same salary amount for all individual campuses combined. Thus, the Carnegie Foundation and the ACE offer two additional approaches.
"The importance of accurate data reporting cannot be overstated, as socioeconomic mobility, workforce development, performance-based funding, and student debt discussions revolve around one central concern: student performance."
The first method uses only the data from the NSLDS system as the sole determinant of socioeconomic outcomes and average salary. It determines an institution's potential return on investment for students. This method would disregard IPEDS data and solely focus on the aggregate. The other method includes a weighted average of the metrics in IPEDS reporting, developing the average among all institutions and using those averages combined to determine the salary outcomes, socioeconomic mobility, and returns on investment.
In either approach, because we have two different data systems of reporting, there needs to be a shared methodology that all schools can follow concerning socioeconomic outcomes. When considering the value of the IPEDS data, it would seem tangential to combine the self-reporting efforts, which the weighted average approach accounts for. In this aspect, we consider the individual factors of the campuses and the whole institution. This would demonstrate a more synergistic approach that includes institutional types, geographical locations, enrollment patterns, and student profiles. This is considered valuable data collection when a student is determining the best college to attend rather than relying on the fragility of peer-to-peer ranking systems, which recently began to account for an aspect of social mobility.
"When considering the value of the IPEDS data, it would seem tangential to combine the self-reporting efforts, which the weighted average approach accounts for."
Every college and university must carefully consider how they demonstrate these outcomes. This largely depends upon financial aid packaging, career development outcomes, theory and application, and college student development theory. Implementing these data measurements will soon be crucial to any institution's longitudinal story and admissions projections.
Thank you for reading and sharing this summary article. Be sure to subscribe to felderofficial.com for weekly insights, released every Friday.
Rebuttals are always welcome,
Jade M. Felder, Ed.D.
Comments