The Voluntary Framework Analysis (VFA) is an accountability system created for (and by) community colleges so that colleges can more accurately tell the story of how they serve their students and how they perform as institutions in the context of their communities. Here at EMSI, we applaud such work and would like to offer our support — as well as a few ideas on how we might be able to add some labor market metrics to the assessment.
The demographic, economic, and workforce realities that surround community colleges are often drastically different from those of four-year institutions. Community colleges work in hundreds of distinct regional economies formed by the vast, complex, rapidly changing U.S. economy, and much of their focus is driven by local economic and workforce needs. The community college system is thus a highly varied assembly of postsecondary institutions working to support and develop students based upon the unique environment in which they serve.
It is therefore quite difficult to measure these schools with large-scale, national accountability metrics. Not surprisingly, metrics (such as the College Scorecard) aren’t wildly popular because they can be inaccurate when assessing a particular community college.
To illustrate — one key criteria in higher-ed accountability metrics is graduation rate. Graduating is, of course, a good thing. However, judging community colleges based on graduation rates is potentially problematic. Why? Because by their very nature, community colleges are open-access, serving significant numbers of non-credit, part-time, and short-term technical students who are targeting specific jobs or careers. Very often the employer at the other end of this training is far more interested in seeing a specific skill than in a particular credential.
If non-traditional students (which are increasingly becoming “traditional” since half of all undergrads are enrolled in community colleges) are meeting their personal or educational goals at the college — transferring, finding a job, or getting a specific skill — then the college should be able to put a check in the “good” column.
Over the years, we have worked with many community colleges that use our data to better understand what programs they should offer to students to help meet the demands of local industry or specific employers. As those programs take root and flourish, many colleges tell us that students don’t actually complete the programs — because the employers are so pleased with the students’ work inside the programs that they actually hire the students before they can finish. To us, this sounds like success! For this reason, students are stopping and starting their education in ways that make that make measurement of completion rates over a fixed number of years quite difficult, not to mention potentially misleading.
The Voluntary Framework Analysis seems to be just what the doctor ordered, enabling community colleges to tell their story in the context of local needs. These colleges should be judged based on their ability to help their students in the context of the region. If non-traditional students (which are increasingly becoming “traditional” since half of all undergrads are enrolled in community colleges) are meeting their personal or educational goals at the college — transferring, finding a job, or getting a specific skill — then the college should be able to put a check in the “good” column.
If you haven’t read about the VFA yet, please do. Here we have selected a few notable quotes, perspectives, and articles you should check out.
In this first article, Karen Stout, President of Montgomery County Community College (PA), member of the VFA Planning Advisory Committee, and co-chair of the 21st-Century Commission on the Future of Community Colleges, provides key perspective on the potential value of the VFA program:
A few weeks ago, a local reporter called to ask me if I had seen my college’s metrics in the recently released White House scorecard. I had and the metrics were not flattering. But, I also had my college’s data from the Voluntary Framework of Accountability (VFA) because we were one of the 40 original pilot colleges to test the American Association of Community Colleges’ VFA metrics and data definitions. I was able to use our VFA data on student progression to paint a much more comprehensive and inclusive picture of student progression and attainment.
The VFA helped me tell our story—of our full-time and part-time students based on a sufficient cohort timeframe to determine outcomes. It was a story that included the good and the not so good. But, I was able to tell the story in a way that was mission-based and in a way that the reporter was able to understand.
As author Jim Collins states in his now famous book Good to Great and the Social Sectors, ‘What matters is not finding the perfect indicator, but settling upon a consistent and intelligent method of assessing your output results, and then tracking them with rigor.’ Collins’ words capture the spirit of the emerging metrics in the VFA, especially those that measure workforce and economic development and student learning outcomes and where additional research and development are required to find the “’perfect’ indicators.
What the VFA offers right now is a consistent methodology and an intelligent method for all community colleges to demonstrate their value. Full-scale adoption will give community colleges a new and powerful common voice to speak to student success—a voice that is perfectly aligned with our unique missions.
More than any other sector of higher education, community colleges have embraced the use of data, a “culture of evidence,” to provide greater transparency, increase institutional accountability, and, most importantly, improve completion rates for the most diverse student body in academe. With support from Lumina Foundation, close to 200 colleges in 34 states have now committed to what many consider the most significant reform movement in the history of community colleges through participation in Achieving the Dream.
This fall, the American Association of Community Colleges (AACC) will formally launch a national Voluntary Framework of Accountability (VFA), a program that at long last provides metrics reflective of and appropriate to the community college mission. More than 140 colleges are already testing VFA with promising results, and the entire state system in Pennsylvania has adopted the program.
It’s true: community colleges are the leaders in the adoption of data into their planning processes so that they provide the most relevant, labor-market driven education. Community colleges also occupy a specialized niche that addresses many of the occupations where employers commonly complain of skills gaps — sectors like health care, technology, manufacturing, and energy.
Here at EMSI, we’ve been working with community colleges for more than 10 years, and have found that they are extremely powerful engines for jobs, workforce development, and economic vitality in the regions they serve. This is evidenced by many case studies:
- Karen Stout’s Montgomery County Community College (PA)
- Walla Walla Community College (winner of the 2013 Aspen Prize)
- Monroe Community College (NY)
- Miami Dade College (FL)
- Lakeshore Technical College (WI)
- Montgomery College (MD)
- North Central State College (OH)
Many of the colleges that EMSI serves have been using data for in-depth program planning and research as well as outreach and communication to help students create the right career vision. In fact, this is very much in line with EMSI’s mission for community colleges — to use data in order to respond to labor market realities and help students understand how those programs connect to good jobs.
And so, to better align our work with what is being done with the VFA, we have two additional thoughts related to how external labor market analysis might support and propel the VFA even further.
1. Labor Market Context
First, supporting analysis of the regional labor market could provide colleges with critical economic insight and context. Every region and every college service territory is different, but their uniqueness can be measured using labor market characteristics which define the particular economy in which the college operates. Imagine, for instance, that a college in suburban Chicago, with its specific mix of programs created to meet the needs of its region, were suddenly picked up and dropped in rural Washington State. Would it perform as well? Probably not, because it wouldn’t suit the specific needs of the businesses and employers in that community. Using a program-to-occupation map to paint a regional picture for each college and then relating this picture to their students completing their education and entering the workforce (sometimes ahead of time!) could provide this valuable context. Further, projected regional demand for certain occupations could be compared to completers that a college has produced.
2. Growth in Earnings
Next, external labor market analysis could be helpful in examining the increased earnings gained by students who participate in a college’s programs, highlighting the potential increase in earnings that the college helps generate. Breaking this data out by program and occupation can help colleges better convey their unique value to their region, and focusing this back into the context of the regional economy would clarify the college’s performance. This is potentially the biggest and most valuable perspective that labor analysis could bring for a community college.
We will continue to think about this and see how labor market analysis can complement this work. For now, please feel free to reach out if you have thoughts, questions, or comments.