Friday, June 10, 2011

US DOE Committee defines community college success

Redefining Community College Success
June 6, 2011
WASHINGTON — An Education Department committee last week further honed its recommendations for how to overhaul the way the government measures the success of community colleges.

Though there was general agreement among the panel's members on crafting completion measures that, for example, count successful transfers to four-year institutions as well as those who earn associate degrees, there was strong disagreement about whether the government should require community colleges to report their students’ employment outcomes.
More: http://www.insidehighered.com/news/2011/06/06/u_s_panel_drafts_and_debates_measures_to_gauge_community_college_success

In its February 2011 meeting, the committee looked at the following:

FOR DISCUSSION ONLY
Working Group on Progression & Completion Measures
Draft Report
January 27, 2011
Note: The charter of the U.S. Department of Education’s Committee on Measures of Student Success (Committee) provides for working groups to assist the Committee in carrying out its duties. The working groups are responsible for developing materials to be provided to the entire Committee for full deliberation and discussion during its meetings. This draft document has been prepared by a Committee working group. This document does not represent the final recommendations of the Committee. The information and opinions included are the products of working group discussions and do not necessarily represent the views of the entire Committee or the policies of the U.S. Department of Education.
Tasks:
The CMSS Working Group was given the directive to:  Prioritize major issues related to progress & completion measures
 Identify areas for potential recommendations Working Group Members:
 Patrick Perry, Chair  Wayne Burton  Margarita Benitez
Domain of the Issue:
 Federal data collection instruments (primarily focused on IPEDS instruments: Graduation Rate Survey, Fall Enrollment Survey, Completions)
 All two-year institutions (public, private, for-profit)
Issues Identified by the Working Group (with Suggested Options for Consideration):
Significant issues with completion rates methodology exist. In examining the domain of literature on these topics, the definitions currently being weighed and adopted by external evaluators, and the recent NPEC focused study on “Suggestions for Improving the IPEDS GRS”, some common themes and recommendations have come up and are worthy of discussion. Below are listed the areas deemed as priority areas, along with potential recommendations.
1
o
Potential Suggestions:  Conduct a study of representative institutions to see if there are significant
differences between Fall term starters and starters in other terms, whether in student demographic or in outcomes. If differences are negligible, it is possible to conclude that a Fall term cohort is a representative sample, and thus viable to continue to use.
 Include all terms in a year, and track each start term to its respective normal time to completion. This adds inclusion and precision, but also institutional burden.

 
Promulgate a best practice of performing a NSC match to eliminate prior enrolled students (could the Feds contract with NSC on behalf of all institutions for this service? Could AACC?) Change the definition of first-time to “first-time at your institution only” (however, how do you then consider transfers-in?) Place a “stop-out” limit time period on “first-time” status (meaning the student can be counted as first-time at your institution if he/she has not been enrolled anywhere for X number of years—3? 5? 10?)
FOR DISCUSSION ONLY
 Graduation Rate Survey (GRS): Defining first-time. This is generally clear from the instructions (student should be first-time to higher education), but needs clarity in practice (how long of a stop-out should be valid, use of Nat’l Student Clearinghouse (NSC) to determine first-time freshman (FTF) status).
While there seems to be little issue with the current definition of “first-time anywhere in higher education”, there is ambiguity and an uneven playing field about how this is captured and derived. Generally speaking, matching your first-time cohort locally with the NSC to find prior enrollments elsewhere allows for the greatest ability to eliminate non-first-time students. Local campuses that do not do this NSC match are at a great disadvantage. State systems that have the capability to cross-check enrollments system-wide are advantaged.
o
Potential Suggestions:
 GRS: Defining start term. For most “traditional” calendar institutions (semesters/quarters), one uses a Fall starting term. Should/could this be expanded to a full-year cohort of FTF students, or is a Fall start term an adequate sample to work from to calculate rates? Making a full-year window the start point might add complexity (multiple end points for tracking), but might also add the entire universe of students to the rates (thus helping solve one of the perceived issues surrounding how not all students are included in GRS rates) .
If a full-year cohort is used, it certainly adds to the total percentage of students tracked, but complicates either the multiple end points or the “normal time to completion” determination.
2
FOR DISCUSSION ONLY
 Include all terms in a year, but keep a single end point (for instance, all students that began in academic year 2000-01 will be tracked through June 30, 200X.) This keeps institutional burden minimal, but will commingle normal times to completion (Fall starters will have one extra term to achieve outcomes.)
 GRS: defining degree-seeking. This is a huge issue in GRS calculations for two-year institutions with multiple missions. To a certain extent, being full-time in a Fall term has become the “default” definition of “degree-seeking” (because the GRS only looks at full-time starting cohorts). The current definition in GRS states that a student just needs to be enrolled in courses creditable towards a degree, but this can also include a lot of recreational students and courses (such as PE, which is both degree-applicable and transferrable). Should the GRS be modified to include part-time students, clearly delineating a common definition of “degree-/certificate- /transfer-seeking” will be paramount.
o
Potential Suggestions:  Degree-seeking should be defined as what the student claims he/she has intent
to do (self-stated intent). Feds would come up with a uniform coding of this and integrate it into the EF report, and only count in GRS those that self-identified as degree/certificate/transfer seeking.
 Degree-seeking should be identified by the attempt/completion of some unit threshold (commonly 12-18 units) over the course of the tracking period.
 Degree-seeking should be identified behaviorally by “gateway course” attempt, most commonly whether the student ever attempted a collegiate/degree- applicable math or English course, program “gateway” course, or clearly vocational/occupational course that signifies behavioral intent.
 GRS: defining cohort to be tracked. Currently, the GRS only tracks cohorts of first-time, full-time students at 2-year institutions. For most 2-year colleges, this also is only for fall starters. The GRS rates have been criticized for tracking and placing a great public accountability emphasis on a cohort that encompasses only a small portion of an entire incoming class. On the other hand, given that this is such a “refined” cohort, resultant rates are likely to look somewhat inflated (compared to full-year, full-time and part-time cohorts).
o
Potential Suggestions:  Include all students, regardless of units attempted in first-term.  Set a lower units attempted threshold on the starting cohort (such as 6 or more
in first term).  If a full-year cohort is being tracked, set a minimum units attempted threshold of
units attempted in the first year.  Eliminate any designation of full-time/part-time in the cohort as many students
move between these statuses during their academic history. Increase the tracking period (below) to accommodate all students’ progress.
3
FOR DISCUSSION ONLY
 GRS: Defining the tracking term/period. Currently, the GRS tracks a cohort to 150% “normal time to completion” (NTC); for most students at 2-year institutions, this equates to a maximum 3-year tracking period. Recently, this was extended to 200% NTC, or 4 years. If the GRS is recommended to change to include part-time students, the tracking period for these slower- progressing students needs to be examined. Furthermore, a students’ full-time/part-time enrollment status is not a permanent thing; students frequently move in and out of FT/PT throughout their entire academic history, so the concept of placing students in a fixed FT/PT “bucket” based solely upon their enrollment pattern of their very first term is flawed.
o Potential Suggestions:  Lengthen the tracking period to 6 years from cohort inception, especially if the
cohort is defined as anything less than a first-time, full-time in Fall cohort.  Add to the GRS the tracking of multiple cohorts: 3, 6 and 10 years (each GRS
report will have 3 cohorts reported on.)
 GRS: Outcomes hierarchy. The GRS has been criticized for its adherence to the outcomes hierarchy more aligned with the mission of 4-year institutions insofar as degrees are given top value, and transfers are subordinate (the counting methodology forces colleges to count degrees awarded first, then transfers only for non-degree recipients.) In many states, transfers are at least equal in value to degrees in outcome. The GRS also allows lateral transfers (transfers form a 2-yr college to another 2-yr or lower college) to be counted with the same weight as an upward transfer (2-yr to 4-yr), which most colleges do not recognize equally. What is needed is a revised outcomes counting hierarchy for those colleges with missions to produce awards and transfers.
o
Potential Suggestions:  After removing exemptions, enumerate out separately AA/AS (2-year) degrees,
certificates (<2 year), “transfer-prepared” (equivalent to 2-yr degree, fully prepared to transfer), transfers to 4-yr institutions, and “lateral” transfers. From the original cohort, each of these outcomes will have its own separate line of reporting. Like in the Awards submission, students earning multiples of these will be counted separately in each.  Create one single grouping for “higher-order” outcomes: “earned AA/AS OR Certificate OR transfer-prepared OR transferred to a 4-yr institution” whereby a student earning any of these is counted, but only counted once. Eliminate separate “grad rate” and “transfer rate” calculations. Call this single calculation “Achievement Rate”.  Create a separate grouping for “lower order” outcomes: “lateral transfer, still enrolled”.  Crosstabs identified for reporting. With grad/transfer rates being very highly scrutinized nationally, many desired “cuts” of these rates have been requested. To expand upon the current gender/ethnicity cuts, other crosstabs of these rates include financial aid status (Pell), 4 FOR DISCUSSION ONLY FT/PT status, remedial/collegiate status, socioeconomic status, first-generation status, student age group upon entry, and distance education program status. Currently, the GRS cuts rates by gender and ethnicity. Adding more is possible, but needs to be weighed by form limitation and institutional burden. Also, would each crosstab be reported as a singular crosstab, or would all be crosstabbed (so you could get a rate for Hispanic female 35-40 year olds on financial aid)? The latter will possibly create the need for more unitary level reporting and also might have many more blank or suppressed cells. o Potential Suggestions:  Add age group to gender/ethnicity (broadly, so as to not create too many cells): <24/25+ (basically young/old) or <20, 21-39, 40+ (or some other broad grouping) OR just add age group (in detail) as a separate reporting (not crosstabbed with race/eth).  Remedial Status: separates cohort and all outcome rates into two categories: those that needed remediation, and those that did not.  FinancialAidstatus:Pell/NoPell,orotherlocallydefined“needbased”financial aid  Socioeconomic status: locally defined, or first generation status (not likely to be collected everywhere, and would require Federal mandate) IPEDS Reporting: Intermediate Measures of Progress. The only true “intermediate” measure of student progress outside of the current outcomes listed in the GRS is the Retention Rate metric in the Fall Enrollment Survey (EF). There is a great deal of national movement currently that seeks to view student progress in two-year institutions in terms of “momentum points”, many of which are measures of intermediate progress. Some of these might include: o Potential Suggestions:  Retained until end of first term enrolled  Unit threshold achievement: completed 12, 30 or some other level of units  Completed remedial thresholds (completed sequence)  Wage outcomes studies or employment studies IPEDS Reporting: Institutional comparisons + peer grouping. With grad/transfer rates being highly scrutinized nationally, institutional comparisons for accountability and other purposes always ensue. In general, grad/transfer rates tend to have a high correlation with exogenous factors out of the campus’ control (such as the academic preparedness of the student body and the socioeconomic/ first-gen status of the student body/surrounding service area). To ensure proper comparisons, these exogenous factors need to be weighed and used to create peer groups (which currently use size, urban/rural location status and ethnicity distributions as very rough proxies.) 5 o FOR DISCUSSION ONLY Potential Suggestions:  In the EF report, collect a headcount by student zip code and use this as the basis for creating service area indices for each campus (crossed with census data to look at service area socioeconomic status, first-gen status, and other factors) to create peer groups. This would also allow for a more granular study of participation rates. And: FOR DISCUSSION ONLY Working Group on Alternative Measures Draft Report January 24, 2011 Note: The charter of the U.S. Department of Education’s Committee on Measures of Student Success (Committee) provides for working groups to assist the Committee in carrying out its duties. The working groups are responsible for developing materials to be provided to the entire Committee for full deliberation and discussion during its meetings. This draft document has been prepared by a Committee working group. This document does not represent the final recommendations of the Committee. The information and opinions included are the products of working group discussions and do not necessarily represent the views of the entire Committee or the policies of the U.S. Department of Education. The purpose of the alternative measures working group was to examine issues related to measures of student success other than progression and completion measures such as retention, transfer, completion, and graduation rates. The working group was charged with exploring student learning outcomes, employment outcomes, and other outcomes associated with the multifaceted missions of two-year institutions. The working group was asked to consider which measures could be used by the Department in its disclosure and/or reporting requirements and speak to the viability of using such measures. The working group (Kevin Carey, Jacob Fraire, Sharon Kristovich, Geri Palast, and Linda Thor along with Tom Bailey and Archie Cubarrubia) met by conference call on January 13, 2011 to examine issues related to alternative measures of student success. Overall issues were identified as well as three specific areas of alternative measures were discussed: Student Learning, Developmental Progression and Employment. The group acknowledged that there may also be other measures of student success that could be considered. Prior to the meeting, the group also compiled examples of metrics either already in use or being proposed by external student success initiatives. What follows is a summary of the issues discussed and metrics compiled. Overall Considerations Issues  “Success” may mean different things to different students at different types of institutions. Some key questions to address when looking at alternative measures: o Whose domain is a particular measure?  What’s the appropriate role of the federal government? 1  FOR DISCUSSION ONLY  What’s the appropriate role of states?  What’s the appropriate role of institutions? o How will outcomes be measured? o How will acceptable levels of performance be determined? o If appropriate, how should data be disclosed or reported? What is the purpose of disclosing or reporting alternative measures of success? o Is it to improve consumer information? o Is it to improve accountability? o We need to consider the consumers’ viewpoint when considering this issue. Potential Areas for Recommendations  The working group recommends reporting alternative measures of student success to make it more available to the public. o Some members suggest voluntary reporting:  It gives institutions flexibility to report items that reflect their mission.  May help reduce burden by dovetailing with other reporting initiatives. o Other members suggest mandatory reporting if institutions are already publicly reporting it elsewhere. o Regardless of whether reporting should be voluntary or mandatory, the group agreed that some universal (e.g., common to all institutions) measures should be considered as well.  This common information should be reported in a consistent and accessible format to make it easy to understand and compare.  Measures should include student populations that have been typically excluded (e.g., part-time students, students in developmental education, etc.). o In addition, the “first-time” undergraduate cohort needs to distinguish amongst “first-time in college” and “first-time at the reporting institution.”  For any outcomes measured, the Department needs to weigh the additional administrative burden to institutions against the benefits of transparency, informing students and policy formation.  The Department should encourage institutions to provide/disclose alternative measures of success. o Encourage institutions to provide information in a user friendly and accessible format; and to highlight important information instead of just making the information available. Student Learning Issues  The desired level of student learning outcomes for two-year institutions appropriate for disclosure or reporting needs to be more clearly defined. For instance, what level are we looking at: course, program, or degree?  The area of student learning outcomes is very diverse with many potential items for institutions to report. 2 FOR DISCUSSION ONLY  The group suggested exploring learning outcomes across the different domains of learning: o General Education o Occupational/Certifications o Developmental Education/Introductory level Potential Areas for Recommendations  The federal government has a role in helping to make data more easily available to the public, even if such data are already being reported to other groups such as states and accrediting agencies. There was also some discussion of the Department serving as a venue for comprehensive information about institutional performance and not just a data repository.  There was agreement by the working group to recommend that the federal government should find ways to encourage colleges to disclose or report some form of student learning data in a reasonable format. Examples of Student Learning Metrics Note: The metrics presented here are for illustrative purposes and not an endorsement by the working group. Outcome Sources Notes/Comments Pass Rates: Annual percent of graduates passing licensure examinations VFA1 Also listed in employment metrics. Course Completion: Course completion: percentage of credit hours completed out of those attempted during an academic year. NGA2 Other initiatives look at attempted and earned credit hours through a variety of methods. External Instruments/Surveys: Results from the Community College Learning Assessment (CCLA) CCLA3 CLA measures critical thinking, analytic reasoning, problem solving, and written communication. Fee to colleges to participate. Results from the Community College Student Survey of Engagement (CCSSE) CCSSE4 CCSSE measure student engagement. Research suggests a correlation between student engagement and learning. Fee to colleges to participate. 3 FOR DISCUSSION ONLY Developmental Education Progression Issues  The group acknowledges that this area overlaps with the Progression and Completion Working Group and therefore does not want to spend too much time duplicating efforts.  The role developmental education plays in student success at many two-year institutions is increasing.  Developmental education for two-year institutions appropriate for disclosure or reporting needs to be clearly defined. For instance, should we look at the institution, or subject level? Potential Areas for Recommendations  Perhaps developmental education should be looked at as a separate cohort.  Might want to consider including introductory course success with developmental progression. Examples of Developmental Progression Metrics Note: The metrics presented here are for illustrative purposes and not an endorsement by the working group. Outcome Sources Notes/Comments Developmental Education: Enrollment in developmental education: number and percentage of entering first-time undergraduates who place into and enroll in developmental math, English/reading. NGA, VFA Success beyond developmental education: number and percentage of first-time undergraduates who complete a developmental education course in math, English/reading. Also, number and percentage of students who complete a developmental course and then complete a college-level course in the same subject NGA, VFA Introductory Course Success: Success in first-year college courses: annual number and percentage of entering first-time undergraduates who complete entry college-level math and English/reading courses within the first two consecutive academic years NGA Complete College America (CCA) also has a metric similar to this. 4 FOR DISCUSSION ONLY Employment Issues   Updates on the Department’s proposed regulations on gainful employment: o Final regulations to ensure program integrity in federal financial aid programs were published on October 29, 2010, and will go into effect on July 1, 2011. These regulations will address sections of the Department’s gainful employment proposal as well as 13 other issues in an effort to protect students. o The Department plans to publish final regulations in early 2011 on the remaining portions of the gainful employment proposals dealing with a program's eligibility to receive federal student aid. What measures of employment outcomes—particularly those explicitly tied to the mission of two- year institutions (e.g., certifications, licensing)—could potentially be used? Potential Areas for Recommendations   The group agreed that measures of employment outcomes being examined by the Committee should not be tied to the gainful employment reporting that is being proposed in the Department’s regulations. o However, there is merit in requiring institutions to report average student debt so that future students may be able to compare institutions based on how much they might expect to borrow at an institution, based on average debt level for similar/same degree/certificate program. Reporting average student debt also allows policy makers in monitoring loan growth/ dependency relative to credential attainment. Service/market areas will need to be well-defined. Examples of Employment Metrics Outcome Source(s) Notes/Comments Overall Workforce Measures: Total workforce enrollment /retention VFA Market Penetration: Annual ratio of undergraduate degrees and certificates (of at least one year in program length) awarded relative to the state's population aged 18-24 years old with a high school diploma. NGA, CCA5 5 FOR DISCUSSION ONLY Preparation for Employment and Employment Measures: Annual percent of graduates passing licensure examinations VFA Also listed in student learning outcomes CTE degree and certificate graduates either employed with livable wage or enrolled in further education VFA Wage growth of graduates (median incomes) VFA Workforce Development: Annual course enrollments in non-credit workforce development courses VFA Transition from non-credit to credit: the percentage of students enrolled in non-credit courses that within one year of completing the course enroll in credit course work VFA Noncredit programs (not courses): if colleges have programs, report the outcomes of students that complete the program VFA Annual granting of a state or industry-recognized credential (volume) VFA Note: The metrics presented here are for illustrative purposes and not an endorsement by the working group. 1 Voluntary Framework of Accountability http://www.aacc.nche.edu/Resources/aaccprograms/vfa/Pages/ProposedMeasures.aspx 2 National Governors Association’s Complete to Compete http://www.nga.org/Files/pdf/1007COMMONCOLLEGEMETRICS.PDF 3 Community College Learning Assessment (CCLA) http://www.cae.org/content/pro_communcollege.htm 4 Community College Student Survey of Engagement (CCSSE) http://www.ccsse.org/ 5 Complete College America: Metrics Technical Guide 10/10 http://www.completecollege.org/path_forward/commonmetrics/

No comments:

Post a Comment