CNM Process for CIPR Assessment — CNM

The CIPR process described below is current as of February 2022. The CIPR process has been described in the official CNM Way process format, and the CIPR team must follow this process for an academic program to be considered for the development, growth, revision, transition or sunset.

Process Name: Integrated College-Wide Program Review

Goal

In keeping with the College’s vision of “Changing Lives, Building Community,” the College-wide Integrated NJC Program Review process provides a unique opportunity to:

  • Reflect on our work and continually improve our program offerings
  • Engage in dialogue with colleagues regarding a program’s goals, strengths, challenges and opportunities
  • Assess program effectiveness and quality using academic standards
  • Evaluate the economics of the program in terms of costs, revenues and margins
  • Ensure that all programs lead to completion, transfer and/or positive employment outcomes
  • Align program offerings with the College’s mission and strategic directions

Beginning of the process

The Office of Data Strategy (ODS) prepares program review data reports and creates the college-wide integrated program review summary spreadsheet.

End of process

The College-wide Integrated Program Review Summary is fully completed and updated with the Office of Data Strategy.

Related documents and forms

Process and timeline

Data reports

Generation and distribution

End of October

1. Gray Associates is updating the Program Evaluation System (PES) dashboard with the most recent data from ODS.

2. ODS originated the Summary of the Integrated College-Wide Program Review (ICPR) spreadsheet and sends the SharePoint link to Academic Affairs (AA).

College exam period

November to At the beginning of February

All credited and non-credited programs will perform a review based on the identified target metrics.

1. University schools fulfill theICRP Summary.
Timeline: November, December and January

2. Once University Schools have completed their initial review, other divisions (Workforce and Community Success, Finance and Operations, Enrollment Management and Student Success, MCO and Ingenuity) will provide comments on theICRP Summary.
Timeline: February

Academic AffairsRecommendations

Early March

Once the divisions have completed their review and comments, AA will review the ICRP Summary and make final recommendations (Fix, Grow, Sunset, Transition, Sustain) to the CIPR team.

CIPR Team and Leadership Team Review

Half-March

1. The IPRC team reviews the AA recommendations and makes a final recommendation to the leadership team.

2. The CIPR team forwards the recommendations to the management team for approval. The CIPR team informs AA of the final result approved by the executive team.

Board level review

End of March

1. AA will identify credit programs that need to be brought to the board for further consideration, especially programs that have been identified for sunset.

2. Academic schools will prepare action plans (if necessary) for the identified programs and present them to the planning committee. All credit programs listed as Sunset must be submitted to the planning committee and the full board.

Full ICRP report to the Board

April

1. The CIPR team reviews the results of the annual program review for both credited and non-credited programs.

2. The Workforce and Community Success Division compiles a comprehensive report and presents it to the Planning Committee and then to the full Governing Board.

Archiving of reports and compilation of recommendations

End of April

1. ODS compiles and maintains theICRP Summaryfrom all academic schools and Ingenuity.

2. The ICRP Summary is completely finished. This marks the end of the program’s annual review process.

Development of the program support plan

April to June

1. All programs identified for extinction, transition, correction, or growth will have a small team responsible for developing a program support plan from the CIPR action plan and suggestions provided by the program.

2. Program support plans will consider personnel, equipment, facilities, marketing, recruitment, schedules, terms, program redesign, and employer and industry engagement .

Data used for program review

The following data is intended to provide quarterly program data over 3 years, as well as additional data collected on an annual basis. Each term is defined as summer, fall, spring.

  • Registration
  • Achievements
  • Declared majors
  • Student demographics (internal data, census, Bureau of Labor Statistics data)
  • Markets (student demand, employment, competitive intensity)
  • Margins (gross income, state appropriation, rebate, net income, teaching cost, contribution per student credit hour)
  • Average class size
  • Class fill rate
  • Sections taught by FT instructors
  • success rate C
  • Withdrawal rate
  • DL Sections & Non-DL Courses
  • Average Course Satisfaction Rating
  • Number of graduates
  • Transfer to 4-year-old school (Data from the National Student Clearinghouse)
  • Employee in New Mexico
  • Employee in the field
  • Average earnings (Data from the Bureau of Labor Statistics and Department of Workforce Solutions)
  • Unfinished Results

Target measures

Target metrics are minimum thresholds that each credit program must meet to be considered a healthy program. When a target metric is not achieved, an explanation is needed from the program.

Target Metrics for Credit Programs

Assignment

Annual graduate placement is greater than 75% (for CTE programs)

Market

The market evaluation is rated satisfactory or better (market score out of 70andpercentile or more)

The annual number of graduate scholarships is more than 10

Transfer to a 4-year school (for AA and AS degrees) is 25% or higher

Note: Programs will need to provide an explanation for each of these three situations:

– Transfer rate drops below 25%

– The transfer rate continuously drops over 3 years (even if it is still above 25%)

– The transfer rate is experiencing a significant drop (even if it is still above 25%)

Employed in the field (for career education programs) is at 25% or higher

Note: Programs will need to provide an explanation for each of these three situations:

– Employment in the field falls below 25%

– Employment in the field continuously decreases over 3 years (even if it is still above 25%)

– Employment in the field is experiencing a significant decline (even if it is still above 25%)

Money

The margin assessment is deemed satisfactory or better (Contribution in 50andpercentile or more)

The class fill rate is 60% or more

The number of students enrolled / the number of declared majors enrolled (for degrees and stand-alone certificates) is 30 or more

Note: For integrated certificates, refer to the number of majors reported for the associate degrees in which the certificates are integrated.

Academics

The average annual retention rate for the discipline is 70% or more (SAGE 65%)

The annual C-Pass rate for courses is 60% or more

Activity tracked/results

All program review activities and results are tracked in the College-Wide Integrated Program Review Summary, a comprehensive Excel document managed by the Office of Data Strategy.

Latest revision and improvements made

This process was first written in September 2021 and revised in February 2022.

Comments are closed.