Academic Program Assessment

Purpose: determine overall status of a program (here defined as a defined curricular sequence, completion of which confers some credential to the student, whether that is a certificate or an associate degree) in relation to its goals and stated learning outcomes

Periodic (on a schedule determined by the Division Dean, DCAF, and Department) documentation of the analysis and review of assessment data & response planning by Program coordinators and faculty.

Primarily focused on PLOs, but interrelated with course assessment, project assessment, co-curricular, etc.

Mostly done by collecting specific data or artifacts from ongoing course assessment activities and re-analyzing the data at the program level; some programs do other things in addition – capstones, portfolio reviews, 7-point assessment

  • While most program assessment uses data collected in the classroom, it differs from course assessment in that the level of analysis is different, and subsequently potential follow-up actions (data driving) are also different
  • At the program level, one can see the effects of co-curricular activities, of program revisions or lack thereof, the effects that one course may have on other courses, the effects of College/division/department policies, and can also see how all of these things affect student actions and success over a longer timeframe
  • People who teach in the classroom should be an integral part of developing action plans from program assessment data because they will have the most granular context and experience, which will help inform more practical and effective driving
  • Program assessment activities beyond/outside/in addition to course-based assessment include:
    • Capstone Projects
      • Program capstone projects are designed to give students the opportunity to integrate and apply what they have learned over the course of an entire program. They frequently interact with real-world problems, experiences, communities, and/or programs, and incorporate teamwork, critical thinking, and student reflection. When used as part of Program assessment, they can be a rich source of summative student outcome data and can bring insight to questions of curricular cohesion and applicability.
    • Portfolio Review
      • While portfolio assembly and critique are traditionally thought of as the domain of programs in the creative arts fields, collections of related artifacts created over the course of a student’s program of study can benefit both the student and a program’s assessment efforts in nearly any field, most notably professional areas like business and education. Analyzing the collective output of a graduating cohort of students can generate rich sets of data related to program cohesion, career readiness, critical thinking and reflection skills, as well as how those outcomes are impacted by program-wide changes over time.
    • 7-Point Assessment
      • Program assessment is typically the most straightforward to analyze when PLOs are assessed within a program’s core required courses. When a program is deliberately designed to be very flexible, such as the Health Care Studies program at the College, the small number of courses required of all students in the program can present challenges in assessing the program as a whole. To address these challenges, the Health Care Studies program has developed an annual assessment reporting process that combines traditional course-based assessment with data collected from surveys of students, course performance reports, and student outcomes data provided by the Office of Institutional Research. The additional data points help to augment assessment data collected in-class to create a more robust understanding of the program overall.
    • Exit Interviews
      • A program or department may choose to invite students who are preparing for graduation to participate in a focused interview, in order to gather consistent qualitative data about their experiences in the program. Analyzing this data regularly and rigorously can augment student learning outcomes assessment data with context and explanatory detail.
  • Results/analysis/documentation/use of results for improvements also inform department and division assessment
  • Robust documentation of consistent program assessment is one of the most valuable sources of evidence for accreditation, both for the College and for programs that seek specialized accreditation

Assessment Planning and Curriculum Development

  • Assessment is central to the curriculum development process.
  • The curriculum development process includes the creation/revision of program learning outcomes (PLOs), the identification of methods of assessment specific to the PLOs, a timeline for PLO assessment, and the curriculum map.
  • Program learning outcomes and curriculum maps are part of every program document.
  • Best Practices in Program Learning Outcomes Development:
    • PLOs should be clear, descriptive, and student friendly. Students will see these outcomes in the catalog. They should be able to discern what they can expect to know and do when they finish the program.
    • PLOs should include action verbs like “describe” or “explain” rather than head words like “demonstrate knowledge” or “understand.” Review Bloom’s Taxonomy action verbs for ideas.
    • PLOs should reflect a higher level of learning than the individual course learning outcomes, e.g., “analyze” rather than “describe.”  
    • Program learning outcomes should reflect knowledge/skills gained at the completion of identified course learning outcomes and, when appropriate, general education measures.
    • PLOs should focus on what we can assess, measure, or track. “Prepare to take the XYZ certification exam” is fine, but “pass the XYZ certification exam and transfer to a four-year degree program” is more difficult to assess.
    • PLOs should be distinct from one another. It is difficult to assess outcomes that are vague, layered, or too similar.
    • PLOs should be streamlined wherever possible. More is not better. The number should be what the program/department can reasonably expect to assess at least once every five years.
    • PLOs should reflect our world today. They should use respectful, inclusive, accessible language.
  • The curriculum map typically includes all the program learning outcomes (PLOs) at one axis and the courses in the program at the other to show how the sequence of courses help students achieve the program learning outcomes. Courses may include major courses, courses in other disciplines, or general education courses, if they help students achieve the PLOs. The letters I, R, M, A (the “Irmas”) are positioned to indicate the alignment of the PLOs and the courses.
    • "I” stands for “Introduced.”
    • “R” is for “Reinforced.”
    • “M” is for “Mastered.”
    •  “A” is used to designate the courses in which the PLO is assessed.

Certificate Assessment

There are three types of certificates at the College. Most are proficiency certificates. Proficiency certificates tend to be 12-16 credits and allow students to earn a credential that will assist them in advancing in the workforce. There are also a handful of academic certificates, which also include some general education courses. The third type are micro-certificates. This is not a formal designation, but they work similarly to proficiency certificates but include only two or three courses.

Certificate assessment is usually inseparable from program assessment, as they are “stackable” with degree programs and thus often use the same program learning outcomes and encompass the same courses and the same student populations. The “standalone” certificates, which either do not “stack” with degree programs (uncommon) or represent a variable set of credits within a degree program (i.e., the program is designed for students to take different sets of courses specifically for the certificate, as in Health Care Studies or Business Leadership), are assessed as one would assess a degree program.

Equity in Program Assessment

Equity in the assessment of course and program-level learning outcomes assessment involves a combination of approaches:

  • Data disaggregated by race
  • A commitment to anti-bias training for all faculty and staff
  • An unflinching focus on racial equity (e.g., NOT “shouldn’t we be looking at poverty instead?”)
  • Listening to and acting on student feedback
  • Eschewing a deficit-minded approach that blames students for inequitable outcomes
  • Acting upon the conviction that success is a combination of student engagement and creating an inclusive student learning experience

[Program Assessment by Academic Division]

  • Assistant Deans, Department Heads, Program Coordinators, DCAF, OAE

Workforce Program Assessment

Much like certificate programs, workforce (non-credit) programs should be assessed much like traditional academic programs, with a few additional considerations:

  • The distinction between a “course” and a “program” can be less straightforward in workforce, requiring programs to adopt a hybrid model of assessment.
  • Programs whose students may be eligible for college credit should ensure that their outcomes and assessment are aligned with relevant credit-bearing courses or programs.
  • Workforce development programs will need to pay special attention to external data, market research, and advisory board recommendations in order to remain aligned with industry needs, including the development and fostering of a diverse, equitable, and inclusive labor market.

Program Assessment Timeline

Data Collection for Program Assessment

  • Design and selection of measures prior to beginning of semester
    • Alignment with CLOs in curriculum micro-mapping
  • Data collected each time courses identified with “A” on the curriculum map run
  • Data collected re: examination pass rates, certifications according to external requirements

Data Analysis for Program Assessment

  • Analysis of rolling trends over a multi-semester period (e.g., analyze results from Fall 2022, Spring 2023, and Fall 2023 during Spring 2024)
  • Analysis of data from a specific timeframe to track the success of data-driven continuous improvement strategies at the program level
  • Department meetings, Professional Development Week, etc. are opportunities for reviewing data and making meaning

Data Driving for Program Assessment

  • Decide on and implement recommendations, reassess, and document results from the previous semester, grouping of semesters, or year
    • New teaching strategies (e.g., more hands-on learning, capstone projects)
    • Curriculum changes (e.g., revising PLOs, changing course sequence)
    • Professional development (e.g., FCTL workshops, attending professional conferences)
    • Collaboration with other areas of the College (e.g., Academic Advising, Center on Disability, Learning Lab)
  • Dependent upon whether courses are offered and run
  • Spotlight on links to improving the student experience:
    • The College’s Strategic Plan includes the following two Strategic Directions as part of its operationalization of the first pillar, The Student Experience:
      • “Ensure students are challenged by intellectually rigorous teaching and learning techniques inside and outside the classroom leading to high academic standards through a culture focused on assessment of student learning outcomes and continuous professional development of faculty.
      • Increase academic and student development opportunities by integrating student life, community service, co-curricular activities, internships, experiential learning opportunities, civic engagement and service learning into programs of study.”
    • Student learning is arguably the most fundamental aspect of the student experience in any educational setting. As such, assessment of student learning and student experiences both within and alongside academic programs are essential to successfully supporting the first pillar of the Strategic Plan.
    • Using outcomes assessment data gathered from students to drive program-wide decisions and priorities combines attention to academic rigor with the ultimate goal of students achieving success in their desired aims.

Academic Program Review (APR)

Purpose: The Academic Program Review (APR) at Community College of Philadelphia assists academic programs in identifying strengths, goals, and areas for improving student success, including using the results of assessment to improve teaching and learning. In addition, the APR helps programs to articulate successes and challenges of the past five years and to meet external accreditation standards, where needed. The APR also assesses the program’s role in the achievement of institutional goals.

  • The APR is a periodic deep dive into the current state of each credentialing academic program.
  • The Commonwealth of Pennsylvania’s state code defines the requirements for an “academic audit” for community colleges within the state in two places:
    • “(b)  A college shall conduct a thorough academic audit of programs as necessary, but not less than every 5 years to determine whether each program should be continued, revised or discontinued on the basis of local and student needs of the area served.” (22 Pa. Code § 35.21)
    • “(a)  Each community college shall conduct course evaluations, which for credit courses shall be part of the academic audit specified in §  35.21(b) (relating to curricula). The college shall develop a written program audit and course evaluation policy that specifies the position of the person responsible for program audits. The policy shall also include provisions which require a review of the program’s courses to ensure that:
      • (1) Course materials and content reflect current knowledge in the program’s field of study.
      • (2) Course content is appropriate for both the objectives of the course and the goals of the program.
      • (3) The catalog description of the course is accurate.
      • (4) Each required course’s stated learning goals are necessary to enable students to attain the essential knowledge and skills embodied in the program’s educational objectives.
      • (5) The content of each course designed for transfer is similar to courses which are generally accepted for transfer of credit to accredited 4-year colleges and universities.
    • (b) Each community college shall establish an onsite depository of reports on the results of each program audit and course evaluation. The reports shall, at a minimum, demonstrate that the program audit addressed each of the provisions in subsection (a) and shall be signed by the incumbent in the position responsible for program audits to indicate that the program audit was performed and accepted by the college’s administration. Each college shall maintain the results of each program audit and course evaluation in accordance with §  35.66 (relating to retention of records).” (22 Pa. Code § 335.44)
    • The Academic Program Review process at the College is designed to meet all of the requirements of the academic audit as set forth by the state, and copies of Academic Program Reviews are retained by the Office of Assessment and Evaluation.
  • MSCHE’s Standards for Accreditation and Requirements of Affiliation (Thirteenth Edition) refers to program assessment in several areas:
    • “8. The institution systematically evaluates its educational and other programs and makes public how well and in what ways it is accomplishing its purposes.
    • 9. The institution’s student learning programs and opportunities are characterized by rigor, coherence, and appropriate assessment of student achievement throughout the educational offerings, regardless of certificate or degree level or delivery and instructional modality.
    • 10. Institutional planning integrates goals for academic and institutional effectiveness and improvement, student achievement of educational goals, student learning, and the results of academic and institutional assessments.” (Requirements of Affiliation)
    • “An institution provides students with learning experiences that are characterized by rigor and coherence at all program, certificate, and degree levels, regardless of instructional modality. All learning experiences, regardless of modality, program pace/schedule, level, and setting are consistent with higher education expectations.” (Standard III)
    • “An institution provides students with learning experiences that are characterized by rigor and coherence at all program, certificate, and degree levels, regardless of instructional modality. All learning experiences, regardless of modality, program pace/schedule, level, and setting are consistent with higher education expectations.” (Standard V)
    • The APR is designed to serve as evidence of the College’s compliance with these requirements and standards.
  • The audience for the APR consists of the program faculty, the department head/program coordinator, the Academic and Student Success Council (ASSC), the Student Outcomes Committee of the Board (SOC), and other stakeholders as identified by the program (e.g., advisory committees, students).
  • The Office of Assessment and Evaluation (OAE) compiles and shares program data and data analysis, collaborates with  faculty in documenting their assessment practices, results, and work toward continuous improvement, collaborates with  program faculty in writing the sections of the report for which they are responsible, edits the document in response to feedback, and manages the review process through the various points in the review process. The OAE also maintains an archive of completed and approved APRs.
  • The APR document is divided into three main sections: the executive summary, the body of the document, and the appendices. They are generally completed in reverse order, though work on each main section will overlap with the others. The information contained in the appendices includes the program’s catalog description, course sequence, and curriculum map, as well as the data tables and major analysis from the Academic Performance Measures data. The body of the document, which includes everything from the program analysis to the program cost analysis, is informed both by the appendices and additional data, research, and context provided by Program faculty. The executive summary includes bullet points summarizing the contents of the other two sections, the recommendations from the prior audit or APR and the program’s response to them, and recommendations for the program to consider over the next five years based on key findings from the program review process. The final presentation of the APR to the Student Outcomes Committee of the Board of Trustees is based only on the contents of the executive summary, making it important for that part of the document to be an accurate and complete representation of the other two sections.

APR Assessment and Equity Check-Ins

APRs cover five years of the life of a program and can include a great deal of material regarding assessment, enrollment, and demographics. APRs also show trends in these areas over time. Unless a program is only five years old, every program must also address the recommendations from the previous APR (or audit), many of which require months, if not years, of attention to specific strategies and innovations as well as tracking. A few months before an APR is due is not the right time to discover that there are missing documents, gaps in a program’s assessment process, technological difficulties, or a disproportionate number of students of color not persisting in the program! Assessment and equity check-ins are a recent addition to the APR process that helps to 1) familiarize programs with the process and the Office of Assessment and Evaluation well ahead of time, 2) identify areas related to assessment and equity in need of improvement (and work with the OAE on improvement strategies), and 3) minimize surprises.

  • Assessment and Equity Check-In #1: Three years prior to APR
    • Review of prior APR/audit action items
    • Review of available program assessment results since prior APR
    • Identify and remedy gaps in documentation
    • Update the assessment repository
    • Review of current enrollment and retention data
    • Review of current demographic data
    • Discussion of program equity goals (aligned with divisional goals)
  • Assessment and Equity Check-In #2: Two years prior to APR
    • Progress checks on prior APR/audit action items
    • Review of available program assessment results since first check-in
    • Identify and remedy gaps in documentation
    • Update the assessment repository
    • Review of updated enrollment and retention data through an equity lens
    • Progress checks on program equity goals
  • These two check-ins are followed by a general kickoff meeting with that year’s APR cohort one year prior to the APR, and then the creation of the APR document that will go to the ASSC and the SOC begins a few weeks after that.

Long Term Timeline 

APR Process & Production Schedule 

X-Y weeks prior to ASSC presentation



ACADEMIC PROGRAM REVIEW TIMELINE

DATE

ACTIVITIES AND DELIVERABLES

Each spring semester

OAE and Provost/VPASS collaborate on annual APR schedule

Each spring and fall semester

APR Information Session for new department heads and program coordinators during PD Week (built into department head onboarding)

  • APR Schedule
  • APR Task List and Timeline
  • Sample APRs/ Meet with program faculty who’ve recently completed APRs

Three years prior to scheduled APR

Assessment and Equity Check-In #1

  • Confidential
  • OAE collaborates with DCAF, department heads, program coordinators
  • OAE reviews program assessment data including enrollment, retention, and graduation data, results, plans
  • OAE assists program with assessment needs
  • OAE assists program in identifying gateway courses and equity measures

Two years prior to scheduled APR

Assessment and Equity Check-in #2

  • Advises  department head and dean
  • OAE collaborates with DCAF, department heads, program coordinators
  • Review of Prior APRs/Audits
  • OAE assists program with assessment needs
  • OAE assists program in identifying gateway courses and refining equity measures

One year prior to scheduled APR

General Kick-Off Meeting with cohort

Six to nine months prior to ASSC presentation

APR Kick-Off Meeting

  • OAE provides program faculty with APR template
    • Academic performance measures (data)
    • Recommendations from previous APR/audit

Four to seven months prior to ASSC presentation

Enrollment projections discussed with Enrollment Management

Program faculty provide documents

  • Growth and retention benchmarks
  • Course and program revisions/curriculum map
  • Assessment documents
  • Advisory committee information
  • Job titles

Two to four months prior to ASSC presentation

Program faculty review updated draft and write the following sections, based on the prompts in the guidelines and the data:

  • Responses to recommendations from previous APR/audit
  • Program analysis
  • Statement of mission alignment
  • Future directions of program and field
  • Cost analysis
OAE updates the draft with faculty additions and returns draft to program faculty for rewrites

Four to six weeks prior to ASSC presentation

OAE meets with the program rep and dean to review the updated APR draft

Two to four weeks prior to ASSC presentation

OAE and Program faculty make requested changes

Two weeks prior to ASSC presentation

Completed draft sent to dean for review

One week prior to ASSC presentation

OAE sends draft APR to Provost/VPASS

  • Updates as needed

Scheduled ASSC date (Wednesdays mornings)

Present APR to the Academic and Student Success Council (ASSC)

One week before SOC

  • Make changes requested by ASSC
  • Post-ASSC meeting to review changes
  • Develop/Share talking points

Scheduled date (First Thursday September-February, April, and June)

Present APR Executive Summary to Student Outcomes Committee of the Board of Trustees (SOC)

1-2 weeks following presentation of APR to SOC

Debrief

  • Program faculty/ASSC process feedback and plan implementation of recommendations
  • Schedule next assessment check-in for two years in the future

 

Academic Program Review and Improving the Student Experience

Every Academic Program Review begins with an executive summary, which includes recommendations from the prior audit or APR and the program’s response to them as well as a set of recommendations for the program to consider over the following five years.

  • Recommendations from prior audit/APR and program response
    • This is an excellent opportunity to demonstrate continuous improvement in response to assessment data. The program is required to write approximately one paragraph responding to each recommendation from the last APR or audit, explaining actions taken toward completing the recommendation, or alternatively, a rationale as to why a program chose not to pursue the recommended course of action and the actions they took instead. The response will very frequently detail changes made to various aspects of the program as a result of the previous program review, providing an aspect of continuity and accountability from one APR cycle to the next.
  • Recommendations for the program
    • The Office of Assessment and Evaluation will work with the program’s coordinator and Dean to develop a set of recommendations for the program to pursue over the next five years. Each recommendation will be based on one or more “key finding” cited elsewhere in the executive summary.