Shared Definitions

Although assessment involves the work we do every day, such as designing assignments, developing goals, evaluating progress toward those goals, asking for feedback, etc., the materials associated with assessing the work we do at the College may seem unfamiliar, or like familiar words in an unfamiliar context. This section is meant to demonstrate how we use these words and terms in the context of the assessment of learning outcomes, AES support outcomes, general education measures, projects, and assessment of the assessment process.

Glossary

335 or Chapter 335:
Pennsylvania regulation (22 Pa. Code § 335 – Community College Courses) ensures the accuracy and appropriateness of certain aspects of community college course offerings, including the catalog description, the course learning outcomes (CLOs), the number of credits, and alignment with programs, where applicable. Chapter 335 mandates that every course offered is evaluated at least once every five years. At given points in time, we are required to show evidence that our courses are in compliance (meaning that they had been evaluated within the allotted timeframe). Sometimes called “Act 335.”
Artifact:
A piece of student work (e.g., lab report, research paper, photograph) that faculty assess using established rubrics which demonstrates how well the student has learned course or program content.
Assessment:
Assessment is a recurring process of inquiry and improvement in which clearly articulated outcomes are measured against pre-established performance criteria. Results may meet or exceed or fall short of expectations. These disparities lead to analysis, evaluation, and proposed action to continuously improve student outcomes.
Benchmark:
An agreed-upon standard of performance, e.g., “80% of research papers will show proficiency CLO #1” or “100 people will attend a professional development session.” Benchmarks should be aligned directly with the relevant measure and should be ambitious but achievable. Benchmarks are foundational to assessment, but assessment goes beyond benchmarking to analyze success and gaps in achievement or feedback regarding how well the event met its goals.
Closing the Loop:
Describes the process of analyzing assessment results to see where improvements might be made, taking actions to make those improvements and reassessing to determine the results of those actions.
Completion:
In assessment, this usually refers to finishing degree or program requirements.
Curriculum:
Although often used interchangeably with “program,” in assessment, curriculum refers to a broad array of academic and workforce (non-credit) courses, programs, and certificates as well as the relationships between them.
Curriculum Development:
The (only) process by which changes can be made to curriculum, including changes to course and program learning outcomes, handled by the Office of Curriculum Development.
Curriculum Map:
Curriculum maps typically include all the program learning outcomes (PLOs) at one axis and courses in the program at the other. Courses may include major courses, courses in other disciplines, or general education courses, if they help students achieve the PLOs. The letters I, R, M, A (the “Irmas”) are positioned to indicate the alignment of the PLOs and the courses. The letter I stands for “Introduced,” R for “Reinforced,” M for “Mastered,” and A for “Assessed.” The curriculum map demonstrates the program’s internal coherence. It shows how the sequence of courses helps students attain the knowledge and skills needed to achieve the program learning outcomes (PLOs), earn an associate degree, and enter the workforce or transfer to a four-year institution. The curriculum map also indicates which courses introduce and reinforce the PLOs, when and if mastery of the PLOs occurs, and when and where the PLOs are assessed. Curriculum maps are a key part of program assessment.
Curriculum Micro-Map:
A micro-map is a curriculum map that is more granular, demonstrating how CLOs map to PLOs. Micro-mapping begins with the curriculum map and indicates exactly which of the CLOs map to the PLO, everywhere there is an “A” for assessment.
Data:
Data are a collection of facts, figures, and/or observations (both numerical and non-numerical) that represent information. Assessment data can come in the form of direct measures, such as scores on student artifacts; academic performance measures, such as retention, class standing, and time to degree completion; reflections, surveys, focus groups, and other indirect measures.
Data Aggregator:
This is any means by which we compile sets of data, such as Excel, AEFIS, scantrons, or the Institutional Research dashboards, which are available to all.
Data Analysis:
The process of making meaning from data. Analysis is often concerned with identifying areas where we are exceeding, meeting, or falling short of benchmarks and goals as well as taking note of trends. This may be accomplished by reviewing collected data in a department meeting, with an advisory board, as part of a unit meeting, or collecting reflections or feedback on a process or event. Documentation of how and when data analysis occurs is important for consistency and reporting.
Data Collection:
The process of gathering evidence of proficiency in learning outcomes, progress toward goals, or other performance indicators. In learning outcomes assessment, data collection generally refers to rating and documenting students’ achievement via assessment measures such as lab reports, capstone projects, speeches, or clinical evaluations. In the Academic Program Review process, data collection often refers to academic performance measures, such as retention and completion. In AES and other types of non-academic assessment, data collection can take the form of survey responses, pass rates, retention rates, graduation rates, etc. Data collection is only one part of assessment.
Data Driving:
Once data is collected and analyzed, departments or units should decide how to “drive” the data, i.e., make “data-driven decisions” about what to do to close gaps or sustain successes. Examples of data driving might include trying a new teaching method, updating a rubric, revising learning outcomes, professional development that addresses a particular issue, revising a course or program, aligning artifacts and outcomes, or improvements in the assessment process.
Direct Measure:
A direct measure requires students or participants to demonstrate their achievement of a specific learning outcome or goal. At the program level, direct measures might include capstone projects, portfolios, lab reports, presentations, performances; pass rates or scores on licensure, certification, or subject-area tests; student publications or conference presentations. Course-level direct measures might include course and homework assignments, exams and quizzes, research papers and reports, presentations, case study analysis, or course grades—but only if those course grades are based solely on explicit criteria clearly related to specific learning outcomes (embedded assessment).
Documentation:
If it’s not written down, including details like date, time, and method, it didn’t happen! Documentation provides a blueprint for future assessment, demonstrates what has and has not worked in the past, assists with onboarding new people, makes it easier to follow trends and look at historical data, and helps us avoid gaps in activities or consistency when there is turnover or institutional change. Finally, when our accreditor, the Middle States Commission on Higher Education, evaluates our self-study in 2031, they will ask for an evidence repository that consists of documentation for all the assessment (and equity) work we’ve been doing for the last eight years. Clearly labeled and archived assessment reporting helps to make this possible and can minimize stress.
Essential Skills:
Since Fall 2021, the College’s general education requirements have been grounded in six Essential Skills, which are foundational skills that students learn in their general education courses and develop in their program coursework at the College. These skills, based on Middle States Standard III requirements for general education, are Cultural Analysis and Interpretation (CAI), Oral Communication/Creative Expression (OCCE), Quantitative Reasoning (QURE), Scientific Reasoning (SCRE), Technological Competency (TEC), and Writing, Research, and Information Literacy (WRI).
Equity:
The sixth pillar of the College’s Strategic Plan provides this definition: “Equity is the process of ensuring that processes and programs are impartial and fair. To ensure equitable outcomes, the College will work towards ensuring equal opportunities and access for all persons in the College.” This institutional definition of equity requires large-scale tracking of demographic data regarding retention and time to completion, as well as a critical examination of the barriers faced by students at Community College of Philadelphia, 60% of whom identify as Black or Hispanic. Some of the data used for this kind of tracking can be found on the Institutional Research dashboards, which are available to all.
Follow-Up:
In assessment, this refers to all the things that happen after data collection, i.e., data analysis, data driving, and reassessment. Follow-up ensures that identified challenges, gaps and issues are addressed using assessment results.
Formative Assessment:
Describes assessment that occurs mid-cycle and gauges the achievement of learning outcomes or goals attained in a single assignment or group of related assignments. In academic assessment, summative assessment describes how instructors support student learning with ongoing feedback on assignments with lower stakes, e.g., outlines, short-answer quizzes, etc.
General Education:
All degree students, regardless of program, must complete courses that meet the College's general education requirements to gain a breadth of experience outside as well as within their academic field and build interdisciplinary skills essential to academic, career, and personal development and success. Since Fall 2021, those requirements have been grounded in six Essential Skills.
Goals:
A goal is a broad statement about what the College, division, project, or unit would like to achieve, as opposed to objectives or outcomes, which are more discrete and measurable. Goals are used in strategic planning, equity assessment, project assessment, and AES assessment.
Indirect Measure:
An indirect measure relies on reflection on or perceptions of students’ or participants’ achievement of skills, behaviors, knowledge, outcomes or goals. At the program level, indirect measures might include surveys, focus groups, interviews, registration or enrollment information, job placement information, or transfer rates. At the course level, indirect measures might include course evaluations, assessments of time spent on a particular activity, employer/internship supervisor ratings, or course grades not based solely on explicit criteria clearly related to specific learning outcomes (e.g., course grades that reflect things like attendance, participation, or timeliness in addition to student learning).
Learning Outcome:
The College uses learning outcomes at the course and program levels, called course learning outcomes (CLOs) and program learning outcomes (PLOs). Learning outcomes define what a student should know or be able to do upon completion of the course or program. CLOs should be clear, descriptive, and student friendly, should include action verbs like “describe” or “explain” rather than head words like “demonstrate knowledge” or “understand,” and should reflect the level of learning that they describe. Learning outcomes may also appear in events related to professional development. Learning outcomes are different from support outcomes.
Measures:
A measure, also called the means or method of assessment, is the information used to assess a specific outcome. Measures should be aligned with learning outcomes prior to data collection. Examples of measures are practically endless: lab reports, clinical evaluations, performances, writing assignments, event feedback, survey responses, survey response rates, speeches, selected questions tied to CLOs and PLOs in quizzes, tests, and exams, discussion boards, statistics, focus groups, homework, capstone projects, case study analyses, presentations, academic plans, peer critiques, training sessions, evaluation forms, participation/population participation tracking, etc.
Meta-Assessment:
This refers to assessment of the assessment process and happens in many areas, including reflections on course and program outcome assessment, the APR process, divisional assessment, general education assessment reflections and reporting, the institutional effectiveness plan, and the Middle States Self-Study.
Mission Statement:
A mission statement provides context to why the College, its divisions, departments, and units exist, how it serves students and how the College intends to support the students to accomplish their educational goals. The mission statement is a formal summary of core contributions and services of the division or unit to the College’s efforts at student success. The College has its own mission statement, and the strategic plan, divisional plans, as well as new degree program proposals must align with the College mission. Each AES unit has its own mission statement that must also align with the College mission statement.
Objective:
Objectives are unique to AES unit assessment, derived from the goals to which they align, and focused on how exactly the goal is accomplished by the unit. They communicate the delivery of services, processes, activities, or functions to students, faculty, or staff, and include an indication of desirable value, quality, or evaluation.
Outcome:
An “outcome” describes what happens to students or participants as a result of teaching, learning or the work of an AES unit. At the College, we use learning outcomes and support outcomes.
Persistence:
A measure of the proportion of a cohort of students who remain at the College from term to term (semester to semester).
Racial Equity:
In higher education, racial equity is the process of ensuring that students from historically underserved racial groups have access to the resources and opportunities they need to be successful in college and that existing policies and programs do not unfairly impact students in these groups. Racial equity is a critical focus at the College, as 60% of students at Community College of Philadelphia identify as Black or Hispanic. Demographic data can be found on the Institutional Research dashboards, which are available to all.
Reassessment:
Reassessment is part of the process of closing the loop, using the results of assessment to drive the data to make changes for continuous improvement, then reassessing the same outcome after the change takes place to gauge its effectiveness.
Retention:
A measure of the proportion of cohort of students who remain at the College from year to year.
Strategic Plan:
A strategic plan is a dynamic document that helps the College to define its tactical priorities and directions over an eight-year period. The Strategic Plan typically consists of broad goals, pillars that demonstrate areas of focus for those goals, and strategic directions that consist of specific actions that will result in the achievement of the goals.
Summative Assessment:
Assessment that occurs at the end of a cycle, e.g., a semester or fiscal year, that evaluates the achievement of learning outcomes or goals attained throughout the cycle. In academic assessment, this occurs when an instructor evaluates student learning in a more comprehensive, higher-stakes assignment, usually at the end of the semester, e.g., a final exam, final project, or final paper.
Support Outcome:
In AES assessment, the terms “objective,” “outcome,” and “support outcome” are interchangeable. Objectives are unique to the AES unit, derived from the goals to which they align, and focused on how exactly the goal is accomplished by the unit. They communicate the delivery of services, processes, activities, or functions to students, faculty, or staff, and include an indication of desirable value, quality, or evaluation.
Trends:
A trend is a pattern observable over time. Examples might include three semesters’ worth of collected data showing that students are not proficient in a particular outcome, enrollment in a degree program falling or rising over the course of several semesters or years.

Assessment Acronyms

AA: Associate of Arts
AAS: Associate of Applied Science
AES: Administrative and Educational Support
APR: Academic Program Review
AS: Associate of Science
ASSC: Academic and Student Success Council
CLO: Course Learning Outcomes (formerly student learning outcomes or SLOs)
DCAF: Divisional Curriculum Assessment Facilitator
DEI: Diversity, Equity, and Inclusion
CATF: College Assessment Task Force
FCTL: Faculty Center for Teaching and Learning
GEES: General Education Essential Skills
GEMs: General Education Measures
IEP: Institutional Effectiveness Plan
IR: Institutional Research
MSCHE: Middle States Commission on Higher Education (also Middle States)
OAE: Office of Assessment and Evaluation
OIE: Office of Institutional Effectiveness
PD: Professional Development
PLO: Program Learning Outcomes
SOC: Student Outcomes Committee (of the Board of Trustees)

Terms Crosswalk

Given all of the varied assessment activities and assorted stakeholders involved in assessment at Community College of Philadelphia, it’s no wonder that the terms used to describe components of assessment vary as well. Below is a list of terms that may frequently be used as synonyms, near-synonyms, or even false synonyms in different contexts. This list is not intended to be perfectly comprehensive or exhaustive, but to help clarify some of the ways we communicate about assessment both at the college and outside of it.

TermsExplanation
Objective
Outcome
Support Outcome
Student (Course, Program) Learning Outcome
  • In assessment plans that use pillars/goals, these terms are all used to describe the second level of organization.
  • These all describe concrete statements of specific areas of knowledge or delivery of function.
  • An "outcome" describes what happens to students or participants as a result of teaching, learning, or the work of an AES unit. At the College, we use learning outcomes (in academic assessment) and support outcomes (in administrative/AES assessment).
  • At the College, "Student Learning Outcome" (SLO) is an umbrella term that has largely been replaced in documentation and usage, by either of the more specific terms "Course Learning Outcome (CLO)" or "Program Learning Outcome (PLO)."
Measure
Assignment
Tool
Data Collection
Method
Means of Assessment
Assessment
Rubric
  • A measure, also called the means or method of assessment, is the information used to assess a specific outcome. Measures should be aligned with learning outcomes prior to data collection. Examples of measures are practically endless: lab reports, clinical evaluations, performances, writing assignments, event feedback, survey responses, survey response rates, speeches, selected questions tied to CLOs and PLOs in quizzes, tests, and exams, discussion boards, statistics, focus groups, homework, capstone projects, case study analyses, presentations, academic plans, peer critiques, training sessions, evaluation forms, participation/population participation tracking, etc.
Benchmark
Target
Goal
  • An agreed-upon standard of performance, e.g., "80% of research papers will show proficiency CLO #1" or "100 people will attend a professional development session." Benchmarks should be aligned directly with the relevant measure and should be ambitious but achievable. Benchmarks are foundational to assessment, but assessment goes beyond benchmarking to analyze success and gaps in achievement or feedback regarding how well the event met its goals.
  • The term "goal" is often used in everyday parlance to refer to the kind of specific target or aim that, in assessment, is more accurately termed a "benchmark."
Findings
Results
Data
Scores
Grades
Ratings
  • The information gathered in the course of the activity or using the assessment method.
  • Data are a collection of facts, figures, and/or observations (both numerical and non-numerical) that represent information. Assessment data can come in the form of direct measures, such as scores on student artifacts; academic performance measures, such as retention, class standing, and time to degree completion; reflections, surveys, focus groups, and other indirect measures.
Analysis
Data Analysis
Trend Analysis
Discussion
Context
  • The process of making meaning from data. Analysis is often concerned with identifying areas where we are exceeding, meeting, or falling short of benchmarks and goals as well as taking note of trends. This may be accomplished by reviewing collected data in a department meeting, with an advisory board, as part of a unit meeting, or collecting reflections or feedback on a process or event. Documentation of how and when data analysis occurs is important for consistency and reporting.
Action Plan
Follow-up
Data Driving
Intervention
Strategies
Closing the Loop
  • Once data is collected and analyzed, departments or units should decide how to "drive" the data, i.e., make "data-driven decisions" about what to do to close gaps or sustain successes. Examples of data driving might include trying a new teaching method, updating a rubric, revising learning outcomes, professional development that addresses a particular issue, revising a course or program, aligning artifacts and outcomes, or improvements in the assessment process.
  • "Closing the Loop" describes the process of analyzing assessment results to see where improvements might be made and then taking actions to make those improvements.