Choosing an Implementation Partner for Corequisite Reform: Turn Policy Decisions Into Predictable Results at Scale

How to select an implementation partner for corequisite reform that drives measurable impact and ROI. Learn to turn policy decisions into predictable results at scale.

Executive Summary

An implementation partner for corequisite reform bridges the gap between policy decisions and daily operational practice, transforming mandates into cross-functional execution that produces measurable student success at scale. Almy Education’s “done with you” methodology creates stakeholder alignment and momentum through structured, hands-on implementation support with cross-functional teams. The focus is on systems and operating routines that make outcomes predictable at scale.

Who needs one: Coreq 1.0 institutions launching or scaling corequisite models for the first time, and Coreq 2.0 institutions already at scale but seeing plateaued or inconsistent gateway course completion rates across terms.

Key evaluation criteria: Proof of execution through tangible artifacts (heat maps, decision logs, operationalization audits, curriculum maps integrating support), cross-functional expertise spanning placement, advising, scheduling, communications, and faculty coordination, and structured week-to-week operating routines, not just frameworks alone.

Expected outcomes: Within one semester, institutions should see baseline metrics established, operational gaps closed, and leading indicators of student outcomes moving in the right direction, supporting equity gaps reduction and improved student engagement.

How to evaluate partners: Focus on deliverables, cadence, and accountability measures. Ask for evidence of decision-to-practice translation at comparable community colleges and public colleges, emphasizing experience turning developmental education reform decisions into operational practice and measurable gateway course results.

Red flags to avoid: Casemaking-only approaches, “advise and leave” models, proposals lacking institution-specific diagnostics, and partners who focus solely on casemaking at a high level rather than specific systems and processes that get students into college level courses effectively.

What an Implementation Partner Does for Corequisite Reform (and What They Don’t)

An implementation partner turns executive decisions into an end-to-end student experience that works before, during, and after the gateway course. That requires coordinated execution across placement, advising, scheduling, registration, communications, curriculum, and support structures.

What implementation partners do:

  • Cross-functional execution: Work across placement, scheduling, advising, registration, web communications, and faculty coordination simultaneously, not in silos
  • Operational translation: Convert policy decisions into specific intake scripts, registration defaults, web language, and staff routines that support college-level learning and college-level coursework
  • Structured project management: Run operating routines with clear owners, timelines, and dependencies, plus regular executive readouts to educational leadership
  • Continuous improvement: Establish iteration protocols and leading indicators so corequisite courses improve term over term and help close achievement gaps
  • Artifact creation: Deliver tangible outputs, not just recommendations: heat maps of offerings, operationalization audits, decision logs, curriculum maps, and dashboards

What effective implementation partners don’t do:

  • One-off training sessions that expire when the consultant leaves
  • Generic recommendations without follow-through on execution
  • Framework delivery focusing on casemaking without operational translation
  • Classroom-only approaches that ignore the systems surrounding instruction and faculty buy-in
  • “Advise and leave” engagements that create strategic plans but no change in daily practice

The distinction matters because corequisite implementation requires institutional transformation across multiple functions. Community colleges and universities undertaking developmental education reform must coordinate placement, scheduling, intake, advising, registration defaults, classroom practices, and communications simultaneously to effectively support both students and faculty.

When community college students enrolled in corequisite support experience inconsistent messaging from advisors, registration systems that default to traditional remediation, or schedules that make concurrent enrollment impossible, pass rates and credit hours completion suffer regardless of instructional quality.

The #1 Partner-Selection Mistake (Especially for Coreq 2.0 Institutions)

The most critical misstep in strategic partner selection: choosing a collaborator that delivers surface-level solutions in isolated areas rather than comprehensive transformation across the complete student journey from pre-enrollment through course completion.

This mistake is particularly costly for Coreq 2.0 institutions already operating corequisite models at scale but seeing inconsistent results. These institutions often hire partners who specialize in high-level casemaking, faculty professional development, or tutoring support, but miss the cross-functional coordination that determines whether students and faculty experience consistent support and whether students access college level content effectively.

Why casemaking-only approaches fail at scale:

Casemaking consultants deliver information, examples, and research.  They may run convenings, facilitate stakeholder discussions, and produce comprehensive reports. But when the engagement ends, institutions face the same gap: decisions may or may not exist, and daily practice hasn’t changed. The move from talking to doing hasn’t occurred.

Consider placement methods. A casemaking partner might recommend multiple measures assessment as a concept to place students into corequisite courses more accurately. But that recommendation requires operational translation:

  • Specific policies and protocols to incorporate multiple measures placement
  • Intake scripts must change so staff explain the new placement approach
  • Web language must be updated so incoming students understand what to expect
  • Registration defaults must be reconfigured so the system routes students correctly
  • Advising workflows must be revised so advisors reinforce (not contradict) the new approach
  • Staff training must happen and stick so the new process persists across terms
  • Faculty training must occur so that messaging is consistent across campus

Example: If you decide that high school GPA is the default placement method, operationalizing that decision requires more than a memo. Advising teams need new scripts, intake steps need updating, and every student-facing web page needs to remove language that sends students to placement testing “just in case.” Registration pathways must reflect the new default so students are routed correctly without manual workarounds. If those changes do not happen, the policy exists but students still experience the old system.

Without that translation, the decision sits in a report while students continue experiencing the old process, perpetuating equity gaps and inconsistent student outcomes.

The implementation gap is the missing bridge between talking and doing. It’s the distance between “we’ve decided to do corequisites” and “corequisites are working predictably across every section, cohort, and student population in our system.”

Partial solutions create inconsistent student experiences. One advisor explains corequisite support accurately; another steers students toward prerequisite alternatives. One registration pathway defaults to concurrent enrollment; another requires manual intervention. One section’s faculty members coordinate effectively; others operate in isolation.

These inconsistencies compound across terms, creating the plateau effect that frustrates Coreq 2.0 institutions and hinders closing achievement gaps.

Coreq 1.0 vs Coreq 2.0: How Needs and Scope Change

Institutions pursuing corequisite implementation fall into two categories with different needs:

Coreq 1.0 Institutions: Early-Stage Implementation

These institutions are launching corequisite models for the first time or scaling from limited pilots to broader adoption. Coreq 1.0 institutions are moving from that restricted state toward full scale implementation.

What Coreq 1.0 needs:

  • Enablement and build: constructing the operational infrastructure for corequisites from scratch
  • Schedule design: creating course blocks that make concurrent enrollment possible
  • Placement alignment: establishing cut-scores and routing logic for incoming students using multiple measures assessment
  • Advisor training: equipping staff to explain and champion the corequisite approach and improve student engagement
  • Governance setup: clarifying decision rights for who decides what, by when

Coreq 2.0 Institutions: At Scale but Inconsistent

These institutions already operate corequisite courses across gateway math and English (and sometimes biology or chemistry), but results have plateaued or vary widely by section, term, or student population. The growing evidence suggests these institutions achieved initial success but couldn’t sustain it.

What Coreq 2.0 needs:

  • Reset and operationalization: diagnosing where the system broke down and rebuilding operational routines
  • Legacy option removal: auditing the schedule to identify traditional models running in parallel (institutions often “added but didn’t delete”)
  • Iteration protocols: establishing feedback loops so sections that struggle get targeted assistance
  • Consistency mechanisms: ensuring student experience doesn’t vary by which advisor they see or which section they enroll in
  • Measurement infrastructure: building dashboards that track leading indicators, not just end-of-term outcomes, to support data-driven educational leadership

Early diagnostic importance: For both Coreq 1.0 and 2.0 institutions, the first step is examining what’s actually being offered and where students get stuck. A schedule audit often reveals that legacy standalone developmental course options persist alongside corequisite offerings, creating parallel pathways that undermine reform and increase challenges for students and faculty.

Tailoring partner selection to institutional context matters. A partner effective at Coreq 1.0 build may lack the diagnostic depth for Coreq 2.0 reset. A partner experienced in Coreq 2.0 iteration may over-engineer solutions for institutions just beginning.

Comparison Table: Advice Partner vs Implementation Partner

Dimension Advice/Casemaking Partner Implementation Partner
Primary Deliverable High-level recommendations, frameworks, reports Operational artifacts: heat maps, blueprints, decision logs, dashboards
Engagement
Cadence
Periodic check-ins, convenings, milestone meetings Monthly operating routines with specific tasks, owners, and deadlines
Artifact Examples Best practice guides, research summaries, stakeholder presentations Operationalization audits, registration default configurations, intake scripts, courseheat maps
Accountability
Structure
Broad recommendations handed off; institution responsible for specific application and execution Shared accountability with clear decision rights and escalation paths
Measurement
Approach
End-of-engagement report on activities Leading indicators tracked; monthly executive readouts on student outcomes
Sustainment Model Knowledge transfer at end of engagement Embedded routines, decision logs, and iteration protocols that persist after engagement ends
Cross-Functional
Scope
Often focused on one function (curriculum, PD, advising) Explicitly spans placement, scheduling, advising, registration, communications, curriculum, and governance
Faculty
Engagement
Professional development sessions or workshops Ongoing coordination routines between gateway instructors and corequisite support instructors

The difference shows up in results. Casemaking partners operate at a high level with broad suggestions. Implementation partners help institutions not only make specific decisions but execute on those choicesin daily practice across every function that touches the student experience.

A provost or VPAA should feel “held” during implementation. The work should move on the timeline you expect, with the right stakeholders included, and with fewer surprises. If leadership feels like they are carrying the daily labor of worry, the support model is not truly implementation.

What Should Be Included in a Scope of Work for Corequisite Implementation Support

A scope of work (SOW) for corequisite implementation support must specify deliverables, cadence, artifacts, accountability, and timelines. Generic scopes produce generic results. A strong implementation partner acts like a house builder. They help you make decisions in the right order, with the right dependencies in place, so the work does not collapse later. The value is not only what you decide, but how those decisions are sequenced and operationalized across the institution.

Essential Deliverables:

  • Heat map of offerings: Visual representation of where corequisite courses exist, where gaps remain, and where students get stuck in the registration process
  • Blueprint/project plan: Tasks, owners, timelines, and dependencies for the full implementation arc
  • Decision log and decision rights: Documentation of who decides what, by when, with escalation paths for unresolved issues
  • Leading indicators dashboard: Defined metrics for student success, tracked regularly, with clear thresholds for intervention and focus on closing equity gaps

Cadence Requirements:

  • Monthly operating routines: Standing meetings with task owners to review progress, surface blockers, and assign action items
  • Regular executive readouts: Summaries for provost/VPAA on progress against milestones, student outcomes, and upcoming decisions
  • Term-over-term reviews: Analysis of outcomes by section and student population to identify iteration opportunities and system-level fixes.

Artifact Specifications:

Artifacts must be tangible and usable, not just slide decks that sit in SharePoint. Examples:

  • Registration default configuration documentation showing exactly how the system routes students
  • Intake script templates that advisors can use in conversations with underprepared students
  • Advisor quick-reference guides explaining corequisite course structure and how to address common questions
  • Dashboard views showing gateway course completion rates by cohort, section, and term
  • Curriculum maps showing how and when corequisite support is integrated into college-level content

Accountability Measures:

  • Named task owners for every deliverable
  • Clear decision deadlines with rationale included
  • Escalation protocols when cross-functional coordination breaks down
  • Explicit documentation of dependencies between departments

Timeline Expectations:

Implementation engagements should specify first-year milestones with concrete deliverables at each stage. Vague timelines (“Phase 1: Assessment; Phase 2: Implementation”) signal framework-only thinking.

What the First Year Should Look Like

The first year with an implementation partner should produce visible progress, not just planning.

Implementation work is constrained by the academic calendar. Institutions are “on the clock” during the term, and it is hard to convene the right people outside those windows. A real implementation model plans around these constraints and uses cadence and clear deliverables to keep momentum when time is limited.

Month 1: Diagnostic and Alignment

  • Stakeholder mapping: Identify who must be at the table across placement, advising, scheduling, registration, communications, and instruction
  • Baseline establishment: Document current gateway course completion rates, placement processes, and student flows
  • Schedule audit: Examine what’s actually being offered and identify legacy options running in parallel with corequisite courses
  • Quick wins identification: Surface immediate fixes (removing conflicting web language, correcting registration defaults) that demonstrate momentum

Deliverables by Day 30:

  • Heat map of current offerings and student routing
  • Baseline metrics document
  • Stakeholder map with roles and decision rights
  • Quick wins implementation plan

Months 2-4: Operationalization and Process Redesign

  • Gap closure: Address issues identified in the diagnostic, such as: update intake messaging, reconfigure registration defaults, revise advisor scripts
  • Cross-functional coordination: Establish routines between departments that must work together (e.g., advising and scheduling, faculty and tutoring)
  • Decision log setup: Begin documenting decisions, decision-makers, and rationale for future reference
  • Pilot iteration: If sections are already running, identify underperforming sections and implement targeted interventions

Deliverables by end of semester 1:

  • Updated intake scripts and advisor guides
  • Registration default reconfiguration documentation
  • Regular operating routine established with standing meeting schedule
  • Decision log with first entries

Months 4-12: Routine Establishment and Dashboard Deployment

  • Dashboard launch: Deploy leading indicators dashboard tracking enrollment, registration completion, early attendance, and support utilization
  • Iteration protocols: Establish feedback loops so section-level issues get surfaced and addressed within the term
  • Executive readout cadence: Begin regularreadouts for institutional leadership
  • Sustainment planning: Document routines and artifacts so the work persists beyond the engagement

Deliverables by end of year 1:

  • Leading indicators dashboard operational
  • First monthly executive readout delivered
  • Iteration protocol documented and in use
  • Sustainment plan drafted

Regular touchpoints throughout the year  ensure blockers surface quickly. Monthly milestone reviews keep leadership informed and accountable.

Ready to see what the first 30 days could look like at your institution?

If you’re at scale but not getting the results you expected, or preparing to scale and want it to work the first time, schedule an implementation assessment to identify your highest-impact opportunities.

Non-Negotiable Partner Criteria

Use this checklist when evaluating implementation partners for corequisite support implementation:

Cross-Functional Expertise:

  • Demonstrated experience across placement, advising, scheduling, registration, communications, curriculum and instruction
  • Track record of coordinating multiple departments simultaneously
  • Understanding of shared governance dynamics in higher education and the needs of college instructors

Operational Artifacts:

  • Can provide examples of heat maps, decision logs, and dashboards from prior engagements
  • Delivers tangible outputs and specific recommendations
  • Creates documentation that persists beyond the engagement

Execution Model:

  • Structured operating routines with named task owners
  • Clear cadence: regular meetings, executive readouts, term-over-term reviews
  • Escalation protocols for cross-functional blockers

Institutional Experience:

  • Track record with both Coreq 1.0 (build) and Coreq 2.0 (reset/iteration) contexts
  • Experience at comparable institution types (community colleges, regional universities, large systems)
  • Understanding of state policy contexts and compliance requirements 

Measurement and Accountability:

  • Defines leading indicators, not just lagging outcomes
  • Establishes baselines and tracks progress regularly
  • Holds themselves accountable for results, not just activities

Sustainment Focus:

  • Plans for how routines persist after the engagement ends
  • Builds institutional capacity, not dependency
  • Documents decision rights and escalation paths for ongoing use

8 Buyer Questions Every Provost Should Ask

Before hiring an implementation partner, use these questions to evaluate execution capability:

  1. What artifacts will you deliver in the first 30 days, and who will own each one?
    • Look for: specific deliverables (heat maps, decision logs) with named owners
  2. How do you translate a policy decision into daily operational practice?
    • Look for: concrete examples involving intake scripts, registration defaults, and staff routines
  3. What does your operating routine look like, and who participates?
    • Look for: standing meetings with clear agendas and cross-functional representation
  4. Can you show me examples of decision logs from prior engagements?
    • Look for: actual artifacts, not descriptions of what they would create
  5. How do you handle situations where departments disagree or block progress?
    • Look for: escalation protocols and experience navigating shared governance
  6. What leading indicators will you track, and how often?
    • Look for: regular tracking of enrollment, registration completion, and early-term signals
  7. What happens when the engagement ends? How do routines persist?
    • Look for: sustainment planning, documentation, and institutional capacity building
  8. Can you describe a Coreq 2.0 situation where results had plateaued and how you addressed it?
    • Look for: diagnostic depth, root cause identification, and iterative improvement

Partners who struggle to answer these questions with specifics are likely operating at the casemakinglevel rather than the implementation level.

Red Flags When Selecting a Partner

Watch for these warning signs during partner evaluation:

Casemaking-Only Indicators:

  • Proposals emphasize “best practices” and “research-based models” without operational specifics
  • Deliverables are reports, presentations, and broad recommendations rather than artifacts and specific suggestions
  • No clear operating routine described
  • Engagement ends with “knowledge transfer” rather than sustained routines
  • They say they can share research or facilitate conversation, but they cannot give concrete suggestions or recommendations

Lack of Cross-Functional Expertise:

  • Focus on curriculum or instruction without addressing placement, scheduling, or advising
  • No experience coordinating multiple departments simultaneously
  • Unfamiliarity with registration systems, intake processes, or web communications
  • Proposals that treat developmental education reform as a classroom problem only
  • Proposals that neglect classroom-level work in favor of systemic work only

Generic Approach:

  • Proposals lack institution-specific diagnostics
  • Same scope regardless of Coreq 1.0 vs. Coreq 2.0 context
  • No mention of schedule audits or legacy option analysis
  • Timeline lacks 1 month, 1 semester, and 1 year milestones with concrete deliverables

Accountability Gaps:

  • No named task owners or decision rights documentation
  • Vague measurement approach (“we’ll track student success”)
  • No escalation protocols for cross-functional blockers
  • Focus on activities delivered rather than outcomes achieved

Pedagogy-First Framing:

  • Primary emphasis on instructional quality or faculty professional development
  • Limited attention to the systems that get students into the right course with the right support
  • Language that implies faculty members are the problem rather than part of the solution
  • No acknowledgment of the role of advising, scheduling, and registration in student outcomes

Partners exhibiting these flags may produce good strategy but are unlikely to close the implementation gap that separates policy from practice.

Get Started: Schedule Your Assessment

If your institution is operating corequisite courses at scale but not seeing consistent results or preparing to scale and want it to work the first time, schedule an implementation assessment to identify your highest-impact opportunities.

What to expect in the initial consultation:

  • Discussion of your current state: Coreq 1.0 (early-stage) or Coreq 2.0 (at scale, inconsistent results)
  • Preliminary identification of cross-functional gaps
  • Overview of what a 1 year engagement could accomplish
  • Candid assessment of fit and readiness

Use the criteria checklist above in your next partner conversation—whether with us or anyone else. The questions matter more than who’s asking them.

For institutions pursuing gateway course redesign alongside corequisite implementation, the diagnostic can address both simultaneously.

Choosing an Implementation Partner for Corequisite Reform: Turn Policy Decisions Into Predictable Results at Scale

How to select an implementation partner for corequisite reform that drives measurable impact and ROI. Learn to turn policy decisions into predictable results at scale.

Download Resource

Executive Summary

An implementation partner for corequisite reform bridges the gap between policy decisions and daily operational practice, transforming mandates into cross-functional execution that produces measurable student success at scale. Almy Education’s “done with you” methodology creates stakeholder alignment and momentum through structured, hands-on implementation support with cross-functional teams. The focus is on systems and operating routines that make outcomes predictable at scale.

Who needs one: Coreq 1.0 institutions launching or scaling corequisite models for the first time, and Coreq 2.0 institutions already at scale but seeing plateaued or inconsistent gateway course completion rates across terms.

Key evaluation criteria: Proof of execution through tangible artifacts (heat maps, decision logs, operationalization audits, curriculum maps integrating support), cross-functional expertise spanning placement, advising, scheduling, communications, and faculty coordination, and structured week-to-week operating routines, not just frameworks alone.

Expected outcomes: Within one semester, institutions should see baseline metrics established, operational gaps closed, and leading indicators of student outcomes moving in the right direction, supporting equity gaps reduction and improved student engagement.

How to evaluate partners: Focus on deliverables, cadence, and accountability measures. Ask for evidence of decision-to-practice translation at comparable community colleges and public colleges, emphasizing experience turning developmental education reform decisions into operational practice and measurable gateway course results.

Red flags to avoid: Casemaking-only approaches, “advise and leave” models, proposals lacking institution-specific diagnostics, and partners who focus solely on casemaking at a high level rather than specific systems and processes that get students into college level courses effectively.

What an Implementation Partner Does for Corequisite Reform (and What They Don’t)

An implementation partner turns executive decisions into an end-to-end student experience that works before, during, and after the gateway course. That requires coordinated execution across placement, advising, scheduling, registration, communications, curriculum, and support structures.

What implementation partners do:

  • Cross-functional execution: Work across placement, scheduling, advising, registration, web communications, and faculty coordination simultaneously, not in silos
  • Operational translation: Convert policy decisions into specific intake scripts, registration defaults, web language, and staff routines that support college-level learning and college-level coursework
  • Structured project management: Run operating routines with clear owners, timelines, and dependencies, plus regular executive readouts to educational leadership
  • Continuous improvement: Establish iteration protocols and leading indicators so corequisite courses improve term over term and help close achievement gaps
  • Artifact creation: Deliver tangible outputs, not just recommendations: heat maps of offerings, operationalization audits, decision logs, curriculum maps, and dashboards

What effective implementation partners don’t do:

  • One-off training sessions that expire when the consultant leaves
  • Generic recommendations without follow-through on execution
  • Framework delivery focusing on casemaking without operational translation
  • Classroom-only approaches that ignore the systems surrounding instruction and faculty buy-in
  • “Advise and leave” engagements that create strategic plans but no change in daily practice

The distinction matters because corequisite implementation requires institutional transformation across multiple functions. Community colleges and universities undertaking developmental education reform must coordinate placement, scheduling, intake, advising, registration defaults, classroom practices, and communications simultaneously to effectively support both students and faculty.

When community college students enrolled in corequisite support experience inconsistent messaging from advisors, registration systems that default to traditional remediation, or schedules that make concurrent enrollment impossible, pass rates and credit hours completion suffer regardless of instructional quality.

The #1 Partner-Selection Mistake (Especially for Coreq 2.0 Institutions)

The most critical misstep in strategic partner selection: choosing a collaborator that delivers surface-level solutions in isolated areas rather than comprehensive transformation across the complete student journey from pre-enrollment through course completion.

This mistake is particularly costly for Coreq 2.0 institutions already operating corequisite models at scale but seeing inconsistent results. These institutions often hire partners who specialize in high-level casemaking, faculty professional development, or tutoring support, but miss the cross-functional coordination that determines whether students and faculty experience consistent support and whether students access college level content effectively.

Why casemaking-only approaches fail at scale:

Casemaking consultants deliver information, examples, and research.  They may run convenings, facilitate stakeholder discussions, and produce comprehensive reports. But when the engagement ends, institutions face the same gap: decisions may or may not exist, and daily practice hasn’t changed. The move from talking to doing hasn’t occurred.

Consider placement methods. A casemaking partner might recommend multiple measures assessment as a concept to place students into corequisite courses more accurately. But that recommendation requires operational translation:

  • Specific policies and protocols to incorporate multiple measures placement
  • Intake scripts must change so staff explain the new placement approach
  • Web language must be updated so incoming students understand what to expect
  • Registration defaults must be reconfigured so the system routes students correctly
  • Advising workflows must be revised so advisors reinforce (not contradict) the new approach
  • Staff training must happen and stick so the new process persists across terms
  • Faculty training must occur so that messaging is consistent across campus

Example: If you decide that high school GPA is the default placement method, operationalizing that decision requires more than a memo. Advising teams need new scripts, intake steps need updating, and every student-facing web page needs to remove language that sends students to placement testing “just in case.” Registration pathways must reflect the new default so students are routed correctly without manual workarounds. If those changes do not happen, the policy exists but students still experience the old system.

Without that translation, the decision sits in a report while students continue experiencing the old process, perpetuating equity gaps and inconsistent student outcomes.

The implementation gap is the missing bridge between talking and doing. It’s the distance between “we’ve decided to do corequisites” and “corequisites are working predictably across every section, cohort, and student population in our system.”

Partial solutions create inconsistent student experiences. One advisor explains corequisite support accurately; another steers students toward prerequisite alternatives. One registration pathway defaults to concurrent enrollment; another requires manual intervention. One section’s faculty members coordinate effectively; others operate in isolation.

These inconsistencies compound across terms, creating the plateau effect that frustrates Coreq 2.0 institutions and hinders closing achievement gaps.

Coreq 1.0 vs Coreq 2.0: How Needs and Scope Change

Institutions pursuing corequisite implementation fall into two categories with different needs:

Coreq 1.0 Institutions: Early-Stage Implementation

These institutions are launching corequisite models for the first time or scaling from limited pilots to broader adoption. Coreq 1.0 institutions are moving from that restricted state toward full scale implementation.

What Coreq 1.0 needs:

  • Enablement and build: constructing the operational infrastructure for corequisites from scratch
  • Schedule design: creating course blocks that make concurrent enrollment possible
  • Placement alignment: establishing cut-scores and routing logic for incoming students using multiple measures assessment
  • Advisor training: equipping staff to explain and champion the corequisite approach and improve student engagement
  • Governance setup: clarifying decision rights for who decides what, by when

Coreq 2.0 Institutions: At Scale but Inconsistent

These institutions already operate corequisite courses across gateway math and English (and sometimes biology or chemistry), but results have plateaued or vary widely by section, term, or student population. The growing evidence suggests these institutions achieved initial success but couldn’t sustain it.

What Coreq 2.0 needs:

  • Reset and operationalization: diagnosing where the system broke down and rebuilding operational routines
  • Legacy option removal: auditing the schedule to identify traditional models running in parallel (institutions often “added but didn’t delete”)
  • Iteration protocols: establishing feedback loops so sections that struggle get targeted assistance
  • Consistency mechanisms: ensuring student experience doesn’t vary by which advisor they see or which section they enroll in
  • Measurement infrastructure: building dashboards that track leading indicators, not just end-of-term outcomes, to support data-driven educational leadership

Early diagnostic importance: For both Coreq 1.0 and 2.0 institutions, the first step is examining what’s actually being offered and where students get stuck. A schedule audit often reveals that legacy standalone developmental course options persist alongside corequisite offerings, creating parallel pathways that undermine reform and increase challenges for students and faculty.

Tailoring partner selection to institutional context matters. A partner effective at Coreq 1.0 build may lack the diagnostic depth for Coreq 2.0 reset. A partner experienced in Coreq 2.0 iteration may over-engineer solutions for institutions just beginning.

Comparison Table: Advice Partner vs Implementation Partner

Dimension Advice/Casemaking Partner Implementation Partner
Primary Deliverable High-level recommendations, frameworks, reports Operational artifacts: heat maps, blueprints, decision logs, dashboards
Engagement
Cadence
Periodic check-ins, convenings, milestone meetings Monthly operating routines with specific tasks, owners, and deadlines
Artifact Examples Best practice guides, research summaries, stakeholder presentations Operationalization audits, registration default configurations, intake scripts, courseheat maps
Accountability
Structure
Broad recommendations handed off; institution responsible for specific application and execution Shared accountability with clear decision rights and escalation paths
Measurement
Approach
End-of-engagement report on activities Leading indicators tracked; monthly executive readouts on student outcomes
Sustainment Model Knowledge transfer at end of engagement Embedded routines, decision logs, and iteration protocols that persist after engagement ends
Cross-Functional
Scope
Often focused on one function (curriculum, PD, advising) Explicitly spans placement, scheduling, advising, registration, communications, curriculum, and governance
Faculty
Engagement
Professional development sessions or workshops Ongoing coordination routines between gateway instructors and corequisite support instructors

The difference shows up in results. Casemaking partners operate at a high level with broad suggestions. Implementation partners help institutions not only make specific decisions but execute on those choicesin daily practice across every function that touches the student experience.

A provost or VPAA should feel “held” during implementation. The work should move on the timeline you expect, with the right stakeholders included, and with fewer surprises. If leadership feels like they are carrying the daily labor of worry, the support model is not truly implementation.

What Should Be Included in a Scope of Work for Corequisite Implementation Support

A scope of work (SOW) for corequisite implementation support must specify deliverables, cadence, artifacts, accountability, and timelines. Generic scopes produce generic results. A strong implementation partner acts like a house builder. They help you make decisions in the right order, with the right dependencies in place, so the work does not collapse later. The value is not only what you decide, but how those decisions are sequenced and operationalized across the institution.

Essential Deliverables:

  • Heat map of offerings: Visual representation of where corequisite courses exist, where gaps remain, and where students get stuck in the registration process
  • Blueprint/project plan: Tasks, owners, timelines, and dependencies for the full implementation arc
  • Decision log and decision rights: Documentation of who decides what, by when, with escalation paths for unresolved issues
  • Leading indicators dashboard: Defined metrics for student success, tracked regularly, with clear thresholds for intervention and focus on closing equity gaps

Cadence Requirements:

  • Monthly operating routines: Standing meetings with task owners to review progress, surface blockers, and assign action items
  • Regular executive readouts: Summaries for provost/VPAA on progress against milestones, student outcomes, and upcoming decisions
  • Term-over-term reviews: Analysis of outcomes by section and student population to identify iteration opportunities and system-level fixes.

Artifact Specifications:

Artifacts must be tangible and usable, not just slide decks that sit in SharePoint. Examples:

  • Registration default configuration documentation showing exactly how the system routes students
  • Intake script templates that advisors can use in conversations with underprepared students
  • Advisor quick-reference guides explaining corequisite course structure and how to address common questions
  • Dashboard views showing gateway course completion rates by cohort, section, and term
  • Curriculum maps showing how and when corequisite support is integrated into college-level content

Accountability Measures:

  • Named task owners for every deliverable
  • Clear decision deadlines with rationale included
  • Escalation protocols when cross-functional coordination breaks down
  • Explicit documentation of dependencies between departments

Timeline Expectations:

Implementation engagements should specify first-year milestones with concrete deliverables at each stage. Vague timelines (“Phase 1: Assessment; Phase 2: Implementation”) signal framework-only thinking.

What the First Year Should Look Like

The first year with an implementation partner should produce visible progress, not just planning.

Implementation work is constrained by the academic calendar. Institutions are “on the clock” during the term, and it is hard to convene the right people outside those windows. A real implementation model plans around these constraints and uses cadence and clear deliverables to keep momentum when time is limited.

Month 1: Diagnostic and Alignment

  • Stakeholder mapping: Identify who must be at the table across placement, advising, scheduling, registration, communications, and instruction
  • Baseline establishment: Document current gateway course completion rates, placement processes, and student flows
  • Schedule audit: Examine what’s actually being offered and identify legacy options running in parallel with corequisite courses
  • Quick wins identification: Surface immediate fixes (removing conflicting web language, correcting registration defaults) that demonstrate momentum

Deliverables by Day 30:

  • Heat map of current offerings and student routing
  • Baseline metrics document
  • Stakeholder map with roles and decision rights
  • Quick wins implementation plan

Months 2-4: Operationalization and Process Redesign

  • Gap closure: Address issues identified in the diagnostic, such as: update intake messaging, reconfigure registration defaults, revise advisor scripts
  • Cross-functional coordination: Establish routines between departments that must work together (e.g., advising and scheduling, faculty and tutoring)
  • Decision log setup: Begin documenting decisions, decision-makers, and rationale for future reference
  • Pilot iteration: If sections are already running, identify underperforming sections and implement targeted interventions

Deliverables by end of semester 1:

  • Updated intake scripts and advisor guides
  • Registration default reconfiguration documentation
  • Regular operating routine established with standing meeting schedule
  • Decision log with first entries

Months 4-12: Routine Establishment and Dashboard Deployment

  • Dashboard launch: Deploy leading indicators dashboard tracking enrollment, registration completion, early attendance, and support utilization
  • Iteration protocols: Establish feedback loops so section-level issues get surfaced and addressed within the term
  • Executive readout cadence: Begin regularreadouts for institutional leadership
  • Sustainment planning: Document routines and artifacts so the work persists beyond the engagement

Deliverables by end of year 1:

  • Leading indicators dashboard operational
  • First monthly executive readout delivered
  • Iteration protocol documented and in use
  • Sustainment plan drafted

Regular touchpoints throughout the year  ensure blockers surface quickly. Monthly milestone reviews keep leadership informed and accountable.

Ready to see what the first 30 days could look like at your institution?

If you’re at scale but not getting the results you expected, or preparing to scale and want it to work the first time, schedule an implementation assessment to identify your highest-impact opportunities.

Non-Negotiable Partner Criteria

Use this checklist when evaluating implementation partners for corequisite support implementation:

Cross-Functional Expertise:

  • Demonstrated experience across placement, advising, scheduling, registration, communications, curriculum and instruction
  • Track record of coordinating multiple departments simultaneously
  • Understanding of shared governance dynamics in higher education and the needs of college instructors

Operational Artifacts:

  • Can provide examples of heat maps, decision logs, and dashboards from prior engagements
  • Delivers tangible outputs and specific recommendations
  • Creates documentation that persists beyond the engagement

Execution Model:

  • Structured operating routines with named task owners
  • Clear cadence: regular meetings, executive readouts, term-over-term reviews
  • Escalation protocols for cross-functional blockers

Institutional Experience:

  • Track record with both Coreq 1.0 (build) and Coreq 2.0 (reset/iteration) contexts
  • Experience at comparable institution types (community colleges, regional universities, large systems)
  • Understanding of state policy contexts and compliance requirements 

Measurement and Accountability:

  • Defines leading indicators, not just lagging outcomes
  • Establishes baselines and tracks progress regularly
  • Holds themselves accountable for results, not just activities

Sustainment Focus:

  • Plans for how routines persist after the engagement ends
  • Builds institutional capacity, not dependency
  • Documents decision rights and escalation paths for ongoing use

8 Buyer Questions Every Provost Should Ask

Before hiring an implementation partner, use these questions to evaluate execution capability:

  1. What artifacts will you deliver in the first 30 days, and who will own each one?
    • Look for: specific deliverables (heat maps, decision logs) with named owners
  2. How do you translate a policy decision into daily operational practice?
    • Look for: concrete examples involving intake scripts, registration defaults, and staff routines
  3. What does your operating routine look like, and who participates?
    • Look for: standing meetings with clear agendas and cross-functional representation
  4. Can you show me examples of decision logs from prior engagements?
    • Look for: actual artifacts, not descriptions of what they would create
  5. How do you handle situations where departments disagree or block progress?
    • Look for: escalation protocols and experience navigating shared governance
  6. What leading indicators will you track, and how often?
    • Look for: regular tracking of enrollment, registration completion, and early-term signals
  7. What happens when the engagement ends? How do routines persist?
    • Look for: sustainment planning, documentation, and institutional capacity building
  8. Can you describe a Coreq 2.0 situation where results had plateaued and how you addressed it?
    • Look for: diagnostic depth, root cause identification, and iterative improvement

Partners who struggle to answer these questions with specifics are likely operating at the casemakinglevel rather than the implementation level.

Red Flags When Selecting a Partner

Watch for these warning signs during partner evaluation:

Casemaking-Only Indicators:

  • Proposals emphasize “best practices” and “research-based models” without operational specifics
  • Deliverables are reports, presentations, and broad recommendations rather than artifacts and specific suggestions
  • No clear operating routine described
  • Engagement ends with “knowledge transfer” rather than sustained routines
  • They say they can share research or facilitate conversation, but they cannot give concrete suggestions or recommendations

Lack of Cross-Functional Expertise:

  • Focus on curriculum or instruction without addressing placement, scheduling, or advising
  • No experience coordinating multiple departments simultaneously
  • Unfamiliarity with registration systems, intake processes, or web communications
  • Proposals that treat developmental education reform as a classroom problem only
  • Proposals that neglect classroom-level work in favor of systemic work only

Generic Approach:

  • Proposals lack institution-specific diagnostics
  • Same scope regardless of Coreq 1.0 vs. Coreq 2.0 context
  • No mention of schedule audits or legacy option analysis
  • Timeline lacks 1 month, 1 semester, and 1 year milestones with concrete deliverables

Accountability Gaps:

  • No named task owners or decision rights documentation
  • Vague measurement approach (“we’ll track student success”)
  • No escalation protocols for cross-functional blockers
  • Focus on activities delivered rather than outcomes achieved

Pedagogy-First Framing:

  • Primary emphasis on instructional quality or faculty professional development
  • Limited attention to the systems that get students into the right course with the right support
  • Language that implies faculty members are the problem rather than part of the solution
  • No acknowledgment of the role of advising, scheduling, and registration in student outcomes

Partners exhibiting these flags may produce good strategy but are unlikely to close the implementation gap that separates policy from practice.

Get Started: Schedule Your Assessment

If your institution is operating corequisite courses at scale but not seeing consistent results or preparing to scale and want it to work the first time, schedule an implementation assessment to identify your highest-impact opportunities.

What to expect in the initial consultation:

  • Discussion of your current state: Coreq 1.0 (early-stage) or Coreq 2.0 (at scale, inconsistent results)
  • Preliminary identification of cross-functional gaps
  • Overview of what a 1 year engagement could accomplish
  • Candid assessment of fit and readiness

Use the criteria checklist above in your next partner conversation—whether with us or anyone else. The questions matter more than who’s asking them.

For institutions pursuing gateway course redesign alongside corequisite implementation, the diagnostic can address both simultaneously.

Gateway Success Starts Here

Almy Education is the only gateway course consultancy that combines proven methodology with hands-on 'done with you' implementation support, led by current and former faculty who understand both systemic change management and classroom realities.

Get Started