Skip to main content

Appendices

Appendix A

Structure and Process of Strategic Planning 2010

Structure

The structure of this planning effort involved two types of groups: First, there was an eight-person faculty group at the center of the strategic planning effort, the Strategic Planning Advisory Council (SPAC). That council was charged with developing and writing the strategic plan. All ideas and input were processed through and by that group. Second, there were four working groups consisting of a total of 54 faculty, students, and staff, each focused on one of the following areas: Education; Research, Scholarship, and Creativity; Public Engagement; and Organizational Stewardship.

The SPAC provided questions and issues for these working groups (WGs), and members of the WGs "drilled down" more deeply into the goal areas of the plan and provided detailed input and suggestions to the SPAC.

Membership of Strategic Planning Advisory Council

  • Lance Collins, Mechanical and Aerospace Engineering
  • Jonathan Culler, English
  • Sandra Greene, History
  • Martha Haynes, Astronomy
  • Katherine Hajjar, Cell and Development Biology, Weill Cornell Medical College
  • Edward Lawler, Organizational Behavior, ILR School (Advisory Council Chair)
  • Susan McCouch, Plant Breeding and Genetics
  • Michael Waldman, Economics, Johnson Graduate School of Management

Steps in Planning Process

The major steps in the planning process were as follows:

  1. The SPAC developed a set of questions and issues which Working Groups (WGs) analyzed in more depth.
  2. The WGs provided reports to the SPAC from which the SPAC developed draft objectives and actions for several goal areas.
  3. The SPAC sent these back for comment and suggestions to WGs, the president, provost, deans, vice presidents, and vice provosts.
  4. A draft plan outline was made available to the university community for comment on January 25, 2010. February and early March were devoted to gathering feedback across campus through meetings in every college, with student groups, and with staff.
  5. During that period, the SPAC set priorities among the objectives in each section, formulated seven strategic initiatives for 2010-2015, and developed metrics for assessing progress.
  6. On March 11, 2010, a complete draft of the plan outline was made available to the university community for further comments and suggestions.
  7. On April 7, 2010, the trustees spent half a day at a retreat discussing the plan.
  8. Breakout groups of trustees, based on the goal areas of the plan, provided valuable input to the SPAC.
  9. April to May, the SPAC drafted the final version of the Strategic Plan.

Appendix B

Open Doors, Open Hearts, and Open Minds
http://www.cornell.edu/diversity/history/statement.cfm

Open Doors

"I would found an institution where any person can find instruction in any study." This statement, made by Ezra Cornell in 1865, proclaims Cornell University's enduring commitment to inclusion and opportunity, which is rooted in the shared democratic values envisioned by its founders. We honor this legacy of diversity and inclusion and welcome all individuals, including those from groups that have been historically marginalized and previously excluded from equal access to opportunity.

Open Hearts

Cornell's mission is to foster personal discovery and growth, nurture scholarship and creativity across a broad range of common knowledge, and affirm the value to individuals and society of the cultivation of the human mind and spirit. Our legacy is reflected in the diverse composition of our community, the breadth of our curriculum, the strength of our public service, and the depth of our commitment to freedom, equity, and reason. Each member of the Cornell community has a responsibility to honor this legacy and to support a more diverse and inclusive campus in which to work, study, teach, research, and serve.

Open Minds

Free expression is essential to this mission, and provocative ideas lawfully presented are an expected result. An enlightened academic community, however, connects freedom with responsibility. Cornell stands for civil discourse, reasoned thought, sustained discussion, and constructive engagement without degrading, abusing, harassing, or silencing others. Cornell is committed to act responsibly and forthrightly to maintain an environment that opens doors, opens hearts, and opens minds.

Appendix C

Student Learning Outcomes
http://www.cornell.edu/provost/assessment/

Purpose

In fall 2009 the provost and vice provost for undergraduate education formed a Core Assessment Committee to develop mechanisms for assessing the outcomes of educational programs, in particular the impact on students. To accomplish this, the committee pulled together "assessment agents" designated by each college. The educational goals and competencies that were first developed in each college became the basis for the competencies defined below.

Draft Core Competencies for Cornell Students

  1. Courses of study at Cornell should generate proficiency in the following core academic competencies:
    1. Disciplinary Knowledge: demonstrate a systematic or coherent understanding of an academic field of study.
    2. Critical Thinking: apply analytic thought to a body of knowledge; evaluate arguments, identifying relevant assumptions or implications; formulate coherent arguments.
    3. Communication Skills: express ideas clearly in writing; speak articulately; communicate with others using media as appropriate; work effectively with others.
    4. Scientific and Quantitative Reasoning: demonstrate the ability to understand cause-and-effect relationships; define problems; use symbolic thought; apply scientific principles; solve problems with no single correct answer.
    5. Self-Directed Learning: work independently; identify appropriate resources; take initiative; manage a project through to completion.
    6. Information Literacy: access, evaluate, and use a variety of relevant information sources.
    7. Engagement in the Process of Discovery or Creation: demonstrate the ability to work productively in a laboratory setting, studio, or field environment.
  2. In addition, the Cornell environment strives to foster collegiality, civility, and responsible stewardship. Through academic studies and broader experiences on and off campus, Cornell graduates should attain proficiency in the following:
    1. Multicultural Competence: have knowledge of the values and beliefs of multiple cultures; effectively engage in a multicultural society; interact respectfully with diverse others; develop a global perspective.
    2. Moral and Ethical Awareness: embrace moral/ethical values in conducting one's life; formulate a position/argument about an ethical issue from multiple perspectives; use ethical practices in all work.
    3. Self-Management: care for one's self responsibly; demonstrate awareness of one's self in relation to others.
    4. Community Engagement: demonstrate responsible behavior; engage in the intellectual life of the university outside the classroom; participate in community and civic affairs.

Assessment Implementation Plan and Benchmarks

Stage 1 - Initial Implementation

  • Each college establishes an assessment process (utilizing standing committees, e.g., curriculum committees or educational policy committees, or setting up a new committee).
  • Each college surveys already-existing assessment activities (including those generated by external review requirements and those conceived within ongoing program and course conceptions), in order to incorporate these into the college process.
  • Each college generates an educational goals/outcomes statement for the college.
  • These statements are added to a specifically designated assessment site on the college web site.
  • Each college targets 2-3 majors or programs for which an assessment plan will be generated and incorporated into curricular materials. These should be stable programs, ideally representing a range of fields/areas.
  • For each major/program, 2-3 program goals should be provided and learning outcomes should be collected, using both direct and indirect measures (at least 2 measures per major/program).
  • Statements of these goals/outcomes should be posted on the college web site and on other sites (to be determined).
  • These activities of the colleges are coordinated and facilitated by the Core Assessment Committee.

Stage 2 - Full Implementation

  • College assessment sites are maintained and expanded to include department and program goals.
  • Each college continues the process of generating assessment plans for the majority of its majors/programs, with this process to be completed by the end of spring 2011.
  • Progress is reviewed; process is revised as necessary; further need for resources is reviewed; wider inclusion of the campus is reviewed.
  • The Core Assessment Committee becomes a standing committee, overseen by the provost's office, and given the task of coordinating and facilitating college assessment processes.

Stage 3 - Institutionalization

  • College committees with designated responsibility regularly review educational goals, according to a timetable.
  • Colleges maintain and update their assessment web sites regularly (according to a timetable designated within each college).
  • Departments/programs review their assessment outcomes (according to a regular process established internally).
  • New programs are included in the assessment process (according to the established process within each college).
  • The Core Assessment Committee facilitates the assessment process, identifies needs as they arise, addresses challenges, and provides an annual report to the provost.

Appendix D

Assessing Progress
2010-2015

Introduction

This section proposes a general approach and set of assumptions that should guide the development and use of metrics and qualitative indicators for assessing progress on plan goals, objectives, and strategic initiatives. It includes an initial list of core metrics as well as an elaboration of qualitative and quantitative indicators for each objective in the plan. The Strategic Planning Advisory Council is offering this broad framework for using existing data or institutional capacities to assess and track progress on plan priorities and objectives. This proposal is a first step, intended to be a starting point from which appropriate groups of administrators, faculty, and staff can develop and further refine the appropriate metrics and qualitative indicators.

  1. General Approach:
    1. Focus on university-wide (aggregated) metrics and qualitative indicators but include, where appropriate, unit-level ones.
    2. Organize metrics and qualitative indicators around goals and priorities.
    3. Include both quantitative and qualitative indicators.
    4. Have multiple indicators for each goal, given the complexity of the assessment, but as few as possible to enhance focus.
    5. Make the metrics flexible and adaptable to be useful across a wide range of academic areas or units.
    6. Consider the need to minimize the amount of staff time or additional staff to implement the metrics.
    7. Insofar as possible, use existing sources of data and information.
  2. Assumptions
    1. It is exceedingly difficult to develop fully adequate measures of progress toward greater excellence in a research university.
    2. No particular metrics or qualitative indicators will be sufficient, but some sets or combinations of them will be significantly better for tracking progress than others or than having none.
    3. Metrics and qualitative indicators need to be developed in consultation with those people in the areas being measured (faculty, students, and staff). The metrics developed in this plan, therefore, must be considered a draft for further consultation and development.
    4. Metrics help to promote progress and improvement by holding the institution or units accountable for working toward goals or objectives, but they also can do harm if action is focused on moving particular numbers or indicators rather than the larger purposes for which they are created.
    5. Any set of metrics will have unintended consequences that are important to analyze and anticipate.
    6. Any set of metrics or indicators should be viewed as a whole and be part of an overall qualitative assessment and judgment.
  3. Core Metrics

    With the above approach and assumptions as context, listed below are a provisional set of core metrics that are important to track over the next five years. These are related to strategic priorities and initiatives in previous sections of this plan. This list should be modified and developed further over time with assumptions 4 and 5 above in mind.
    • Faculty and staff compensation
      Compare salaries and fringe benefits to peer institutions (faculty) or appropriate markets (staff).
    • Amount and nature of faculty hiring and retention
      Number of hires/year; rank distribution of hires; tracking of changes in faculty size; yearly assessment of faculty exits.
    • Age distribution of the faculty
      Percent of faculty 55 and above; 60 and above (university-wide and by unit).
    • Diversity of faculty, students, and staff
      Percent women and underrepresented minorities. For faculty, comparison to specific goals of 20% or pipeline percent (whichever is higher). Set comparable goals for students and staff.
    • Number of top-ranked departments and programs
      Select appropriate NRC criteria; discipline-specific rankings; regular program reviews.
    • Sponsored research
      Total expenditures; expenditures per faculty member.
    • Student learning outcomes and health
      College assessments of learning outcomes based on core competencies (see Appendix C); data from Gannett on student mental and physical health.
    • Student access
      Cost of Cornell education by family income quintile.
    • Student surveys (undergraduate, graduate, and professional)
      Satisfaction with teaching; satisfaction with research opportunities and training; perceptions of international and public engagement opportunities; ease of taking courses across boundaries and administrative/bureaucratic barriers; perceptions of living-learning environment at Cornell.
    • Library rankings
      Compare to research university libraries, using appropriate measures from the ARL (Association of Research Libraries).
    • Faculty and staff surveys
      Conduct surveys on a regular schedule.
    • Ithaca-Weill interactions
      Joint research grants; collaborative teaching programs; and cross-usage of core facilities.
    • Stature of university as a whole
      Institutional reputation based on appropriate high-quality rankings of research universities (e.g., based on NRC data and criteria); use of select metrics from above list (e.g., faculty quality, student quality; external research funding; library rankings including collections).

Indicators for University-Wide Excellence

  1. Institutional Reputation and Stature
    1. Select reputational rankings of research universities.
    2. Aggregate data on the university and academic units (e.g., indicators of faculty excellence, student quality, and excellence of research, scholarship, and creativity).
    3. Choose a set of metrics from forthcoming National Research Council evaluations to track institutional progress toward the university's aspiration.

Indicators for Faculty Excellence

  1. Faculty Recruiting and Size (Objective 1)
    1. Amount and nature of faculty hiring.
    2. Have there been pre-fills where future faculty quality warrants it?
    3. Funding for new faculty positions in strategically important departments or programs (from fund-raising, internal reallocation, or other sources).
    4. Has Cornell's dual-career program expanded the window for commitments beyond three years? Are there new elements that distinguish Cornell from its competition?
  2. Faculty Diversity (Objective 2)
    1. Comparison of proportion of women in departments to a 20% absolute standard or the pipeline level (whichever is higher), with the federal standard as a reference. This extends the CU ADVANCE standard across departments and colleges.
    2. Comparison of the proportion of underrepresented minorities to the appropriate pipeline and federal standards. Develop an absolute standard that serves the same purpose as the CU ADVANCE standard does for women in science.
    3. How many departments have reached the 20% or pipeline goals for women and underrepresented minorities? How many have reached the federal standards?
    4. Have funding mechanisms to promote diversity improved?
    5. Is there an efficient and effective monitoring mechanism in place for each hiring unit?
  3. Competitive Faculty and Staff Compensation (Objective 3)
    1. Define peer groups appropriate to given disciplines, fields, departments, professional schools, and staff job categories.
    2. Track faculty and staff salaries and fringe benefits against appropriate peer institutions.
  4. Faculty Retention (Objective 4)
    1. Have efforts to prevent exits by highly valued faculty increased?
    2. A qualitative assessment each year of cases in which highly valued faculty have left (to determine how responses can be improved).
    3. Have dual-career and work-life issues (e.g., childcare) been given appropriate attention in retention efforts?
    4. Track percent of faculty exits per year across faculty career stages (pre-tenure, tenure to mid-career, and more senior).
  5. Rewarding Outstanding Faculty (Objective 5)
    1. Have new forms of recognition and reward for outstanding performance among faculty (in teaching, research, and public engagement) been implemented in departments and colleges?
    2. Faculty awards and honors; leadership positions in field.
    3. Number of faculty in distinguished national academies (e.g., the National Academy of Sciences).
    4. Do all departments have systems for reviewing the teaching, research, and public engagement of faculty after tenure?
  6. Intellectual Environment (Objective 6)
    1. What new cross-college or cross-department interdisciplinary collaborations have formed? What is the potential impact of these?
    2. Has a sustainable "faculty club" been developed and/or other measures to promote informal conversation and dialogue?
    3. Have concerted efforts been made to improve or maintain a strong culture of collaboration in departments? Have these efforts had an impact?

Indicators for Excellence in Education

  1. Shared Educational Student Experience (Objective 3)
    1. Have academic or administrative barriers to students in one college taking courses in another been reduced?
    2. Have additional shared educational components for Cornell undergraduates (living-learning programs, courses, and so forth) that address core competencies been added? What impact have these had?
    3. Have course or credit hours out-of-college for Cornell undergraduates increased?
    4. Senior survey results on perceptions of how easy or difficult it is to take courses outside of their department or college.
  2. Student Learning Outcomes
    1. Assessments by colleges and other units of how they contribute to students' acquisition of core competencies.
    2. Are mechanisms for measuring learning outcomes, indicated by these core competencies, in place? [See Appendix C.]
  3. Student Psychological Health and Well-Being (Objective 5)
    1. Staffing levels directed at quick, effective response to students experiencing excessive stress.
    2. Waiting or lag times between contact and appointment or intervention.
    3. Regularity of communications and outreach to encourage students to seek help and ensure that they know how to do so.
    4. Surveys of student psychological health and well-being.
    5. Programs to enhance faculty and staff knowledge of and capacity to detect students undergoing excessive stress and encourage them to seek help.
  4. International and Public Engagement Opportunities for Students (Objective 4 in Education and Objective 1 in Public Engagement Section)
    1. Student participation rates (for credit and noncredit).
    2. Are more programs offering public engagement opportunities to students under faculty supervision and leadership?
    3. Have costs to students for study abroad decreased? Have administrative obstacles been reduced for international and public engagement experiences?
    4. Have we examined and affirmed the quality of current international and public engagement options for students?
    5. Student survey data about these experiences.
  5. Culture in Support of Teaching (Objective 1)
    1. Develop Cornell benchmarks for rigorous assessment, by identifying good models within Cornell, and use these to compare departments' attention to and support for teaching excellence.
    2. Have forms or ways to recognize excellence in teaching increased?
    3. Student survey data on teaching environment.
    4. Per capita credit hours (or courses) taught by senior faculty (full professors).
  6. Supporting Pedagogical Innovation (Objective 2)
    1. Usage rate of technological support.
    2. Increase in number of teaching projects supported by the office of the provost.
    3. Has team teaching across colleges or disciplines increased?
  7. Undergraduate Student Quality and Diversity (Objective 6)
    1. Standard student quality measures for enrolled students.
    2. Competitive position of financial award packages.
    3. Acceptance rate, retention rate.
    4. Costs of a Cornell education by income quintile.
    5. Has the percent of underrepresented minorities in each entering class grown? Has it reached 20% or greater?
    6. Quality of URM educational experience: percent of URMs who are deans' scholars or whose GPA is in the upper quartile.
    7. Senior student survey data on minority experience and perceived educational benefits from a diverse community.
    8. Student survey data on perceptions of academic and personal dimensions of student life at Cornell.
  8. Graduate Student Diversity and Quality (Objective 7)
    1. Standard student quality measures.
    2. Competitive position of stipend and benefit levels.
    3. Graduate student surveys (overall satisfaction; perceptions of support for research and of opportunities to develop as teachers).
    4. Have graduate fields been reduced in number or consolidated?
    5. Do more fields have teaching components in their graduate programs?
    6. Have fellowships for entering graduate students increased?
    7. Quality of job placements (graduate and professional students).
    8. Has the percentage of women in graduate fields or professional programs reached the pipeline or 20% (whichever is higher)? Has the percentage of underrepresented minorities reached the pipeline, 20%, or the federal standard (whichever is higher)?

Indicators for Research, Scholarship, and Creativity

  1. Leadership Position/Department or Program Stature (Objectives 1 and 2)
    1. Have dimensions for comparing departments to peers been defined, and are they being used by departments and colleges to track changes?
    2. Number and quality of faculty publications, appropriate to discipline or field (e.g., citation data, journal publications, book publications, qualitative assessments).
    3. Scholarship evaluations in regular external reviews of departments.
    4. Placements of Ph.D. graduates and of postdoctoral associates.
    5. Metrics on grant support (where appropriate), e.g., percent of faculty who are PIs on external grants; total sponsored research per FTE faculty; proposals submitted/successful, etc..
  2. Support for Interdisciplinary Initiatives (Objective 3)
    1. Inventory new interdisciplinary initiatives emerging from the faculty, how they were nurtured, and the impact to date (actual or potential).
    2. Annual review of faculty search results, assessing cross-departmental and cross-disciplinary impact.
    3. Have academic units reviewed tenure and promotion policies to determine if interdisciplinary work of untenured faculty is evaluated in appropriate and effective ways?
    4. Assess the success of seed funding programs on a regular basis.
  3. Administration and Support for Research Grants (Objective 4)
    Central and Units:
    1. Administrative and support services: annual customer satisfaction surveys-both centrally (OSP, ORIA, SFA) and at unit level.
    2. Ratio of cost of research administration to sponsored funding (expenditures).
    3. Number of proposals submitted per research administrator.
    Central:
    1. OSP-average number of days to execute awards by sponsor category.
    2. ORIA (IACUC, IRB, IBC)-average number of days to review protocols.
    Units:
    1. Quality of quarterly financial management reports.
    2. Number of proposals submitted error-free.
    3. Number and dollar value of awards per unit research administrator.
    4. Number of overdue sponsor reports.
      [Any metrics for improving the administration of external research grants need to be vetted by focus group discussions with faculty PI's and administrators.]
  4. University Library (Objective 5A)
    1. Annually publish the library's acquisition budget by area of research and scholarship.
    2. Publish comparisons to other major research universities with respect to collections, subscriptions, etc.
    3. [Focus groups on library services every year or two with faculty and students from different disciplinary groupings.]
  5. Shared Research Facilities (Objective 5B)
    1. Report annually on each university-supported shared research facility-number of users served, user fees generated, and the dollar amount of externally funded research enabled by the shared facility.
    2. Yearly number (and dollar value) of instrumentation grants submitted for and by the shared facilities, and the success rate.
  6. Ithaca-Weill Collaborations (Objective 6)
    1. Have new synergies or collaborations developed between faculty and graduate students on the Cornell campus in Ithaca and faculty and graduate students at Weill Cornell Medical College and Graduate School? Joint research grants? Collaborative teaching programs? Cross-usage of core facilities?
    2. Has there been improvement in administrative issues or barriers? Is it easier for graduate students to take courses at both locations?

Indicators for Excellence in Public Engagement

For public engagement, the appropriate metrics and indicators should come from the proposed university-wide assessment of public engagement programs and activities. The action items within each objective suggest some things to consider tracking in the interim. For example:

  1. Unified Concept (Objective 2)
    1. Has there been an increase in the use of electronic media and the internet to deliver public engagement (including extension) programs?
    2. Have new innovative connections among disparate outreach or public engagement programs been developed? What impact have these had?
  2. Research Foundation (Objective 4)
    1. Have new or deeper ties to research been developed?
  3. Partnerships with Stakeholders (Objective 5)
    1. Have new partnerships with stakeholders been developed or existing ones strengthened?
    2. Data on technology transfer (patents, licensing).
    3. Have we capitalized on Cornell's base in New York City in new ways?

Indicators for Staff Excellence and Organizational Stewardship

Metrics and qualitative indicators for assessing the objectives for staff excellence and excellence in organizational stewardship should be developed by the professionals in those areas.

Concluding Comment

This framework is provisional. There are many objectives in this strategic plan. While we do believe the institution should move on all of these fronts, some objectives are more important than others, and progress on some is easier to track than on others. Some may receive greater priority now and others be deferred for later. This appendix does not distinguish among the most and least important but does offer methods for tracking the university's movement along them. The purpose is to provide a general framework for the implementation stage of the university strategic plan.