Standard 2 Exhibit Room » Standard 2 Institutional Report
The unit has an assessment system that collects and analyzes data on applicant qualifications, candidate and graduate performance, and unit operations to evaluate and improve the performance of candidates, the unit, and its programs.
|How does the unit use its assessment system to improve candidate performance, program quality and unit operations?
The RSA SOE Assessment System Guide (Guide, Exhibit 2.3.a) is integral to unit evaluation. The Guide was designed to communicate the goals and ensure adherence to unit-wide assessment policies and practices. This comprehensive manual details the system’s principles and procedures, data flow and utilization, unit and program evaluation timelines and accreditation cycles, and includes unit-wide assessment documents and a glossary of terms. The RSA SOE assessment system meets the NCATE acceptable standard through its alignment with the unit’s Conceptual Framework and professional standards and its use of multiple evaluation methods to monitor candidate performance and manage and improve its operations and programs.
The RSA SOE’s assessment system was developed as a collaborative effort that included the RSA SOE Office of Assessment and Research, the University Office of Research, Assessment and Planning (ORAP), and the RSA SOE Assessment, Standards, and Policies Committee (Assessment Committee). The Assessment Committee collaborates with other committees to oversee the assessment system and is composed of faculty and administrative representatives from all unit programs. The Guide (page 25, Exhibit 2.3.a) describes the function of each committee in relation to the assessment system. As detailed in the Guide (pages 4-7, Exhibit 2.3.a), the assessment system reflects the strategic alignment of RSA SOE programs with its Core Values, Conceptual Framework, the University’s goals, and professional, state, and institutional standards.
The assessment system is regularly evaluated through input from educators representing our partner school districts and the RSA SOE Advisory Council that brings together representatives from the professional communities, faculty, and administration in the RSA SOE and the College of Arts and Sciences. A sub-committee of the Advisory Council, the Community School Liaison Committee, focuses on P-12 partnerships. This sub-committee makes recommendations concerning the design of the assessment of teacher candidates, analyzes survey data from graduates and Teacher Mentors and provides other pertinent feedback for the improvement of our programs. Committee members include representatives from the P-12 professional community, clinical supervisors, faculty, candidates and administrators including the director of the Office of School and Community Partnerships (OSCP).
The Guide (page 15, Table 2, Exhibit 2.3.a) shows the integrated qualitative and quantitative data used to evaluate candidate performance against a range of criteria including the Conceptual Framework, RSA SOE Core Values and professional standards. Decisions about candidate performance are based on multiple assessments at admission into programs, appropriate transition points, and program completion. This process is summarized on page 22 of Exhibit 2.3.a. At the micro level, reports are forwarded to program directors to facilitate tracking students at specific transition points in each of the programs: Induction, Exploration, Synthesis, Reflective Practice, and Professional Practice. Induction is the transition point used to evaluate admission or entry into the program. All of the programs within the RSA SOE have established admissions criteria for teacher candidates during the Induction transition point that are aligned to University as well as program-specific criteria (see Exhibit 2.3.b for publications of admissions criteria). Exhibit 2.3.b also provides a summary of admission data for 2009-2010 and 2010-2011 including total number of applicants, and the number accepted and enrolled. For the past two years, approximately 25% of applicants were rejected, and of those accepted, approximately 40% enrolled. Exploration involves coursework that is often foundational in building candidates’ content and nascent pedagogical knowledge. Synthesis is the point at which candidates are able to utilize their growing knowledge in pedagogical courses and fieldwork.
Reflective Practice for our initial programs is the transition point to student teaching at the end of the candidates’ studies. Prior to student teaching for the initial programs, the RSA SOE Assessment Office generates reports on students’ progress, grades, key assessments (fully described on pages 21-24 of the Guide, Exhibit 2.3.a) and, as appropriate, dispositions and field evaluations. The data are used in making decisions about candidates’ status, e.g. clearance for student teaching, probation, and the need for tutoring or student services. Professional Practice is the transition point as candidates exit the programs. Exit surveys are conducted prior to Professional Practice (Exhibit 1.3.i). For the advanced programs, Exploration and Synthesis are combined in coursework, leading to clinical experience during the Reflective Practice transition point.
The unit has taken effective steps to eliminate bias in assessments and is working to establish fairness, accuracy, and consistency in its assessment procedures and unit operations. The principles used to ensure fairness are fully detailed in the Guide (pages 8, Exhibit 2.3.a) and are summarized as follows: 1) transparent processes (with protocols and reports available to all faculty, staff and administrators; select reports available to external partners in P-12 schools); 2) data stored safely and securely (adhering to student privacy and FERPA standards, no identifying information reported, and data grouped in reports); 3) data integrity (accurately collected, processed, analyzed, and reported in coordination with other offices); 4) ease of use (for purposes of data collection and reporting that are clear and concise); 5) reliable and valid data used for evaluation and analyses. The RSA SOE Assessment Committee and the Office of Assessment and Research routinely examine validity and utility by reviewing rubrics and assessments to ensure consistency, accuracy, fairness, avoidance of bias, and the process of data collection and analysis; and 6) updating and compatibility of reporting technologies.
The RSA SOE maintains an assessment system that provides regular and comprehensive information on applicant qualifications, candidate proficiencies, graduate competency, unit operations, and program quality. Key assessment data, in particular, are collected on Moodle (the University’s course management and intranet system) by program and location, with detailed scores on individual elements of related rubrics (see Guide, page 29, Exhibit 2.3.a). Each program has a specific cycle and timeline for reviewing these program-wide data and deciding on actions about current operations, policy changes, or modifications in coursework, and assessments or field requirements (generally an annual review with greater frequency when needed). The data and summaries of key assessments (KA) results by program, track, and location are given in Exhibit 2.3.e for KA 4 and 5, which are consistent across all initial programs (also offered in Exhibit 1.3.d). Please see AIMS for additional detailed KA data by program area. The unit disaggregates candidate assessment data when candidates are in alternate route, off campus, and distance learning programs. See SPA reports for TESOL and Early Childhood Special Education for examples of data disaggregated by Garden City and the Manhattan Center.
The unit maintains records of formal candidate complaints and documentation of their resolution. The RSA SOE Review Board supports the organized process of responding to candidate complaints, serving to hear and mediate in these complaints, and seeking a positive resolution for all parties. The committee, as part of the governance structure of the RSA SOE, is addressed in Standard 6 of this report. The policies, procedures and practices for managing candidate complaints are offered in Exhibit 2.3.f. A file of candidate complaints and unit responses, in aggregated form, is offered in Exhibit 2.3.g (available in the RSA SOE Dean’s office).
The unit maintains its assessment system through the use of information technologies appropriate to the size of the unit and institution. Course evaluations are conducted online via the University’s system for course registration, which expedites the collection of candidate disposition data from faculty. The unit uses Survey Monkey for general surveys among candidates and faculty. Moodle, LiveText and University databases are other applications used in gathering assessment data.
The unit regularly and systematically uses data, including candidate and graduate performance information, to evaluate the efficacy of its courses, programs, and clinical experiences. The unit has fully developed evaluations and continuously searches for more relevant comparisons, revising both the underlying systems and analytic techniques as necessary. Since the first accreditation review, the Assessment Committee has been proactive in implementing the use of transition points for review of candidate progress. Additionally, review of the disposition data has led to further refinement of that assessment tool, described more fully below. The Committee has been strengthened by greater and more consistent representation by faculty from all RSA SOE programs.
In 2010, a newly formed position was created for a full-time Director of Research and Evaluation to oversee the RSA SOE Assessment Office, the administrative support of the assessment system, the coordination of the RSA SOE system with other University offices and assessment systems (including writing reports to outside governing bodies) and to provide much-needed support for the use and analysis of data to determine direction for programs and faculty committees (particularly the Assessment, Diversity, Fieldwork. and Technology Committees).
The appointment of the RSA SOE Director of Research and Evaluation and the enhancement of resources for the Office of Assessment and Research have facilitated the improvement of the effectiveness and efficiency of the assessment system, supporting the increased use of technology-based assessment data sets and reports (on Moodle, for example), and resulting in greater access to data by faculty, administrators, and school partners. There has been increased evidence-based decision-making within the RSA SOE and an enhancement in the culture of assessment. Minutes, for example, from the Assessment, Fieldwork and Technology Committees (available at RSA SOE office) demonstrate the importance of applying data results in informing the Committees’ planning and actions. The revised work from the Office of Assessment and Research has provided faculty, staff, and administrators with more complete, reliable, and valid data to inform strategic and practical goals and decisions. Since the unit’s previous NCATE visit in 2006, the newly drafted Guide reflects updated policies, procedures, and practices for ensuring continuous improvement.
The Assessment Committee and the Office of Assessment and Research have undertaken several important projects, including identifying the need for using multiple data points for the purposes of increasing the validity, reliability, and fairness of assessments and to streamline the reporting process. The number of surveys related to the assessment system conducted by the various faculty committees and given to relevant groups (employers of our candidates and alumni) has increased, while eliminating redundancy in survey content. The feedback process about assessment instruments has also been strengthened, allowing for further refinement of the assessment process (the data validity and utilization studies described above are examples of this, addressing program assessments as well as the system for monitoring candidates’ progress). There is greater consistency and frequency in monitoring and securing missing data from required assessments, resulting in more complete data sets and return rates.
An example of how data has been collected and utilized to improve the unit’s operations involves the creation of the RSA SOE Continuous Improvement Advisory (CIA) team, composed of department chairs, program directors, and directors of faculty committees. Information is gathered, summarized, and synthesized to improve candidate performance, program quality, and unit operations. This group serves as an advisory review board and provides feedback on the data. In Spring 2011, the CIA participated in a data utilization study which included responding to a survey about data-informed changes across the RSA SOE, captured in Exhibit 2.3.h. The table includes program responses to questions about the data that were most valuable for implementing change, specific examples of changes resulting from that data, and suggestions on improving data collection and analysis. The feedback from this study has been used as a source for planning future system and data improvements.
The RSA SOE plans continuous improvement at various levels. There is an effort to expand and enhance field experiences, further integrate technology in both the collecting and reporting of data, increase collaboration among faculty and community partners, and streamline the unit’s administrative procedures.
The Assessment Committee will also continue its pilot work on the dispositions assessment, which involved data collection in Spring 2011. Focus will be placed on the review and further development of the Pathwise assessment as it pertains to our teacher candidates during student teaching in areas such as diversity and technology.
In June 2011, the RSA SOE undertook a unit-wide validation study which entailed faculty review of key assessments to ensure consistency across programs on assessments that measure the same outcome, such as the ability to plan instruction (detailed in Exhibit 2.3.c). The study’s goals included validating rubrics assess the appropriate key assessment topic across programs, as well as selecting the samples to be submitted within the Institutional Report. In total, 219 samples across all key assessments (with candidate names omitted) were submitted by faculty in each program and reviewed by 11 faculty members from across the unit over a two-day period. Forty-three samples were removed because of a problem, such as using the incorrect rubric, it was too old, or there were too many for a specific assessment. Faculty members evaluated each rubric and teacher candidate sample against the appropriate rubric measurement. Inter-rater reliability in the study was initially established as 85%. Faculty reviewers discussed discrepancies until 100% reliability was reached. Three important results were the establishment of a process for reviewing rubrics for quality consistency across the RSA SOE, development of a feedback process to program directors and faculty on rubrics and ultimately, provision of professional development on rubric design.
Future plans also involve data validation studies as well as evaluation of systematic changes at program and unit levels. Recognizing the importance of multiple data collection points, additional validation studies will be undertaken in future summers to identify performance indicators and address whether the data collected at transition points are adequate predictors of candidate success. Two potential studies would be: 1) the relationship between admissions status of incoming candidates at the Induction transition point and their performance during their coursework and field experiences during the Exploration and Synthesis transition points (using data from the key assessments, grades, field reports, and dispositions) and 2) the relationship between candidate performance during the Exploration and Synthesis transition points and successful entry into the Professional Practice transition point (completion of the program, successful passing of NYSED certification tests, and subsequent certification).
There are plans to increase the participation of the professional community at two levels: 1) how the RSA SOE school partners and employers view our candidates’ preparation for the teaching profession and 2) how these partners and employers evaluate the assessment system used to support continuous improvement. With regard to the former, a more extensive survey of partners and employers is planned for the 2011-2012 academic year, including focus groups to address some of the trends identified in our 2010-2011 surveys. The evaluation of the assessment system will also involve regular meetings with P-12 school personnel about continuous improvement, the assessment instruments used in the field, and how assessment processes can be collaboratively strengthened. These conversations began on an informal basis over the past year and will continue in a more formal capacity moving forward.
Additional movement toward strengthening the assessment system involves collaboration among the various faculty committees and programs to be supported through increased coordination of data-sharing across committees. It has become clear during this past academic year that Form A, the assessment tool redesigned by the Fieldwork Committee for use in the field to evaluate candidates prior to student teaching, needs to be piloted in relation to how it will work in combination with other assessment tools, such as the dispositions assessment. The survey results of the Technology Committee with regard to use of technology by student teachers will have significant impact on recommendations by the Fieldwork Committee during the 2011-2012 academic year, leading to further strengthening of candidates’ use of technology in clinical and school settings.
At the level of unit operations, the RSA SOE plans to broaden its assessment of administrative office support, including candidate input, to streamline procedures and improve efficiency in record-keeping and data coordination among offices. The expanded use of information technologies, such as digitization of documents to expedite document sharing, is also integral to the assessment system’s continuous improvement.
Plans for the Spring 2012 faculty retreat include a review of status and program changes. As we move forward, we continue to explore ways to ensure data integrity and further refine our assessment system to improve candidate performance and enhance RSA SOE programs.