Skip to main content

Standard 2: Assessment System and Unit Evaluation

STANDARD 2.   ASSESSMENT SYSTEM AND UNIT EVALUATION

The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs

2.1 Assessment System and Unit Evaluation

Unit Assessment System

The unit assessment system involves a comprehensive and integrated set of metrics to inform data driven decisions related to candidate and program performance in the preparation of teacher educators. Collection and analyses of applicant qualifications, candidate and graduate performance, evaluations of the unit's conceptual framework (CF), as well as professional and state standards, support data driven decisions. The unit aligns assessments with the CF, state and Interstate Teacher Assessment and Support Consortium (InTASC) standards, as well as VCSU Abilities at the initial level. The unit aligns assessments with the CF, state and National Board of Professional Teachers (NBPTS) core propositions, and VCSU Core Values at the advanced level. At both the initial and advanced levels, the unit's CF focuses on candidate proficiencies to plan, implement, evaluate, and reflect on lessons for learning. Assessment data gathered at key transition points heighten the unit's awareness of both candidate and program areas of strength or weakness.

The unit's professional community has been involved in the development and use of system data on a regular basis, with unit data reports shared among unit faculty each August, as well as program level data shared with representatives from each academic area in September and February. The advanced program faculty members meet each August for data sharing and discussion for program improvements. With regularity, the School of Education and Graduate Studies (SEGS) and Teacher Education Committee (TEC) meet and receive data updates. The routine sharing of data opens the door for ongoing discussions regarding changes in field experiences, course offerings, curriculum, or policies.

The unit shares data with stakeholders from P-12 partner schools each August. The purpose of the work sessions is to provide open communication and create awareness with P-12 partners about the unit's assessments and data. The process strengthens P-20 relationships and yields insights from P-12 educators for program improvement. Based on the initial program experiences, the advanced program decided to host a P-20 data sharing and focus group session in 2014.

The unit's comprehensive system of assessment includes a variety of internal and external evaluations of candidate performance completed at multiple decision points. Unit faculty members use grades, disposition forms, and portfolio evaluations to assess initial program candidates internally. Candidates are assessed externally through cooperating teacher evaluations, surveys of first year teachers and employers, and external testing services such as the Educational Testing Services (ETS) Praxis tests that measure basic skills as well as content and pedagogy. Grade Point Average (GPA), dispositions, field experiences, action research and the capstone portfolio provide data on candidates in the advanced program. Candidates complete self-assessments at key transition points in the advanced program and an alumni survey after completion. Data from the candidates' performances affect decisions for the individuals to progress through the program. The aggregated and disaggregated data influence unit decisions for improvement.

The unit has collaborated with 13 other Bush Grant Educator Preparation Providers (EPPs) since 2010 to develop common assessments. The input from multiple assessment coordinators, the reliability and validity testing of the assessment instruments have helped the unit to address concerns for fairness, accuracy, and consistency. Gathering data from multiple sources helps the unit eliminate bias. Multiple faculty members at both the initial and advanced levels assess portfolios for graduation. Unit faculty members are able to compare evaluation ratings to avoid bias and inter-rater reliability concerns.      

The assessment system aligns closely with the unit assessment needs and the voice of VCSU faculty, division assistants, P-12 educators, web application developers, the dean, and the assessment coordinator enhances it. The web application developer, who originally worked with the assessment coordinator to design the system in 2007, has effectively updated the unit's technology and tailored the assessment system to meet the needs of the unit.

Data collection, analysis, and evaluation

The unit's assessment system provides regular and comprehensive data on candidates as well as the program at both the initial and advanced levels. The unit's division assistants play a major role at both the initial and advanced levels to maintain the data entry and tracking of candidate progress. The unit has prioritized release time for the assessment coordinator and in 2010 increased administrative assistant support from a half time to a full time position in order to provide significantly more time and effort into maintaining the data in the unit's CAS. Continuity of the unit's personnel working closely with the system over the past four years has enabled the unit to move from a mode of system maintenance to an environment of continuous technological and process improvement in the areas of data collection and data sharing with stakeholders.

The unit systematically collects data from candidates, graduates, faculty, cooperating teachers, and administrators. Candidates and graduates complete surveys while faculty members assess portfolios and enter candidate disposition information at both the initial and advanced levels. Cooperating teachers assess candidate professional dispositions and pedagogical skills during field experiences and employers provide feedback on the performance of graduates. The use of multiple measures from both internal and external sources enables the unit to assess candidate progress and determine program strengths and weaknesses.

All the institution's academic areas with education majors utilize the data that are systematically collected, aggregated and disaggregated in the CAS. Program reports also include disaggregated data for off-campus, distance learning, and online programs. The assessment coordinator is responsible for sharing the summarized data through regularly scheduled unit and program reports as well as making data available for specific decision-making discussions related to candidates or the program. Data analyses or summaries are made available for public viewing for the benefit of academic programs and stakeholders involved in improving candidate performance, the program's quality or its overall operations.

Each August, faculty review the compiled, summarized, analyzed, and shared unit data during welcome week workshops. In September and February, representatives from each academic area receive disaggregated program data. Programs use the data in the development of state reports, as well as Annual Program Update reports required by the institution. The unit shares data with stakeholders from partner schools each August. The purpose is to provide open communication, build relationships, and collaborate to gain insight for program improvement.

Candidate surveys at the initial and advanced levels have provided an avenue for candidates to express positive feedback and for the unit to discover areas of concern. The unit has a process in place for formal candidate complaints and their resolutions. The dean's office keeps confidential records of formal candidate complaints, appeals, and resolutions. Candidates in the advanced program can appeal decisions by submitting a petition to the director of graduate studies and research.

The unit has made a commitment to build, support and maintain its own Central Assessment System (CAS). The robust system manages unit data for candidate information on dispositions, field experiences, standardized test results, entry surveys, exits surveys, advanced program self-assessments and action research as well as portfolio data at the initial and advanced level. The web application developer works closely with the unit to make changes to the CAS by adding new assessments or administrative features to improve efficiency. Unit faculty and candidates have "24/7" portal access to data related to candidate status in the program and the unit's primary assessments.

Use of data for program improvement

The unit is continuously assessing its program and making changes based on data, research in the field, collaborative ideas shared with other EPPs, and valuable insights gained from experiences with P-12 partners. Research based and data driven decisions have become standard practice and part of unit culture. Data available in the system are vital to the unit's efforts for the improvement of candidate performance, program quality, and unit operations.

Appropriate faculty members consider unit data each August and program data each September and February. Unit faculty members also have opportunities to be involved in the annual data sharing work session with stakeholders. The unit's sharing of data with faculty, program representatives, other EPPs, candidates, and P-12 partners has led to informed decisions of program improvement. Faculty members are involved in data driven decisions through program meetings, unit meetings, or through representation on the TEC. The unit has completed follow-up studies to ensure its major changes have strengthened the program without adverse effects.

The advanced program data indicate that candidates are experiencing confidence gains in the unit's CF and Core Values as well as in all areas of the NBPTS. The analysis of data indicates a number of strengths in the program. In addition, the advanced program data has led to some important changes noted in the continuous improvement section.

Continuous Improvement

Summarize activities and changes based on data that have led to continuous improvement of candidate performance and program quality.

The SEGS unit is continuously assessing its program and making changes based on data, research in the field, collaborative ideas shared with other education preparation providers (EPPs), or valuable insights gained from experiences with P-12 partners. The unit's assessment system was in its infancy in 2008. Now the unit's Central Assessment System is a regular part of the unit's everyday operations. The unit regularly gathers, collects, analyzes, summarizes, and shares data with its faculty, P-12 partners, stakeholders, and other EPPs more than the unit may have even thought possible in 2008.  

Research based and data driven decisions have become standard practice and part of the unit's culture. Examples of data regularly gathered for analysis in the initial program:

  1. Praxis I or Pre-Professional Skills Test (PPST)/ The PPST has transitioned to the Core Academic Skills for Educators (Core) Test - a new basic skills test for reading, writing, and mathematics
  2. GPA
  3. Praxis II (Content and PLT)
  4. Pre-student teaching field experiences
    1. Introduction to Education practicums
    2. Diversity practicums
    3. Methods practicums
    4. Dispositions
    5. Student teaching
    6. Teaching for Learning Capstone (TLC) unit
    7. Senior portfolio
    8. Exit (student teachers) surveys

10. Transition to teaching (1st year teachers) surveys

11. Supervisor (employer) surveys

Data gathered from multiple sources help the unit eliminate bias. The unit has been fortunate to work with 13 other EPPs in the Network for Excellence in Teaching (NExT) initiative since 2010 to develop common assessments. The Bush Foundation funded efforts did not set out to include common assessments, but the project emerged. The unit's assessment coordinator was fortunate to be involved in numerous meetings with 13 other assessment coordinators in the development of the four instruments that all 14 institutions use for the sake of gathering and sharing data.

The common metrics include four surveys:

  1. An entry survey administered at the time of the candidate's first education course.
  2. An exit survey administered at the end of student teaching, shortly before graduation.
  3. A transition to teaching survey distributed to first year teachers.
  4. A supervisor survey administered to the employers of the first year teachers.

The advanced program has been continuously assessing its concentrations and making changes based on data, research in the field, and feedback from concentration faculty and chairs through the Graduate Council. The advanced program receives data updates and opportunities for discussion each August. Research based and data driven decisions have become standard practice and part of the culture for the advanced program.

Examples of data gathered regularly for analysis:

  1. Portfolio assessments
  2. Research assessments
  3. Conceptual framework assessments
  4. Self-assessments
  5. Grade point average
  6. Alumni surveys
  7. Employer and educator focus meeting

The initial program uses assessment data from candidates, faculty, graduates, and employers to manage and improve the operations and programs of the unit. The analysis of data and the unit's active engagement with other universities has led to numerous program improvements.

Unit research and analysis of data has led to the following significant changes:

  1. Increased GPA requirements for program admission
  2. Follow-up research on GPA requirement changes and student teacher data
  3. Increased length of the student teaching experience
  4. Follow-up research on cooperating teacher evaluations for student teachers with increased length of experience

Analysis of unit data and NExT institution common metric data have impacted School of Education decisions to strengthen curriculum in the following areas:

  1. Differentiated instruction
  2. Strategies for teaching English language learners
  3. Effective application of technology
  4. Formative assessment
  5. Engaging students in higher level thinking skills
  6. Classroom management
  7. Implementing TLC component pieces into the curriculum

Analysis of current trends, research and collaborative efforts with other educator preparation providers led the unit to doing more in its curriculum with

  1. Response to Intervention (RTI)
  2. Common core standards
  3. Co-teaching
  4. The recruitment and retention of teacher candidates
  5. Data sharing among partner institutions
  6. Follow-up with graduates as they enter the teaching profession
  7. Factor analysis of reliable and valid assessment instruments
  8. The unit's version of the edTPA, called the Teaching for Learning Capstone (TLC) unit, is completed by all teacher candidates with a purpose of strengthening their:
    1. Unit and lesson planning skills.
    2. Emphasis on specific elements of the implementation of lessons.
    3. Development of strategies to formatively assess student learning and apply feedback that is timely and descriptive.
    4. Ability to reflect on what they have done well and how they can improve teaching and learning in the future.

Advanced program data indicate positive overall growth in the unit's CF and Core Values ratings from candidates, as well as in areas of NBPTS core propositions.

The analysis of data indicates a number of strengths in the program and has also led to the following improvements:

  1. Development of two additional concentrations
  2. Revision to action research template and chapters
  3. Strategies to increase retention and completion rates
  4. Program changes resulting from alumni survey feedback
  5. Data sharing within concentrations

Analysis of the candidate assessment and alumni surveys indicate a need to strengthen curriculum in the following areas:

  1. Student assessment (formative and summative)
  2. Strategies for working with parents and community
  3. Strengthen supervisor/leader skills
  4. Curriculum development and planning

The assessment system was developed for data collection and analysis of applicant qualifications, candidate and graduate performance, and the evaluation the unit's conceptual framework (CF). The assessment system aligns with university Abilities and Core Values, as well as the professional and state standards, to gather data that measures program quality and supports the unit's data driven decisions. At both the initial and advanced levels, the unit's CF focuses on candidate proficiencies to plan, implement, evaluate, and reflect on lessons for learning. Assessment data gathered at key transition points heighten the unit's awareness of both candidate and program areas of strength or weakness.

Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard.

Institutional leadership and the unit's dean have increased the expectations for programs to assess learner outcomes and to provide evidence for Annual Program Update (APU) reports required by each academic area. The institution and unit trends have pointed toward an increased commitment to assessment that bodes well for the unit's potential to sustain and enhance assessment performance through continuous improvement based on data driven decisions.

The unit has increased its support for assessment and developed a team of personnel working with assessment that have far more experience in realizing the potential uses for its data. Continuity of all the unit's personnel working closely with the system over the past four years has enabled the unit to move from a mode of system maintenance to an environment of continuous technological and process improvement in the areas of data collection and data sharing discussions with stakeholders.

The North Dakota Education Standards and Practices Board (ESPB) selected the assessment coordinator to be a member of two state Board of Examiner's teams with the role of reviewing NCATE Standard 2 for other institutions. The assessment coordinator has also benefitted greatly from working with assessment coordinators from 13 other EPPs on Bush Grant Common Metric assessments. The experience of working on committees and subcommittees with consultants and assessment coordinators from 13 other institutions has been meaningful in the professional development for unit's assessment coordinator. The assessment coordinator has been engaged in the process of establishing learning targets, standards, assessment items, and hearing the input from multiple assessment coordinators. The assessment coordinator has gained experience with the psychometric analysis of the common metric assessments, the reliability and validity testing of the instruments, and has been able to utilize the common metric assessments to help the unit address concerns for fairness, accuracy, bias, and consistency.

The unit is committed to using data to improve teacher preparation. The unit's practice of gathering, collecting, analyzing, summarizing, and sharing data with unit faculty, P-12 partners, stakeholders, and other EPPs on a regular basis has enabled the unit to improve its program since 2008. The unit will stay committed to using data to make informed decisions concerning its candidates, the program, and the unit operations.

Areas for Improvement Cited in the Action Report from the Previous Accreditation Review

The unit does not regularly and systematically collect, analyze, summarize, and share data to make program and unit improvements.

The unit regularly and systematically collects, analyzes, summarizes and shares data to make program and unit improvements. The unit and institution's commitment to building a culture of assessment and the use of evidence to inform decision-making has become increasingly apparent since the 2008 NCATE visit. The unit has increased its support for assessment and developed a team of personnel working with assessment that have far more experience in realizing the potential uses for its data. Continuity of unit personnel working closely with the system over the past four years has enabled the unit to move from a mode of system maintenance to an environment of continuous technological and process improvement in the areas of data collection and data sharing discussions with stakeholders.

Faculty members have continuous access to their advisee's candidate summary information through the unit's Central Assessment System (CAS). The data helps candidates identify their status in the program and opens the door for advisor/advisee discussions for improvement or future action to advance in the program.The goal of the assessment system is to use candidate and program data to improve teacher preparation.

Each teacher education program has at least one representative at the unit's annual data sharing session in mid-August, prior to the start of the academic year. Faculty members teaching advanced program courses take part in a parallel session during the same week in August. The allotted workshop time with faculty members provides the assessment coordinator a regular opportunity to share data prior to the beginning of the academic year. The data analysis and response to findings that are shared and utilized by the unit to communicate with representative faculty and advisory groups to consider areas for curriculum or field experience improvement.

Program level data sharing happens regularly with representatives from each teacher education program in September and February. Each program examines its own data to find causes for celebration and/or improvement. The data are also helpful as programs write their institutional Annual Program Updates (APUs) and state reports.

Regular data sharing occurs at School of Education and Graduate Studies (SEGS) meetings and Teacher Education Committee (TEC) meetings. The unit also shares data with stakeholders from partner schools each August. The process strengthens P-12 partner relationships, and yields insights from P-12 educators for program improvement. Data sharing occurs with advanced program faculty each August and as requested with the Graduate Council. The advanced program hosted a P-20 data sharing and focus group session in November of 2014.

The SEGS dean has provided leadership by prioritizing the gathering and use of evidence to make informed decisions. The unit has increased credit-load release time for the assessment coordinator position from three to nine credits since 2008. The change from one-quarter to three-quarters release time has enabled the assessment coordinator to share data and communicate more frequently with members of the professional community, be involved on multiple state BOE teams and assess standard 2 reports at other EPPs, and work with other EPPs to improve assessments. The increased release time has also provided the coordinator opportunities to engage in activities that support the unit's data driven decisions for change. The additional time has aided the coordinator's opportunities to attend conferences, Bush Grant common metrics retreats, as well as successfully write and follow-through on the execution of a grant for improving campus assessment. The funding for the grant enabled the unit to help strengthen its assessment system practices as well as benefitted other areas on campus that assist in the unit's assessment efforts. Most importantly to the unit, the additional credit-load release time has enabled the coordinator to spend more time analyzing data, writing summaries, and organizing data for assessment sharing events.

Involvement with Bush Grant Common Metrics and other EPPs

The assessment coordinator has benefitted greatly from working with assessment coordinators from 13 other EPPs on Bush Grant Common Metric assessments. The benefits of working with the Bush Grant NExT initiative include funds to:

  • hire an additional division assessment assistant who works with the overall SEGS efforts, but spends a significant amount of time working with assessment. The institution realizes the importance of assessment work and has made a commitment to sustaining the position beyond the grant funding.
  • pay stipends to each of the P-12 teachers from several partner schools in the area that attended three field experience work sessions. Field experience representatives from VCSU, North Dakota State University (NDSU), and Minnesota State University - Moorhead (MSUM) collaborated to organize a P-20 work group of 20 P-12 cooperating teachers to develop a common student teacher final evaluation. After piloting and testing the evaluation form for reliability in the spring of 2011, VCSU, NDSU, and MSUM implemented the form for student teacher evaluations beginning in the fall of 2011. VCSU shares its data with these partners to provide opportunities for discussion and improvement among EPPs.
  • provide stipends for P-20 annual data sharing events. P-20 partners examine unit data from cooperating teachers, student teachers, first-year teachers, and supervisors of first-teachers to make recommendations for unit improvement.
  • provide stipends for P-20 follow-up work groups. Assessment data revealed the need for curriculum improvements. The unit created P-20 work groups focused on the areas of differentiated instruction, strategies for working with English language learners, and formative assessment. Even though data were favorable, an additional technology work group was established to keep the unit's pulse in rhythm with technology activities in partner schools.
  • provide stipends for summer Teaching for Learning Capstone (TLC) development and assessment efforts. Both P-12 educators and unit faculty have been involved in the work sessions involving the development of the TLC unit at VCSU.
  • allow travel for the assessment coordinator to Bush Grant common metrics meetings. Fourteen institutions are all using the same entry, exit, first-year teacher (titled Transition to Teaching Survey) and supervisor survey. The coordinator travels to St. Paul several times each year, as well as other locations, to engage in discussions about the development, administration, reliability testing, validity testing, data aggregation and data sharing of the common assessment instruments. The opportunity to gather with 13 other assessment coordinators and the Bush Grant consultants has provided tremendous professional development opportunities.
  • hire independent consultants, Hezel Associates from New York, to administer the first-year teacher and supervisor surveys and aggregate statistical for all 14 institutions. Each institution received an individual report and an overview of the aggregate data of all 14 institutions. The Bush Foundation paid for the reliability, validity, and factor analysis of the common metric survey instruments.
  • receive reports and compare data strengths and weaknesses with that of 13 other institutions. The Bush Grant institutions established a set of data governance rules so institutions do not rank or compare or release the results of other institutions. The results support formative program improvement efforts, rather than favoring publication.

For sustainability purposes, the unit was among five pilot institutions to administer its own surveys and send results to Hezel Associates in New York for the purpose of data sharing and aggregation. Eventually, funding for the aggregation work with Hezel will not exist and the collaborative institutions will need to share data without one entity as the host for the data aggregation. In the spring of 2014, the unit  administered the common metric surveys through VCSU resources and the unit will continue to do so in the future. The unit's assessment coordinator is working with the assessment coordinator at North Dakota State University in Fargo, ND to help other North Dakota teacher education institutions have the opportunity to utilize the common metric assessments.

2011 Bush Grant Proposal: Request for Campus Level Data and Assessment Resources

In addition to the unit's collaborative group assessment efforts, the unit submitted a proposal for a one-time grant of $52,340.50 to enhance the university's assessment resource capacity to collect, compile, share, and analyze data by adding an assistant in the registrar's office who was proficient at writing queries. The unit also requested funding to hire a student assistant to help the university's web application developer who writes code for the unit's CAS. The grant supported query training for the registrar's office staff and the division assessment assistant. The division assessment assistant has utilized the training frequently and efficiently to assist the unit in various assessment requests.

The grant provided funding for university offices that strongly support the unit with its assessment data and also helped the unit to build stronger bonds across campus. The one-time grant offered money for assessment with a stipulation that the institution makes a commitment toward matching funds and sustainable improvements in the area of assessment. The unit was thereby able to improve its assessment effort to directly and indirectly assist its collaborative assessment partners within the university. The institutional matching fund commitments increased the unit's potential for continuous improvement and the sustained assessment efforts.            

The unit's division assessment assistant position became reality in 2010. The newest division administrative assistant has extensive technology skills and is able to commit the time needed to help the unit regularly and systematically collect data. The overall continuity of the three division assistants (two in the initial program and one in the advanced program), the assessment coordinator, and the web application developer have made a positive impact in the unit's ability to regularly and systematically track candidate and graduate performance and use data effectively to improve teacher preparation.

The most recent Higher Learning Commission (HLC) report noted the unit's Central Assessment System (CAS) multiple times as a model for the university to centralize more of its institutional data. The institution developed an academic assessment committee and hired a director of institutional research and assessment. The leadership of the institution's vice president of academic affairs has increased the expectations for programs to assess its learner outcomes and to provide evidence for APU reports required in each academic area. The unit's regular gathering, collecting, analyzing, summarizing, and sharing of aggregated and disaggregated data with the various teacher education programs on campus has proven to be beneficial in educator preparation and also program assessment efforts involving the institution.

The unit's system of assessment has assisted the institution and the institution's support of assessment has been beneficial to the unit. The respect and appreciation for the sharing of meaningful assessment data for informed decision-making has changed substantially in the unit and within the institution as a whole since 2008. The unit understands that following its 2015 NCATE visit that some of its data collection items will change to gather data aligned more closely with new CAEP assessments. The unit has an attitude of continuous improvement and will be committed to making changes in its assessment plans for the future.

Exhibits for Standard 2

2.4.a

Description of the unit's assessment system including the requirements and key assessments used at transition points.

exhibit 2.4.a.1 Unit Assessment Handbook

exhibit 2.4.a.2 Assessments at Key Transition Points

exhibit 2.4.a.3 Mapping of Initial Program Courses and Curriculum

exhibit 2.4.a.4 Mapping of Advanced Program Courses and Curriculum

exhibit 2.4.a.5 Example of Initial Program Assessment Aligned with Standards and Conceptual Framework

exhibit 2.4.a.6 Example of Advanced Program Assessment Aligned with Standards and Conceptual Framework

2.4.b

Admission criteria and data from key assessments used for entry to program exhibit 2.4.b.1 Admission Criteria for Entry to Program

exhibit 2.4.b.2 Action Research on Teacher Education Admission Communication

exhibit 2.4.b.3 Action Research Follow-Up

exhibit 2.4.b.4 Dissertation Research Related to Significant Relationships Among PPST scores, GPA, and Student Teacher Final Evaluations

exhibit 2.4.b.5 GPA Research Follow-Up

exhibit 2.4.b.5b Example of  Data used for Student Teaching Length Decision  and Follow-Up Research

exhibit 2.4.b.6 Admission Criteria for Advanced Program

exhibit 2.4.b.7 Advanced Program Progression Toward Completion

2.4.c

Policies, procedures, and practices for ensuring that key assessments of candidate performance and evaluations of program quality and unit operations are fair, accurate, consistent, and free of bias 

exhibit 2.4.c.1 Fair Assessments

exhibit 2.4.c.2 Entry Survey Analysis (Candidates beginning the program)

exhibit 2.4.c.3 Exit Survey Analysis (Student teachers)

exhibit 2.4.c.4 Transition to Teaching Survey Analysis (1st year teachers)

exhibit 2.4.c.5 Supervisor Survey Analysis (Employers of 1st year teachers)

exhibit 2.4.c.6 Advanced Program Alumni Survey Reliability Analysis

exhibit 2.4.c.7 Advanced Program Capstone Survey Reliability Analysis

exhibit 2.4.c.8 Student Teacher Final Evaluation Analysis

exhibit 2.4.c.9 Contribution to Candidate Fairness at Local, State, and National Level through Multi-State Representation on setting ETS Core Passing Scores

2.4.d

Policies, procedures, and practices for ensuring that data are regularly collected, compiled, aggregated, summarized, analyzed, and used for continuous improvement

exhibit 2.4.d.1 Assessment Overview for School of Education

exhibit 2.4.d.2 Central Assessment System (CAS) User Guides

exhibit 2.4.d.3 Assessment System Booklet

exhibit 2.4.d.4 Usage of CAS Candidate Summary Page

exhibit 2.4.d.5 Candidate Summary Help Document

exhibit 2.4.d.6 CAS Update Request that did not Initially Move Forward into Action

exhibit 2.4.d.7 CAS Update Request put into Action

exhibit 2.4.d.8 Another Example of CAS Update Request put into Action

exhibit 2.4.d.9 2014 Example of Regular Sharing of Annual Unit Report

exhibit 2.4.d.10 2014 Example of Regular Sharing of September Program Report and February

exhibit 2.4.d.11 Example of Email Messages sent to Unit Faculty Communicating about Unit and Program Reports

exhibit 2.4.d.12 2014 Example of Regular Sharing of Annual Advanced Program Data

exhibit 2.4.d.13 Example of Email Messages to Communicate about Advanced Program Reports

Examples of Data that are Regularly Collected, Aggregated, Summarized, Analyzed, and Used for Decision-Making and Continuous Improvement - These reports are utilized on an annual basis. VCSU works with 13 other institutions on these reports and the unit will continue using these assessments and practices after the grant funding no longer exists.

exhibit 2.4.d.14 Example of Exit Survey Comparison Report

exhibit 2.4.d.14a   Entry Survey

exhibit 2.4.d.14b   Exit Survey

exhibit 2.4.d.14c   Transition to Teaching Survey

exhibit 2.4.d.14d   Supervisor Survey

exhibit 2.4.d.15 Calendar for Data Submissions and Reports

2.4.e

Policies, procedures and practices for managing candidate complaints

exhibit 2.4.e.1 Direct Link to Page with Teacher Education Appeals Procedures

exhibit 2.4.e.2 Direct Link to Page with Student Teacher Appeals Procedures

exhibit 2.4.e.3 Master of Education Student Handbook pages 17-18

exhibit 2.4.e.4 Graduate Policy and Procedures

exhibit 2.4.e.5 Petition for Appeal of Graduate Policy Form

exhibit 2.4.e.6   Petition for Appeal of VCSU Policy Form for Graduate Students

exhibit 2.4.e.7 Direct Link to VCSU Student Grievance Policy

2.4.f

File candidate complaints and the unit's responses and resolutions (This information should be available during onsite visit.)

exhibit 2.4.f.1 Outline of Candidate Complaint Summary Response

2.4.g

Examples of significant changes made to courses, programs, and the unit in response to data gathered from the assessment system 

exhibit 2.4.g.1 List of Unit Changes Based on Data

exhibit 2.4.g.2 Examples of Data Driven Decisions

exhibit 2.4.g.3 Example of School of Education Minutes

exhibit 2.4.g.4 Example of Ad Hoc Committee Feedback

exhibit 2.4.g.5a Example of Initial Program Meeting Minutes with Assessment

exhibit 2.4.g.5b Example of Graduate Council Meeting with Assessment

exhibit 2.4.g.6 2014 Example of Assessment Sharing and P-12 Feedback

exhibit 2.4.g.7 2013 Example of Assessment Sharing and P-12 Feedback

exhibit 2.4.g.8 Example of Math Education Program Change

exhibit 2.4.g.9 Collaboration with other EPPs on Common Metrics Assessments

exhibit 2.4.g.10 Collaboration with other EPPs and School Partners on Student Teacher Final Evaluation

exhibit 2.4.g.11 Collaborative Partnerships with other EPPs: Published article and AACTE conference presentation

exhibit 2.4.g.12 Bush Assessment Grant

exhibit 2.4.g.13 VCSU Higher Learning Commission Report References SEGS Assessment Leadership on Campus

exhibit 2.4.g.14 Teaching for Learning Capstone (TLC) Timeline for Changes and Development

exhibit 2.4.g.15 TLC Rubric Session and Follow-Up

exhibit 2.4.g.16 TLC Unit Assessment Session

exhibit 2.4.g.17 TLC Unit Assessment Session for Identifying Candidate Performance and Advising Curriculum Conversations for Improvement

exhibit 2.4.g.18 Data Driven Assessment Discussions for Curriculum Improvement

exhibit 2.4.g.19 Data Shared with VCSU Teacher Candidates Prior to Student Teaching

exhibit 2.4.g.20 Data Shared with Representatives from State Education Boards Regarding Potential Changes in Praxis II Content Tests

exhibit 2.4.g.21 Sharing Teacher Preparation Changes with Legislators

exhibit 2.4.g.22 Data Shared with each Academic Area for Annual Program Update (APU)_ Elementary Education Example

exhibit 2.4.g.23 Collaborative Data Sharing and Value Added Research Discussions with Area P-12 Administrators

exhibit 2.4.g.24 Data Driven Decision to make laptop switch to Mac for Elementary Education

exhibit 2.4.g.25 Feedback for Advanced Program from P-12 Focus Group Data Sharing Session

exhibit 2.4.g.26 Changes and Data Driven Decisions


 

 

 

VCSU Conceptual Framework

VCSU Conceptual Framework

2.4.a.1 and 2.4.d.3 Unit Assessment Booklet

2.4.a.2 Transition Points Poster

2.4.a.3 Curriculum Alignment Map

2.4.a.4 Advanced Program Course Assessment Matrix

2.4.a.5 Student Teacher Evaluations aligned with InTASC

2.4.a.6 Advanced Program Assessment Aligned with Standards and Conceptual Framework

2.4.b.1 Admission Criteria for Entry to Program

2.4.b.2 Action Research on Teacher Education Admission Communication

2.4.b.3 Action Research Follow-up

2.4.b.4 Dissertation on Significant Relationships Among PPST, GPA, and Student Teacher Evaluations

2.4.g.4 GPA Data Driven Decision and Ad Hoc Committee Work

2.4.b.5 GPA Follow-Up Research

Increase in student teaching length from 10 to 12 weeks

Data comparison after increasing the length of student teaching

2.4.b.6 Admission Criteria for Advanced Program

2.4.b.7 Advanced Program Progression Toward Completion

2.4.c.1 Fair Accurate Consistent Assessments

1.4.c.5a and 2.4.c.2 Entry Survey Factor Analysis

1.4.c.5b and 2.4.c.3 Exit Survey Factor Analysis

1.4.c.5c and 2.4.c.4 Transition to Teaching Factor Analysis

1.4.c.5d and 2.4.c.5 Supervisor Survey Analysis

1.4.c.7 and 2.4.c.6 Advanced Program Reliability Analysis of Alumni Survey

1.4.c.8 and 2.4.c.7 Advanced Program Reliability Analysis of Capstone Assessment Survey

2.4.c.9 Contribution to Candidate Fairness at Local, State, and National Level

2.4.d.1 Assessment Overview - Calendar, Schedules, Responsibilities

2.4.d.2 Central Assessment System User Guides 2014

2.4.a.1 and 2.4.d.3 Unit Assessment Booklet

2.4.d.4 Usage of CAS Candidate Summary Page

2.4.d.5 Candidate Summary Help Document

2.4.d.6 CAS update request and future planning

2.4.d.7 CAS update request put into action in summer of 2014

2.4.d.8 Another example of CAS update request put into action in summer of 2014

2.4.d.9 2014 Example of Annual Unit Data Report Shared

2.4.d.10 2014 Example of Program Data Report Shared

2.4.d.11 Example of Email Message Communication about Unit and Program Data Sharing

2.4.d.12 2014 Example of Advanced Program Data Shared

2.4.d.13 Example of Email Communication Regarding Data Sharing

2.4.d.14 Sample of Exit Survey Comparative Data

2.4.d.14a Example of Annual Entry Report Data

2.4.d.14b Example of Annual Exit Survey Report

e-Portfolio created with myeFolio