
278418e87f5d9dc6b639a4e34f7ecc48.ppt
- Количество слайдов: 42
Educator Preparation Advisory Council (EPAC) Subcommittee Reports and 2015 Action Plan December 5, 2014 Slide 1
Welcome and Introductions Sarah Barzee, Chief Talent Officer, CSDE Elsa Nunez, President, Eastern CT State University Slide 2
Purpose of the meeting is to report progress to date and seek your feedback on: EPAC Progress Report to the State Board on Nov 5 EPAC Subcommittee preliminary recommendations CEEDAR IHE Team progress report Action plan for 2015 Slide 3
Non-Purpose Approve final recommendations because: u All three subcommittees are not yet at the final recommendation stage u Other factors to impact our final design of data & accountability system and program review process: § New procedures for accreditation yet to be released by the Council for Accreditation of Educator Preparation (CAEP) § Proposed Title II Higher Education Act regulations just released by US Dept of Education Slide 4
EPAC Principles for Transformation of Teacher and School Leader Preparation 1. Program Entry Standards 2. Staffing & Support of Clinical Experiences 3. Clinical Experience Requirements 4. District-Program Partnerships & Shared Responsibility 5. Program Completion & Candidate Assessment Standards 6. Program Effectiveness & Accountability Slide 5
Principles were based on EPAC Recommendations Identified need for more rigorous and relevant: Preparation of teachers and school leaders aligned with the needs of students, schools and districts Standards for entry through exit from preparation programs Data/Accountability systems Reform of state program approval system for continuing and new programs Slide 6
Beliefs The EPAC Principles were developed and synthesized within the context of three beliefs: Do No Harm. The policies governing and requirements for teacher preparation programs must be based on practices that are demonstrated to have a positive impact on teacher effectiveness and student learning. Encourage Innovation. Where there is a reasonable expectation of positive outcomes but limited evidence or data exists, Connecticut should further explore the practice and encourage innovation by preparation programs, districts and other stakeholders. Be Aspirational. The CSDE should lead with high aspirations for the state’s educator preparation programs, setting rigorous standards and expectations for all educators to ensure every student has an excellent teacher. Slide 7
2014 Activities 3 working subcommittees in 3 areas: u u u Program Review Data and Accountability System Assessment Development Subcommittees met between Feb-Nov 2014 and included EPAC members and additional K-12/IHE representatives: u u u Data – 7 meetings Program Review – 4 meetings Assessment Development – 2 meetings Slide 8
Presenters Role Irv Richardson, Consultant Facilitation of Q & A following each subcommittee report out Ken Di Pietro Superintendent, Plainfield Report out on Program Review Colleen Palmer Superintendent, Weston Report out on Data & Accountability Nancy Hoffman CCSU Professor Report out on Assessment Development Suzanne Robinson CEEDAR Center Facilitator University of Kansas Report out on CEEDAR Irv Richardson Action Plan Development Slide 9
Program Review Subcommittee Report Slide 10
Model Guiding Work of EPAC Subcommittees SBE PROGRAM APPROVAL DECISION Based on evaluation of qualitative and quantitative data Qualitative Review of programs based on qualitative criteria and evidence (curricula/syllabi, program reports, interview data, etc. ) with focused review of individual programs if accountability data indicates performance issues. Quantitative Review Collection, analysis and reporting of multiple data sources to monitor individual program quality and improve program effectiveness. Slide 11
Program Review Proposed recommendations: l Standards for Program Review: Adopt the five broad standards of Council for Accreditation of Educator Preparation (CAEP) standards l Review Process: Adopt some or most of the CAEP visit process with state teams participating in joint process with national team members. Awaiting December 2014 release of CAEP accreditation and visit process details. l Additional State Process: State team will conduct “focused review” of programs identified as at-risk or low-performing based on accountability data and issue an addendum to CAEP report with their findings. Slide 12
CAEP Standards Standard 1: Content and Pedagogical Knowledge Standard 2: Clinical Partnerships and Practice Standard 3: Candidate Quality, Recruitment, and Selectivity Standard 4: Program Impact Standard 5: Provider Quality Assurance and Continuous Improvement Slide 13
Program Review Proposed recommendations: Commence drafting new program approval regulations outlining: u Definitions u Minimum requirements such as admission and exit, clinical experiences, etc. u Approval Cycle u Decision rules that combine qualitative & quantitative data and recommendation to the State Review Committee u Approval decisions by the Board and procedural requirements by level of approval u Just cause to conduct off-cycle review u Other policies concerning new program approval, low enrollment programs, etc. Slide 14
Data and Accountability Subcommittee Report Slide 15
Design of Data System To serve three purposes: u Public Profile Data u Program Improvement Data u Accountability Data Slide 16
Profile Data https: //title 2. ed. gov/Public/Report/State. Home. aspx Slide 17
Profile Data Profile of completers/graduates by program Completers in shortage areas Completers by race/ethnicity Employment data Slide 18
2013 Profile Data (2011 -12 Academic Year) Total Enrollment 5716 Female/Male 75%/25% Total Completers 2092 Race/Ethnicity Enrollments Hispanic 290 Am Indian/Alaska Native 15 Asian 88 Black/Af Am 222 White Enrollment 4606 Other 54 N FTE Faculty Supv Clin Exp 249. 56 N FTE Adjunct Supv Clin Exp 956 N Candidates in Supv Clin Exp 2811 Slide 19
Profile Data Annual employment data Supply and demand data Slide 20
Program Improvement Data Slide 21
Program Improvement Data Slide 22
Accountability System 4 Categories and 12 indicators of accountability: o Program selectivity, entry and completion o Candidate pre-service performance Candidate employment, persistence, in-service performance District Partnership Leadership (institutional level data) o o NOTE: Some indicators require measures yet to be developed, piloted, implemented (see assessment subcommittee). Accountability decision rules will result in: o o Program designation as Effective, At-Risk or Low-performing Focused qualitative program review by state team Slide 23
Accountability System Decision rules ultimately will lead to identifying a program (not institution) as: u u u Effective At-Risk or Low-performing Recommendations to include high level designation Proposed federal Title II Regulations require 4 rating levels: the three listed above + Exceptional Link with the qualitative review of educator preparation programs (EPPs) through the state program review process Link indicators with new Title II Higher Education Act Slide 24
*Proposed Title II Teacher Prep Regs Key Indicators to be reported annually by states must minimally include: u Employment outcomes: New teacher placement and three-year retention rates, including in high-need schools u Teacher and employer feedback: Surveys on the effectiveness of prep u Student learning outcomes: Effectiveness of new teachers as demonstrated through measures of student growth, performance on state or local teacher evaluation measures that include data on student growth, or both, during their first three teaching years u Assurance of specialized accreditation, or evidence that a program produces candidates with content and pedagogical knowledge and quality clinical preparation, who have met rigorous entry and exit requirements. *60 -day public comment period with the final rules to be published in mid-2015. Slide 25
Consider CAEP Requirements Slide 26
Consider CAEP Requirements Slide 27
Accountability Categories, Indicators, Weights and Decision Rules Category Indicator 1. Program selectivity, completer/graduation rates (CAEP) entry and completion admission selectivity criteria and goals 2. Candidate preservice performance pass rates by program for external assessments (including Praxis II, ACTFL, Foundations of Reading, CAT, etc. ) (CAEP) pre-service performance assessments (CAEP) numbers initial employed in CT schools (of those candidates residing in CT using Dept of Labor data using occupational code) (CAEP) employment of completers in hard to staff or high-need schools 3. Candidate employment, persistence and inservice performance persistence rate: years in field after 1 st and 3 rd year of teaching or school leadership/admin/special service (CAEP) surveys of candidates 1 -3 years from program completion and feedback on readiness for service (identify how many years out of preparation/distance away from completion date and how many/% stay in CT) (CAEP) surveys of employers about candidates readiness 1 -3 years from program completion (Supt will identify who is to receive these surveys supt or designees) (CAEP) 4. District Partnership Leadership (institutional level data) summative teacher level educator evaluation data (CAEP) surveys of superintendents regarding shared responsibility and shared accountability with preparing institution partners surveys of deans of education regarding shared responsibility and shared accountability with district partners Slide 28
Assessment Development Subcommittee Report Slide 29
Assessment Development Proposed recommendations: 1. Develop/consider of the following assessment measures: § New Teacher and Employer Feedback Surveys (CSDE administered) § New School Leader and Employer Feedback Surveys (CSDE administered) § § Pre-Service Performance Assessment Measure of IHE/District Partnership Quality 2. Do not develop a statewide student teaching instrument with required training for cooperating teachers and univ supervisors due to time, cost and capacity issues, Slide 30
Assessment Development On Nov 20, the subcommittee heard presentations on the following pre-service performance assessments: u National Observational Teacher Exam (NOTE), which includes a performance assessment component and content knowledge for teaching items u ed. TPA, a portfolio assessment that includes 5 parts assessed against 15 rubrics u Pre-Service Performance Assessment (PPAT) which includes 5 parts but only 3 assessed against rubrics Slide 31
Assessment Development Proposed recommendations: 3. At this point, the subcommittee recommends further review of ed. TPA and other supporting policies: § § § Review of the 15 rubrics Review of assessment handbooks and training outlines Consider adoption for a set period of time, including a pilot study Develop a fund to pay for this assessment for low-income candidates (20 -25%) Reduce other testing requirements once adopted Note: Data from ed. TPA will meet requirements for CAEP accreditation and the Accountability System Slide 32
Collaboration for Effective Educator Development, Accountability, and Reform (CEEDAR) Report Slide 33
Collaboration for Effective Educator Development, Accountability, and Reform CEEDAR Funded by the US Office of Special Education Programs (OSEP) with goal of improving outcomes for students with disabilities as a technical assistance grant to support states engaged in: Reforming teacher and leadership preparation programs to embed evidence based practices Revising licensure standards to align with reforms in teacher and leader preparation Refining personnel evaluation systems in teacher and leader preparation programs Realigning policy structures and professional learning systems Slide 34
CEEDAR • Connecticut’s state goal is to focus on the design and implementation of pre-service curricula for all TEACHER candidates (special education and non-special education) in order to provide opportunities to learn and demonstrate competency in evidencebased practices to improve core and specialized instruction to support SWD, ELLs and struggling learners reach college- and career-ready (CCR) standards in reading, writing and comprehension skills in argumentation. • Faculty teams collaborate to evaluate and revise syllabi based on national research and identified essential elements contained in the innovation configurations in literacy, writing and culturally responsive practice. • Revised curricula evaluated by external experts and provide feedback • Scale up with other Connecticut IHEs Slide 35
CEEDAR Syllabi reviews require identification of gaps, redundancies and priorities in program curricula relative to the CEEDAR Innovation Configurations (ICs) and evidence -based practices (EBPs) Faculty must review the content taught relative to ICs and the level of practice/competency expected of candidates in current courses Program syllabi are revised based on goals for preparing candidates to teach students to achieve core standards in literacy and argumentation reading, test, lecture/presentation, Level 1: discussion, modeling/demonstration, or quiz observation, Level 2: projects/activity, case study, lesson plan study tutoring, small group Level 3: student teaching, whole group internship Slide 36
CEEDAR Excerpt from IC for Literacy Grades 6 -12 RS 5. 1 RS 5. 2 RS 5. 3 RS 5. 4 RS 5. 5 RS 5. 6 RS 5. 7 RS 5. 8 RS 5. 9 RS 5. 10 RS 5. 11 RS 5. 12 Content Area Literacy & Disciplinary Literacy English/Language Arts: Author’s purpose, point of view, theme English/Language Arts: Literal & implied meaning of text Social Studies/History: Sourcing of primary documents Social Studies/History: Contextualization Social Studies/History: Summarization Social Studies/History: Corroboration Science: Scientific meaning of vocabulary Science: Relationships among concepts Science: Interpretation of graphs, charts, & formulas Mathematics: Vocabulary of mathematics, Greek symbols Mathematics: Mathematical communication Mathematics: Alignment of mathematical representations with text explanations 37
2015 -2016 Action Plan Slide 38
Timeline Benchmarks Work within and towards implementation dates of: CAEP Implementation of new data-based accreditation process – Fall 2016 Federal Title II Report Requirements, u Starting in 2016 -17, reporting of all new data on 2015 -16 completers, for earliest first year of teaching in 2016 -17, for the April 2018 pilot Title II report by U. S. Secretary of Education in u Begin reporting 4 -level program performance ratings in 2017 -18 Slide 39
2015 NTEP Implementation Plan Proposed plan for Nov – Oct 2015 for: Stakeholder Engagement Program Approval Development Data Collection, Analysis and Reporting Licensure/Certification Slide 40
Action Plan Development Policy Area Pilot /Review Date Implementation Date Data & Accountability System Design Assessment Development Program Review/ Approval and Related Regulations Certification Regulations Supply & Demand Report Meeting Dates (virtual or in-person) Slide 41
Follow-up and Thank You We will follow-up with you about our final action plan including future convenings of full EPAC Subcommittees will continue to meet spring 2015 We will focus on strategies to increase stakeholder engagement and communication THANKS to EPAC, subcommittees members and national and state colleagues for your support and contributions to the work of educator preparation reform Slide 42