Скачать презентацию The Missing Link Development of Programmatic Outcomes Christopher Скачать презентацию The Missing Link Development of Programmatic Outcomes Christopher

8b5fa6bf24c2d302ebb024b15eddee2d.ppt

  • Количество слайдов: 27

The Missing Link: Development of Programmatic Outcomes Christopher Meseke, Ph. D. Park University The Missing Link: Development of Programmatic Outcomes Christopher Meseke, Ph. D. Park University

What is Assessment? “As a whole, assessment is a framework for focusing faculty attention What is Assessment? “As a whole, assessment is a framework for focusing faculty attention on student learning and for provoking meaningful discussions of program objectives, curricular organization, pedagogy, and student development” (Allen, 2004). Assessment process is the quality control of the educational

Assessment is -- first and foremost -about student learning. Assessment is -- first and foremost -about student learning.

Assessment Levels Student Classroom Course Program College Division Institution Assessment Levels Student Classroom Course Program College Division Institution

Common Reactions to Assessment Initiatives • • • Ignoring it Bribing someone else to Common Reactions to Assessment Initiatives • • • Ignoring it Bribing someone else to do it Complaining about it Losing sleep over it Sitting down and writing it

Levels of assessment: quality control and assurance Institutional/Programmatic National/State Licensure Exams Certain academic programs Levels of assessment: quality control and assurance Institutional/Programmatic National/State Licensure Exams Certain academic programs (Nursing, Engineering, Physical Therapy, Social Work, etc) Accreditation Council for Graduate Medical Education (ACGME) Higher Learning Commission of the North Central Association of Colleges and Schools

Assessment is important to the accreditation process The Higher Learning Commission of the North Assessment is important to the accreditation process The Higher Learning Commission of the North Central Association of Colleges and Schools (HLC-NCA) requires it. Assessment of student learning provides evidence at multiple levels: course, program, and institutional. Assessment of student learning includes multiple direct and indirect measures of student learning. Assessment results inform improvements in curriculum, pedagogy, instructional resources, and student services.

Interpretation of the HLC-NCA Statement on Assessment on Student Learning, 2003. is the faculty Interpretation of the HLC-NCA Statement on Assessment on Student Learning, 2003. is the faculty members who must take ownership of the assessment process. Their buy-in, ownership, and implementation is directly related to both the mission of the institution and attitudes, knowledge, and skills required for the student to successfully complete the program requirements. It

Goals of Program Assessment Planning Measure student learning, not teaching, curricular content, processes or Goals of Program Assessment Planning Measure student learning, not teaching, curricular content, processes or resources Measure things that are important to us Involve faculty in development of the assessment program, processes and instruments Use multiple measures to produce valid and reliable data (triangulation) Make it manageable and cost affordable

Goals of Program Assessment Planning Map assessment data back to the curriculum for improvement Goals of Program Assessment Planning Map assessment data back to the curriculum for improvement Produce annual reports of assessment outcomes that show the data, interpretation of data, and improvement plans where indicated Same assessment procedures for all campuses and sister programs

Modified Hatfield Assessment Model Meta-Competencies/Institutional Goals Competency Degree to which Competency is achieved Learning Modified Hatfield Assessment Model Meta-Competencies/Institutional Goals Competency Degree to which Competency is achieved Learning Outcome objective indicators Learning events

Programmatic Competencies A measurable, complex behavioral statement may be written for each Key Component Programmatic Competencies A measurable, complex behavioral statement may be written for each Key Component (meta-competency) and competency. Statements reflect what we think we can actually measure in an educational environment considering time and resources.

Program Assessment Learning Competencies Develop a set of Learning Goals (Metacompetencies) which represent the Program Assessment Learning Competencies Develop a set of Learning Goals (Metacompetencies) which represent the attributes of the graduate General Education (Competencies) Discipline (Competencies) Course (Outcomes)

EVALUATION COMPREHENSION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Cite Count Define Draw Identify List Name Point EVALUATION COMPREHENSION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Identify Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Lower Order Bloom’s verbs

EVALUATION COMPREHENSION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Cite Count Define Draw Identify List Name Point EVALUATION COMPREHENSION APPLICATION ANALYSIS SYNTHESIS KNOWLEDGE Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Higher Order Bloom’s verbs/Upper division Course / Program outcomes Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Identify Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate

An Example of Departmental/Programmatic Competencies Students will be able to: Demonstrate biological knowledge appropriate An Example of Departmental/Programmatic Competencies Students will be able to: Demonstrate biological knowledge appropriate for the course level. Demonstrate a working knowledge of the scientific method. Demonstrate the ability to communicate scientific concepts and findings, both in oral and written format. Apply interdisciplinary knowledge to the biological sciences. Demonstrate an awareness of ethical issues in the life sciences.

Mapping Competencies to Courses Mapping Competencies to Courses

Mapping Outcomes to Courses Program Competencies Course 1 Course 2 x Course 3 Course Mapping Outcomes to Courses Program Competencies Course 1 Course 2 x Course 3 Course 4 x x x x Course 5 x x

Mapping Outcomes to Courses Program Competencies Course 1 Course 2 Course 3 Course 4 Mapping Outcomes to Courses Program Competencies Course 1 Course 2 Course 3 Course 4 x x x x x Course 5 x x x

Triangulation of Assessment Strategies: Direct Measures (Pick 2) Licensure Scores GRE/GMAT/MCAT/other Departmental Exit Exams Triangulation of Assessment Strategies: Direct Measures (Pick 2) Licensure Scores GRE/GMAT/MCAT/other Departmental Exit Exams Portfolio Core Assessment Capstone Experience Others

Triangulation of Assessment Strategies: Indirect Measures (Pick 1) Advisory Boards Senior Survey Employer/Professional School Triangulation of Assessment Strategies: Indirect Measures (Pick 1) Advisory Boards Senior Survey Employer/Professional School Survey Focus Groups/Interviews Other

Three-Year Assessment Plan All departments/programs should be assessed on three-year cycles Not all department/program Three-Year Assessment Plan All departments/programs should be assessed on three-year cycles Not all department/program competencies should/can be assessed Focus on 2 -3 competencies at most Establish meaningful criteria At end of three-year cycle examine data for meaningful trends Only after the three-year cycle are changes made, only if needed

An example of a three-year assessment cycle Three-Year Schedule of Program Assessment Plans Submissions An example of a three-year assessment cycle Three-Year Schedule of Program Assessment Plans Submissions to the Assessment Committee 2013 -2014 -2015 -2016 Program Accounting Adult Education Athletic Training Biology Business Administration Business Core (CPC) Chemistry Communication Arts Communication and Leadership Computer Information Systems Criminal Justice Economics G/UG UG UG UG UG X X X X X Early Childhood Education (Educational Studies) Educational Leadership Elementary Education Engineering English Finance Fine Arts Geography Graphic Design Healthcare Leadership G/UG G UG UG G X X X X X X

Summary Choose an Assessment Model Identify (develop) programmatic competencies Write program level competencies based Summary Choose an Assessment Model Identify (develop) programmatic competencies Write program level competencies based upon the Institutional meta-competencies. Establish key performance indications (KPOs)based upon the competencies Identify 3 measures for each competency (2 direct, 1 indirect) and set success criteria (may be different across various university units) Develop data gathering and reporting mechanisms and templates Develop close-the-loop mechanisms

Big Mistakes in Assessment • Assuming that it will go away • Trying to Big Mistakes in Assessment • Assuming that it will go away • Trying to do too much, too soon • Expecting to get it right the first time • Not considering implementation issues when creating plans • Borrowing plans and methods without acculturation

Big Mistakes in Assessment • Demanding statistical research standards • Doing it for accreditation Big Mistakes in Assessment • Demanding statistical research standards • Doing it for accreditation instead of improvement • Confusing assessment with student learning • Making assessment the responsibility of one individual • Assuming collecting data is doing assessment

Thank you A special thanks to Dr. Susan Hatfield, for her gracious input. Questions? Thank you A special thanks to Dr. Susan Hatfield, for her gracious input. Questions?