
7a239d738c2ff5967e8c0c2652a324ab.ppt
- Количество слайдов: 32
Setting And Communicating Assessment Vision And Expectations: Discussing Barriers And Solutions Joni E. Spurlin, Ph. D. University Director of Assessment NC State University
Session Outcomes l l The goal of this session is to engage participants in a thoughtful dialogue. By the end of this session, participants will be able to: § § § Define why a vision for assessment is useful; Define assessment expectations; and Discuss barriers and solutions to barriers related to communicating vision and expectations of assessment. NC State Undergraduate Assessment Symposium April, 2008
What is New or Different about this Session? l l Probably not much From Psychological viewpoint – if there are “problems” – first have to identify the issues, then talk about these issues and together find solutions…. Use for your institution…from an angle you might not have considered Help synthesize some of the ideas you may have heard over the past two days NC State Undergraduate Assessment Symposium April, 2008
Doing Assessment As If Learning Matters Most by Thomas A. Angelo l “Thus, in order to move beyond piecemeal and superficial change and toward transformation, we need to develop a learning community-like culture among the faculty and administrators involved in assessment. Four basic preconditions are key to this collective personal mastery. First, we need to develop shared trust; second, shared visions and goals; and third, shared language and concepts. Fourth, we need to identify research-based guidelines that can orient our assessment efforts toward the goal of creating productive learning communities. ” AAHE May 1999: http: //education. gsu. edu/ctl/outcomes/Doing%20 Assessment%20 As%20 If%20 Learnin g%20 Matters%20 Most. htm NC State Undergraduate Assessment Symposium April, 2008
Topics l Vision § l l l § Of assessment process Barriers Some Solutions Examples Exercises NC State Undergraduate Assessment Symposium Expectations § l l Of assessment process Of student learning Barriers Some Solutions Examples Exercises April, 2008
Vision is… l l Picture of the future Inspirational Framework for planning Dreams & hopes balanced with reality § l l Feasible, attainable “If strategic plan is a ‘blueprint’ for an organization’s work, the vision is the ‘artist’s rendering’ of the achievement of that plan. ” …scary …intimidating NC State Undergraduate Assessment Symposium April, 2008
Group Discussion l Why develop a “SHARED VISION” of Assessment Process within your institution? NC State Undergraduate Assessment Symposium April, 2008
Why Develop a “SHARED VISION” of Assessment Process? l l l Increases the sense of shared responsibility for student learning Focuses direction for next few years Pulls our sight above the day-to-day aspects of our work Stretches us beyond our status quo… stretches expectations, aspirations, and performance Provides an “outcome”…against which to plan and assess NC State Undergraduate Assessment Symposium April, 2008
Group Discussion l In your institution, what are barriers to developing a VISION? NC State Undergraduate Assessment Symposium April, 2008
Barriers l l Differences in how assessment is valued. Lack of understanding and lack of clear and agreed-upon definitions of accreditation, assessment, and accountability, including differences and relationships between. Differing attitudes, knowledge, and competing needs of resources Expectations by different parties collide: administration, faculty, students, assessment professionals, internal and external needs NC State Undergraduate Assessment Symposium April, 2008
Exercise l l l NC State Undergraduate Assessment Symposium Q: How Overcome Barriers? Worksheet – Examining list of barriers on previous slide - pick one and list a way to overcome this barrier 5 minutes April, 2008
Overcome Barriers l l Defining what you are talking about Defining and developing consensus about purposes of assessment, assessment outcomes, use of assessment findings for informing: § § § Individual students Academic or student affairs programs College or division Institution External agencies NC State Undergraduate Assessment Symposium April, 2008
Examples of Vision Statements l l l Continuous, ongoing assessment that is fully integrated into all units of the university A campus culture that fully participates in and uses information from assessment initiatives to make decisions A process that matures and grows - Assessment aligns with what faculty/staff are doing and why they are doing what they are doing Process defines measurable learning outcomes which help reframe departments’ thinking Annually celebrate process, success, failure NC State Undergraduate Assessment Symposium April, 2008
Other Vision Examples l The handout has examples from other institutions: § § § American Association for Higher Education NC State University of Wyoming Millersville University Towson University Indiana State University NC State Undergraduate Assessment Symposium April, 2008
Example of Definitions l NC State University: Common Language: http: //www. ncsu. edu/uap/academic-standards/uapr/process/language. html l l James Madison University: Dictionary of Student Outcome Assessment: http: //people. jmu. edu/yangsx/ Western Kentucky University: http: //www. wku. edu/sacs/assessmentmanual. htm “Glossaries” available on the Internet Resources list http: //www 2. acs. ncsu. edu/UPA/assmt/resource. htm. Adding to our vocabulary: § Transparency § Evidence of student learning NC State Undergraduate Assessment Symposium April, 2008
Exercises for your institution l See “Exercises” in the handout NC State Undergraduate Assessment Symposium April, 2008
Topic 2: Expectations NC State Undergraduate Assessment Symposium April, 2008
CHEA’s View on Expectations: l l “Institutions and programs are responsible for establishing clear statements of student learning outcomes and for collecting, interpreting, and using evidence of student achievement. ” “Institutions and programs share responsibility with accrediting organizations for providing clear and credible information to constituents about what students learn. ” from: Statement Of Mutual Responsibilities For Student Learning Outcomes: Accreditation, Institutions, And Programs: http: //www. chea. org/pdf/stmntstudentlearningoutcomes 9 -03. pdf NC State Undergraduate Assessment Symposium April, 2008
Q: What are YOUR Expectations? l l l List expectations you have about assessment process and student learning at your institution. What do we expect to accomplish through assessment? “What” and “how well” do you expect students to learn? NC State Undergraduate Assessment Symposium April, 2008
Re-Visiting Expectations l Why revisit? When/how often revisit? § § When something changes When processes are failing NC State Undergraduate Assessment Symposium April, 2008
Barriers: Multiple Focuses o. Assessment process designed for specific purpose or level; then changes occur; such as motivations, Programinternal needs, external pressures & Institutional curricular decision-making o. Lack of trust of motives/ use of assessment. making o. Fear of resistance and hostility. What and Regional o. Miscommunication, especially because of differences in perspectives, expectations and values. How Well are accreditation o. Fatigue: We all get tired of conversations about standards/criteria our students assessment, accreditation, accountability. learning? Programmatic o. Fatigue: Does one process fit all? accreditation o. Despite communication, some don’t or won’t hear. standards/criteria Accountability o. Lack of assessment momentum after an accrediting pressures from body leaves. federal and state NC State Undergraduate Assessment Symposium April, 2008
Statement of Good Practice* Suskie l l l Expectations are discussed at program and institutional levels so that expectations are congruent and clear to all engaged in assessment processes. Faculty, assessment professionals and leadership discuss ramification of programmatic accreditation, regional accreditation and accountability issues. The institution promotes an atmosphere of critical reflection about teaching, learning, research and services. Assessment reflects what stakeholders really care about. Assessment evidence is publicly available, visible and consistent. *For a summary, see Linda Suskie’s compilation: “What is good assessment: A synthesis of Principles of Good Practice”: From “What is ‘good’ assessment? A new model for fulfilling accreditation expectations, ” presented at the First Annual International Assessment & Retention Conference, Phoenix AZ, June, 2006. See Internet Resources for Higher Education Outcomes Assessment website: http: //www 2. acs. ncsu. edu/UPA/assmt/resource. htm NC State Undergraduate Assessment Symposium April, 2008
Exercise: What Issues Lead to Barriers? Program & curricular decision- making Programmatic accreditation standards/criteria Institutional decision-making What and How Well are our students learning? NC State Undergraduate Assessment Symposium Regional accreditation standards/ criteria Accountability pressures from federal and state April, 2008
Issues Program Level Institutional Level Purpose Formative: program improvement Summative: what can we say about student learning for entire institution? Fears Leads to fear of sharing results with decision makers Fears related to the inability to meet external pressures (accreditation, accountability) Are there needs for multiple processes because of multiple audiences? Outcomes: Relationship of program outcomes to institutional “outcomes”, goals, mission Isn’t having program outcomes for every program enough? Do we need institutional outcomes? Intuitional outcomes: Should we have? How define? Who defines? Are these just “general” outcomes? Do these map to program outcomes? (Is having institutional outcomes enough, why have program outcomes? ) Design of Assessment Design of assessment process may NOT be applicable to “roll up” for institutional reporting. More difficult to use institutional results for program improvement Design of assessment more difficult: how obtain data applicable for institutional summaries? CLA, NSEE, portfolio Cost & Time Defining the balance of cost/time to value of evidence for the program. Institutional Evidence Gathering is Very Costly: CLA: $6500 + time NSEE: $6000 -7000 + time Student motivation lower Portfolio = more faculty time Faculty & students more willing to participate for their own program NC State Undergraduate Assessment Symposium April, 2008
Issue: Purpose of Assessment Program Level Purpose Formative: program improvement Institutional Level Summative: what can we say about student learning for entire institution? Solutions: l. Discuss faculty expectations, including workload and purposes of processes. l. Review programmatic and regional accreditation expectations. l. Identify what is needed for making program and curricular decisions. l. Set expectation that program assessment results will be used at institutional level (and/or visa versa) Examples: l. The University of Montana http: //www. umt. edu/provost/pdf/academicassessmentplan 2006. pdf l. Frostburg State University: http: //www. frostburg. edu/admin/apie/Model. Planning. Group. Assessment. Plan 08 Mar 06. pdf l. University of Nevada, Reno: http: //www. unr. edu/assess/model/index. html Workload: l. NCSU faculty survey results: 25% said they spent time on academic program assessment activities. l. California faculty workload study: 7% of faculty spent time on assessment. NC State Undergraduate Assessment Symposium April, 2008
Issue: Fears Program Level Fears Leads to fear of sharing results with decision makers Institutional Level Fears related to the inability to meet external pressures (accreditation, accountability) Are there needs for multiple processes because of multiple audiences? Solutions: l. Fears have been raised because of Spellings Commission Report and Voluntary System of Accountability Format (VSA). For VSA, results of student learning outcomes are transparent. Increases discussions about use of evidence, reporting integrity, audience. l. Collective group who take responsibility for student learning, develop methodology of data collection and work with leadership in HOW data will be used, including confidence in validity of data collection methods. l. Reporting, documentation also includes the limits of the methodology. Clearly state the appropriate uses of the data and the methodology limits. l. Institutional methodology should be as useful for as many audiences as possible, with clearly and timely documentation of results, appropriate uses of results and issues surrounding data collection. Examples: l. VSA: As of 4/19/2008: 235 institutions signed up. http: //www. voluntarysystem. org/index. cfm? page=templates Western: http: //www. case. edu/provost/uplan/Documents/Assessment%20 Task%20 Force_final 2. pdf l. CASE NC State Undergraduate Assessment Symposium April, 2008
Issue: Outcomes Program Level Outcomes: Relationship of program outcomes to institutional outcomes Isn’t having program outcomes for every program enough? Do we need institutional outcomes? Institutional Level Intuitional outcomes: Should we have? How define? Who defines? Are these just “general” outcomes? Do these map to program outcomes? (Is having institutional outcomes enough, why have program outcomes? ) Solutions (in addition to issues already discussed): l. Institution decides what type and level of outcomes to develop, besides program outcomes, e. g. for general education/core curriculum, college –wide, institutional l. Focus Groups with students and faculty: What do students expect to learn? What do faculty expect students to learn? What do faculty expect ALL students to learn? l. Alumni (one to three years after graduation) have great perspective on what they did learn and what they wished they had learned. Use their insight when reviewing or developing program level outcomes and institutional level outcomes. l. The student experience = comprehensive view including both experiences inside and outside the classroom (combining curricular and co-curricular assessment processes) l. Electronic assessment management tools –can help map program and institutional outcomes and goals Linked Program and Institutional Outcomes, Examples: l. Dickinson State University: http: //www. dsu. nodak. edu/Catalog/fine_arts/art_majors_minors. htm l. St. Olaf College: http: //www. stolaf. edu/committees/curriculum/media/Pending. Matters/Call. For. Program. ILOs. pdf l. Pierce College: http: //www. lapc. ca. us/offices/preview/ILOs. doc Institutional Outcomes Examples: l. University of Nebraska Omaha: http: //www. unomaha. edu/assessment/index. php l. Worcester Polytechnic Institute: http: //www. wpi. edu/Academics/Outcomes/UOAC%20 assess%20 plan%20 April%202005%20 rev%20100405. pdf NC State Undergraduate Assessment Symposium April, 2008
Issue: Design of Assessment Program Level Design of Assessment Design of assessment process may NOT be applicable to “roll up” for institutional reporting. More difficult to use institutional results for program improvement Institutional Level Design of assessment more difficult: how obtain data applicable for institutional summaries? CLA, NSEE, portfolio Solutions: l. Institutional level: Identify what administrators need, how often, and why. l. Discuss how program assessment results can be used (and should not be used) for decision making at institutional level; discuss how institutional results can be used (and should not be used) for decision making at program level. l Key words at NCSU: meaningful, manageable, effective, efficient l. Effective assessment = “assessment-as-learning”: assessment process increases student learning, e. g. , portfolio feedback, reflections, self-assessment, peer-assessment, etc l. Institutional level assessment committee to coordinate, prioritize, and use assessment data to improve student learning and for institutional improvement purposes. (comprised of key faculty members, senior academic and student affairs administrators, students, and staff) l. Consider pros and cons of designing centralized vs. decentralized assessment processes Examples: l. University of Colorado at Boulder: http: //www. colorado. edu/pba/outcomes/handbook. htm l. University of Maine: http: //www. umaine. edu/provost/committees/SLOA/index. htm NC State Undergraduate Assessment Symposium April, 2008
Issue: Resources Program Level Cost & Time Defining the balance of cost/time to value of evidence for the program. Faculty & students more willing to participate for their own program Institutional Level Institutional Evidence Gathering is Very Costly: CLA: $6500 + time NSEE: $6000 -7000 + time Student motivation lower Portfolio = more faculty time Solutions: l. Embrace concept that assessment takes time. Time to develop or review program outcomes: up to one year; Time to develop or review institutional outcomes: 3 -5 years l. Determine costs for each assessment design, then make priorities for use of resources… balance the value of the assessment evidence with the effort or resources involved in gathering the evidence. Determine what is required. l. Weigh importance and requirements for benchmarking with other institution using nationally developed instruments. Examples: l. Assessment, Accountability, And Student Learning Outcomes by Richard Frye: http: //www. ac. wwu. edu/~dialogue/issue 2. html l. New Directions for Community Colleges: Volume 2004 Issue 126 , Pages 1 - 109 (Summer 2004): http: //www 3. interscience. wiley. com/journal/109580268/issue l. Assessment Practice in Student Affairs: An Applications Manual by John H. Schuh, M. Lee Upcraft: http: //www. josseybass. com/Wiley. CDA/Wiley. Title/product. Cd-078795053 X. html NC State Undergraduate Assessment Symposium April, 2008
Exercises for your institution l See “Exercises” in the handout NC State Undergraduate Assessment Symposium April, 2008
Final slide! l Communicate, communicate § § § Vision and expectations about assessment processes Vision and expectations about what students should be learning How well students are learning (increase transparency of evidence – across programs, internal and external to institution) NC State Undergraduate Assessment Symposium April, 2008
Contact Information Joni E. Spurlin, Ph. D. University Director of Assessment University Planning and Analysis Campus Box 7002 NC State University Raleigh, NC 27695 919 -515 -6209 [email protected] EDU http: //www 2. acs. ncsu. edu/UPA/assmt/index. htm NC State Undergraduate Assessment Symposium April, 2008