Скачать презентацию Responsive Evaluation in the Community College An Alternative Скачать презентацию Responsive Evaluation in the Community College An Alternative

e8f53b8eef01360393e937aa503e49da.ppt

  • Количество слайдов: 20

Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs Nathan R. Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs Nathan R. Durdella, Ph. D Monterey, California April 10, 2006 Building an Information Community: IT and Research Working Together

Presentation Overview Background, Design, & Methods Results: Project HOPE & MESA Findings & Conclusions Presentation Overview Background, Design, & Methods Results: Project HOPE & MESA Findings & Conclusions

Background, Design, & Methods Background, Design, & Methods

Research Context and Problem • Increasing institutional, accreditation requirements to document student outcomes • Research Context and Problem • Increasing institutional, accreditation requirements to document student outcomes • Dominant model: systematic evaluation (Rossi, 1993) – Program objectives, outcomes • Alternative evaluation models – Recently been used successfully (Shapiro, 1988) – Responsive evaluation

Evaluation Models: Systematic vs. Responsive Evaluation • Stake’s problem with systematic evaluation: – Systematic Evaluation Models: Systematic vs. Responsive Evaluation • Stake’s problem with systematic evaluation: – Systematic evaluation’s narrow focus to assess program’s goals, measurements, and standards (Shaddish et al, 1991) – Systematic evaluations best suited for summative evaluations • Responsive evaluation’s focus: – The primary purpose should be “to respond to audience requirements for information” (Guba, 1978, p. 34) – Process-oriented issues • Program implementation – Stakeholder-based • Locally-generated criteria

Stake’s Responsive Evaluation • Responsive Evaluation’s prescriptive steps: 1. Program staff/participants “are identified and Stake’s Responsive Evaluation • Responsive Evaluation’s prescriptive steps: 1. Program staff/participants “are identified and solicited for those claims” (Guba & Lincoln, 1989, p. 42) 2. Issues of program staff and participants are organized and brought to staff members for comment 3. Issues not resolved are used as “organizers for information collection” (Guba & Lincoln, 1989, p. 42) 4. The evaluator approaches each audience member with the evaluation results to resolve all issues

Research Questions • Two research questions: 1. How effectively does responsive evaluation theory work Research Questions • Two research questions: 1. How effectively does responsive evaluation theory work as a way to evaluate instructional support programs? 2. How does responsive evaluation articulate with systematic evaluation approaches?

Research Design and Methods • Design: Comparative, qualitative case study • Case selection – Research Design and Methods • Design: Comparative, qualitative case study • Case selection – Institutions • Cerritos College & Santa Ana College = HSIs – Programs • Project HOPE & MESA • Data sources and Sampling: – Interviews and journals – 2 -step procedure: purposeful and random • Data Collection – Interviews: 19 total subjects, 23 total interviews • Per program: 3 students, 2 staff and 2 faculty, 2 -3 administrators • Program directors were interviewed 3 times

Results: Project HOPE & MESA Results: Project HOPE & MESA

Results: Project HOPE 1. Faculty resisted cultural pedagogy 2. Campus did not value Project Results: Project HOPE 1. Faculty resisted cultural pedagogy 2. Campus did not value Project HOPE faculty: “It’s a method of learning where you would approach you’re teaching looking at culture” “They don’t feel like it would have any impact on their students. ” Faculty and administrators: “We need to serve all of our students equitably. ” “Well we’re not really a minority any more. ” Project HOPE staff: “There are issues of, I’d say, with respect to this program and the college in general about the value of it, the need for it because I think there’s a prevailing thought that we do already all we can for students of color just by default because we have such a diverse student population to have programs like these. ”

Results: Project HOPE • Guidance Counseling “Well now I know exactly what am I Results: Project HOPE • Guidance Counseling “Well now I know exactly what am I supposed to be taking for every, every semester and everything. ” • Parent, family participation “[My mom] was telling my dad, ‘We have to do our taxes because they have to file. ’ So now she knows what we’re talking about when we have to do our financial aid paperwork. ” • Health Occupations 100 as central “I definitely know I want to stay in in L. A. and really serve those communities in need. ” • Program communication, coordination “There was nothing said or nothing exchanged. ” • Lack of faculty buy-in, participation “The only things I ever hear is why aren’t we part of this. ”

Results: MESA Program • Major issue: Program impact – In general, MESA students outperform Results: MESA Program • Major issue: Program impact – In general, MESA students outperform math/science, SAC students • MESA staff: central to students “I know you really want to go, call me. If you can’t make it, call me. If you can’t come to class, tell me why. If you think you’re doing bad in class, just talk to me. We can work something out. ” • Successful program coordination “We have an organized system. ”

Results: MESA Program • Other emerging themes: – Student finances: book loans & more Results: MESA Program • Other emerging themes: – Student finances: book loans & more “I then use the money I saved to attend events sponsored by the Transfer Center. ” – MESA Study Center “The MESA Study Center is a good place if one wants to share a friend’s company and eat lunch while one studies. ” – Program focus: no parent participation “A big obstacle for me as well was that the lack of information available to my parents. ” – Course scheduling, engineering “These classes are not offered every semester. ”

Findings & Conclusions Findings & Conclusions

Findings: Responsive Evaluation • Ongoing programs, categorically funded or institutionalized • Program staff: cooperation, Findings: Responsive Evaluation • Ongoing programs, categorically funded or institutionalized • Program staff: cooperation, participation • Programs: challenges, underlying problems • Program processes, improvement • Programmatic or institutional need – Not solely program impact

Further Findings: Responsive Evaluation • Politically charged context • Personality and power conflicts – Further Findings: Responsive Evaluation • Politically charged context • Personality and power conflicts – Project HOPE: preexisting – UC, well established MESA programs • Responsiveness: no assurance model responds to all stakeholders – Identification, development of issues

Findings: Responsive & Systematic Models • Models articulate well – Project HOPE: prior evaluations Findings: Responsive & Systematic Models • Models articulate well – Project HOPE: prior evaluations vs. responsive evaluation – MESA: program impact • Results meaningful – Project HOPE: new “face” • But, reinforce perceptions – MESA: few surprises but useful • Student voices

Findings: Responsive Evaluator • Initial phases: conditions present to conduct evaluation • Balance between Findings: Responsive Evaluator • Initial phases: conditions present to conduct evaluation • Balance between encouraging participation and maintaining control – Stakeholder-based models • Key: understanding programs as insider while maintaining checks • Presentation of results: critical

Conclusions: Responsive Evaluation in the Community College • Institutional charge: respond to students, faculty, Conclusions: Responsive Evaluation in the Community College • Institutional charge: respond to students, faculty, staff, stakeholders • Responsive evaluation: powerful tool for community colleges programs • Community colleges: limited resources • Research offices: overburdened

Thank you for attending… Questions or comments? Nathan R. Durdella, Ph. D Cerritos College Thank you for attending… Questions or comments? Nathan R. Durdella, Ph. D Cerritos College ndurdella@cerritos. edu Building an Information Community: IT and Research Working Together