Скачать презентацию Federal Education Software Study Findings Implications SIIA Скачать презентацию Federal Education Software Study Findings Implications SIIA

0d6ae7749a25d7ce2c60cd40a35459bc.ppt

  • Количество слайдов: 45

Federal Education Software Study: Findings & Implications SIIA Webcast May 15, 2007 Federal Education Software Study: Findings & Implications SIIA Webcast May 15, 2007

Announcements • NECC – Register at http: //www. siia. net/events/ – SIIA/COSN Leadership Feedback Announcements • NECC – Register at http: //www. siia. net/events/ – SIIA/COSN Leadership Feedback Forum Breakfast: Web 2010 in Education, June 26, 7: 15 -8: 30 am – SIIA member Breakfast & Launch of Vision K-20 Website June 27, 7: 00 -8: 30 am • SIIA Vision K-20 – Outreach campaign to education stakeholders to ensure our educational system is relevant/competitive in the 21 st century. – Web Launch at NECC; PDF at http: //www. siia. net/estore/ – SIIA Members: Submit Evidence & Case Studies to kbillings@siia. net by May 30

AGENDA • Welcome & Introductions • Study Briefing – Mark Dynarski – Phoebe Cottingham AGENDA • Welcome & Introductions • Study Briefing – Mark Dynarski – Phoebe Cottingham • Q&A / Discussion • Impact, Implications & Next Steps – Debbie Stirling – Denis Newman

Webcast Speakers • Phoebe Cottingham, Commissioner, National Center for Education Evaluation and Regional Assistance, Webcast Speakers • Phoebe Cottingham, Commissioner, National Center for Education Evaluation and Regional Assistance, U. S. Department of Education • Audrey Pendleton, Project Officer, U. S. Department of Education • Mark Dynarski, Senior Fellow, Mathematica Policy Research, Inc. and Senior Director, National Study of the Effectiveness of Educational Technology Interventions • Deborah L. Stirling, Research Science Director, Pearson Digital Learning • Denis Newman, President, Empirical Education, Inc. and Co. Chair, SIIA Evaluation Research Working Group • Mark Schneiderman, Director of Education Policy, SIIA (Moderator)

Effectiveness of Reading and Math Software Products Findings From the First Student Cohort Mark Effectiveness of Reading and Math Software Products Findings From the First Student Cohort Mark Dynarski May 2007

Study Synopsis l 16 reading and math software products implemented in 132 volunteer schools Study Synopsis l 16 reading and math software products implemented in 132 volunteer schools l Treatment teachers could use products, control teachers could not l Companies provided trainers and other types of support l Study purchased various upgrades and hardware components l Test scores at the end of the school year were not statistically different 6

Study Size 7 Study Size 7

Implementation Framework l Teacher training [O, R] l Amount of use [I, R] l Implementation Framework l Teacher training [O, R] l Amount of use [I, R] l Technical difficulties and teacher support [I] l Student and teacher roles [O] l Student on-task behavior [O] l Use of performance reports [I] Key: O indicates observations, R records, I interviews 8

General Implementation Findings l Nearly all teachers received training and believed it prepared them General Implementation Findings l Nearly all teachers received training and believed it prepared them to use products l Difficulties using hardware mostly were minor l Logged-in time about 10 percent of instructional time l When products are used – teachers more likely to be “facilitators” – students more likely to work on their own – math: more students on task 9

Logged-In Time Product Records 10 Logged-In Time Product Records 10

Difference in Technology Use in Treatment and Control Classrooms: First Grade 11 Difference in Technology Use in Treatment and Control Classrooms: First Grade 11

Effects on Classroom Practices Percent Difference: Teacher as Facilitator 100% 75% 50% * * Effects on Classroom Practices Percent Difference: Teacher as Facilitator 100% 75% 50% * * 25% 0% First Grade Fourth Grade Sixth Grade Note: * Significantly different from zero at the 0. 05 level 12 Algebra

Effects on Classroom Practices Percent Difference: Students On Task 30% 20% 10% 0% * Effects on Classroom Practices Percent Difference: Students On Task 30% 20% 10% 0% * X First Grade Fourth Grade Sixth Grade Note: * Significantly different from zero at the 0. 05 level 13 Algebra

Estimating Effects l Outcome: spring test score l 3 -level model (students, classrooms, schools) Estimating Effects l Outcome: spring test score l 3 -level model (students, classrooms, schools) l 3 -level model extended to estimate effects of conditions and practices (implementation) – interactions of treatment effect and classroom and school characteristics 14

Test Scores: First Grade SAT-9 Reading Score 0. 25 Effect Size 0. 15 0. Test Scores: First Grade SAT-9 Reading Score 0. 25 Effect Size 0. 15 0. 05 -0. 05 Overall Score Sounds and Letters Word Reading Note: * Significantly different from zero at the 0. 05 level 15 Sentence Reading

Test Scores: First Grade Test of Word Reading Efficiency Score 0. 30 0. 20 Test Scores: First Grade Test of Word Reading Efficiency Score 0. 30 0. 20 Effect Size 0. 10 0. 00 -0. 10 Overall Score Phonemic Decoding Efficiency Sight Word Efficiency Note: None of the effect sizes is significantly different from zero at the 0. 05 level 16

Effect Sizes By School: First Grade 17 Effect Sizes By School: First Grade 17

Interactions: First Grade l Larger effects – More experienced teachers – Smaller student-teacher ratio Interactions: First Grade l Larger effects – More experienced teachers – Smaller student-teacher ratio 18 l No relationship – – Product usage Problems getting access Technical difficulties Computer specialist in school – Professional development last year on using technology – Poverty, urban area, African-American students, Hispanic, special education students

Test Scores: Fourth Grade SAT-10 Reading Score 0. 30 Effect Size 0. 20 0. Test Scores: Fourth Grade SAT-10 Reading Score 0. 30 Effect Size 0. 20 0. 10 X 0. 00 -0. 10 Overall Score Vocabulary Word Study Comprehension Skills Note: None of the effect sizes is significantly different from zero at the 0. 05 level 19

Interactions: Fourth Grade l Larger effects – Product usage 20 l No relationship – Interactions: Fourth Grade l Larger effects – Product usage 20 l No relationship – Problems getting access – Technical difficulties – Computer specialist in school – Professional development last year on using technology – Poverty, urban area

Test Scores: Sixth Grade SAT-10 Math Score Effect Size 0. 30 0. 20 0. Test Scores: Sixth Grade SAT-10 Math Score Effect Size 0. 30 0. 20 0. 10 0. 00 -0. 10 Overall Score Procedures Problem Solving Note: None of the effect sizes is significantly different from zero at the 0. 05 level 21

Interactions: Sixth Grade l 22 No statistically significant relationships Interactions: Sixth Grade l 22 No statistically significant relationships

Test Scores: Algebra ETS Algebra Exam Overall Score Effect Size Concepts Processes Skills 0. Test Scores: Algebra ETS Algebra Exam Overall Score Effect Size Concepts Processes Skills 0. 30 0. 10 -0. 30 Note: None of the effect sizes is significantly different from zero at the 0. 05 level 23

Interactions: Algebra l 24 Smaller effects when teachers had technical difficulties Interactions: Algebra l 24 Smaller effects when teachers had technical difficulties

Study Tradeoffs l 16 reading and math products – Many products and types of Study Tradeoffs l 16 reading and math products – Many products and types of educational technology not in the study l Precision to detect small effect sizes – Average product effect reported l Experimental design – Teachers have not used these products in current classrooms 25

Second Year Study l 10 products, data from 47 schools, treatment and control teachers Second Year Study l 10 products, data from 47 schools, treatment and control teachers with new cohort of students l Test whether effects are related to teacher experience l Effects reported for products 26

Mark Schneiderman Director, Education Policy SIIA Mark Schneiderman Director, Education Policy SIIA

Congressional Request A study: “(A) on the conditions and practices under which educational technology Congressional Request A study: “(A) on the conditions and practices under which educational technology is effective in increasing student academic achievement; and (B) on the conditions and practices that increase the ability of teachers to integrate technology effectively into curricula and instruction, that enhance the learning environment and opportunities, and that increase student academic achievement, including technology literacy. ”

SIIA Involvement • Congressional Authorization • Study Design • Concept Paper – Identifying Software SIIA Involvement • Congressional Authorization • Study Design • Concept Paper – Identifying Software & Selecting Schools • Ad-Hoc Study Participants Group • Statement & Press Outreach • Talking Points

Talking Points • • • The question today is not Talking Points • • • The question today is not "if" technology is useful, but "how and when" technology can best be used to improve education. The study indicates that the studied software worked at least as well as, and in some classrooms better than, other instructional methods and curriculum, such as textbooks. The study confirmed what we already know: Effective implementation is critical to success; and the teacher is still the most important variable. The study reports on the average effect of a few select software applications, not all educational software nor other technologies used in education. The study is not done.

SIIA Survey http: //www. keysurvey. com/survey/152392/134 d/ • What share of your customers (current SIIA Survey http: //www. keysurvey. com/survey/152392/134 d/ • What share of your customers (current or potential) have mentioned the study? • How concerned are you with the study's potential negative impact on sales? • To what extent has the study had any preliminary impact on your product evaluation research agenda? • Based on the conduct and outcome of this study, I am less likely to participate in future third-party (e. g. , government) studies. • Have you used the SIIA talking points to help respond to customer questions about the study?

SIIA Software Implementation Toolkit: Guidelines for K-12 Educators (April 2007) • Purposes: – Help SIIA Software Implementation Toolkit: Guidelines for K-12 Educators (April 2007) • Purposes: – Help K-12 educational institutions make better use of software products through the use of effective implementation practices. Highlight the importance and impact of implementation practices on obtaining results from software use. Provide K-12 educators and administrators with practical tools to use in the implementation process. – – • Software Implementation Components: 1. 2. 3. 4. 5. 6. 7. Determine objectives and obtain buy-in Integration planning Logistics planning Delivery and installation of software Professional development Implementation monitoring and software support Program evaluation

SIIA Software Implementation Toolkit: Guidelines for K-12 Educators (April 2007) • Toolkit Elements: – SIIA Software Implementation Toolkit: Guidelines for K-12 Educators (April 2007) • Toolkit Elements: – – – • 7 Implementation Components: Descriptions & Scenarios Timeline Model for Software Implementation Planning The Software Implementation Checklist for Educators Software Implementation Planning Forms Resources Toolkit Notes – – Software Broadly Defined 56 Pages Share with Customers http: //www. siia. net/education/foreducators/toolkit_0407. pdf

SIIA Research & Evaluation Working Group • • Objective: Contribute to the improvement of SIIA Research & Evaluation Working Group • • Objective: Contribute to the improvement of Ed Tech research and dissemination efforts by government, industry, and the nonprofit sector. Co-Chairs: – Rob Foshay, Texas Instruments – Denis Newman, Empirical Education Activities Aimed at: – enhancing the quality, usability, and timeliness of research and dissemination; – enhancing collaboration in R&E and dissemination activities among stakeholders; and – improving the knowledge base about the implementation and effectiveness of technology-supported educational interventions. Join: Contact marks@siia. net

SIIA Resources • • SIIA Statement on the National Study (April 5, 2007) http: SIIA Resources • • SIIA Statement on the National Study (April 5, 2007) http: //www. siia. net/press/releases/SIIAStatement_USEDStudy 4 -5 -07. pdf SIIA Software Implementation Toolkit: Guidelines for K-12 Educators (April 2007) http: //www. siia. net/education/foreducators/toolkit_0407. pdf SIIA Talking Points on Federal Education Software Study (April 26, 2007) http: //www. siia. net/govt/docs/members/SIIAsoftwarestudy. TPfin al. pdf SIIA “Guide to Conducting Scientifically Based Research” and Related Resources http: //www. siia. net/govt/issue. asp? issue=EDTK#9

The Good, Bad and Ugly of Vendor Participation Deborah L. Stirling, Ph. D. Research The Good, Bad and Ugly of Vendor Participation Deborah L. Stirling, Ph. D. Research Science Director Pearson Digital Learning

The Good, Bad and Ugly of Vendor Participation • Met with SRI prior to The Good, Bad and Ugly of Vendor Participation • Met with SRI prior to recruitment • Recruitment – Sampling – Not enough time (Feb to June) – Supplied list of school districts – Participated in site recruitment visits

The Good, Bad, and Ugly of Vendor Participation • Data collection – School districts The Good, Bad, and Ugly of Vendor Participation • Data collection – School districts voiced concerns about the quality of the pre- and posttesting and the personnel doing the observations – Fall pre-testing still going on in December – 4 th graders took SAT 9 on gym floor – Mathematica responsive to inquiry

The Good, Bad, and Ugly of Vendor Participation • Incentive to Participate – Promised The Good, Bad, and Ugly of Vendor Participation • Incentive to Participate – Promised access to product-specific findings and unrestricted use of those findings – To date vendors have not received data • SRI’s development of observation and interview protocols – Conceived as a collaborative effort – Mathematica said the instruments were public information – SRI said fidelity index and developed instruments were proprietary and would not share

Reporting to Congress vs. Reporting to Schools Denis Newman President Empirical Education Inc. SIIA Reporting to Congress vs. Reporting to Schools Denis Newman President Empirical Education Inc. SIIA Webcast Concerning “Effectiveness of Reading and Mathematics Software Products: Findings from the First Cohort” May 15, 2007 40

Differences in conducting research • For Congress – Abstract concept, e. g. , “fourth Differences in conducting research • For Congress – Abstract concept, e. g. , “fourth grade reading software products”, rather than concrete examples – What’s the average effect across all products? – Broad, maximally diverse sample • For school districts – Want to know about a specific product – Will it work in my district? – Need good exemplars that can illustrate realistic implementation May 15, 2007 41

Think of an RCT as a case study • A case study with a Think of an RCT as a case study • A case study with a quantitative outcome • Don’t spread your sample of teachers too thin – Conduct 3 or 4 “case studies” in representative districts – Combine these into a single study to obtain the “statistical power” needed • Shows what can happen in a typical instance of implementation • The RCT should tell a story that a district decision-maker can understand May 15, 2007 42

Measure impact on teachers not just students • Most consistent finding in the study Measure impact on teachers not just students • Most consistent finding in the study was – Leader changed to facilitator – Lecture changed to individual work • Document teaching practices found more in program than control classrooms – Do these practices correlate with student outcomes? – These “mediators” can be very useful way to show your product works • Don’t expect to always find simple correlations of outcomes with implementation measures May 15, 2007 43

IES Grant Programs • National Center for Education Research • This year includes a IES Grant Programs • National Center for Education Research • This year includes a program specifically for education technology • Calls for – RCTs to show effectiveness – Strong collaboration with school systems • Requires some prior research suggesting your program is effective • Deadlines: end of July and end of October May 15, 2007 44

Denis Newman Empirical Education Inc. 425 Sherman Ave. Suite 210 Palo Alto, CA 94306 Denis Newman Empirical Education Inc. 425 Sherman Ave. Suite 210 Palo Alto, CA 94306 (650) 328 -1734 x 112 dn@empiricaleducation. com www. empiricaleducation. com May 15, 2007 45