Engaging Practitioners in Program Evaluation PDF Download

Are you looking for read ebook online? Search for your book and save it on your Kindle device, PC, phones or tablets. Download Engaging Practitioners in Program Evaluation PDF full book. Access full book title Engaging Practitioners in Program Evaluation by Sue Bainter. Download full books in PDF and EPUB format.

Engaging Practitioners in Program Evaluation

Engaging Practitioners in Program Evaluation PDF Author: Sue Bainter
Publisher:
ISBN:
Category :
Languages : en
Pages : 15

Book Description
Background: Many early intervention teams are shifting their service delivery for children with disabilities from a child-focused model to one that focuses on strengthening the competence and confidence of the child's caregivers (parents, childcare providers, preschool teachers). The use of coaching strategies in a primary coach model of service delivery continues to evolve as a means of supporting young children who have disabilities in their natural home-community environments (where they and their families interact every day) (Shelden & Rush 2006). The early intervention team in this study felt their coaching strategies were making a difference; however, they had no systematic way to evaluate/document their efforts. Purpose: To engage early intervention practitioners in a program evaluation of their efforts toward implementation of intended coaching practices with caregivers of young children with disabilities. To examine the immediate effects of those coaching practices on caregivers' behaviors and gather evidence of practitioner-caregiver partnerships and desired caregiver competence and confidence. Setting: Home and preschool classrooms of subjects associated with a midsize public school district early intervention/special education program that provides home/community visits as a state-mandated service for children who have disabilities, ages birth to 5 years. Study Sample: Five early intervention practitioners including two speech therapists, three early childhood teachers and six caregivers. The data collection focused on six dyads and the interactions between the practitioner and the caregiver of a child who was enrolled in the public school early intervention/special education program. Intervention: Early Intervention practitioners meet weekly in the homes or classrooms of children's caregivers. Visits are jointly designed in terms of purpose, scheduling and frequency and practitioners aim to focus caregivers on the priority identified for their child/family. Coaching strategies are used to build a caregiver's ability to effectively involve their child in everyday learning opportunities and child/family interests to enhance the child's participation and development. All coaching interactions involve some amount of observation, reflection and joint planning, but also include discussion, demonstration, practice, problem solving, questions, and feedback. The practitioner and caregiver routinely assess the effectiveness of the coaching partnership and activities in light of their intentions and the child's progress (Hanft, Rush & Sheldon, 2004). Research Design: Descriptive; Other Quantitative; Control or Comparison Condition. Data Collection and Analysis: A meeting of all district-employed early intervention practitioners (n = 21) was used to solicit descriptions of perceived roles and program outcomes. Themes were identified and cross-referenced with principles of coaching generated from a statewide training that most of the practitioners had completed. The themes were then reframed by the evaluators using the literature on evidence-based practices in early intervention so as to develop a core of behavioral "success" indicators for both the practitioner and the caregiver. Six home/community visits/meetings were videotaped; each one involved the interactions between one practitioner and one caregiver. Follow-up telephone interviews were conducted with each practitioner and caregiver independently to determine each participant's satisfaction with the visit, their perception of their role, and the actions taken. The videotapes were watched independently by two evaluators using two-minute intervals to note (+/-) all behavioral indicators evidenced at least once in that time period for both the practitioner and the caregiver. Mean rate per minute calculations were used to compare behaviors of practitioners to caregivers and to compare behaviors of both in hobserved in community visits (2) which were both intervention planning meetings. Responses to post-visit Yes-No interview questions were noted as (+/-), with total number of positive responses divided by total number of questions asked to compute a percentage of positive responses per partner. A mean percentage was computed independently for the group of practitioners, and the group of caregivers observed across the six visits. Findings: Practitioners and caregivers were engaged, and perceived themselves to be engaged, during the visits with the most amount of practitioner and caregiver time in each dyad spent in collaboration/partnership building. The least amount of the practitioners' time was spent engaging caregivers in planned or spontaneous learning opportunities with the child or planning for next steps. Despite their eager engagement in developing a partnership with practitioners in each visit (sharing, asking questions), the caregivers' rates of active reflection behavior, problem-solving and generation of new ideas for their children's targets and learning opportunities were notably low as compared to the rate of practitioner behaviors aimed at promoting these caregiver behaviors (2 X more frequent). The practitioners engaged least in behaviors that encouraged caregiver engagement with the children during or after the visit; not surprising, the caregivers engaged in these actions very seldom during the visits. Practitioners and caregivers did not differ dramatically across the six dyads. Overall, practitioners promoted reflection more in home visits than in meetings and parent-caregivers were more likely than teacher-caregivers to practice new strategies or describe possible learning opportunities for the children. Conclusion: Because the EI team was engaged in the process of developing the behavioral "success" indicators used for coding the observed practitioner-caregiver interactions, the study provided meaningful information upon in light of their intentions. The results indicated that the five practitioners participating in this study were demonstrating efforts to engage the caregivers as partners, and encouraging them to reflect, problem-solve and identify child-learning opportunities. The caregivers involved show a willingness to partner in planning the focus for discussion or observation of their child, but demonstrated notably lower rates of action and participatory activities. Both the practitioners and caregivers perceived contributions to a partnership in their visits. Practitioners might benefit from greater focus/time spent on promoting caregiver-child interactions during or after the visits. However, it is not known from this study if these or current efforts are enough to effect change in the caregivers' behaviors or that of their child. References: Hanft, B., Rush, D. & Shelden, M. (2004). Coaching families and colleagues in early childhood. Baltimore, MD: Paul H. Brookes. Shelden, M. & Rush, D. (2006, June). Personal Development: Facilitators' Summer Institute, Nebraska Department of Education, Lincoln. Citation: Bainter, S. & Marvin, C. (2006). Engaging practitioners in program evaluation: A preliminary report of perceptions and observations of practitioner-caregiver partnerships in early intervention. University of Nebraska-Lincoln. Appended are: (1) Practitioner Follow-Up Survey and Caregiver Follow-Up Survey; (2) Raw data resulting from team meeting January 2005; and (3) Practitioner Behavioral Indicators (to be observed in the practitioner and Caregiver Behavioral Indicators (to be observed in the caregiver. (Contains 2 tables.).

Engaging Practitioners in Program Evaluation

Engaging Practitioners in Program Evaluation PDF Author: Sue Bainter
Publisher:
ISBN:
Category :
Languages : en
Pages : 15

Book Description
Background: Many early intervention teams are shifting their service delivery for children with disabilities from a child-focused model to one that focuses on strengthening the competence and confidence of the child's caregivers (parents, childcare providers, preschool teachers). The use of coaching strategies in a primary coach model of service delivery continues to evolve as a means of supporting young children who have disabilities in their natural home-community environments (where they and their families interact every day) (Shelden & Rush 2006). The early intervention team in this study felt their coaching strategies were making a difference; however, they had no systematic way to evaluate/document their efforts. Purpose: To engage early intervention practitioners in a program evaluation of their efforts toward implementation of intended coaching practices with caregivers of young children with disabilities. To examine the immediate effects of those coaching practices on caregivers' behaviors and gather evidence of practitioner-caregiver partnerships and desired caregiver competence and confidence. Setting: Home and preschool classrooms of subjects associated with a midsize public school district early intervention/special education program that provides home/community visits as a state-mandated service for children who have disabilities, ages birth to 5 years. Study Sample: Five early intervention practitioners including two speech therapists, three early childhood teachers and six caregivers. The data collection focused on six dyads and the interactions between the practitioner and the caregiver of a child who was enrolled in the public school early intervention/special education program. Intervention: Early Intervention practitioners meet weekly in the homes or classrooms of children's caregivers. Visits are jointly designed in terms of purpose, scheduling and frequency and practitioners aim to focus caregivers on the priority identified for their child/family. Coaching strategies are used to build a caregiver's ability to effectively involve their child in everyday learning opportunities and child/family interests to enhance the child's participation and development. All coaching interactions involve some amount of observation, reflection and joint planning, but also include discussion, demonstration, practice, problem solving, questions, and feedback. The practitioner and caregiver routinely assess the effectiveness of the coaching partnership and activities in light of their intentions and the child's progress (Hanft, Rush & Sheldon, 2004). Research Design: Descriptive; Other Quantitative; Control or Comparison Condition. Data Collection and Analysis: A meeting of all district-employed early intervention practitioners (n = 21) was used to solicit descriptions of perceived roles and program outcomes. Themes were identified and cross-referenced with principles of coaching generated from a statewide training that most of the practitioners had completed. The themes were then reframed by the evaluators using the literature on evidence-based practices in early intervention so as to develop a core of behavioral "success" indicators for both the practitioner and the caregiver. Six home/community visits/meetings were videotaped; each one involved the interactions between one practitioner and one caregiver. Follow-up telephone interviews were conducted with each practitioner and caregiver independently to determine each participant's satisfaction with the visit, their perception of their role, and the actions taken. The videotapes were watched independently by two evaluators using two-minute intervals to note (+/-) all behavioral indicators evidenced at least once in that time period for both the practitioner and the caregiver. Mean rate per minute calculations were used to compare behaviors of practitioners to caregivers and to compare behaviors of both in hobserved in community visits (2) which were both intervention planning meetings. Responses to post-visit Yes-No interview questions were noted as (+/-), with total number of positive responses divided by total number of questions asked to compute a percentage of positive responses per partner. A mean percentage was computed independently for the group of practitioners, and the group of caregivers observed across the six visits. Findings: Practitioners and caregivers were engaged, and perceived themselves to be engaged, during the visits with the most amount of practitioner and caregiver time in each dyad spent in collaboration/partnership building. The least amount of the practitioners' time was spent engaging caregivers in planned or spontaneous learning opportunities with the child or planning for next steps. Despite their eager engagement in developing a partnership with practitioners in each visit (sharing, asking questions), the caregivers' rates of active reflection behavior, problem-solving and generation of new ideas for their children's targets and learning opportunities were notably low as compared to the rate of practitioner behaviors aimed at promoting these caregiver behaviors (2 X more frequent). The practitioners engaged least in behaviors that encouraged caregiver engagement with the children during or after the visit; not surprising, the caregivers engaged in these actions very seldom during the visits. Practitioners and caregivers did not differ dramatically across the six dyads. Overall, practitioners promoted reflection more in home visits than in meetings and parent-caregivers were more likely than teacher-caregivers to practice new strategies or describe possible learning opportunities for the children. Conclusion: Because the EI team was engaged in the process of developing the behavioral "success" indicators used for coding the observed practitioner-caregiver interactions, the study provided meaningful information upon in light of their intentions. The results indicated that the five practitioners participating in this study were demonstrating efforts to engage the caregivers as partners, and encouraging them to reflect, problem-solve and identify child-learning opportunities. The caregivers involved show a willingness to partner in planning the focus for discussion or observation of their child, but demonstrated notably lower rates of action and participatory activities. Both the practitioners and caregivers perceived contributions to a partnership in their visits. Practitioners might benefit from greater focus/time spent on promoting caregiver-child interactions during or after the visits. However, it is not known from this study if these or current efforts are enough to effect change in the caregivers' behaviors or that of their child. References: Hanft, B., Rush, D. & Shelden, M. (2004). Coaching families and colleagues in early childhood. Baltimore, MD: Paul H. Brookes. Shelden, M. & Rush, D. (2006, June). Personal Development: Facilitators' Summer Institute, Nebraska Department of Education, Lincoln. Citation: Bainter, S. & Marvin, C. (2006). Engaging practitioners in program evaluation: A preliminary report of perceptions and observations of practitioner-caregiver partnerships in early intervention. University of Nebraska-Lincoln. Appended are: (1) Practitioner Follow-Up Survey and Caregiver Follow-Up Survey; (2) Raw data resulting from team meeting January 2005; and (3) Practitioner Behavioral Indicators (to be observed in the practitioner and Caregiver Behavioral Indicators (to be observed in the caregiver. (Contains 2 tables.).

Program Evaluation

Program Evaluation PDF Author: Robert O. Brinkerhoff
Publisher: Springer Science & Business Media
ISBN: 9401176302
Category : Education
Languages : en
Pages : 408

Book Description
Please glance over the questions that follow and read the answers to those that are of interest. Q: What does this manual do? A: This manual guides the user through designing an evaluation. A: Who can use it? A: Anyone interested or involved in evaluating professional trammg or inservice education programs. The primary users will be staff members who are doing their own program evaluation-maybe for the first time. (Experienced evaluators or other professional educators can find useful guides and worksheets in it.) Q: If I work through this manual, what will I accomplish? A: You will develop one or more evaluation designs, and perhaps you'll also use the designs to evaluate something to make it better or to document its current value. Q: What is an evaluation design? A: An evaluation design is a conceptual and procedural map for getting important information about training efforts to people who can use it, as shown in the graphic below.

Handbook of Practical Program Evaluation

Handbook of Practical Program Evaluation PDF Author: Kathryn E. Newcomer
Publisher: John Wiley & Sons
ISBN: 1118893611
Category : Business & Economics
Languages : en
Pages : 912

Book Description
The leading program evaluation reference, updated with the latest tools and techniques The Handbook of Practical Program Evaluation provides tools for managers and evaluators to address questions about the performance of public and nonprofit programs. Neatly integrating authoritative, high-level information with practicality and readability, this guide gives you the tools and processes you need to analyze your program's operations and outcomes more accurately. This new fourth edition has been thoroughly updated and revised, with new coverage of the latest evaluation methods, including: Culturally responsive evaluation Adopting designs and tools to evaluate multi-service community change programs Using role playing to collect data Using cognitive interviewing to pre-test surveys Coding qualitative data You'll discover robust analysis methods that produce a more accurate picture of program results, and learn how to trace causality back to the source to see how much of the outcome can be directly attributed to the program. Written by award-winning experts at the top of the field, this book also contains contributions from the leading evaluation authorities among academics and practitioners to provide the most comprehensive, up-to-date reference on the topic. Valid and reliable data constitute the bedrock of accurate analysis, and since funding relies more heavily on program analysis than ever before, you cannot afford to rely on weak or outdated methods. This book gives you expert insight and leading edge tools that help you paint a more accurate picture of your program's processes and results, including: Obtaining valid, reliable, and credible performance data Engaging and working with stakeholders to design valuable evaluations and performance monitoring systems Assessing program outcomes and tracing desired outcomes to program activities Providing robust analyses of both quantitative and qualitative data Governmental bodies, foundations, individual donors, and other funding bodies are increasingly demanding information on the use of program funds and program results. The Handbook of Practical Program Evaluation shows you how to collect and present valid and reliable data about programs.

Handbook of Practical Program Evaluation

Handbook of Practical Program Evaluation PDF Author: Kathryn E. Newcomer
Publisher: John Wiley & Sons
ISBN: 1118893697
Category : Business & Economics
Languages : en
Pages : 100

Book Description
The leading program evaluation reference, updated with the latest tools and techniques The Handbook of Practical Program Evaluation provides tools for managers and evaluators to address questions about the performance of public and nonprofit programs. Neatly integrating authoritative, high-level information with practicality and readability, this guide gives you the tools and processes you need to analyze your program's operations and outcomes more accurately. This new fourth edition has been thoroughly updated and revised, with new coverage of the latest evaluation methods, including: Culturally responsive evaluation Adopting designs and tools to evaluate multi-service community change programs Using role playing to collect data Using cognitive interviewing to pre-test surveys Coding qualitative data You'll discover robust analysis methods that produce a more accurate picture of program results, and learn how to trace causality back to the source to see how much of the outcome can be directly attributed to the program. Written by award-winning experts at the top of the field, this book also contains contributions from the leading evaluation authorities among academics and practitioners to provide the most comprehensive, up-to-date reference on the topic. Valid and reliable data constitute the bedrock of accurate analysis, and since funding relies more heavily on program analysis than ever before, you cannot afford to rely on weak or outdated methods. This book gives you expert insight and leading edge tools that help you paint a more accurate picture of your program's processes and results, including: Obtaining valid, reliable, and credible performance data Engaging and working with stakeholders to design valuable evaluations and performance monitoring systems Assessing program outcomes and tracing desired outcomes to program activities Providing robust analyses of both quantitative and qualitative data Governmental bodies, foundations, individual donors, and other funding bodies are increasingly demanding information on the use of program funds and program results. The Handbook of Practical Program Evaluation shows you how to collect and present valid and reliable data about programs.

Program Evaluation

Program Evaluation PDF Author: David Daniel Royse
Publisher: Brooks Cole
ISBN:
Category : Education
Languages : en
Pages : 344

Book Description
Well-known in the field, Royse and Thyer present and simplify all the essentials needed for a critical appreciation of evaluation issues and methodology. From this text, students will learn how to gather evidence and demonstrate that their interventions and programs are effective in improving clients' lives. This text is known for its student-friendly writing style and clear presentation of concepts, as well as its hands-on and applied focus.

Program Evaluation

Program Evaluation PDF Author: Robert O. Brinkerhoff
Publisher: Springer Science & Business Media
ISBN: 9401167575
Category : Education
Languages : en
Pages : 248

Book Description
Please glance over the questions that follow and read the answers to those that are of interest. Q: What does this manual do? A: This manual guides the user through designing an evaluation. A: Who can use it? A: Anyone interested or involved in evaluating professional trammg or inservice education programs. The primary users will be staff members who are doing their own program evaluation-maybe for the first time. (Experienced evaluators or other professional educators can find useful guides and worksheets in it.) Q: If I work through this manual, what will I accomplish? A: You will develop one or more evaluation designs, and perhaps you'll also use the designs to evaluate something to make it better or to document its current value. Q: What is an evaluation design? A: An evaluation design is a conceptual and procedural map for getting important information about training efforts to people who can use it, as shown in the graphic below.

Program Evaluation and Performance Measurement

Program Evaluation and Performance Measurement PDF Author: James C. McDavid
Publisher: SAGE Publications
ISBN: 145228959X
Category : Social Science
Languages : en
Pages : 561

Book Description
Program Evaluation and Performance Measurement: An Introduction to Practice, Second Edition offers an accessible, practical introduction to program evaluation and performance measurement for public and non-profit organizations, and has been extensively updated since the first edition. Using examples, it covers topics in a detailed fashion, making it a useful guide for students as well as practitioners who are participating in program evaluations or constructing and implementing performance measurement systems. Authors James C. McDavid, Irene Huse, and Laura R. L. Hawthorn guide readers through conducting quantitative and qualitative program evaluations, needs assessments, cost-benefit and cost-effectiveness analyses, as well as constructing, implementing and using performance measurement systems. The importance of professional judgment is highlighted throughout the book as an intrinsic feature of evaluation practice.

Advancing Evidence-Based Practice Through Program Evaluation

Advancing Evidence-Based Practice Through Program Evaluation PDF Author: Julie Q. Morrison
Publisher: Oxford University Press
ISBN: 019067170X
Category : Psychology
Languages : en
Pages : 216

Book Description
Given the current climate of results-driven accountability, school-based professionals have a significant contribution to make in improving the impact of programs and initiatives through the application of program evaluation methods and tools to inform decision making within a multi-tier system of supports framework. And yet there is currently a dearth of practical resources dedicated to developing school psychologists' competencies in program evaluation. Advancing Evidence-Based Practice through Program Evaluation will meet the needs of school psychologists and other school-based professionals seeking to use program evaluation approaches to enhance data-based decision making and accountability at a program and systems-level. This practical guide provides the most cutting-edge evaluation frameworks, methods, and tools available, with particular emphasis on the rapidly-developing areas of implementation research, evidence-based professional learning, and innovative approaches to communicating evaluation findings. The book will support school professionals in daily practice by enhancing and extending their knowledge and skills in measurement, assessment, consultation for systems change and the use of evidence-based interventions for academic and social/behavioral concerns, with a focus on evaluating the implementation and outcomes of school-based programs. The book will also facilitate the professional development of those currently engaged in graduate preparation programs in education, educational leadership, school counseling, and school social work, as well as the university faculty who guide their professional preparation. Finally, school professionals may also use Advancing Evidence-Based Practice through Program Evaluation to develop their professional competencies in implementing new initiatives funded by grants with clear expectations for program evaluation.

Program Evaluation for Social Workers

Program Evaluation for Social Workers PDF Author: Richard M. Grinnell
Publisher: Oxford University Press, USA
ISBN: 0190227303
Category : Social Science
Languages : en
Pages : 553

Book Description
First published in 1994, this text is designed to be used by graduate-level social work students in courses on evaluation and program design. Over the course of 20 years and 6 editions, the goals of the book have remained the same: to prepare students to participate in evaluative activities within their organizations; to prepare students to become critical producers and consumers of professional evaluative literature; and to prepare students for more advanced evaluation courses and texts. Grinnell, Gabor, and Unrau aim to meet these objectives by presenting a unique approach that is realistic, practical, applied, and user-friendly. While a majority of textbooks focus on program-level evaluation, some recent books present case-level evaluation methods but rely on inferentially powerful -- but difficult-to-implement -- experimental baseline designs. This text assumes that neither of these approaches adequately reflects the realities of the field or the needs of students and beginning practitioners. Instead, Program Evaluation for Social Workers offers a blend of the two that demonstrates how they can complement one another. The integration of case-level and program-level approaches provides an accessible, adaptable, and realistic framework for students to more easily grasp and implement in the real-world.

Handbook of Practical Program Evaluation

Handbook of Practical Program Evaluation PDF Author: Joseph S. Wholey
Publisher: John Wiley & Sons
ISBN: 047087340X
Category : Business & Economics
Languages : en
Pages : 754

Book Description
Praise for the third edition of the Handbook of Practical Program Evaluation "Mix three of the most highly regarded evaluators with a team of talented contributors, and you end up with an exceedingly practical and useful handbook that belongs on the reference shelf of every evaluator as well as program and policy officials." Jonathan D. Breul, executive director, IBM Center for The Business of Government "Joe Wholey and his colleagues have done it again a remarkably comprehensive, thoughtful, and interesting guide to the evaluation process and its context that should be useful to sponsors, users, and practitioners alike." Eleanor Chelimsky, former U.S. Assistant Comptroller General for Program Evaluation and Methodology "Students and practitioners of public policy and administration are fortunate that the leading scholars on evaluation have updated their outstanding book. This third edition of the Handbook of Practical Program Evaluation will prove once again to be an invaluable resource in the classroom and on the front lines for a public service under increasing pressure to do more with less." Paul L. Posner, director, public administration, George Mason University, and immediate former president, the American Society of Public Administration "The third edition of the Handbook of Practical Program Evaluation reflects the evolving nature of the field, while maintaining its value as a guide to the foundational skills needed for evaluation." Leslie J. Cooksy, current president, the American Evaluation Association "This third edition is even more of a must-have book than its earlier incarnations for academics to give their students a comprehensive overview of the field, for practitioners to use as a reference to the best minds on each topic, and for evaluation funders and consumers to learn what is possible and what they should expect. I've been in evaluation for 35 years, and I used the first and second editions all the time." Michael Hendricks, Ph.D., independent evaluation consultant