Product Cover

Evaluation in the Face of Uncertainty

Anticipating Surprise and Responding to the Inevitable

Jonathan A. Morell

HardcoverPaperbacke-bookprint + e-book
Hardcover
August 11, 2010
ISBN 9781606238585
Price: $74.00
303 Pages
Size: 6" x 9"
order
Paperback
August 12, 2010
ISBN 9781606238578
Price: $49.00
303 Pages
Size: 6" x 9"
order
e-book
June 10, 2011
ePub ?
Price: $49.00
303 Pages
order
print + e-book
Paperback + e-Book (ePub) ?
Price: $98.00 $58.80
303 Pages
order
professor copy Request a free digital professor copy on VitalSource ?

Unexpected events during an evaluation all too often send evaluators into crisis mode. This insightful book provides a systematic framework for diagnosing, anticipating, accommodating, and reining in costs of evaluation surprises. The result is evaluation that is better from a methodological point of view, and more responsive to stakeholders. Jonathan A. Morell identifies the types of surprises that arise at different stages of a program's life cycle and that may affect different aspects of the evaluation, from stakeholder relationships to data quality, methodology, funding, deadlines, information use, and program outcomes. His analysis draws on 18 concise cases from well-known researchers in a variety of evaluation settings. Morell offers guidelines for responding effectively to surprises and for determining the risks and benefits of potential solutions.

“The book provides a useful presentation of the tactics that can be implemented to prevent surprises from weakening an evaluation plan....Aside from the more theoretical discussion provided by Morell, the book presents and makes extensive use of 18 cases drawn from all experiences of practicing evaluators around the world....These cases...would make excellent teaching examples for newer evaluators: not only will they feel as though they are not alone in encountering unanticipated situations in their work, but they will learn important lessons from the experience of others....Meets a very real need and undoubtedly advances out thinking about the issue of surprise in evaluation....Morell provides good, practical advice on dealing with unanticipated events throughout the text, based on his own experiences and those of the evaluators who provided the cases for the book. The key insights found in the book will surely enable evaluators to better plan their work, identify surprises before they crop up, and, in the end, enable them to produce higher-quality evaluations.”

Canadian Journal of Program Evaluation


“This clearly written, well-organized book presents a lexicon of the surprises that occur in evaluation practice, both in the program—how it unfolds between planning and completion of the evaluation—and in the process of data collection and analysis. Morell outlines a structure for understanding what these surprises are, where they occur in the programming and evaluation process, why they are inevitable, and how they can (or sometimes cannot) be foreseen. The book provides practitioners with a systematic way of diagnosing and possibly even anticipating surprises, and explains how to accommodate them.”

—Deborah Wasserman, PhD, Principal Consultant, PERSolutions: Program Evaluation and Research, Columbus, Ohio


“If your world, including your evaluation work, is often complex, uncertain, and unpredictable, you have a fellow traveler and real-world guide in Morell. He applies more than three decades of experience to the challenges of distinguishing what can and cannot be foreseen, anticipating the unexpected, and dealing with the unforeseeable. This book draws on concrete cases, expert wisdom, practitioner experiences, scholarly knowledge, and organizational theory to explore evaluation approaches and methods that are agile, flexible, emergent, and responsive. Morell's voice is personable, his guidance realistic, and his insights important. You'll be surprised how much better you can get at anticipating and learning from surprises.”

—Michael Quinn Patton, PhD, Director, Utilization-Focused Evaluation, St. Paul, Minnesota


“Morell offers descriptions and prescriptions to help evaluators develop agile methodologies. This book is a valuable addition to available instructional resources for both seasoned practitioners and students just entering the evaluation profession. Writing in an accessible, cogent style, Morell effectively demonstrates how to navigate the challenges of complex systems. The brief cases he presents to illustrate his points will be especially useful for stimulating discussion in graduate classes as well as professional development settings.”

—Kathryn E. Newcomer, PhD, Director, Trachtenberg School of Public Policy and Public Administration, and Co-Director, Midge Smith Center for Evaluation Effectiveness, The George Washington University


“The use of real-life examples with all their warts adds considerably to the usefulness of the book, especially because the examples come from around the world and reflect a wide variety of evaluation contexts.”

—David L. Streiner, PhD, Department of Psychiatry, University of Toronto and Department of Psychiatry and Behavioural Neurosciences, McMaster University


“Morell offers an original, plain-spoken, perspicacious, wise discourse on a relatively neglected yet highly significant aspect of evaluation. This is among the first and best full-on analyses of the primary sources of evaluation surprise. The book puts a strong intellectual foundation under proposed remedies. There are gems in almost every chapter, such as the discussion of agile evaluation.”

—Lois-ellin Datta, PhD, President, Datta Analysis, Waikoloa, Hawaii


“This book fills a vastly neglected void in the evaluation literature. Morell provides a theoretical framework for anticipating and minimizing the unexpected by means of agile, responsive evaluation methodologies. He illustrates a variety of pragmatic strategies for dealing with the inevitable (and sometimes unforeseeable) things that can go wrong when planning and executing evaluations. Ironically, the lessons exemplified in the book have great potential for propelling the field forward in both anticipated and unanticipated ways. This is an essential, invaluable resource for any serious student, practitioner, or scholar of evaluation.”

—Chris L. S. Coryn, PhD, Director, Interdisciplinary PhD in Evaluation, Western Michigan University


“Insightful and provocative. Though Morell writes from the stance of an evaluator, his descriptions of 'things that go awry' apply to a wide swath of research methodologies. The idea that all research projects encounter unanticipated or unintended outcomes is aptly illustrated through a variety of case studies—for example, No Child Left Behind evaluation studies, health impacts of central heating, and outcomes of abolishing user fees in health clinics in Niger. The cases provide ample evidence of why things went awry and how unanticipated or unintended outcomes may be predicted and controlled. This book would be ideal for graduate-level courses on research design or program evaluation, either as a textbook or a supplement.”

—James E. Gruber, PhD, Department of Behavioral Sciences, University of Michigan-Dearborn

Table of Contents

1. From Firefighting to Systematic Action

Adding “Surprise” to the Mix

Historical Roots: Evaluation, Planning, and System Behavior

From Explaining Surprise to Dealing with It

Development Path of This Book

Guiding Principles

How to Read This Book

In Sum

2. Structure of the Unexpected

Where Does Surprise Come From?

Beyond Simple Distinctions

In Sum

3. Placing Surprise in the Evaluation Landscape

When Is the Probability of Surprise High?

When Is Surprise Disruptive to Evaluation?

In Sum

4. Minimizing Foreseeable Surprise

Theory: Using Explanatory Power and Simplified Relationships

Exploiting Past Experience: Capitalizing on What We Already Know

Limiting Time Frames to Minimize the Opportunity for Surprise

In Sum

5. Shifting from Advance Planning to Early Detection

Leading Indicators

System-Based Logic Modeling

In Sum

6. Agile Evaluation

Data

Agile Methodology

Retooling Program Theory

Agility and Stakeholder Needs

In Sum

7. How Much Is Too Much?: Appreciating Trade-Offs and Managing the Balance

A Framework for Appreciating Design Trade-Offs

Maximizing Choice, Minimizing Risk

Evaluation Design

In Sum

8. Applying the Examples to Categories of Cases: The Life Cycle View

“Unintended Consequences”: Unity across Programs and Their Evaluations

Interpreting Cases through a Life Cycle Perspective

In Sum

9. Applying the Examples to Categories of Cases: The Social/Organizational View

Navigating through the Cases

Placement of Cases on the Social/Organizational Map

Categorizations Derived from the Data

In Sum

10. Lessons from Individual Cases: Tactics for Anticipating Surprise

In Sum

11. Lessons from Individual Cases: Responding to Surprise

The Middle

Leading Indicators and Agile Evaluation

In Sum

12. Unanticipated Program Outcomes

Case Descriptions

Applying the Cases to Unintended Program Outcomes

Comparing the Cases

Predicting the Need for Agile Evaluation

In Sum

13. Concluding Thoughts

Cases

Case 1. Grasping at Straws and Discovering a Different Program Theory: An Exercise in Reengineering Analysis Logic in a Child Care Evaluation Setting, Dennis P. Affholter

Case 2. Shifting Sands in a Training Evaluation Context, James W. Altschuld and Phyllis M. Thomas

Case 3. Evaluating Programs Aimed at Promoting Child Well-Being: The Case of Local Social Welfare Agencies in Jerusalem, Anat Zeira

Case 4. Assessing the Impact of Providing Laptop Computers to Students, J. Dan Strahl, Deborah L. Lowther, and Steven M. Ross

Case 5. Quasi-Experimental Strategies When Randomization Fails: Propensity Score Matching and Sensitivity Analysis in Whole-School Reform, Gary L. Bowen, Roderick A. Rose, and Shenyang Guo

Case 6. Unexpected Changes in Program Delivery: The Perils of Overlooking Process Data When Evaluating HIV Prevention, Bryce D. Smith

Case 7. Evaluating Costs and Benefits of Consumer-Operated Services: Unexpected Resistance, Unanticipated Insights, and Déjà Vu All Over Again, Brian T. Yates

Case 8. Keep Up with the Program!: Adapting the Evaluation Focus to Align with a College Transition Program’s Changing Goals, Kristine L. Chadwick and Jennifer Conner Blatz

Case 9. Assumptions about School Staff’s Competencies and Likely Program Impacts, Laura Hassler Lang, Christine E. Johnson, and Shana Goldwyn

Case 10. Mixed Method Evaluation of a Support Project for Nonprofit Organizations, Riki Savaya and Mark Waysman

Case 11. Evaluating the Health Impacts of Central Heating, Jeremy Walker, Richard Mitchell, Stephen Platt, and Mark Petticrew

Case 12. Recruiting Target Audience: When All Else Fails, Use the Indirect Approach for Evaluating Substance Abuse Prevention, Molly Engle

Case 13. Unintended Consequences of Changing Funder Requirements Midproject on Outcome Evaluation Design and Results in HIV Outreach Services, Lena Lundgren, Therese Fitzgerald, and Deborah Chassler

Case 14. Generating and Using Evaluation Feedback for Providing Countywide Family Support Services, Deborah L. Wasserman

Case 15. Trauma and Posttraumatic Stress Disorder among Female Clients in Methadone Maintenance Treatment in Israel: From Simple Assessment to Complex Intervention

Miriam Schiff and Shabtay Levit

Case 16. From Unintended to Undesirable Effects of Health Intervention: The Case of User Fees Abolition in Niger, West Africa, Valéry Ridde and Aissa Diarra

Case 17. Unintended Consequences and Adapting Evaluation: Katrina Aid Today National Case Management Consortium, Amanda Janis and Kelly M. Stiefel

Case 18. Evaluation of the Integrated Services Pilot Program from Western Australia, Peter Hancock, Trudi Cooper, and Susanne Therese Bahn


About the Author

Jonathan A. Morell is Director of Evaluation at The Fulcrum Corporation and Editor of Evaluation and Program Planning. Formerly he was Senior Policy Analyst at the Vector Research Center, a division of Jacobs Engineering. He is active in the American Evaluation Association (AEA), where he has been instrumental in founding two topical interest groups: Systems, and Business and Industry. He is a recipient of the Marcus Ingle Distinguished Service Award and the Paul F. Lazarsfeld Evaluation Theory Award from the AEA. His professional life has integrated his role as an evaluation practitioner with his theoretical interests. As a practitioner, he evaluates organizational change, R&D, and safety programs. He is also deeply involved in organizational design. His theoretical interests include the nature and use of logic models, the role of Lean Six Sigma methodologies in evaluation, complex system behavior, and the nature of practical action. He maintains a blog on issues related to evaluation and evaluation surprises.

Contributors

Dennis P. Affholter, Independent Consultant, Paducah, Kentucky

James W. Altschuld, Professor Emeritus, School of Educational Policy and Leadership, The Ohio State University, Columbus, Ohio

Susanne Therese Bahn, Social Justice Research Centre, School of Psychology and Social Science, Edith Cowan University, Joondalup, Australia

Jennifer Conner Blatz, Knowledge Works Foundation, Cincinnati, Ohio

Gary L. Bowen, School of Social Work, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina

Kristine L. Chadwick, Center for Research and Evaluation Services, Edvantia, Inc., Charleston, West Virginia

Deborah Chassler, Center for Addictions Research and Services, Boston University School of Social Work, Boston, Massachusetts

Trudi Cooper, School of Psychology and Social Science, Edith Cowan University, Joondalup, Australia

Aissa Diarra, Laboratoire d'Etudes et de Recherches sur les Dynamiques Sociales et le Développement Local (LASDEL), Niamey, Niger

Molly Engle, Department of Adult and Higher Education Leadership and Extension Service, Oregon State University, Corvallis, Oregon

Therese Fitzgerald, Massachusetts Medical Society, Waltham, Massachusetts

Shana Goldwyn, School of Education, Educational Leadership Program, University of Cincinnati, Cincinnati, Ohio

Shenyang Guo, School of Social Work, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina

Peter Hancock, School of Psychology and Social Science, Edith Cowan University, Joondalup, Australia

Amanda Janis, Catholic Charities USA, Alexandria, Virginia

Christine E. Johnson, The Learning Systems Institute, Florida State University, Tallahassee, Florida

Laura Hassler Lang, The Learning Systems Institute, Florida State University, Tallahassee, Florida

Shabtay Levit, Jerusalem and Ashdod Methadone Maintenance Treatment Programs, Jerusalem, Israel

Deborah L. Lowther, College of Education, The University of Memphis, Memphis, Tennessee

Lena Lundgren, Center for Addictions Research and Services, Boston University School of Social Work, Boston, Massachusetts

Richard Mitchell, Section of Public Health and Health Policy, Faculty of Medicine, University of Glasgow, Glasgow, United Kingdom

Jonathan A. Morell, Vector Research Center, TechTeam Government Solutions, Ann Arbor, Michigan

Mark Petticrew, Public and Environmental Health Research Unit, London School of Hygiene and Tropical Medicine, London, United Kingdom

Stephen Platt, Centre for Population Health Sciences, School of Clinical Science and Community Health, The University of Edinburgh, Edinburgh, United Kingdom

Valéry Ridde, Départment de Médecine Sociale et Préventive, Université de Montréal, Montréal, Quebec, Canada

Roderick A. Rose, School of Social Work, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina

Steven M. Ross, Center for Research and Reform in Education, Johns Hopkins University, Baltimore, Maryland.

Riki Savaya, Shapell School of Social Work, Tel Aviv University, Tel Aviv, Israel

Miriam Schiff, School of Social Work and Social Welfare, Hebrew University, Jerusalem, Israel

Bryce D. Smith, School of Social Work, The University of Georgia, Athens, Georgia

Kelly M. Stiefel, Carr's Human Services Solutions LLC, Tenafly, New Jersey

J. Dan Strahl, Center for Research in Educational Policy, The University of Memphis, Memphis, Tennessee

Phyllis M. Thomas, Evaluation Consultant, Louisville, Colorado

Jeremy Walker, Centre for Population Health Sciences, School of Clinical Science and Community Health, The University of Edinburgh, Edinburgh, United Kingdom

Deborah L. Wasserman, Center for Family Research at COSI, The Ohio State University, Columbus, Ohio

Mark Waysman, Independent Consultant, Rishon Lezion, Israel

Brian T. Yates, Department of Psychology, American University, Washington DC

Anat Zeira, School of Social Work and Social Welfare, Hebrew University, Jerusalem, Israel

Audience

Applied researchers who do evaluations; instructors and graduate students in education, psychology, sociology, management, social work, nursing, and public policy.

Course Use

May serve as a supplemental text in graduate-level courses in evaluation, program planning, and management consulting.