Qualitative Research And Evaluation Methods. Article · January with 4, Reads. Cite this publication. Michael Quinn Patton at Utilization-Focused. Michael Quinn Patton and Michael Cochran A Guide to using Qualitative Research Methodology How to develop qualitative research designs. 3. Library of Congress Cataloging-in-Publication Data. Patton, Michael Quinn. Qualitative research and evaluation methods / by Michael Quinn Patton.— 3rd ed .

Qualitative Research And Evaluation Methods Patton 2002 Pdf

Language:English, Dutch, Hindi
Country:Bosnia Herzegovina
Published (Last):29.08.2016
ePub File Size:30.43 MB
PDF File Size:9.24 MB
Distribution:Free* [*Sign up for free]
Uploaded by: COREEN

Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the. Get this from a library! Qualitative research and evaluation methods. [Michael Quinn Patton] -- This book has been completely revised with hundreds of new. Research Methods – Dr Richard Boateng [[email protected]] Sage. 3. Patton, M. Q. () Qualitative Research and Evaluation Methods.

As a result, conducting QE becomes a relevant process of social intervention. However, for this to occur, it is essential to transform the knowledge acquired into practical action, namely, to make of the evaluation results tools to design and implement interventions that promote change.

Value of this approach The debate of whether evaluation is a different activity than research has been undertaken by several authors e. Fournier and Shaw tried to specifically differentiate between these two. Shaw mentioned that evaluation seeks to address practical problems and short-term issues, and calls for action, in contrast with research that addresses theoretical topics, strives for description, and looks at long-term issues.

Hammersley , tried to summarize the difference by stating that while the first refers to practical research, the second deals with scientific research. In any case, it is not possible to conduct a meaningful evaluation without using research methods.

Therefore, perhaps the most important distinction lies in the purpose and products rather than in the methodological differences. In contrast with qualitative research, QE deals with practical issues, with the potential of promoting change in the course of programs, and leads to judgments of merit and value linked to a specific context and population. Yet, QE faces the same criticisms regarding trustworthiness, reliability, and functionality that affect any qualitative investigation.

In this context, subjectivity remains as an important concern, especially by those using conventional quantitative evaluation methods. So far, there are signals suggesting that QE will become more widely used in the coming years as an alternative or complementary approach to evaluate health programs. This implies the recognition that this approach can use findings effectively to illustrate and reflect upon difficulties in the implementation of programs with the ultimate goal of improving the processes involved from the perspective of the stakeholders.

The participation of social actors in the assessment of programs and the complexity of the evaluated phenomena are key elements that constrain the possibility of quantitative methods alone to appreciate a reality filled with inter-subjectivity, experiences, meanings, and interpretations, as several reserachers have tried to illustrate in several studies.

As to the range of programs that can be assessed using QE, the list of examples is broad. While some authors have qualitatively assessed specific health programs, such as those to detect, treat, and control patients with cancer Collie et al.

In general, the results derived from these studies have provided useful information not only for designing and planning purposes, but to implement actions to improve the effectiveness of the programs. Even though they used a critical approach, they all proposed concrete strategies for the stakeholders based on perceptions, experiences, needs, and expectations, facilitating the decision-making process to plan and execute specific actions. They have been able to gain insights about what is really happening in the programs: their strengths and weaknesses, their contradictions and conflicts, and the gap between the intended and experienced implementation.

These attempts are noteworthy, as they challenge and encourage other researchers and evaluators to value the use of this approach to assist policy makers to improve existing health programs. It is also important to keep in mind that diversity rather that uniformity is the norm when designing and carrying out QEs.

Aims, methods, techniques, types, and potential use of the results, as well as the intended role of the evaluator can vary considerably, making it undesirable to judge one approach as better than the other. Presenting the framework of reference and assessing the proposed methods and techniques in context with the purpose of the evaluation is what actually determines the relevance of the specific approach used. Moreover, evaluative approaches can indeed be complementary. The fact that we generally evaluate complex and dynamic systems that often involve problematic individuals with intricate interactions inside and outside the system makes it suitable to combine approaches to better unveil such complexity.

Final considerations We would like to encourage academic evaluators in the field of public health to get more directly involved with the reality they investigate so that their efforts are more conducive to its transformation, as the very purpose of evaluating is to integrate knowledge and action to promote health improvements.

We believe that researchers could learn to move comfortably from the dimension of investigating phenomena aiming at understanding and generating knowledge, to the dimension of transforming society, which in this context refers to taking an active role to promote changes that result in a better health for the population. This implies that beneficiaries become involved in an active and pro-active manner in the evaluated phenomena with the aim of improving the very health programs or services they receive.

Qualitative and participative models can complement each other. Although participative evaluation combines discovery and activism to transform society, this practice occurs depending on the way in which people live and conceive the world, which can be unveiled through qualitative and interpretative methods. This reflection points to the importance of considering quantitative, qualitative, and participative approaches as complementary rather than conflicting to be able to generate more complete, meaningful, and transformative evaluations.

The fact that there is a plurality of methods, techniques, results, effects, and topics of evaluation along with the various functions that the evaluation can take and the roles that the evaluator can play, make it clear that there is not a single model to be used, nor is it possible to typify approaches as better or worse.

All this points to the importance of contextualizing the topic with the purpose of the evaluation to select a combination of methods that best achieve the goals of the evaluation. Such combinations are capable of assessing intricate and dynamic systems that involve individuals with complex interactions inside and outside the system in which they operate to effectively integrate knowledge and action Potvin et al.

Conflict of interest and funding The authors did not receive any funding or benefits from the industry or elsewhere to conduct this study. References Bosi M. L, Mercado F. In: Bosi M. J, editors. Enfoques emergentes. Emerging approaches]. Qualitative knowing in action research. In: Brenner M, Marsh P, editors.

Qualitative Research & Evaluation Methods (eBook, PDF)

The social context of methods. London: Croom Helm; Action research: Explaining the diversity. Human Relations. Participatory rural appraisal: Challenges, potentials and paradigm. World Development. Qualitative evaluation of care plans for Canadian breast and head-and-neck cancer survivors. Current Oncology. Towards a critical approach to evaluation.

What is Kobo Super Points?

Debates and dilemmas in promoting health. London: Palgrave; Foundations of empowerment evaluation.

Thousand Oaks, CA: Sage; Establishing evaluative conclusions: A distinction between general and working logic. New Directions for Evaluation. Participatory evaluation. Studies in Educational Evaluation. The creation of performance evaluation indicators through a focus group.

Revista Latino-Americana de Enfermagem. Understanding social programs through evaluation. In: Denzin N, Lincoln Y, editors. Handbook of qualitative research. G, Lincoln Y. Fourth generation evaluation. Newbury Park, CA: Sage; Educational research, policy making and practice.

London: Paul Chapman; Can and should educational research be educative? Oxford Review of Education.


Evaluating with validity. Beverly Hills, CA: Sage; Madrid: Morata; D, Niteesh K, Choudhry M. D, Kilabuk B. A, William H, et al. Online social networking by patients with diabetes: a qualitative evaluation of communication with Facebook. Journal of General Internal Medicine.

L, Cezar-Vaz M. R, Silveira R.

Right of the citizen and evaluation of health services: Theoretical-practical approaches. Research, evaluation and policy analysis: Heuristics and disciplined inquiry. Module 17 Systems Theory and Complexity Theory. Module 19 Patterns and themes across inquiry frameworks: Chapter summary and conclusions. Practical purposes, concrete questions, and actionable answers: Illuminating and enhancing quality. Program evaluation applications: Focus on outcomes.

Module 22 Specialized qualitative evaluation applications. Module 23 Evaluating program models and theories of change, and evaluation models especially aligned with qualitative methods.

Module 24 Interactive and participatory qualitative applications. Module 25 Democratic evaluation, indigenous research and evaluation, capacity building, and cultural competence. Module 26 Special methodological applications. Module 27 A vision of the utility of qualitative methods: Module 28 Design thinking: Questions derive from purpose, design answers questions. Module 29 Date Collection Decisions. Module 30 Purposeful sampling and case selection: Overview of strategies and options.

Module 31 Single-significant-case sampling as a design strategy. Module 32 Comparison-focused sampling options. Module 33 Group characteristics sampling strategies and options.

Module 34 Concept and theoretical sampling strategies and options. Instrumental-use multiple-case sampling. Module 36 Sequential and emergence-driven sampling strategies and options.

Module 37 Analytically focused sampling. Module 38 Mixed, stratified, and nested purposeful sampling strategies. Module 39 Information-rich cases. Module 40 Sample size for qualitative designs. Module 41 Mixed methods designs. Module 42 Qualitative design chapter summary and conclusion: Methods choices and decisions. Module 43 The Power of direct observation. Variations in observational methods. Variations in duration of observations and site visits: From rapid reconnaissance to longitudinal studies over years.

Variations in observational focus and summary of dimensions along which fieldwork varies. What to observe: Sensitizing concepts. Integrating what to observe with how to observe.

Unobtrusive observations and indicators, and documents and archival fieldwork. Observing oneself: Reflexivity and Creativity, and Review of Fieldwork Dimensions. Doing Fieldwork: The Data Gathering Process. Stages of fieldwork: Entry into the field.

Routinization of fieldwork: The dynamics of the second stage. Bringing fieldwork to a close. The observer and what is observed: Unity, separation, and reactivity. Chapter summary and conclusion: Guidelines for fieldwork. Module 57 The Interview Society: Diversity of applications.

Qualitative research and evaluation methods

Module 58 Distinguishing interview approaches and types of interviews. Module 59 Question options and skilled question formulation. Module 60 Rapport, neutrality, and the interview relationship.

Module 61 Interviewing groups and cross-cultural interviewing. Creative modes of qualitative inquiry. Ethical issues and challenges in qualitative interviewing. Personal reflections on interviewing, and chapter summary and conclusion. Setting the Context for Qualitative Analysis: Challenge, Purpose, and Focus. Thick description and case studies: The bedrock of qualitative analysis.

Qualitative Analysis Approaches: Identifying Patterns and Themes. The intellectual and operational work of analysis. Logical and matrix analyses, and synthesizing qualitative studies. Interpreting findings, determining substantive significance, phenomenological essence, and hermeneutic interpretation. Causal explanation thorough qualitative analysis. New analysis directions: Contribution analysis, participatory analysis, and qualitative counterfactuals.

Writing up and reporting findings, including using visuals. Special analysis and reporting issues: Mixed methods, focused communications, and principles-focused report exemplar. Module 75 Chapter summary and conclusion, plus case study exhibits. Analytical processes for enhancing credibility: Four triangulation processes for enhancing credibility. Alternative and competing criteria for judging the quality of qualitative inquiries: Part 1, universal criteria, and traditional scientific research versus constructivist criteria.

Alternative and competing criteria, Part 2: Module 80 Credibility of the inquirer. Module 82 Enhancing the credibility and utility of qualitative inquiry by addressing philosophy of science issues. Access to certain full-text SAGE journal articles that have been carefully selected for each chapter. Each article supports and expands on the concepts presented in the chapter. This feature also provides questions to focus and guide your interpretation.

Kari O'Grady Ph. D, Loyola University Maryland. Kathleen A. Bolland, The University of Alabama.

Susan S. Manning, University of Denver. Michael P. Dr John Donohue. Psychology, University Of Ottawa.The dynamics of the second stage Module Contribution analysis participatory analysis and qualitative counterfactuals Module Please re-enter recipient e-mail address es.

Jetzt verschenken. Evaluation Practice. W ilson treatment of a number of topics not pretation, and enhancing the quality University of Sheffield, UK covered in the first edition as well as a and credibility of qualitative analysis. Chapter summary and conclusion Part 2. Such issues can hardly be measured or predicted beforehand.

Find a copy online Links to this item ptarpp2.

RETA from Norwalk
I love exploring ePub and PDF books jubilantly. Also read my other articles. I take pleasure in soccer.