However, not only the manifest content of the material is analyzed, but also so-called latent content as well as formal aspects of the material MAYRING, b, pp. Obviously, the strength of qualitative content analysis is that it is strictly controlled methodologically and that the material is analyzed step-by-step. Central to it is a category system which is developed right on the material employing a theory-guided procedure. Categories are understood as the more or less operational definitions of variables.
Above, we said that qualitative content analysis aims to preserve the advantages of quantitative content analysis but at the same time apply a more qualitative text interpretation see Section 4. Fitting the material into a model of communication: It should be determined on what part of the communication inferences shall be made, to aspects of the communicator his experiences, opinions, feelings , to the situation of the text production, to the socio-cultural background, to the text itself or to the effect of the message.
The material is to be analyzed step by step, following rules of procedure, devising the material into content analytical units. Categories in the center of analysis: The aspects of text interpretation, following the research questions, are put into categories, which were carefully founded and revised within the process of analysis feedback loops. Subject-reference instead of technique: This implies that the procedures of content analysis cannot be fixed but have to be adapted depending on the subject and its context.
Verification of the specific instruments through pilot studies: Due to the subject-reference, fully standardized methods are abstained from. That is why the procedures need to be tested in a pilot study. Inter-subjective verifiability is a case in point here. Technical fuzziness of qualitatively oriented research needs to be balanced by theoretical stringency. This means that the state-of-the-field of the respective research subject as well as subjects closely related are required to be taken into account and integrated into the analysis.
Inclusion of quantitative steps of analysis: Quantitative analyses are especially important when trying to generalize results. As a matter of fact, this notion of triangulation to argue in favor of an integration of qualitative and quantitative methods is not limited to content analysis but has been raised by many researchers cf.
Quality criteria of reliability and validity see also Section 4. The procedure has the pretension to be inter-subjectively comprehensible, to compare the results with other studies in the sense of triangulation and to carry out checks for reliability.
As a matter of fact, it is this kind of systematics what distinguishes content analysis from more interpretive, hermeneutic processing of text material MAYRING, , p. The seven components of content analysis listed above see Section 4. Consequently, MAYRING has developed a sequential model of qualitative content analysis and puts forward three distinct analytical procedures which may be carried out either independently or in combination, depending on the particular research question MAYRING, , p.
For this the text is paraphrased, generalized or abstracted and reduced. As a first step a lexico-grammatical definition is attempted, then the material for explication is determined, and this is followed by a narrow context analysis, and a broad context analysis. Finally an "explicatory paraphrase" is made of the particular portion of text and the explication is examined with reference to the total context. Here the text can be structured according to content, form and scaling.
The first stage is the determination of the units of analysis, after which the dimensions of the structuring are established on some theoretical basis and the features of the system of categories are fixed.
Subsequently definitions are formulated and key examples, with rules for coding in separate categories, are agreed upon. In the course of a first appraisal of the material the data locations are marked, and in a second scrutiny these are processed and extracted. If necessary the system of categories is re-examined and revised, which necessitates a reappraisal of the material. As a final stage the results are processed. Obviously, the central part of the process—structuring—is derived from classical content analysis, because here, too, units of coding and evaluation are set up and arranged in a schema of categories TITSCHER et al.
However, the basic difference between classical content analysis and structuring within qualitative content analysis is the development and use of the coding agenda 7. However, "extraction" seems to be closely related to MAYRING's structuring since it literally means the extraction of the relevant information from the text by the means of using a category system.
Thus, the material is reduced and a new basis of information separate from the original text comes into existence ibid. Therefore they argue in favor of a theory-based category system, which is more open and can be changed during extraction when relevant information turns up but does not fit into the category system. Both the dimensions of existing categories can be modified and new categories can be designed.
It is actually a package of techniques from which the analyst can chose and then adapts to his research question 8. Figure 1 shows the basic proceeding of qualitative content analysis from the initial theory to the final analysis and interpretation.
Basic proceeding of qualitative content analysis Source: Among the procedures of qualitative content analysis MAYRING a,  hallmarks the following two approaches as central to developing a category system and finding the appropriate text components as a result: Quantitative content analysis does not provide satisfactory answers to the question where the categories are derived from, and how the system of categories is developed.
But within the framework of qualitative approaches it is essential to develop the aspects of interpretation—the categories—as closely as possible to the material, and to formulate them in terms of the material. The steps of inductive category development are displayed in Figure 2. The main idea of the procedure is to formulate a criterion of definition, derived from the theoretical background and the research question, which determines the aspects of the textual material taken into account.
Following this criterion the material is worked through and categories are deduced tentatively and step by step. Within a feedback loop the categories are revised, eventually reduced to main categories and checked in respect to their reliability MAYRING, a, . Or, put the other way round: Deductive category application works with previously formulated, theoretically derived aspects of analysis, which are brought into connection with the text. The qualitative step of analysis consists of a methodologically controlled assignment of the category to a passage of text MAYRING, a, .
Figure 3 shows the steps of deductive category application. According to MAYRING a, ; ,  the main idea here is to give explicit definitions, examples and coding rules for each deductive category, determining exactly under what circumstances a text passage can be coded with a category.
Finally, those category definitions are put together within a coding agenda. Any kind of social research asserts its claims to fulfill certain quality criteria for measuring and collecting data. It is widely accepted that measurement or the methods of measurement should be as objective, reliable and valid as possible cf. In fact, the research strategy that is regularly pursued in content analysis is governed by these traditional criteria of validity and reliability, where the latter is a precondition for the former but not vice versa TITSCHER et al.
Since arguments concerning the content are judged to be more important than methodical issues in qualitative analysis, validity takes priority over reliability MAYRING, , p.
Two specific problems of content analysis that are often discussed in this context are problems of inference and problems of reliability TITSCHER et al. Problems of inference relate to the possibility of drawing conclusions, on the one hand, about the whole text on the basis of the text sample and, on the other hand, about the underlying theoretical constructs such as motives, attitudes, norms, etc. As a result, inference in content analysis confines itself only to specific features of external and internal validity.
The so-called inter-coder reliability shows to what extent different coders agree in the coding of the same text and intra-coder reliability explains how stable the coding of one coder is.
Because of the problems of reliability, the coding of texts is usually assigned to multiple coders so that the researcher can see whether the constructs being investigated are shared and whether multiple coders can reliably apply the same codes MAYRING, , p.
Semantic validity relates to the meaning reconstruction of the material, and is expressed in the appropriateness of the category definitions, the key examples and the rules for coders.
Sampling validity refers to the usual criteria for precise sampling and correlative validity refers to the correlation with some external criterion e. Predictive validity can only be used as a quality criterion if predictions can reasonably be made from the material in this case verification is usually easy and significant.
Construct validity relates, for instance, to previous success with similar constructs, established models and theories, and representative interpretations. Stability refers to whether the same results are obtained in a renewed application of the analytical tool to the same text and reproducibility is the extent to which the analysis achieves the same results under different circumstances, for instance with different coders.
It can be measured through inter-coder reliability for which a range of measures and indices have been developed. The main idea behind this concept is to discursively achieve mutual consent and accordance about the results of the analysis between the researchers and the researched. Thus, it draws on the degree to which the original data were representative of a larger population ibid.
Features of the units of evaluation: It will be examined whether the problem locations, where there is some disagreement about coding, differ systematically from the material. Properties of individual categories: The question is whether instances of disagreement are particularly common with particular categories.
It will be checked whether the distinctions between categories are too fine. Properties of the coders: If the lack of reliability cannot be attributed to a , b , or c , then the problem is usually with the coders and may perhaps be solved by more careful selection, more thorough training, shorter operation periods, etc.
The further development of new quality criteria calls for an analysis of where and what kind of other errors can be made or occur in conducting content analysis ibid. This section explores and discusses the possibilities of applying qualitative content analysis as a text interpretation method in case study research and thus tries to find an answer to the research question initially posed see Section 2.
The rising popularity of mixed methods approaches and the use of triangulation have already been mentioned briefly in the introduction of this paper. Having its origin in navigation, military strategy and geodetic surveying, the term triangulation in social research is used in a less literal sense to describe the use of multiple methods and measures of an empirical phenomenon cf. Data accumulated by different methods but bearing on the same issue are part of what is called the " multi-method approach": In fact, the "effectiveness of triangulation rests on the premise that the weaknesses in each single method will be compensated by the counter-balancing strengths of another" JICK, , p.
Therefore, triangulation "can potentially generate what anthropologists call "holistic work" or "thick description" JICK, , p. In the case of using qualitative content analysis in case study research, triangulation takes actually place on two different levels. On the first and more obvious level, data is triangulated by integrating different material and evidence see Section 5. On second level, triangulation takes place by applying a method of analysis qualitative content analysis that has not been particularly developed for this purpose to a different research design case study research.
As was already shown in Section 3. Besides, we also saw that case study research has a major function in generating hypotheses and build theory. In fact, a theory or theoretical framework first emerges through the inductive approach of studying an empirical case or object, not through a deductive process. As the author tried to demonstrate in Section 4. Hence, qualitative content analysis might be an appropriate analysis and interpretation method for case study research.
As a matter of fact, its quantitative counterpart—classical content analysis—is repeatedly mentioned as a method of analyzing data in the context of conducting case study research cf. Even though they concede that this is "not a particularly satisfactory approach," they claim that "it is not infrequently used" REMENYI et al.
Besides it preserves the advantages of quantitative content analysis but at the same time apply a more qualitative text interpretation. Therefore, it can be argued that qualitative content analysis could prove to be a useful tool for analyzing data material in case study research. In fact, the contribution of using qualitative content analysis in case study research will be demonstrated on the basis of the following points: One of the strengths of qualitative content analysis is the way it tries to synthesize openness—as claimed by the qualitative research paradigm—and theory-guided investigation—usually demanded by the hypothetical-deductive paradigm.
In fact, despite this openness, qualitative content analysis is strictly controlled methodologically and the material is analyzed in a step-by-step process see Section 4. It is this combination that fosters its strong ability to deal with complexity. Qualitative content analysis takes a holistic and comprehensive approach towards analyzing data material and thus achieves to almost completely grasp and cover the complexity of the social situations examined and social data material derived from them.
At the same time, qualitative content analysis uses a rule-based and methodologically controlled approach in order to deal with the complexity and gradually reduce it. The procedures of summary, explication and structuring step-by-step reduce complexity and filter out the main points of analysis in an iterative process.
Therefore, qualitative content analysis perfectly fits the credo of case study research: We just mentioned theory-guided analysis as one of the special strengths of qualitative content analysis see above, Section 5. The important point here is the same as with case study research: Theory-guided analysis also offers the chance to compare and complement the primary data collected within the research project with secondary data.
In fact, experts in social research recommend to conduct interpretations of results on two levels: Techniques and Procedures for Developing Grounded Theory. Burden, Johann and Roodt, Gert Grounded theory and its application in a recent study on organizational redesign.
Some reflections and guidelines. Journal of Human Resource Management, 5 3 , 11 — Qualitative Social Research, 1 1 , Art. The grounded theory method and case study data in IS research: The Creation of Theory: As a theory of interpretation, the hermeneutic tradition stretches all the way back to ancient Greek philosophy.
In the middle ages and the Renaissance, hermeneutics emerges as a method to identify the meaning and intent of Biblical scripture. Today hermeneutics is also used as a strategy to address a broad range of research questions like interpreting human practices,events, and situations. Researchers bring their personal conviction to the analysis, but they need to be open for revision. In the process of collecting data, a tentative understanding is developed which is then tested against reality.
Further understanding is gained if discrepancies between the current interpretation and the new data are recognized.
Thus, the process of understanding is characterized by constant revisions. This is referred to as the hermeneutic cycle. Research Methods for Political Science: Quantitative and Qualitative Methods.
Wallach, Harald , 2. Psychologie — Wissenschaftstheorie, philosophische Grundlagen und Geschichte: The term originally comes from phenomenological sociology, where it refers to the familiar world of everyday life. In analyzing lifeworlds, one attempts to draw out the individual structures within it.
A lifeworld can be understood as a physical environment even though the various inhabitants do not necessarily attribute the same meaning to the same space. Cats and people for example may inhabit the same physical environment but live in different lifeworlds as cupboards, window sills, and spaces underneath chairs have different significances for both of them.
The aim is the reconstruction of the various subjective perspectives. In order to achieve this, a number of data typesare employed like document analysis, interviews, standardized surveys or observant participation.
Not surprisingly, a hermeneutic approach of analysis is chosen. When the need arises, this is combined with codification procedures and thus, CAQDAS is a possible choice to support the process of data analysis. Hitzler, Roland und Eberle , Thomas S. Qualitative Social Research, 6 3 , Art. Oberkircher, Lisa and Hornidge, Anna-Katharina Rural Sociology 76 3 , , pp. Narrative research is about stories of life experiences. Study participants are asked in long interviews to give a detailed account of them and their story rather than to answer a predetermined list of questions.
Other forms of data include life histories, journals, diaries, memoirs, autobiographies and biographies. After transcription, narratives may be coded according to categories deemed theoretically important by the researcher Riesman, Another approach is a formal sequential analysis with the purpose of identifying recurrent and regular forms which are then related to specific modes of biographical experiences.
An example where ATLAs. Narrative analysis can however also be conducted using quantitative methods QNA. The aim of QNA is to turn words into numbers. By computing word frequencies of coding categories, words are then turned intonumbers cf. Sage, , 83, A Review of Narrative Methodology.
Narrative Configuration in Qualitative Analysis. Qualitative Studies in Education, Vol. Contexts and accounts on deviant actions. Objective Hermeneutics was developed by Oeverman, a German scholar and former student of Habermas. It is a method of interpreting textual dataproviding an explicit, rule-governed procedure. The aim is to go beyond subjective meanings detecting the objective connotation, the so called latent sense structure behind the data.
Similar as in ethnomethodology, personal motives and intentions are not important. Beginning with a first sequence, e. The story lines can beviewed as preliminary hypotheses that in the process of analysiscan be falsified when inspecting more of the empirical data. The method is very time-consuming and thus only feasible with small amounts of text. Coding procedures are explicitly banned. Proponents of this tradition argue that the development of a coding system cannot represent social reality appropriately.
The process of coding would even deplete the theoretical appraisal of empirical phenomena. Thus, objective hermeneutics is clearly not a methodological approach that can or should be supported by ATLAS. Steinke eds , A Companion to Qualitative Research. Hermeneutics and Objective Hermeneutics. Phenomenology is a research methodology which has its roots in philosophy focusing on the lived experience of individuals.
Phenomenological researchersare interested in the nature or meaning of something, their questions are about essence and not about appearance.
A constant question is: How does this affect me as researcher? Data are collected through a variety of means: During the process of analysis, the researcher reflects upon his or her own preconceptions about the data grasping the experiential world of the research participant.
An introduction to Interpretative Phenomenological Analysis. Writing in the dark: Phenomenological studies in interpretive inquiry. Cheng, Fung Kei Rosedale, Mary, Lisanby, Sarah H. Phenomenography is a fairly new qualitative research method developed in the mid to late s. It has primarily been a tool for educational research. Its roots are in Sweden at the University of Gothenburg. Today there are also strong communities in Britain and Australia.
The focus is on the experience of a phenomenon rather than on the phenomenon per se. The aim is to investigate the differing ways in which people experience, perceive, apprehend, understand, and conceptualize various phenomena. Variation and commonality in phenomenographic research methods.
Phenomenography — describing conceptions of the world around us. Instructional Science , 10 , Phenomenography — A research approach investigating different understandings of reality. Journal of Thought , 21 2 , On the philosophical foundation of phenomenography. BMC Medical Education, Journal of Documentation, 63 2. About method and methodology According to the academic literature, it should be your research question that is guiding this decision.
Conclusions are reached through discursive validation An analysis of embodied lived experience before empirical data are collected via self-inspection and reflection of own experience. This is considered necessary as all empirical data are regarded as being reductions and objectifications Coding: Take advantage of our expertise in document analysis qualitative research.
We offer the best service at the lowest rates at QualitativeDataAnalysis. Please accept our Terms. Your message has been successfully sent! We will get back to you soon. I enjoyed the way you analyzed that and your writing style. Thanks a lot, I really like the content. Help Document Analysis Qualitative Research Document analysis qualitative research is an important method in determining similarities and differences in human behavior.
Document analysis is a form of qualitative research in which documents are interpreted by the researcher to give voice and meaning around an assessment topic (Bowen, ). Analyzing documents incorporates coding content into themes similar to how focus group or .
This article examines the function of documents as a data source in qualitative research and discusses document analysis procedure in the context of actual research experiences. Targeted to research novices, the article takes a nuts‐and‐bolts approach to document analysis. It describes the nature and forms of documents, outlines the advantages and limitations of document analysis, and.
This article examines the function of documents as a data source in qualitative research and discusses document analysis procedure in the context of actual research experiences. Targeted to research novices, the article takes a nuts-and-bolts. Document Analysis as a Qualitative Research Method Glenn A. Bowen WESTERN CAROLINA UNIVERSITY ABSTRACT This article examines the function of documents as a data source in qualitative research and discusses.
This article examines the function of documents as a data source in qualitative research and discusses document analysis procedure in the context of actual research experiences. Document Analysis as a Qualitative Research Method Glenn A. Bowen WESTERN CAROLINA UNIVERSITY ABSTRACT This article examines the function of documents as a data source in qualitative research and.