Journal article
From text to codings: intercoder reliability assessment in qualitative content analysis.
English
BACKGROUND
High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis.
OBJECTIVE
To illustrate how ICR assessment can be used to improve codings in qualitative content analysis.
METHODS
Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain.
RESULTS
First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results.
DISCUSSION
The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.
-
Language
-
-
Open access status
-
closed
-
Identifiers
-
-
Persistent URL
-
https://sonar.ch/global/documents/192255
Statistics
Document views: 21
File downloads: