Can data analysis using software provide rigor?

Can data analysis using software provide rigor?

António Pedro Costa, University of Aveiro (Portugal)

Numerous technological solutions have emerged to support researchers in almost all phases of Research projects. In general, the use of these tools gives more agility, completeness, making it possible to work an extensive volume of data in an organized and quite coherent way. Qualitative Research has benefited from the enormous progress in terms of methods and techniques with intensive use of technology. The current demands in the investigative context compel more and more researchers to equip themselves with digital tools that provide speed and efficiency in their research processes. Regardless of the nature of the research, the researcher, when using digital tools, seeks to ensure that the data collected are analyzed in a careful/rigorous, and systematic way, more effectively managing the time spent and increasing the reliability of the results obtained (Baugh, Hallcom and Harris, 2010).

Different application fields explore tools to support qualitative data analysis, the so-called Computer-Assisted Qualitative Data Analysis Software (CAQDAS) or Qualitative Data Analysis Software (QDAS) such as webQDA software (2019) – webqda.net. The simple use of a CAQDAS does not guarantee the rigor that a researcher should advocate. Regardless of the use of software, the researcher must define the research questions and their objectives, develop the method, as well as the techniques for collecting and analyzing data. Afterward, they should look for the CAQDAS that best suits their research project. Visualizing the structure, reading the features available, experimenting with a CAQDAS, allow the less experienced researcher to understand the different phases of qualitative data analysis. This concern with choosing “the best tool” is a decision that will be reflected in the result of the data analysis.

In this configuration, webQDA allows you to explore, directly or indirectly, validation techniques, such as:

  • Delphi Technique: it is based on the principle that the predictions of a structured group of experts are more accurate compared to those coming from unstructured or individual groups. Each element is thus isolated from the influence of the others (they must be experts);
  • Krippendorff’s alpha: Krippendorff’s alpha coefficient, is a statistical measure of the agreement achieved when coding a set of units of analysis in terms of the values of a variable/attribute;
  • Cohen’s kappa: Cohen’s kappa coefficient is a statistic used to measure inter-rater reliability of qualitative items. It is generally considered a more robust measure than the simple percentage calculation of agreement, because κ considers the possibility of agreement occurring by chance.

With the possibility that it can be applied to all three techniques described, Creswell & Miller (2000) state that when proceeding to a coding work, initially one should not expect to obtain more than 70% agreement. Ideally, both intra- and intercoder agreements should be close to 90%, but such is difficult to achieve. Léssard-Herbert and colleagues (1990) state that the search for synchronous validation can become very stimulating for the researcher as it forces them to reflect on the fact that slightly different results can be simultaneously true. Although this inter-coding process may be considered reductive, as it leaves part of the richness of the information in the shadow, this type of analysis is irreplaceable at the validation level (Bardin, 2004). As already argued by Vala in the 1980s, any content is susceptible to different interpretations, depending on the analysts, i.e. two coders analyzing the same material will certainly arrive at different results (1989). This aspect is part of the nature of qualitative research itself.

In this framework, when we involved experts to validate the model of categories, we did not corroborate what Gibbs, Friese, & Mangabeira (2002) state: that CAQDAS are less useful for addressing issues of validity and reliability in the thematic ideas that emerge during data analysis.

Another dimension of the rigor of the work conducted online is the integrity of the researcher. They collect, validate, and analyze data online. They transfer and share information, involve different stakeholders at different points in their study. This flexibility and ease in the user’s actions often lead them to ignore the defined methodological steps. They can always adjust them in the face of some constraints and limitations that have emerged throughout the study. This implies an ethical pact with themselves, and the others involved.

Partilhar
Share on facebook
Share on twitter
Share on linkedin
Share on email
Share on whatsapp

Related News

One of the main errors verified in research is the lack of planning of adequate methods for data analysis. For example, to develop a data collection instrument, it is necessary to pay attention to the tools used to obtain results (analysis). Analysing qualitative data is not a task without difficulties, as the non-numeric and unstructured data corpus is generally diffuse and complex. There are no clear and widely accepted rules on how to analyse non-numeric and unstructured data.
The seven essential steps or subtasks that we will describe below are transversal or generic to qualitative data analysis techniques. The technique’s focus on the analysis rests on specific choices according to each objective and research questions.
For very practical purposes, it can be said that qualitative research of scientific nature has three stages: (1) an exploratory phase; (2) fieldwork; (3) analysis and treatment of the empirical and documentary material. The exploratory phase consists of the production of the research project and all the procedures necessary for preparation to enter the field. The fieldwork phase constitutes the primordial moment for understanding, in intersubjectivity, the empirical reality under study.