Abstract
Keywords
Introduction
One of the most widely used and controversial qualitative research methods among researchers is Grounded Theory (GT). The defining element that differentiates GT from other types of qualitative methods of analysis is its focus on the generation of theories or theoretical models that explain, confirm and/or develop the social phenomena under study (Rodríguez et al., 1999). These theories are the result of systematically collecting and analysing data taken from real life, and subsequently interpreting the content of the data (Strauss & Corbin, 2002). In this way, GT allows us to truly understand “the meanings and actions of research participants, offer abstract interpretations of empirical relationships and generate conditional statements about the implications of their analysis” (Charmaz, 2005, p. 508), aspects that other methods of analysis do not allow us to do.
However, this interpretation of reality is not the result of a process of intuitive discovery, but rather GT providing a systematic process of theory building and decision making throughout the study (Charmaz, 2014). Thus, GT makes qualitative analysis procedures explicit and helps researchers to develop useful conceptualizations of the data (Cuesta, 2006). In this way, GT itself is both a theory-building and a theory-elaborating technique (Murphy et al., 2017).
Therefore, from a simple induction exercise, GT is based on four basic principles that make it possible to generate a theoretical model, understand the phenomenon in-depth, and obtain a meaningful guide for action (Charmaz, 2014; Murphy et al., 2017; Watling & Lingard, 2012): emergence, theoretical sampling, constant comparison, and theoretical saturation. The principle of
Other key and fundamental elements of the model are: ( ( ( ( ( (
However, despite the fact that the basic concepts of GT are widely known by qualitative researchers, its practical use in studies presents some difficulties (Murphy et al., 2017). There are multiple ways of understanding and applying Grounded Theory, and within each approach, different types of interpretation are used. Since the publication of Glaser and Strauss' book in 1967, three main versions of Grounded Theory have emerged (Charmaz, 2011; 2014; Glaser, 1993; Kelle, 2019; Matteucci & Gnoth, 2017; Strauss & Corbin, 1990): On the one hand, Glaser, more orthodox, insists on the generation of theories from the primary data obtained by the researcher. On the other hand, Strauss adds new instruments of analysis, such as the interpretive description of data, axial coding, diagrams, the matrix, or the use of computer programs (Abela et al., 2007). Finally, Charmaz, with a constructivist theoretical basis, focuses on allowing data topics to emerge organically. Her proposal is a redesign of the model, through a systematic approach that encourages the integration of the subjective experience of the researcher, as a priority and the social conditions specific to the subject of study (Bonilla-García & López-Suárez, 2016; Charmaz, 2011). Therefore, although there has been an important development of this qualitative methodology, its complexity does not favour researchers' understanding of it in sufficient depth (Contreras et al., 2020) and, in addition, it exhibits some ambivalences or vagueness (Prigol & Behrens, 2019).
The aim of this article is to describe and model the process of applying GT in educational technology research by describing the different components of the procedure simultaneously with the methodological practices applied in an educational technology study. The aim is to assess the applicability of GT in educational research and to provide a roadmap to help other researchers use GT.
This contribution is not developed from the orthodoxy of the method, as we consider GT to be an instrument of data analysis that leads to theory elaboration. Consequently, the strategy described is directed towards Strauss' interpretive approach to GT, which is more open and flexible. As such, those tools and strategies that have been deemed suitable have been selected and used, and it is hoped that this knowledge will provide an aid or resource for researchers using GT. However, the intention is not to provide a “recipe” for how to automatically “build theories”. In fact, one of the strengths of GT is its versatility, i.e. its adaptability to the most appropriate decisions for each study. For this reason, it is advised that the proposed strategies should be seen as useful tools rather than a single path for all GT studies.
Method
Context of The Research
Since this study is supported by the methodological process applied in a qualitative study on educational technology, entitled “
Objectives and research Questions
The general purpose of GT research should be to analyse and understand processes, thus generating a formal model or theory of the phenomenon (Flick, 2019). The general objective of the PolEx-ICT study, in which GT was used, was to delve into the processes of ICT integration in primary and early childhood education schools, in such a way that the current state of the process of ICT integration in general, and the phenomena that are occurring around it, can be understood. A series of specific objectives were also defined: (
For GT, it is not enough to merely describe reality and facts; one must understand the phenomena in order to extract meaning from the observed processes. Researchers commonly preserve the open and emergent approach of GT by setting two or three specific research questions that initially guide the attention of the research process. Their purpose is to “explore” in order to understand the phenomenon rather than to “confirm” any preconceived ideas (Murphy et al., 2017). These questions are not only the point of reference during data collection and report writing, but they also provide a guide for making sense of previous results and clarify the usefulness of potential findings. Therefore, the questions elaborated in GT are usually along the lines of: “What is happening in this context?” and “What phenomena are taking place?”. As answers are obtained, decisions about theoretical sampling are made.
Research Questions by Grounded Theory typologies in PolEx-Information and Communication Technologies.
ICT = information and communication technologies
At the same time as analysing the data from the PolEx-ICT case studies and building a theory, the research questions were reformulated based on these results and new questions were posed that required different types of data. For this reason, the duration of the study required a time period of 3 years (Abela et al., 2007).
Literature Review
It is recommended that research questions are categorized by dimensions or themes that are obtained from a literature review (Stake, 1998). In this way, the researcher has a conceptual structure to focus on the relevant aspects. This review can be approached from two approaches (Murphy et al., 2017): the
According to these three phases, in the research design of the PolEx-ICT study, based on Grounded Theory, a literature review was conducted before the start of each theoretical sampling/research study (Figure 1). First, in order to become familiar with the topic, the researcher looked for the latest scientific manuals on educational technology and ICT integration in education. Second, the researcher searched databases (Eric, Scopus, and WOS) for scientific articles on empirical studies whose main objective was the study of organizational aspects within the integration of ICT in primary schools. And third, the researcher searched databases (Eric, Scopus, and WOS) for Systematic Literature Reviews (SLR) on the topic under study, which would allow the data and the emerging theory to be compared and verified. This provided the state of the art that contributed to the decision-making process in the selection of cases, the development of research instruments, and the formulation of new research questions during the process of analysis of the three research studies (Strauss & Corbin, 2002). PolEx-Information and communication technologies research design based on the theoretical samplings.
Theoretical Sampling
Theoretical sampling is the starting point for GT research. It establishes the criteria for selecting people, cases, or situations from which to collect and analyse data (Prigol & Behrens, 2019), as well as for making appropriate decisions about expanding the sample when more information is needed (Glaser & Strauss, 1967). Theoretical sampling is integrated and interconnected with the research process and occurs simultaneously with data analysis, making it the most complex and important component of the study (Morse & Clark, 2019). The data collection process generates the emergent theory from the moment that information collection, coding, and analysis determine what data will be needed next to answer the research questions and on which samples should be drawn. As key concepts emerge and categories and properties take shape, the researcher requires more data from both new subjects and different contexts in order to understand the phenomenon in-depth and continue theory development. Therefore, as the analysis matures, sample selection strategies and criteria will change accordingly (Morse & Clark, 2019).
There are several types of sampling depending on the timing of the research process and the objectives set (Urquhart, 2019). The basic theoretical sampling types are (Strauss & Corbin, 2002):
Theoretical sampling requires researchers to access the field repeatedly at different times. Prolonging the research process in this way is a limitation that must be taken into account as researchers, in order to exploit the full potential of theoretical sampling, must analyse the data before returning to the context of the study (Matteucci & Gnoth, 2017). Consequently, in the study design, each theoretical sampling should be associated with a time or phase of the process in order to establish a purpose for each sampling. However, we must not forget that GT implies a flexible design that is adapted to the evolution of concepts in the process of theory building and confirmation (Morse & Clark, 2019).
Therefore, taking into account the different phases of theoretical sampling, the PolEx-ICT GT model conceives study design as a spiral (Figure 1), through which the three types of sampling (open, relational and variational, and discriminate) are developed, and in which the type of groups, informants, or new settings to be explored in order to shape the theory are determined (Trinidad et al., 2006).
Decisions Concerning the Sample
Types of Case Studies Developed in PolEx-Information and Communication Technologies.
Data Collection Instruments
In order to meet the objectives set out in each of the studies, according to each type of theoretical sampling, different data collection instruments were used. The first study, following open sampling, had the widest variety of data collection instruments (institutional documents, teaching materials, classroom observations, individual semi-structured interviews, and group discussions). In the second and third studies, considering relational and variational, and discriminate sampling, the study focused on conducting individual semi-structured interviews with different educational agents, which are described below.
Data Collection Instruments and Informants According to The Type of Sampling and Study in PolEx-Information and Communication Technologies.
aFour primary schools participated in study 1.
bOne primary school participated in study 2.
cFour primary schools participated in study 3.
Informants
Therefore, according to the theoretical sampling developed at each point in time, those people who meet certain criteria or requirements will be selected. In PolEx-ICT, teachers who frequently use ICT, teachers belonging to the school’s organizational structure (management team or ICT coordinators), students, families belonging to the Parents' Association (PA), and other external agents such as advisors from the Centre for Teachers and Resources (CTR), which is a body for the continuous training of teachers, were selected.
PolEx-ICT used a variety of data collection processes, including techniques and instruments used in the different phases of the study and its informants (people, documentation, and settings).
Results
Constant Comparative Method
GT is, in essence, a comparative method. Glaser and Strauss integrate coding and theory generation in a systematic way through an analytical procedure of constant comparison, developing categories and properties (Abela et al., 2007). Thus, the constant comparative method consists of the search for similarities and differences by analysing the incidents contained in the data, through the simultaneous processes of coding and analysis, with the purpose of systematically generating a theory (Trinidad et al., 2006).
It is difficult to separate coding from other elements of analysis in GT, let alone establish an order in the elements, especially as, in some cases, they are carried out simultaneously and iteratively (Belgrave & Seide, 2019). However, there must be a starting point, which is why it is useful to establish phases that define the method as steps in a process that is by no means linear. The constant comparative method is developed in four stages:
Phase 1: Open Coding
As data are collected, researchers normally transcribe and code them using the principle of constant comparison. Coding is the phase in which each piece of text “quotes” an indicator or code specific to the category in which it is considered to fall, and is classified by concepts and meanings. Charmaz defines the process of coding as “attaching labels to data segments that represent what each segment is about” (2006, p.3).
Thus, the analyst’s initial task is to code each event to form as many categories of analysis as emerge from the data (Abela et al., 2007; Charmaz, 2006). This procedure consists of fragmenting, examining, comparing, and conceptualizing information and concrete data from institutional documents, observations, interviews, and group discussions, which is carried out by reading each transcript and attaching “codes” (essentially just labels) to fragments of text to reveal meaning. Codes can be applied to text segments of any length, as long as the segment conveys a coherent idea. However, to begin with open coding, it is recommended to conduct
This whole process of microanalysis and coding the different segments of meaning should be guided by analytical questions such as: “What does the data suggest?”, “What is the central idea of this paragraph or sentence?”, “What is the text talking about?”, “What is happening in the text?”, and “What concept does this text extract suggest?” (Charmaz, 2017). Following this, a name or code is assigned to it; as the codes begin to accumulate within the dimensions determined at the beginning of the study, the study proceeds to classification or categorization under more abstract explanatory terms, i.e. into categories (Strauss & Corbin, 2002). Subcategories then usually emerge from the categories, giving greater specificity to the category and forming what we can call a
It should be noted that the researcher can not only code text fragments, but can study and understand emotional states, attitudes, and settings, as well as the participants’ silence (Prigol & Behrens, 2019). Furthermore, several codes or labels can be assigned to the same text segment if the researcher considers that it can contribute information to different categories (Kelle, 2019). When using the Tabula Geminus approach (Kreiner, 2015), code names can be designated with vocabulary from existing literature. In contrast, if the researcher follows the Tabula Rasa approach (Gioia et al., 2012), the coding is prevented from using literature-based terms (Murphy et al., 2017). Especially useful during the coding process are in vivo codes, or terms that are used by informants during the interview and become part of the coding structure (Belgrave & Seide, 2019; Murphy et al., 2017; Vegas, 2016).
Example of Content Analysis Memorandums in Memorandums.
ICT = information and communication technologies
Tools to Facilitate Microanalysis and Coding
Each researcher must find their own style of memo writing, as long as these notes are made in an orderly, progressive, systematic, and easily retrievable manner, thus providing the researcher with a bank of analytical ideas classified and grouped according to the evolving theoretical framework.
Example of Classification of Memos by Dimension and Research Study in PolEx-Information and Communication Technologies.
In each of the memos, the whole evolutionary process of the constant comparative method of one dimension and, therefore, of the different coding procedures (open, axial, and selective) is comprehended. Therefore, all the memos contain the following elements: (
Phase 2: Axial Coding
Axial coding requires an in-depth analysis of a category in order to uncover interactions and relationships with other categories, subcategories, and properties (Strauss & Corbin, 2002). Axial coding enables data to be recomposed and gives coherence to the analysis and to the emerging theory, pointing out sub-dimensions and properties within a context, allowing for more precise explanations that respond to the phenomenon with questions such as: when, where, why, who, how, and with what consequences (Prigol & Behrens, 2019).
Procedurally, Strauss and Corbin (2002) point out four basic tasks in axial coding: (
Example of an Organizational Table of Ideas in PolEx-Information and Communication Technologies.
ICT = information and communication technologies
The construction of matrices is a very useful method for creating and validating hypotheses (Conde, 2009). In this procedure, the most common data and from whom they originate is made explicit. It makes it possible to answer the questions of axial coding: “Why does it happen?”, “Where/when does it happen?” and, “With what/whom?” (Strauss & Corbin, 2002).
Once conclusions have been drawn from these data, it is necessary to arrange and present them in an orderly fashion. Thus, axial coding concludes with the outline of a diagram or model called a
Concept maps can be used as the instrument to represent the diagrams, as they are the best method to visualize the central concepts, illustrate the relationships established between them (Conde, 2009), and even the mobility process observed throughout the different stages of the research. These concept maps can display the concepts (inside “boxes”) that express the themes and categories found, the relationships (represented by lines) between themes and categories, the type of relationship (represented by the word-link or connector) that is established between themes and categories, as well as the time of analysis (research study) in which a theme or category appears, represented by the colours of the concepts (Figure 2). Example of definitive concept map in PolEx-Information and communication technologies.
Thus, throughout the axial coding process, concept maps can be progressively constructed for each dimension. As the analyses and the different studies progress, the concept maps are modified, shaping the analysis, improving the understanding of the data, and facilitating the subsequent drawing of conclusions.
Phase 3: Delimitation of the Emerging Theory That is Beginning to Develop
The analyst ends up with coded data, categories, memos, and a possible theoretical postulate shown in a concept map. As a result of the constant comparison of the categories, the emerging theory is modified and becomes more consistent. The ideas that have been reflected in the memos concretize the meanings of the categories and their relationships. The annotations in the memos are the main support for theoretical coding or selective coding (Abela et al., 2007; Glazer et al., 2005; Strauss & Corbin, 2002), whose main objective is to “reweave the fractured story” (Glaser & Strauss, 1967).
Thus, once the theoretical scheme has been generated from the conceptual map and the interpretations made in their memos, the researcher returns to the units or segments and compares them with their emerging scheme in order to substantiate it and to be able to delimit the categories and outline the emerging theory (Hernández et al., 2006). Selective coding begins in order to integrate and refine the theory Strauss and Corbin (2002); in order to carry out the selective coding process, it is recommended to carry out a theoretical writing, which describes the relationships between the categories from a central concept, as well as the process or phenomenon, in the light of our theory (Strauss & Corbin, 2002). It can be affirmed that the theory constructed thus far is of medium scope but has a high explanatory capacity for the whole of the data collected (Hernández et al., 2006).
The theoretical writing of PolEx-ICT was developed in different parts, each of which constitutes a research study and each of which developed different dimensions and categories that emerged throughout the research process. In each of these documents, the way in which the concepts emerge and how the data are developed, as well as the way in which the central categories are generated, can be seen. In the end, a final report that articulates all the results of the research is written.
Example of Central Ideas Extracted from an Organizational Table in PolEx-Information and Communication Technologies.
Phase 4: Saturation of the Incidents Specific to Each Category and Construction of the Theory
The researcher has to apply constant comparison at the end of the analysis process for each of the research units. Therefore, it is necessary to determine whether or not the set of categories has become saturated and thus guide the researcher on how to proceed in the next steps of the study. As a result of the constant comparison, the researcher can see the need to initiate a new study, address what happens in certain settings, or identify which individuals can provide new information.
Theoretical saturation occurs when no new properties of the category emerge from the data, i.e. more information does not add anything new or relevant (Glaser & Strauss, 1967), and when no additional data are found to identify new dimensions, codings, actions/interactions, or consequences (Strauss & Corbin, 2002). As noted above, at the end of the memorandum for each of the categories, a process of formulating and identifying new research questions to guide further phases of the study is undertaken. The researchers involved brainstorm research questions on the concepts emerging from the conceptual map elaborated during the axial coding phase, pointing out those categories that are considered to be “saturated” and on which no further information needs to be collected (Figure 3). With this strategy, the researcher asks questions about the latest data analyses in order to determine whether or not they bring new meanings to the phenomenon. If it is assessed that there is theoretical saturation, then the study is completed. Example of a concept map with saturated subcategory and questions for study 3 in PolEx-Information and communication technologies.
Conclusion
As has been observed throughout this study, GT has a long history of theoretical and research development, and it is one of the most widely used qualitative research methods. However, it is one of the most controversial among researchers. In the practical application of the method, researchers encounter an approach that is not always clear, and that has some contradictions and can be vague (Contreras et al., 2020; Murphy et al., 2017; Prigol & Behrens, 2019). The objective of this article has been to develop a detailed study design and data analysis strategy based on the GT of a study on educational technology (PolEx-ICT), in such a way that this knowledge provides an aid or resource to both novice and experienced researchers. To this end, a conceptual overview of the field and the methodological route taken in the study have been presented, highlighting the use of analytical tools that help the researcher make decisions and implement GT. A broad and updated theoretical base has been used to describe the conceptual and structural aspects of GT and to describe the steps to follow when applying it, using its application in the PolEx-ICT study as an example.
If the interpretation of the method for its use is not easy, and is not always clear, the difficulty of teaching its application is as well very complex. Moreover, there are currently no reference documents with a tangible, practical application and with concrete examples that could serve as a guide or complement for teachers to train novice researchers. This is why the present study may represent a good didactic guide to help teachers in their work with novice researchers.
In addition, the aim is to contribute to the field of educational research, especially educational technology, which requires rigorous qualitative research methods, and to highlight GT as a very appropriate approach for research in educational technology. In this field, it is usually an unknown method, and some who are aware of it consider it to be an unscientific method (García-Yepes & Rodríguez-Roja, 2018), so that studies in educational technology that use GT as the main method are scarce. Therefore, the process has been exemplified with the intention of showing the systematicity of the GT research process and giving it the scientific value that the research methodology deserves (Charmaz, 2014), as well as highlighting that it allows for a greater understanding of the phenomena and offers theoretical explanations of great value in the field of educational research that other methods, both qualitative and quantitative, do not allow for (Charmaz, 2005).
GT leads to the generation of new theories that, from the realities of the context and its participants, provide organizations with new paradigms and ways of perceiving phenomena that could not have been evidenced before (Contreras et al., 2020). It is undeniable that GT research has a creative component, which is why the tools presented here are flexible and open to modification to suit the needs and objectives of the researchers. In short, the work presented here is an application (interpretation) of the methodology in the educational technology field, and therefore it is a particular development adapted to the needs of the PolEx-ICT study, which can always be improved. Therefore, the researcher must be a creative person who interprets and analyses the data in their own way, provided that this is done with sufficient data collection capacity, order, and rigour until saturation can be produced, which, in general terms, is what actually drives new concepts.
