Abstract
Introduction
Learning analytics is gaining popularity among researchers because of its emphasis on quantifying, collecting, analyzing, and reporting data regarding learners and their environments. This is done to understand and optimize learning experiences and the environments (Long & Siemens, 2011 ; Siemens & Baker, 2012). Conversely, learning analytics utilizes data sources, analytical methods, and predictive modeling to inform interventions that aim to improve students’ learning experiences (Kew & Tasir, 2017; Sønderlund et al., 2018; Williams, 2014). Learning analytics encompasses five main elements: data collection, analysis, student learning, audience, and interventions (Brown, 2011). Thus, these interventions are vital in “closing the loop” within the cyclic learning analytic process (Clow, 2012; Knobbout & Van Der Stappen, 2020). The use of data-driven methodologies has guided learning analytics-based interventions in identifying student difficulties and providing them with timely, personalized assistance, leading to beneficial outcomes for both the learners and the entire learning environment (B. T. M. Wong & Li, 2020). This study examines the impact of learning analytics-based interventions on student learning outcomes, focusing on knowledge acquisition, cognitive skills, and social emotion. The framework proposed by Darling-Hammond et al. (2019) serves as the basis for this analysis.
While several studies have conducted comprehensive reviews and analyses of literature about learning analytics-based interventions from different perspectives, there is a scarcity of empirical research specifically investigating the effects of these interventions on learning outcomes (Kew & Tasir, 2017; Tepgeç & Ifenthale, 2022; Zheng et al., 2023). Therefore, the question of how learning analytics-based interventions enhance student learning outcomes remains unanswered, which implies that educators have not yet implemented learning analytics-based interventions more effectively to improve student’s learning outcomes (Rienties, Cross & Zdrahal, 2016 ). For example, Zheng et al. (2023) conducted a meta-analysis of 33 empirical studies on learning analytics-based interventions and their impact on learning successes from 2012 to 2021. The authors discovered that interventions based on learning analytics had a significant and favorable effect on learning achievements. In addition, the findings indicated that the impact of these interventions was greatly influenced by various aspects, including the composition of the sample, the areas of learning, the methods of learning, the technology used for learning analytics, and the metrics used for learning analytics. Nevertheless, this study solely assessed the overall influence of learning analytics-based interventions on learning outcomes without divulging the precise effects of these interventions or determining if notable distinctions existed among these factors. Sønderlund et al. (2018) conducted a comprehensive analysis of 11 research that examined the impact of learning analytics-based interventions on the effectiveness of higher education. The authors found that interventions based on learning analytics had the potential to improve student success and retention. Nevertheless, the study failed to reveal the magnitudes of the effects of interventions based on learning analytics on students’ learning outcomes or identify the most effective intervention methods for improving learning outcomes. Thus, it is yet unclear how interventions based on learning analytics can better encourage changes in learning outcomes. Furthermore, the efficacy of interventions based on learning analytics in promoting gains in learning outcomes has been the subject of mixed or contradicting findings in prior empirical research. To examine how students’ self-directed meta-cognitive prompts affected their navigation behavior and learning performance, Bannert et al. (2015) used a pre-post experimental design. The research conducted by these authors revealed that the group that utilized self-directed meta-cognitive prompts demonstrated superior transfer performance compared to those who did not employ such prompts. Lonn et al. (2014) examined student motivation within a learning analytics-based interventions implemented during a summer bridge program. Their results indicated that self-report interventions launched by teachers had a detrimental effect on the course mastery scores of remedial students.
The studies above indicated the inconsistent effectiveness of learning analytics-based interventions in promoting students’ learning effect. Hence, conducting an extensive review is crucial to assess whether and to what extent learning analytics-based interventions contribute to increasing or decreasing learning outcomes. Meta-analysis is a systematic research method that involves statistically analyzing and synthesizing many individual study outcomes using effect sizes (ES; Glass, 1976). This research method can reduce the uncertainty associated with individual studies and produce more decisive research outcomes (Lipsey & Wilson, 2000).
This meta-analysis aimed to examine the efficacy of interventions based on learning analytics in improving learning outcomes. The goal was to provide guidance for future endeavors in improving learning environments and interventions, as discussed below.
What is the total effect size of learning analytics-based interventions in enhancing students’ learning outcomes and their impact on the three dimensions of learning outcomes (i.e., knowledge acquisition, cognitive skills, and social emotion)?
If there are variations in the effects of different experimental studies included in a meta-analysis, how do various moderating variables impact the disparities in research conclusions?
Consequently, The structure of this paper is as follows: Section “Literature Review” is a literature review that examines the definition of learning analytics-based interventions, the dimensions that determine their effectiveness, and the factors that influence their efficacy. In the following section, we detail our methodology, which includes a literature search strategy, criteria for sample data inclusion and exclusion, data coding and analysis, and an assessment of publication bias and heterogeneity. In the “Results” and “Discussion” sections, we present and discuss the findings of our study. Finally, in the “Implications, Limitations, and Future Directions” section, we summarize the conclusions, discuss the limitations of this study, and suggest directions for future research.
Literature Review
Learning Analytics-Based Interventions Definition and Effectiveness Dimensions
Learning analytics-based interventions have recently become extensively utilized in the education sector. Nevertheless, a precise and unambiguous definition of learning analytic-based interventions is currently lacking. After a comprehensive review and analysis of previous research, the definition of learning analytics-based interventions is broadly categorized into two approaches. The first perspective highlights the comprehensive structure of interventions based on learning analytics. For instance, Wise (2014) defines learning analytics-based interventions as the surrounding frame of activity through which analytic tools, data, and reports are taken up and utilized; Khalil and Ebner (2015) mentioned that “Intervention, as the final stage of the learning analytics cycle, depends on the input from the previous stage to determine the specific issues to be addressed.” The Society for Learning Analytics Research (SoLAR) asserts that Learning analytics-based interventions are designed to facilitate the optimization of learning and learning environments ( Society of Learning Analytics Research, n.d.). On the other hand, the second perspective focuses on the specific operations and applications of learning analytics-based interventions. For example, Brown (2011) noted that learning analytics-based interventions are about analyzing digital data to help educators understand student behavior patterns, predict which students might face difficulties, and provide timely support. Additionally, it can assist in optimizing course design, improving teaching methods, and evaluating the effectiveness of educational programs; B. T. M. Wong and Li (2020) pointed out that learning analytics-based interventions are measures taken to enhance student outcomes by addressing challenges such as reduced retention and inadequate pass rates, utilizing data on student learning progress; Zheng et al. (2023) consider learning analytics-based interventions to refer to interventions based on learning analytics dashboards and interventions informed by learning analytics (e.g., providing feedback, prompts, or suggestions based on the outcomes of learning analytics). However, there is a distinction between the two definitions. . The first category definition emphasizes the overall framework of learning analytics-based interventions, which is rather broad, whereas the second category definition focuses on the specific operations and applications of learning analytics-based interventions, neglecting the importance of learning analytics-based interventions as a whole framework. This study believes that the definition of learning analytics-based interventions should emphasize the overall framework and pay attention to specific operations and applications. In addition, Zheng et al. (2023) pointed out that learning analytics-based interventions involve two learning settings: face-to-face and online learning. This study focuses on online learning interventions. Therefore, this study considers learning analytics-based interventions as the measures taken by employing learning analytics technology to collect and analyze data from learners and their contexts (e.g., providing automated feedback, personalization of course materials, learner support, and sending information for learners, or messages from teacher to learner).
In the field of practical teaching, several researchers have examined the effectiveness of learning analytics-based interventions based on this definition. For instance, Rienties, Boroowa, et al. (2016) manifested that the proactive dispatch of warning emails to students identified as being at risk positively influences their attitudes, behaviors, and cognitive processes; Kim et al. (2016) carried out an experimental research to demonstrate that the use of learning analytics dashboards can enhance students’ academic performance; Lu et al. (2017) found that teachers sending messages to students based on learning analytics reports can enhance their self-regulation abilities; Y. Lee and Specht (2023) found that providing automated feedback in Human-Robot Interaction (HRI) can enhance learners’ self-efficacy. From previous studies, it has been universally acknowledged that in the quest to examine the effectiveness of learning analytics-based interventions, Some researchers have concentrated on on learners’ knowledge acquisition, as evidenced by academic performance, examination scores, course grades, and grade point average (GPA; Kim et al., 2016; Knobbout & Van Der Stappen, 2020; Viberg et al., 2018; Zheng et al., 2023). Others have emphasized the enhancement of cognitive skills, including abilities such as self-regulated learning and collaborative learning (Heikkinen et al., 2023; Lu et al., 2017; Nussbaumer et al., 2015; J. Wong et al., 2019). Additionally, attention is concentrated on the social-emotional dimensions, including learning attitudes, motivation, and satisfaction (D. Lee et al., 2019; Lonn et al., 2014; Rienties, Boroowa, et al., 2016; Smith et al., 2012). This aligns with the broader understanding of learning outcomes as described by Darling-Hammond et al. (2019), who state that learning outcomes are primarily directed toward academic, cognitive, and social-emotional. Building on the classification of learning outcomes by Darling-Hammond et al. (2019), and synthesizing existing research on the effectiveness of learning analytics-based interventions, this study categorizes the outcomes of learning analytics-based interventions into knowledge acquisition, cognitive skill, and social emotion.
Factors in the Effectiveness of Learning Analytics-Based Interventions
The effectiveness of learning analytics-based intervention may vary among different factors (B. T. M. Wong, 2017; B. T. M. Wong & Li, 2020). For example, regarding a subject area, Rienties, Boroowa, et al. (2016) found that the intervention effect in Technology was better than expected, while Psychology performed relatively poorly, although with a minor impact. From the perspective of the learning stage, Zhang et al. (2023) mentioned in their meta-analysis that the intervention effects in higher education are superior to those in primary, middle, and high schools. In terms of intervention duration, Jayaprakash et al. (2014) mentioned that within the first 5 weeks of the course, sending academic alerts to students at risk through learning analytics tools allows both teachers and students enough time to identify and address existing academic issues, thereby improving student academic performance, while Rienties, Boroowa, et al. (2016) adopted a quasi-experimental method to analyze the impact of sending early warning emails to potentially at-risk students using learning analytics on their learning outcomes. From the 12th week onward, the experimental group had a considerably greater level of involvement in the virtual learning environment than the control group, as indicated by the study. From the perspective of the learning environment, Kew and Tasir (2021) noted that the most commonly implemented online learning environments for learning analytics-based interventions mainly comprise four types, namely Learning Management System (LMS; such as Moodle, Blackboard, etc.), Computer-based environment (such as CSCL, desktop application), virtual learning environment (VLE), web-based environment (except LMS, VLE, MOOC), and other. Kim et al. (2016) conducted an experimental study on groups using a dashboard, comparing the experimental group with a control group. These authors found that within the experimental group students who used the dashboard had higher final scores than those in the control group who did not use it. This study indicates that learning analytics dashboards (LAD) can improve student academic achievement. Kim et al. (2016) indicated that high-achieving learners show lower satisfaction with using the dashboard. Knobbout and Van Der Stappen (2020) used open coding to synthesize the categories for intervention types based on a systematic literature review of 62 articles about learning analytics-based treatments. Eight distinct types of learning analytics-based interventions have been identified in the literature so far: automated feedback (AF), dashboards (D), information for learners (IL), information for teachers (IT), messages from teachers to learners (MT), course material personalization (PC), learner support (LS), and visualization (V). The effects of learning analytics-based feedback in massive online courses were investigated experimentally by Lim et al. (2019) . According to these authors, academically, students in the feedback-receiving experimental group outperformed their non-feedback-receiving control group, where learning analytics can help students learn. Conversely, Bannert and Reimann (2012) interviewed students exposed to self-regulation prompts. The findings revealed that half of these students felt these prompts hindered and interrupted their learning. Regarding diagnostic assessment tools, Xu et al. (2023a) systematically coded the experimental literature included in the meta-analysis, considering that the evaluation tool consists of a few categories. Kew and Tasir (2017) pointed out that learning analytics technology can capture, record, and store learners’ login frequency, page views, number of posts, replies, academic performance, and test scores, which is more conducive to standardized test assessments. Lonn et al. (2014) found integrating psychological scales with existing learning analytics-based intervention measures difficult and challenging.
Finally, several variables affect how well interventions based on learning analytics work. The present study investigated the effectiveness of learning analytics-based interventions in enhancing student learning outcomes, particularly regarding the impact of subject area, learning stage, intervention duration, learning environments, intervention type, and diagnostic assessment tool on student learning improvement. It aims to provide evidence regarding whether these factors enhance or reduce student learning outcomes and to what extent.
Method
This research followed Cooper’s (2010) rigid meta-analysis approach for analyzing quantitative data from multiple studies on the same topic, including database searching, identification, screening, eligibility, merging, duplicate removal, and study analysis. Rev-Man 5.4 was used to conduct a meta-analysis. Cohen’s kappa coefficient was used to test the consistency of three researchers’ data, and publication bias and heterogeneity tests were done on the sample data to evaluate this meta-analysis.
Literature Search Strategy
A comprehensive literature search was carried out to gain insight into the present state of research regarding the efficacy of learning analytics interventions in enhancing learning outcomes. The search encompassed several electronic databases, including the Web of Science Core Collection and the Chinese Social Science Citation Index (CSSCI), as well as journal papers included in the China National Knowledge Infrastructure (CNKI). The reason for choosing the LAK conference and these databases is that they maintain stringent standards for the quality of the literature they compile, ensuring authenticity and reliability through mechanisms such as peer review. The study employed Boolean operators in a search query within Web of Science, which are as follows: TS=(“learning analytics” or “learning analysis” and “experiment” or “empirical study” or “control group” or “quasi-experiment”) or (“learning analytics” or “learning analysis” and “pretest” or “post-test”) and (“intervention” or “instructional intervention” or “educational intervention”) and (“learning effect” or “learning outcome” or “learning achievement” or “learning performance”). “Education and Educational Research” was the focus of the study from 2010-01-31 to 2023-12-31. A grand total of 681 items were kept. Utilizing the Boolean operator in the CNKI, the search string SU=(“learning analytics”*“intervention”+“learning analytics”*“instructional intervention”+“learning analytics”*“educational intervention”+“learning analytics”’*“learning effect”+“learning analytics”*“learning outcome”+“learning analytics”*“learning achievement”+“learning analytics”*“learning performance”) was utilized. The research period from January 1, 2010, to December 31, 2023, is known as the AND variable, and it can be defined as an experiment, empirical study, control group, quasi-experiment, pretest, or post-test. We ended up keeping 143 items. After eliminating 47 items duplicated in the search results, a total of 1,002 relevant research studies were found based on the literature search methodologies. The overall number of studies was 955. Afterward, 34 research documents were chosen based on the inclusion criteria after two researchers reviewed the abstracts and titles of the 955 collected studies.
Inclusion Criteria and Exclusion Criteria
This meta-analysis incorporated literature matching the specified criteria: (1) The issue should be the learning analytics-based interventions on students’ learning outcomes. (2) Experimental or quasi-experimental designs with pre-and post-test or comparative experimental data should be used. (3) Must identify particular markers, such as sample size, mean, standard deviation, and other descriptive statistical measures to assess the efficacy in augmenting students’ learning outcomes. Additionally, to ensure the generalizability of the findings, we excluded studies that were (1) written in languages other than English and Chinese; (2) conference abstracts and review articles; (3) did not provide usable effect sizes; (4) studies that published the same data more than once.
Two researchers meticulously examined the complete texts of each item, adhering strictly to the criteria for inclusion and exclusion. They employed a snowball sampling method, leveraging the references and citations from the included publications to achieve thorough coverage. In the end, 34 items were kept.
Coding of Study Characteristics and Moderator Variables
Two researchers independently reviewed the literature to identify relevant studies (Figure 1), achieved a satisfactory agreement (Cronbach’s α = .847), and resolved disagreements through consensus discussions. The two researchers gathered descriptive data in each study, encompassing the paper title, author’s name, and publication year. Furthermore, they retrieved the data encompassing three metrics used to measure the learning impact of learning analytics-based interventions: sample size, mean value, and standard deviation.

Flowchart illustrating the process of acquiring literature for the meta-analysis.
The variable information from 34 experimental studies is primarily concentrated on two types of variables: dependent variable (learning effect) and moderator variables (subject area, learning stage, intervention duration, learning environment, intervention type, diagnostic assessment tools). As a dependent variable, the learning effect included knowledge acquisition, cognitive skill, and social emotions. The literature identified the following six factors as moderator variables for the purpose of sub-group analysis: subject area, learning stage, intervention duration, learning contexts, intervention type, and diagnostic assessment tools. Table 1 depicts the specific coding scheme, Appendix 1 depicts the list of references included in the study, Appendix 2 depicts the coding information for dependent variable and moderating variables.
The Process of Coding Eigenvalues in Literature.
In this study, the sample size, average, standard deviation, and other parameters of 34 studies are extracted, and the standardized mean difference (SMD) is selected as the effect size according to the needs of the study to characterize the effect of learning analytics-based interventions on students’ learning outcomes. SMD = difference between group effect means ÷ standard deviation of participant effects (Hedges, 1981 ), and its functional relationship is as follows:
Where ME and MC denote the average of the experimental and control groups, respectively; NE and NC denote the sample size of the experimental and control groups, respectively; SE and SC denote the standard deviation of the experimental and control groups, respectively.
Analysis of Data
A comprehensive meta-analysis regarding the information in relevant literature was conducted using Rev-man 5.4 software, calculated the overall effectiveness of whether learning analytics-based interventions can promote students’ learning effect, and conducted a homogeneity test to explore the influence of moderating variable (subject area, learning stage, intervention duration, learning environments, intervention type, diagnostic assessment tools) on students’ learning outcomes.
Assessment of Publication Bias and Heterogeneity
Publication Bias Test
Publication bias leads to an overestimation of the effect size. Research studies demonstrating statistically significant findings are more prone to being published and referenced (Rothstein, 2008 ). The validity and dependability of meta-analysis findings hinge on the ability to conduct an impartial and thorough examination of publication bias throughout the research endeavor. Hence, it is imperative to do a publication bias assessment of the sample data in the meta-analysis. The most popular methods for assessing publication bias, including funnel plots and Egger’s test, were used in this study. In the funnel plot, the standard error is depicted vertically, and the effect magnitude is represented horizontally. A symmetrical and uniform distribution of the sample data around the mean effect size suggests a reduced probability of publication bias. Out of the 76 effect values examined in this work, most show a dispersed distribution focused in the upper-middle area of the funnel plot. This area corresponds to the dependable section of the dataset, as shown in Figure 2. Further quantitative testing revealed that, with the exception of knowledge acquisition variables, all variables exhibited good funnel plot symmetry in Egger’s test, indicating no signs of publication bias (Egger’s test:

Funnel plot analyzing 76 effect sizes concerning publication bias across 34 studies.
Results of Publication Bias Egger’s Test.
Knowledge acquisition variables (Egger’s test:

Funnel plot of knowledge acquisition variables after clipping and patching.
Heterogeneity Test
Homogeneity analysis examines whether all effect sizes estimate the identical population mean. Through this type of analysis, other factors that influence the strength of the relationship can be uncovered (Kyndt et al., 2013). The meta-analysis usually employs Cochrane’s
Heterogeneity Test Results.
Results
An Examination of the Total Effect Size
A total of 34 empirical investigations were included in the random effect model, which examined 76 effect quantities. Figure 4 shows the forest plot that this study produced after removing heterogeneity. It has 76 effect values. The map indicates a statistically significant total impact value of 0.46 (

Impact of the forest map.
This study analyzed knowledge acquisition, cognitive skill, and social emotion to better understand how learning analytics-based interventions improve learning outcomes (see Table 4). Learning analytics-based interventions improved knowledge acquisition (ES = 0.55), social emotion (ES = 0.39), and cognitive skills (ES = 0.35) with insignificant differences between groups (
Text of Intervention Strategies Concerning the Three Learning Outcomes.
Analysis of Moderator Effect Size
The heterogeneity test results demonstrated a substantial relationship among the 76 effect quantities (
Text of Moderating Effect.
Subject Area
Different subject areas promoted the improvement of learning outcomes, and differences between groups were significant (
Learning Stage
Various learning stages impacted learning outcomes, and their effect sizes were significant at all grade levels (
Intervention Duration
Different intervention duration positively impacted learning outcomes, and no significant differences found between groups (
Learning Environments
Different learning environments significantly impacted learning outcomes, with significant differences between groups (
Intervention Type
Different intervention type influenced learning outcomes, and differences between groups were significant (
Diagnostic Assessment Tool
Different diagnostic assessment tools influenced learning outcomes, and differences between groups were significant (
Discussion
The Impact of Learning Analytics-Based Interventions on Students' Learning Outcomes
This meta-analysis of 34 empirical studies indicates that learning analytics-based interventions can improve student learning outcomes and positively promote the three dimensions of learning outcomes. The results showed that the improvement in knowledge acquisition related to this has a high and significant effect, while the corresponding enhancements of cognitive skills and social emotion are relatively smaller. These findings support the idea that learning analytics-based interventions can enhance academic performance. This is consistent with previous research (Kew & Tasir, 2017; Gong & Liu, 2019; Karaoglan Yilmaz & Yilmaz, 2022). This finding indicates that learning analytics can identify potential learning issues early on by collecting and analyzing data on students’ learning behaviors. Implementing targeted interventions can help students better understand and master course content, encourage active participation, and significantly enhance academic performance. However, the results indicate that learning analytics-based interventions have a relatively small effect on the enhancement of cognitive skills, which is consistent with the findings of Ye and Zhou (2022). Cognitive skills are a complex process, which involves the active construction of new behaviors that affect learning, and it is unlikely that students can improve their cognitive skills through short-term courses (Nussbaumer et al., 2015). Learning analytics-based interventions have limitations in enhancing students’ cognitive skills. To effectively enhance students’ cognitive skills, it is necessary for future research to extend the intervention period. Additionally, while the meta-analysis confirms that learning analytics-based interventions positively promote social emotion, it does not demonstrate the anticipated higher level of effectiveness, which is consistent with the results of Arguedas et al. (2016) and Y. Lee and Specht (2023). Compared to traditional face-to-face learning, students are more likely to experience negative emotional states (e.g., social anxiety) in online learning settings. (Ifenthaler et al., 2023). Negative emotions can have an adverse impact on students’ learning (Rienties, Boroowa, et al., 2016). For example, improper intervention measures can distract students, disrupt their learning, cause anxiety, reduce engagement, and hinder their learning process (Bannert & Reimann, 2012; Clarebout et al., 2013 ). Given the complexity and diversity of human emotions, automated tools face challenges in accurately capturing and interpreting all emotional states, especially the subtle and complex ones (Ekman, 1999). This challenge is particularly evident when students’ emotional states change dynamically and there is a lack of real-time data and feedback, making it difficult for learning analytics tools to identify and respond to students’ emotional changes in a timely and accurate manner (Baker & Inventado, 2014; Davies et al., 2017; Kort et al., 2001; Rienties, Boroowa, et al., 2016; B. T. M. Wong & Li, 2020). For example, if learning analytics fail to recognize students’ negative emotions during critical moments of rapid emotional change, they may not provide timely support, leading to increased feelings of frustration and disengagement (Mayer, 2019; Reis et al., 2018). Learning analytics-based interventions has limitations in enhancing students’ social-emotional. To enhance students’ social-emotional, future research should employ multi-modal data for affective computing to discern learners’ implicit emotional states, thereby effectively identifying and responding to learners’ negative emotions in a timely manner. Such interventions can help maintain a positive emotional state in students, which can, in turn, improve learning outcomes by addressing the emotional aspects of the learning process more directly.
The Moderating Impact of Variables on Learning Analytics Interventions
A subgroup analysis of potential moderating variables included in 34 experimental studies to investigate further the key factors in learning analytics-based interventions that affect the learning effect. Research revealed that learning analytics-based interventions can improve learning outcomes by regulating variables like subject area, learning stage, intervention duration, learning settings, intervention type, and diagnostic assessment tool.
Regarding subject areas, differences between subjects significantly differ in intervention effects. Among them, the impact of natural sciences fields (such as mathematics, physics, and science) on intervention effects is most significant, consistent with the findings by other researchers. For example, learning analytics-based interventions can make invisible processes visible, and through visual representations, they help learners establish connections between the micro and macro levels, deepening students’ understanding of the micro level and thereby promoting the development of natural science disciplines (Zhang & Linn, 2011).
From the perspective of the learning stage, different learning stages show significant variations in the effectiveness of interventions. Among them, the intervention’s impact has the largest comprehensive effect on elementary school students. This finding is consistent with Mohd Syah et al. (2016) study that early intervention should be initiated from the beginning of school enrollment for children at academic risk to address issues hindering cognitive ability enhancement. This is crucial to prevent long-term cognitive deficiencies and psychological impacts, promoting improved learning outcomes. Therefore, its impact on the intervention effect is the most significant.
Different intervention periods positively affect the improvement of learning outcomes, but the inter-group effects are not statistically significant, consistent with the findings of Zheng et al.’s (2023) study that the intervention duration cannot be considered a decisive factor affecting learning outcomes. Notably, designing intervention measures need to consider the long-term nature of the intervention period to allow students enough time to master cognitive skills (Zimmerman, 2000). However, in teaching practice, interventions are usually conducted in the short term (Heikkinen et al., 2023).
Regarding learning environment, different learning environments show significant variations in the effectiveness of interventions. Among them, the overall impact of applying game-based learning tools is maximized. For example, a game-based learning environment aims to enhance student interest in learning through real-time adaptive support while improving performance or knowledge in specific subjects (Emerson et al., 2023; Kuo, 2007; Troussas et al., 2020). And it can provide students with an engaging and effective learning experience (Emerson et al., 2023). Therefore, its impact on intervention effectiveness is the most significant.
From the perspective of the intervention type, different intervention type showed significant variations in the effectiveness of interventions. Among them, the combined effect value of personalizing course materials is the greatest. For example, Kew and Tasir (2017) conducted a systematic review of 13 relevant intervention studies and found that the most commonly used intervention measure is to assist students at academic risk by providing personalized materials. Furthermore, relevant information to students at academic risk can offer them additional assistance based on their learning needs, thereby increasing their likelihood of learning success and academic performance (Corbi & Burgos, 2014; Kew & Tasir, 2017; Van Barneveld et al., 2012). Therefore, the impact of providing personalization of course materials on intervention effectiveness is the most significant.
Concerning diagnostic assessment tools, different diagnostic assessment tools show significant variations in the effectiveness of interventions. Among them, the overall impact of the standardized test is maximized. Researchers unanimously consider standardized testing the most effective and reliable diagnostic assessment tool (Xu et al., 2023a, 2023b). However, only 23.68% utilized this diagnostic assessment tool, while 76.32% used a self-complied test or scale to assess the intervention effects. This result indicates that assessing learning analytics-based intervention effects is not always effectively conducted using standardized measurement tools. Lonn et al. (2014) argued that in the current context, there is a certain difficulty in integrating scale measurement tools and data sources (e.g., grades) with learning analytics-based interventions. Therefore, future research needs to focus on developing learning analytics techniques to effectively combine measurement tools and data sources, such as academic performance and enriching measurement tools, thus meeting the needs of learners.
The Practical Implications of Learning Analytics-Based Interventions
The meta-analysis results show that learning analytics-based interventions can effectively improve student learning outcomes. Our findings are also useful for teaching practice.
First, although teachers can strengthen relationships between them and students, rethink their teaching performance, and accommodate their curriculum content by employing learning analytics-based interventions (Rienties, Boroowa, et al., 2016), they should not ignore the issue of time costs, especially for those with large class sizes (Corrin et al., 2016; B. T. M. Wong & Li, 2020). This is because when teachers have insufficient time and available resources, they may be constrained from taking proactive intervention measures with students. To maximize the benefits of learning analytics, teachers should prioritize tasks, utilize automation, manage their time effectively, and ensure adequate resource allocation to address the time cost challenges associated with learning analytics-based interventions.
Second, Researchers implement learning analytics-based interventions as a predictive tool to identify students at risk of dropping out, increasing their likelihood of continuing in the program (Arroway et al., 2016; Li et al., 2018). However, they encounter technical challenges in obtaining specific data types for learning analytics practices, such as students’ perceptions, emotions, and teachers’ observations (Gaševic et al., 2017; B. T. M. Wong & Li, 2020). To overcome these challenges, interdisciplinary collaboration with data scientists, psychologists, and computer scientists is essential to develop innovative data collection and analysis methods. Such cooperation can enhance the effectiveness of data utilization, improving the quality and impact of learning analytics-based interventions.
Finally, School administrators implement learning analytics-based interventions to improve student attendance, strengthen teacher-student interactions, and boost student retention rates (Li et al., 2018). Nevertheless, they face challenges in evaluating the effectiveness of these interventions (Li et al., 2018; B. T. M. Wong & Li, 2020). Therefore, it is essential to establish an evidence-based framework for assessing learning analytics-based interventions, enabling school administrators to manage, evaluate, and determine which types of interventions are effective under specific conditions.
Limitations and Directions for Future Research
The meta-analysis has limitations, and future studies could aim to improve and refine the following three aspects. Firstly, due to language constraints, this study was limited to coding literature written in English and Chinese that explores the effects of learning analytics-based interventions on learning outcomes, excluding literature not written in English and Chinese. This may limit the generalizability of this meta-analysis research findings. Future research could compare studies from different linguistic and cultural backgrounds to gain a more comprehensive understanding. Secondly, Although there is a scarcity of empirical studies on the impact of learning analytics-based interventions on learning outcomes, this research indicates that learning analytics-based interventions have a positive effect on cognitive skills and social emotions, even though these interventions have not achieved the expected higher level of effectiveness. Future research should concentrate efforts on investigating and analyzing how learning analytics-based interventions can measure and enhance students’ cognitive skills and social emotion. Lastly, this study examined the moderating variables of intervention period and intervention measures, without analyzing other potential boundary conditions. Future research can further explore the moderating variables of interventions, such as intervention targets, intervention scale, and challenges encountered during the intervention process.
Validity Threats
The validity threats include aspects such as internal, external, construct, and conclusion validity (Shadish et al., 2002). In conducting this meta-analysis, this study minimized the potential risks of validity threats by adhering to the recommended mitigation actions suggested by Cooper (2015). Search terms that are inappropriate or not comprehensive can undermine construct and internal validity. Due to our use of subject terms in the literature search, there is a possibility that we may have overlooked studies that are actually related to learning analytics interventions and learning outcomes, simply because they did not use these keywords. As a result, we not only focused on the keywords but also paid close attention to the content to make sure we didn’t miss any relevant research. Statistically significant results are more likely to be published than papers with non-significant findings, which may affect the estimated overall effect size in meta-analyses, leading to an overestimation of the true treatment effect and impacting external validity. In this study, we conducted a publication bias test for the included studies to identify research that might affect the reliability of the results, which helps to reduce the threat to external validity caused by varying study quality. In order to minimize the risk to conclusion validity, we have discarded the duplicate studies. Meanwhile, to achieve a consensus that minimizes subjective bias and reduces threats to both internal and conclusion validity, the authors engaged in extensive discussions.
Conclusion
The primary goal of this meta-analysis was to address how interventions effectively enhance student learning outcomes, grounded in the analysis of quantified data results. The results showed that learning analytics-based interventions can effectively enhance students’ learning outcomes,with an overall effect value at a moderate level. In terms of specific dimensions of learning outcomes, interventions can significantly and effectively improve knowledge acquisition, while the improvements in social emotions and cognitive skills were relatively small. The research results also indicated that the five moderating variables included in the 34 experimental studies significantly and positively impacted interventions. Subject area, learning stage, learning environment, intervention type, and diagnostic assessment tools are all important moderating factors that influence the impact of interventions on student learning outcomes. Therefore, it is recommended that stakeholders involved in learning analytics-based interventions focus on factors such as subject area, learning stage, learning environment, intervention type, and diagnostic assessment tools when implementing interventions. Furthermore, the present study provides recommendations for subsequent research, to emphasize that future empirical studies need to pay closer attention to cognitive skills and social emotions in order to enhance the overall learning outcomes of students. Additionally, stakeholders involved in designing learning analytics-based interventions should consider the long-term nature of the intervention period to provide students with sufficient time to develop cognitive skills.
