Abstract
Keywords
Introduction
Currently, autogenous bone grafting remains the gold standard for bone defect treatment, encompassing transplantation of cancellous bone, cortical bone, and bone marrow. 1 However, autograft harvesting requires additional surgical intervention, which inevitably prolongs operative duration, extends hospital stays, and increases risks of donor site complications including infection, hemorrhage, and hematoma formation. More critically, the inherently limited bone volume available for autologous transplantation, particularly the most biologically active components containing cancellous bone with stem cells and growth factors in the bone marrow, becomes significantly scarce in cases requiring substantial bone volume, thereby exacerbating the challenges of insufficient graft availability.2,3 Allogeneic bone has become the most widely used alternative grafting material in clinical practice due to its accessibility and convenience. However, as a natural bone substitute, allografts present inherent limitations including restricted supply sources and lack of osteoinductive potential. Furthermore, these materials demonstrate rapid absorption rates post-implantation and carry potential risks of blood-borne disease transmission and immune-mediated interference with bone healing processes. 4
Advances in bioengineering have enabled the development of structured calcium phosphate bone grafting materials—specifically, biological artificial material (BAM)-induced artificial bone. As a synthetic bone grafting material that does not contain exogenous growth factors or living cells, BAM-induced artificial bone not only avoids immune rejection and infection transmission, but also has the characteristics of easy accessibility and convenience, and also has the advantage of osteoinductive bone formation due to its unique three-dimensional porous structure, which is gradually becoming a new choice for clinical bone grafting materials. 5 However, clinical evidence regarding its efficacy remains limited.
In conclusion, this study utilized digital radiography (DR) imaging to evaluate the osteogenic efficacy of BAM-induced artificial bone versus conventional allogeneic bone grafts in tibial infected bone defect patients, both pre- and post-operatively. Through radiographic assessments combined with Image J analysis, we explored the differential performance between this emerging bone substitute and traditional allografts and investigated potential mechanisms underlying these disparities. These findings aim to provide evidence-based references for optimizing bone graft selection in clinical practice.
Methods
Participants
From January 2021 to December 2023, 56 subjects with tibial osteomyelitis who underwent first-stage treatment at the Department of Orthopaedics, Sichuan Orthopedic Hospital, were included in this study. These subjects required bone grafting during the second stage of treatment due to tibial infectious bone defects. Following clinical guidelines, eligible subjects who met the diagnostic and inclusion criteria were enrolled sequentially based on their admission time. Preoperative consultations were conducted, and subjects were grouped based on their choice of grafting material. The subjects were divided into an observation group (BAM-induced artificial bone mixed with autologous bone, 28 cases) and a control group (allogeneic bone mixed with autologous bone, 28 cases). Inclusion criteria were as follows: (1) Age >18 years; (2) A clear diagnosis of tibial osteomyelitis and tibial infectious bone defects; (3) Tibial bone defect length ≤5 cm; (4) Clinical manifestations and laboratory results indicating controlled infection; (5) Good condition of the calf soft tissues with exclusion of neurovascular injury; (6) Subjects provided informed consent for the treatment and study protocol, signed the informed consent form, and had complete follow-up data. The study was approved by the Ethics Committee of Sichuan Orthopedic Hospital (Ethics Approval Number: KY2022-011-01) and conducted in compliance with GCP guidelines and the requirements of the Declaration of Helsinki. Exclusion Criteria were as follows: (1) Subjects with rheumatic diseases, connective tissue disorders, pregnancy or lactation, coagulation disorders, or severe dysfunction of the heart, brain, liver, or kidneys; (2) Poor mental condition, unable to cooperate with the corresponding treatment process.
Sample size and randomization
Currently, there are no explicit requirements regarding sample size in clinical studies on surgical treatment for infectious bone defects. Referring to similar published studies,6–8 a sample size of 20 subjects per group is sufficient to achieve relatively stable research outcomes. Considering the possibility of data unavailability due to subject loss to follow-up or dropout in clinical research, the final sample size was calculated with a 20% dropout rate, resulting in 28 subjects per group and a total of 56 subjects across both groups.
Following clinical guidelines, eligible subjects who met the diagnostic and inclusion criteria were enrolled sequentially based on their admission time. Preoperative consultations were conducted, and subjects were assigned to either the observation group or the control group based on their choice of grafting material. To mitigate selection bias related to cost or efficacy outcomes, preoperative counseling strictly avoided emphasizing price/efficacy differences between graft materials. Only objective distinctions in material characteristics were communicated. Patients were informed that price variations were minimal, with any remaining cost differences covered by research funding. Given the specific nature of the intervention methods in this study, double-blinding of both operators and subjects was not feasible. Therefore, the study adhered to the principle of three-way separation among the trial investigators, surgical operators, and statisticians. Measures such as separating patient wards or staggering consultation and treatment times were implemented to minimize communication between subjects and reduce bias as much as possible. This study included two intervention groups, both of which underwent surgical treatment with generally consistent surgical protocols, differing only in the type of grafting material used. The observation group received BAM-induced artificial bone mixed with autologous bone, while the control group received allogeneic bone mixed with autologous bone. Apart from the difference in grafting materials, all other preoperative and postoperative standard treatments were identical for both groups.
Materials
(1) Allogeneic bone: Aorui Biomaterials Co., Ltd (Shanxi, China). Cancellous bone strips (surface demineralized). Specifications: Dimensions: (20-60) mm length × (2-8) mm width × (2-8) mm height; Volume: 5 ± 0.2 cm3 per unit. (2) Bone-induced calcium phosphate biomaterial (BAM-induced artificial bone): Bayamon Bioactive Materials Co., Ltd (Sichuan, China). Product model: TH/P (osteoinductive) 2040; Unit specification: 3 g (sterile).
Interventions
Preoperative Treatment
Routine preoperative examinations were completed upon admission. The baseline characteristics and medical history of the subjects were recorded, and the extent of bone defects was assessed. Clinical laboratory tests were conducted to confirm that infection and inflammatory markers had returned to normal. The treatment and study protocols were thoroughly explained to the subjects, and informed consent for both surgery and the study was obtained.
Intraoperative treatment
All surgical procedures for this study were performed by the same surgical team from the same department, with the lead surgeon being a senior physician. (1) First-Stage Surgery: (1) The subject was placed in a supine position, followed by routine disinfection and draping. After disinfection, debridement was performed under general anesthesia via endotracheal intubation; (2) Inflammatory granulation tissue was removed using rongeurs and curettes soaked in normal saline, and pathological bone tissue was removed using a high-speed drill until healthy bone tissue was exposed; (3) The debrided pathological tissue was collected for pathological biopsy and bacterial culture testing. The wound was then irrigated with hydrogen peroxide; (4) Bone cement mixed with an appropriate amount of vancomycin was prepared, molded, and used to fill the bone defect. (2) Second-Stage Surgery: Based on the results of bacterial testing after the first-stage surgery, antibiotic therapy was administered. After 4-6 weeks, infection and inflammatory markers were re-evaluated, and surgery was initiated once normal levels were confirmed. The subject was placed in a supine position, followed by routine disinfection and draping. During the procedure, the lesion was re-cleaned. The membrane induced by the bone cement was longitudinally incised, and the bone cement was completely removed. The status of the subject’s bone-induction membrane was assessed. The subject’s autologous iliac bone was harvested and shaped into bone granules or bone strips, mixed with grafting material, and measured using a 10 mL syringe. Approximately 2/3 of the volume consisted of autologous bone, and 1/3 consisted of allogeneic bone or other grafting materials. To highlight the radiographic changes in the grafting material, approximately half of the autologous bone was placed in the innermost layer of the defect to cover the induced membrane. In the middle region, bone graft material granules (BAM-induced artificial bone or allogeneic bone) were placed (Figure 1(c)). Finally, the outermost layer was smoothed with autologous bone to align with the cortical bone edge (Figure 1(d)). Biological protein sponges were used to cover the grafting area to prevent graft material leakage. The fascial tissue flap was closed to cover the graft area, and a high-pressure suction drainage bottle was inserted. Surgical procedures. (a) autologous bone. (b) Induction membrane. (c) Graft material placement in middle layer. (d) autologous bone levelling at outermost layer.

Postoperative treatment
Effective antibiotics used preoperatively were continued for anti-infection therapy within 14 days after surgery. Drainage tubes were removed 24-72 hours postoperatively, based on drainage volume (<30 mL/24h). Symptomatic treatments, including anticoagulation, anti-swelling therapy, and fluid supplementation, were initiated on the day of surgery. Active functional training of the hip, knee, and ankle joints, as well as muscle contraction exercises, were encouraged 24 hours after surgery.
DR imaging data collection and analysis
The DR with Shimadzu SONIALVISION C200 system was used in this study. To ensure the reliability of the DR data, all DR data were acquired by the same professional technician in the imaging department. Each time the data were acquired, the subject team members worked with the imaging team to adjust the position of the imaging bulb to maintain the same height and shoot in a neutral position as much as possible. Routine examinations and imaging were conducted postoperatively (2-5 days), at 3 months, and at 6 months, with results recorded accordingly.
The DR imaging data of each subject, including preoperative, postoperative, 3-month postoperative, and 6-month postoperative images, were imported into the ImageJ software. The image type was set to “8-bit” under the “Image” > “Type” toolbar. The grafting region images were selected, and the threshold was adjusted under the “Image” > “Adjust” > “Threshold” menu to encompass all relevant areas. Low-density areas were delineated, and the area fraction was calculated using “Analyze” > “Measure.” Each measurement was performed three times, and the average value was used. DR grayscale values from images taken at different time points for the same subject were proportionally corrected using the reference grayscale value of a standardized region.
Outcomes measures
(1) General Information: demographic characteristics such as gender, age, height, weight, etc. and disease-related information such as disease duration, current medical history, past medical history, medication history, and surgical history. (2) Void Ratio: The void ratio is defined as the ratio of the low-density grafted area to the total grafted area. (3) Absorption Rate: The absorption rate reflects changes in the void ratio over time. The 3-month postoperative absorption rate is calculated as the difference between the void ratio at 3 months postoperatively and the immediate postoperative void ratio. Similarly, the 6-month postoperative absorption rate is calculated as the difference between the void ratio at 6 months postoperatively and the immediate postoperative void ratio.
Statistical analysis
Statistical analyses were performed using SPSS 22.0 software. Data that followed a normal distribution were analyzed using the
Results
A total of 56 subjects were included in this study, of which 6 subjects were excluded from the statistical analysis due to incomplete trial participation. In the observation group, 2 subjects dropped out (both lost to follow-up), while in the control group, 4 subjects dropped out (2 lost to follow-up and 2 excluded due to recurrent infection). A total of 50 subjects completed the trial and were included in the statistical analysis, resulting in a dropout rate of 10.7%.
Comparison of baseline conditions between the two groups postoperatively
Comparison of study participants’ general condition and postoperative baseline condition.
Comparison of void ratios and absorption rates between the two groups at 3 and 6 Months postoperatively
Comparison of void ratios and absorption rates between the two groups at 3 and 6 months postoperatively.
Optimize reconstruction of the weight-bearing zone
Postoperative imaging revealed that the observation group demonstrated abundant osseous support within the weight-bearing zone, whereas the control group exhibited a higher proportion of low-density cavitary lesions. These findings suggest that BAM-induced artificial bone holds a distinct advantage in osseous reconstruction of the weight-bearing zone, forming a more robust osseous support structure. This optimization enhances the mechanical properties of the weight-bearing zone and facilitates the restoration of postoperative weight-bearing function (Figure 2). Comparison of radiographic analysis between the two groups. (a) BAM postoperative. (b) BAM 3 months postoperative. (c) BAM 6 months postoperative. (d) Allogeneic bone postoperative. (e) Allogeneic bone 3 months postoperative. (f) Allogeneic bone 6 months postoperative.
Reduce cavitation formation
Radiographic assessment revealed comparable immediate postoperative void ratios between the two graft materials, with no statistically significant difference (observation group: 17.09% ± 3.84 vs control group: 18.12% ± 2.63; Void ratios and absorption rates in the allogeneic bone group. (a) Allogeneic bone postoperative. (b) Allogeneic bone postoperative void area. (c) 3 months postoperative. (d) 3-month postoperative void area. (e) 6 months postoperative. (f) 6-month postoperative void area. Void ratios and absorption rates in the BAM group. (a) BAM postoperative. (b) BAM postoperative void area. (c) 3 months postoperative. (d) 3-month postoperative void area. (e) 6 months postoperative. (f) 6-month postoperative void area.

Discussion
As a novel clinical bone graft material, BAM-induced artificial bone demonstrates unique regenerative properties. Its compositional and structural similarity to natural bone minerals enables in vivo enrichment and adsorption of endogenous growth factors, thereby effectively inducing and promoting osteogenesis while serving as a scaffold possessing osteoconductive and osteoinductive capabilities. Wang C et al. 9 investigated the combination of devitalized autologous bone flaps with BAM-induced artificial bone for repairing rat calvarial defects. The study revealed that both the BAM-induced artificial bone group and the BAM with devitalized autologous bone flap group exhibited significantly higher bone mineral density compared to the devitalized autologous bone flap group and non-treatment group. These findings suggest that BAM-induced artificial bone effectively induces vascular and fibrous tissue regeneration at defect sites, promoting osteogenic differentiation and demonstrating remarkable bone induction capacity. Furthermore, clinical studies on comminuted tibial fractures and avascular necrosis of the femoral head have consistently documented favorable bone repair outcomes and clinical safety profiles with BAM-induced artificial bone applications.10–12 These results demonstrate the material’s advantages in de novo bone formation and neovascularization.
BAM-induced artificial bone, chemically synthesized from calcium-phosphate compounds to achieve biomimetic porous architecture, fulfills the tripartite requirements of specific material composition, three-dimensional porosity, and surface micro-nano structures. 13 Nevertheless, its inherent mechanical limitations, particularly limited compressive strength and elevated brittleness, predispose the porous architecture to structural collapse under tight compaction or excessive pressure,14,15 thereby compromising osteoinductive functionality. During clinical application, our research team observed that BAM-induced artificial bone demands greater stability within the grafted area. When employed as a standalone graft material, BAM-induced artificial bone demonstrates optimal suitability for contained defects such as cavitary bone defects, where minimal load-bearing demand is placed on the graft material. Unlike allogeneic bone grafts (which can be compacted into particulate form for dense packing), BAM-induced artificial bone requires both rigid fixation and primary structural support from autologous bone through external contouring. The material achieves significant reduction in site-specific bone resorption only after osteoprogenitor cells progressively infiltrate its porous and micro-nano structures through osteoinductive mechanisms, ultimately establishing a stable bone scaffold.
In this study, some subjects underwent total osteotomy followed by bone grafting solely enclosed by an induced membrane. Radiographic evaluation revealed weaker peripheral osteogenesis and irregular bone formation patterns in these cases. Similarly, graft consolidation in non-cortical regions demonstrated inferior outcomes compared to cortical-supported areas. Notably, even under these suboptimal conditions, BAM-induced artificial bone exhibited significantly less bone resorption relative to allogeneic bone grafts. Comparative radiographic analysis (Figure 2) demonstrated distinct patterns: allograft sites showed extensive irregular callus bridging at graft margins and persistent segmental bone defects, with substantial resorption of bone tissue originally observed outside weight-bearing remodeling zones. In contrast, BAM-induced artificial bone maintained well-defined high-density callus signals at graft peripheries and retained abundant callus formation beyond weight-bearing regions. Experimental observations further demonstrated significantly reduced bone resorption in the grafted area of the BAM group. This phenomenon may be primarily attributed to the material’s low immunogenicity, 16 enhanced local osteoblast activation, calcium-phosphate precipitation, and optimized scaffold functionality.17–20 Typically, bone defect sites exhibit excessive adipocyte infiltration and fibrous connective tissue proliferation—biological processes that physically impede osteoblast migration and substantially compromise neo-ossification. The BAM scaffold effectively counteracts this through dual mechanisms: suppressing soft tissue hyperplasia while enhancing angiogenesis and osteogenic stimulation, thereby significantly reducing osteolytic cavity formation. 10 These findings highlight the material-structural superiority of BAM-induced artificial bone, confirming its enhanced scaffold functionality, osteoconductivity, and osteoinductive capacity over allogeneic grafts in osseous reconstruction. Clinically, while BAM grafts pose technical challenges due to their limited load-bearing capacity and strict requirement for meticulous surgical fixation (prohibiting tight compaction), our results suggest their viability for large-volume bone defects. This material shows promise as a supplemental grafting solution when autologous bone reserves are insufficient, potentially replacing allografts in routine clinical applications.
The selection of DR imaging technology as the observational modality was based on comprehensive considerations. The fundamental principle of DR imaging involves utilizing X-rays to penetrate human tissues and organs, thereby generating differential imaging contrasts of anatomical structures. 21 This technology has been extensively applied in fracture-related research and clinical practice.22,23 Compared to conventional X-ray imaging, DR offers advantages in rapid imaging acquisition, enhanced image quality, reduced radiation exposure, and operational flexibility. Compared to CT imaging, it offers benefits of lower cost and simplified operational procedures. Furthermore, bone resorption of graft materials manifests as osteoporosis-like density reduction on radiographic images, with a definitive correlation between imaging density and bone mineral content. 24 Through comparative analysis of transverse density variations within identical graft regions across different timepoints, combined with cartographic visualization via ImageJ software, this approach demonstrates high applicability, convenience, reproducibility, and observability. Therefore, DR imaging was selected as the primary observational method in this study.
However, limitations such as restricted sample size, short observational duration, and single-modality radiographic evaluation may compromise the accuracy and comprehensiveness of the findings. Thus, future large-scale, long-term, multicenter clinical trials will be conducted to validate the clinical effectiveness of BAM-induced artificial bone. Concurrently, we will investigate the material’s load-bearing capacity and moldability to comprehensively evaluate the therapeutic potential as a novel bone graft substitute. These efforts aim to expand the proportion of induced bone grafts in total transplantation volume, reduce reliance on autologous bone harvesting, and ultimately address bone deficits in defect reconstruction.
Conclusions
BAM-induced artificial bone demonstrated greater advantages in osteogenesis within the grafting area compared to allogeneic bone. Its absorption rate, and consequently the void ratio in later stages, was lower than that of allogeneic bone. Furthermore, BAM-induced artificial bone exhibited superior osteogenic quality, bony support in load-bearing areas, and effective bone callus formation compared to allogeneic bone.
