"Russian Journal of Transplantology and Artificial Organs" (Abbreviated key title of "Vestnik Transplantologii i Iskusstvennykh Organov") is a scientific and practical journal which publishes original articles and reviews covering basic and clinical transplantology, regenerative medicine, development and clinical studies of artificial and bioartificial organs, and cell technology. It is the only official journal of the Russian Transplant Society, all-Russian public organization of transplantologists. The journal has been published since 1999 and is released quarterly. All research articles published in the journal undergo peer review. The journal's web site provides open access to the current issue and to full text archives.
The editorial board and the editors of the journal are famous scientists from Russia and other countries, experts in transplantation, regenerative medicine and related fields.
The Editor-in-Chief is Professor S.V. Gautier, leading Russian transplantologist, President of the Russian Transplant Society.
The journal is included in the list of leading peer-reviewed scientific publications produced in the Russian Federation and is recommended for publication of primary results of dissertation research (Ph.D. and Sc.D.).
Current issue
EDITORIAL
Новое в клинической трансплантологии
The loss of an arm (or both arms), not only causes severe physical trauma but also leads to long-term psychological consequences and difficulties in occupational, domestic, and social maladjustment, substantially impairing quality of life. This complex problem is not always addressed effectively with modern high-tech bionic prostheses, which, while enabling the performance of many functional tasks, have inherent limitations. In selected cases, allograft transplantation of an upper limb from a deceased donor may be considered. Until recently, this treatment option was unavailable within the Russian healthcare system. This paper presents the key stages in establishing an upper limb transplant program, including its organizational structure, the roles of its individual components, and the outcomes of practical implementation. The efficacy and safety of the treatment method are illustrated through the first two clinical cases. The experience gained and the outcomes of the first clinical cases are intended to demonstrate the practical feasibility of developing a structured, effective upper limb transplant program that can address the growing demand for this type of high-tech medical care.
Clinical Transplantology
Hepatic hemangioma is the most common benign focal lesion of the liver and is usually small and asymptomatic. However, giant hemangiomas may require surgical intervention, while diffuse hepatic hemangiomatosis with multiple giant hemangiomas is exceedingly rare. We report a rare case of diffuse hepatic hemangiomatosis in a 59-year-old patient complicated by portal hypertension, inferior vena cava syndrome, liver failure, and Kasabach–Merritt syndrome, who successfully underwent orthotopic liver transplantation (LT) from a deceased donor. This case demonstrates the malignant course of a relatively benign disease that required LT. Although giant liver hemangiomas are a rare indication for LT, in selected cases where conservative and surgical treatments are ineffective or contraindicated, transplantation may be a safe and effective therapeutic option.
The steady annual increase in the number of liver transplants (LT) in Russia, together with the expansion of the liver recipient population, underscores the need for large-scale studies on the actual clinical practice of prescribing and managing immunosuppressive therapy.
Objective: to analyze the structure of maintenance immunosuppressive therapy (MIT) at various time points after liver transplantation and to assess the evolution of approaches to selecting initial immunosuppressive regimens between 2010 and 2024.
Materials and methods. This singlecenter, retrospective registry study included data from 568 consecutive LT performed between 2010 and 2024, using grafts from living-related (72%) and deceased (28%) donors. The MIT composition was evaluated at six time points: at hospital discharge and at 1, 3, 5, 7, and 10 years after transplantation.
Results. At hospital discharge and at 1, 5, and 10 years after LT, calcineurin inhibitors (CNIs) were prescribed in 99%, 96%, 94%, and 90% of patients, respectively. The use of glucocorticoids (St) decreased over time, accounting for 51%, 17%, 10%, and 14%, respectively. Proliferation signal inhibitors (mTOR inhibitors) were prescribed in 13%, 17%, 14%, and 14% of cases, while antimetabolites (A/M) were used in 6%, 7%, 7%, and 14% of patients, respectively. At intervals ranging from one to ten years after transplantation, CNI monotherapy was used in approximately 70% of recipients. Combination regimens included CNI + St in 7%, CNI + A/M ± St in 8%, CNI + mTOR ± St in 8%, and mTOR ± St in 7% of patients. At discharge and at 5 years after transplantation, mTOR-based regimens were more commonly prescribed in patients who underwent surgery for liver tumors (48% and 57%, respectively), A/M-based regimens in patients with immune-mediated liver diseases (22% and 18%), and CNI monotherapy in recipients with viral cirrhosis (52% and 89%). Immunosuppression regimens changed with a frequency ranging from 9% (in the 5–7 year interval) to 49% (in the interval from discharge to 1 year). Over the period from 2010 to 2024, the most notable trends in initial immunosuppression included a shift from immediate-release to prolonged-release tacrolimus, as well as increased use of mycophenolates and mTOR inhibitors, rising from 2% and 4% in 2010–2015 to 9% and 21% in 2019–2024, respectively.
Conclusion. The underlying etiology of the disease remains the primary determinant in selecting MIT. The high prevalence of CNI monotherapy from 1 to 10 years post-transplant provides a strong rationale for initiating clinical trials to evaluate the safety and efficacy of minimizing or discontinuing immunosuppressive therapy in liver transplant recipients.
Background. Kidney graft failure due to recurrence of previously undiagnosed multiple myeloma (MM) is a rare event. This report presents a clinical case of kidney transplantation complicated by graft dysfunction one year after surgery caused by recurrence of undiagnosed MM.
Clinical observation. A 75-year-old man with a history of arterial hypertension and diabetes mellitus was admitted to the dialysis unit with symptoms of uremia. He had not previously been followed by a nephrologist. Renal ultrasonography revealed diffuse parenchymal changes. Considering the two diseases, chronic kidney disease (CKD C5) secondary to hypertensive and diabetic nephropathy was diagnosed, and maintenance hemodialysis was initiated. After 11 months, the patient underwent deceased-donor kidney transplantation. One year post-transplantation, graft dysfunction developed, prompting a transplant kidney biopsy. Histopathological examination revealed changes characteristic of MM-associated kidney injury, including light-chain cast nephropathy (LCCN, κ type) combined with light-chain proximal tubulopathy (LCPT, κ type), focal segmental glomerulosclerosis, tubulointerstitial nephritis, and acute tubular necrosis. Further evaluation confirmed MM with Bence–Jones proteinuria, stage III B, and myeloma nephropathy of the renal allograft. The patient was transferred to the hematology department for chemotherapy, resulting in partial hematologic remission. However, renal graft function was not restored, and the patient remained on maintenance hemodialysis.
Conclusion. MM-associated kidney injury is a rare clinical event. In routine clinical practice, thorough pre-transplant evaluation and accurate determination of the etiology of CKD are essential.
Background. Liver transplantation (LT) significantly improves survival in patients with end-stage liver disease; however, it is associated with an increased risk of developing de novo malignancies. Long-term immunosuppression, viral infections, and unhealthy lifestyle choices increase the risk of post-transplant oncological complications.
Objective: to summarize current evidence on the prevalence, risk factors, diagnostic, prevention, and treatment of de novo malignancies in liver transplant recipients.
Materials and methods. This paper presents a literature review, including retrospective and prospective studies, meta-analyses, and clinical guidelines published over the past two decades.
Results. The most common post-transplant malignancies include non-melanoma skin cancer, lymphoproliferative disorders, and solid organ tumors. Major risk factors are prolonged immunosuppression, viral infections, smoking, alcohol use, advanced recipient age, and the underlying liver disease. Current management strategies involve immunosuppression reduction, surgical resection, chemotherapy, and targeted therapy.
In particular, mammalian target of rapamycin (mTOR) inhibitors have demonstrated antitumor efficacy in selected patients, particularly those with Kaposi’s sarcoma and lymphoproliferative disorders.
Conclusion. Given the high oncological risk, stratified screening programs and individualized patient management are necessary after LT. Immunosuppression reduction, lifestyle modification, and early detection of malignancies are key factors in improving long-term outcomes.
We present an overview of the current status and development of renal replacement therapy and kidney transplantation in Kyrgyzstan. The paper outlines the key stages in the evolution of transplant care in Kyrgyzstan and references the regulatory and legal frameworks governing the provision of medical services in this field. In addition, it analyzes transplant activity and the demographic characteristics of kidney donors and recipients.
Background. The technique of redirecting blood flow from the inferior vena cava (IVC) and portal vein during liver transplantation (LT) offers several advantages, the most important of which is the prevention of intraoperative hypoperfusion-related complications.
Materials and methods. A literature search was performed in the Scopus, PubMed, and Russian Science Citation Index (RSCI) databases using the following keywords: «liver bypass», «liver transplantation with venovenous bypass/shunting,» «assisted circulation during liver surgery,» «assisted circulation during liver transplantation», and «history of transplantology» from the early 1960s through 2025 were considered. In total, 162 articles from Russian and foreign journals were analyzed, of which 44 met the inclusion criteria and were included in the review.
Results. An analysis of Russian and foreign literature reveals a unified concept regarding the use of venovenous bypass (VVB) systems, as well as the general advantages and limitations of existing techniques. VVB remains a relevant method for maintaining hemodynamic stability and improving postoperative outcomes in liver transplant recipients. Particular attention is given to a novel system and technique for performing VVB that incorporates an oxygenator/heat exchanger and a venous reservoir, allowing the simultaneous use of continuous renal replacement therapy (CRRT). Modern cardiopulmonary bypass techniques are characterized by the creation of optimal conditions for surgical intervention, ensuring a high level of patient safety throughout the procedure.
Heart Transplantation and Assisted Circulation
Background. Coronary artery disease (CAD) is one of the leading causes of graft loss after heart transplantation (HT). Owing to cardiac denervation, myocardial ischemia in transplanted hearts is typically clinically silent, necessitating regular screening of recipients to detect transplant vasculopathy. Routine annual invasive coronary angiography (iCAG), however, is associated with potentially life-threatening complications, prompting the search for safe and equally effective non-invasive diagnostic alternatives. Multislice computed tomography coronary angiography (MSCT-CAG) has been widely and successfully used for many years in CAD diagnosis, with a high class and level of evidence, and has long been an alternative to iCAG. This underscores the relevance of evaluating its applicability in heart transplant recipients.
Objective: to assess the diagnostic effectiveness of MSCT-CAG in detecting cardiac allograft vasculopathy in comparison with iCAG.
Materials and methods. The study included 46 heart transplant recipients (36 men, 78%) aged 29–68 years (mean age 51.1 ± 10.9 years) who underwent HT between 2012 and 2023. The interval from transplantation to CAG ranged from 201 to 4,285 days (mean 1,097 days). All patients underwent scheduled iCAG and MSCT-CAG. Coronary arteries were evaluated using a 16-segment model. Segments that could not be reliably assessed on MSCT-CAG images were excluded from the analysis.
Results. Heart rate during MSCT-CAG ranged from 65 to 105 beats per minute (median 90 bpm) and was not adjusted with medication prior to scanning. Invasive CAG allowed assessment of 690 coronary segments, while 683 segments were of diagnostic quality on MSCT-CAG. According to iCAG, coronary lesions were identified in 25 segments. MSCT-CAG detected lesions in 15 segments, yielded false-positive findings in 14 segments, and failed to identify stenoses detected by invasive CAG in 10 segments. X-ray dose was significantly higher during MSCT-CAG (22.6 mSv) compared with iCAG (10 mSv; p = 0.001). MSCT-CAG also required a larger volume of contrast medium (90 mL vs. 60 mL; p = 0.001). Serum creatinine levels before and after MSCT-CAG were 91.35 ± 18.09 and 95.17 ± 18.53 μmol/L, respectively, while glomerular filtration rate (GFR) values were 86.28 ± 17.79 and 82.96 ± 17.72 mL/min/1.73 m2. Despite the higher contrast load and compromised renal function due to the nephrotoxicity of immunosuppressive therapy, no cases of contrast-induced nephropathy were observed following MSCT-CAG. Comparative analysis demonstrated that MSCT-CAG had a sensitivity of 60%, specificity of 97%, positive predictive value of 52%, and negative predictive value of 98% relative to iCAG.
Conclusion. In the diagnosis of cardiac allograft vasculopathy, MSCT-CAG can be used to rule out coronary artery stenosis, demonstrating high specificity (97%) and negative predictive value (98%). The use of MSCT-CAG for the detection of stenosis/restenosis of the coronary vasculature in transplanted hearts requires further study.
Hypertrophic cardiomyopathy (HCM) is the most common inherited heart disease and is characterized by a variety of manifestations: from morphological features and disease progression to clinical presentation and hemodynamic parameters. Management of HCM should be strictly individualized, considering not only hemodynamic parameters but also anatomical features. Classification based on the presence or absence of left ventricular outflow tract obstruction, as well as the location of interventricular septal hypertrophy (basal, midventricular, and apical), largely determines the optimal management strategy. Drug therapy, surgical myectomy, and alcohol septal ablation are the main treatment options for obstructive HCM, whereas management of the non-obstructive form focuses on symptom control and prevention of complications. Given the risk of sudden cardiac death, timely implantation of an implantable cardioverter-defibrillator in high-risk patients is of paramount importance.
Objective: to evaluate the initial experience of performing simultaneous heart transplant (HT) with coronary artery bypass grafting (CABG) at Almazov National Medical Research Centre in St. Petersburg.
Materials and methods. Outcomes of 196 HT performed between January 1, 2016, and January 1, 2025, were analyzed. Patients were divided into two groups: 16 recipients who underwent combined HT + CABG (Group 1) and 180 recipients who underwent standard HT (Group 2). The groups were compared using the following parameters: duration of surgery, graft ischemic time, duration of cardiopulmonary bypass (CPB), duration of inotropic therapy, length of stay in the intensive care unit (ICU), 30-day mortality, and one-year survival.
Results. The operative time was longer in Group 1, at 312 (286–415); 2020; 1130 min, compared with Group 2, at 268 (225–320); 150; 2150 min (p = 0.007). Group 1 also demonstrated a significant increase in graft ischemic time and CPB duration compared to group 2. Median graft ischemic time in Group 1 was 156 (146–180); 120, 240 min, versus 140 (120–160); 40, 240 min in Group 2 (p = 0.021). CPB duration was 159 (133–180); 101, 214 min in Group 1 and 128 (101–162); 71–350 min in Group 2 (p = 0.015). Despite the more complex nature of the intervention in Group 1, no statistically significant differences were observed between the groups in the duration of inotropic therapy or length of stay in the ICU. Although the 30-day mortality rate was higher in Group 1 than in Group 2 (18.7% vs. 7.8%), this difference did not reach statistical significance (p = 0.136). One-year survival rates were comparable between the two groups.
Conclusion. Simultaneous HT and CABG procedures are associated with longer operative time, prolonged graft ischemia, and longer CPB duration compared with standard HT. However, no differences were observed in the duration of inotropic therapy or early postoperative period. Combined HT and CABG procedures are technically feasible, with 30-day mortality and one-year survival rates comparable to those of standard HT. The use of donor hearts with underlying coronary artery pathology may expand donor selection criteria and increase the number of HT procedures performed.
Objective: to evaluate the clinical efficacy of a comprehensive approach to the diagnosis and surgical treatment of driveline infections (DLIs) in patients with long-term mechanical circulatory support (MCS).
Materials and methods. A single-center retrospective observational study was conducted at Shumakov National Medical Research Center of Transplantology and Artificial Organs. The analysis included 56 patients with implanted devices for long-term MCS of the left ventricle: AVK-N (2012–2018) – 17; Stream Cardio (2022–2025) – 32; HeartMate 3 (2022–2025) – 7 people.
Results. DLIs were identified in 18 of 56 patients (32.1%) with long-term left ventricular MCS. In the group managed without the diagnostic and therapeutic DLI algorithm, infections occurred in 8 of 17 patients (47.1%), whereas in the group managed with the algorithm, DLIs were detected in 10 of 39 patients (25.6%). Most DLI episodes developed within the first six months after device implantation. Implementation of the diagnostic and treatment algorithm resulted in a reduction in relapse frequency in patients with extensive forms of DLI (from 2.25 to 1.17 episodes per patient; p = 0.050) and a significant decrease in the duration of hospitalization for both local and subcutaneous forms (p = 0.022 and p = 0.014, respectively).
Conclusion. The use of the developed algorithm for the diagnosis and treatment of DLIs in patients receiving long-term MCS enabled a systematic approach to patient management, improved the effectiveness of local infection control, reduced hospital length of stay, and decreased the frequency of infectious complications.
There is a growing need for tools that enable objective assessment of donor heart quality. One such approach is the use of regression models incorporating donor and recipient risk factors to predict surgical outcomes and potentially expand the donor pool by increasing the number of transplantations.
Objective: to develop a model for estimating the total risk of one-year mortality in recipients using different categories of expanded-criteria donors (ECDs).
Materials and methods. The study included 1,500 recipients who underwent orthotopic heart transplantation (OHT) at Shumakov National Medical Research Center of Transplantology and Artificial Organs over an 11-year period, from January 1, 2011, to December 31, 2021. The cohort comprised 1,281 men (85.4%) and 219 women (14.6%), aged 9 to 78 years (median age 49.0 [38.0–56.0] years). The heart transplants performed (n = 1,500) were divided into two clinical groups: group 1 (main group) comprised recipients who underwent OHT from ECDs (n = 1,060; 70.6%); group 2 (control group) included recipients who underwent OHT from standard-criteria donors (n = 440; 29.4%). Donor heart suitability for transplantation was assessed according to the 2023 criteria of the International Society for Heart and Lung Transplantation (ISHLT).
Results. Donor- and recipient-related indicators were initially evaluated using univariate regression analysis. The final multivariate regression model included five donor-related factors – donor–recipient weight mismatch, donor age, high-dose cardiotonic therapy, coronary stenosis, and prolonged graft ischemia – and four recipient-related factors – total bilirubin >40 μmol/L, creatinine >110 μmol/L, international normalized ratio (INR) >1.4, and pre-transplant peripheral veno-arterial extracorporeal membrane oxygenation (pVA-ECMO). The highest odds ratios were observed for donor age, coronary stenosis, graft ischemia time exceeding 6 hours, and pre-transplant pVA-ECMO support. The predicted one-year mortality rate calculated using regression analysis showed a strong correlation (R = 0.827; p < 0.001) with the observed one-year mortality rate. Long-term survival was also analyzed across risk groups, with the worst outcomes observed in the high-risk group.
Conclusion. The proposed statistical model provides a reliable prognostic accuracy for both early and long-term post-transplant survival. Its application at the stages of donor heart evaluation and donor–recipient matching may facilitate the use of a broader donor pool while enabling an objective assessment of recipient prognosis.
Objective: to identify risk factors for antibody-mediated myocardial rejection (AMR) during the first year following heart transplantation (HT).
Materials and methods. This retrospective study included 162 patients who underwent HT at Almazov National Medical Research Centre between 2010 and 2024. Clinical, laboratory, immunological, and morphological data were analyzed. Sensitization was assessed using Luminex multiplex assays and solid-phase ELISA. AMR was diagnosed based on endomyocardial biopsy with morphological and immunohistochemical evaluation in accordance with the 2013 ISHLT classification. Hemodynamically significant rejection was defined as AMR II–III in combination with graft dysfunction and/or the presence of donorspecific antibodies (DSA). Patients were stratified into three groups: (1) no sign of AMR (control); (2) isolated morphological signs of AMR (AMR II); and (3) hemodynamically significant AMR (AMR III, or AMR II with DSA and/or echocardiographic evidence of graft dysfunction). Discriminant analysis was used to evaluate the factors associated with the development of AMR.
Results. Morphological signs of AMR were identified in 36.2% of patients, including 9.9% with hemodynamically significant rejection requiring specific therapy. Patients with AMR before HT showed significantly higher sensitization to HLA class I and II antigens (MFI >5000; p < 0.05), and following HT, a statistically significant excess of de novo antibodies persisted for 6–12 months compared with the control group (p < 0.05). In addition, AMR was associated with lower tacrolimus levels during the first two weeks after HT (p = 0.002) and reduced antimetabolite doses in the first month post-transplant (p < 0.001). Discriminant analysis of 15 variables yielded a model incorporating 9 significant predictors, enabling statistically reliable differentiation of patients with hemodynamically significant rejection from other groups (F(18,222) = 12.463; p < 0.00001).
Conclusion. High levels of HLA sensitization before transplantation, suboptimal immunosuppression in the early postoperative period, and predictors identified in the discriminant analysis model are associated with an increased risk of developing hemodynamically significant AMR. The proposed model may be used for risk stratification and to individualize post-transplant follow-up.
Objective: to study the relationship between circulating anti-HLA antibodies and the incidence of adverse events (death and retransplantation) and to evaluate the effectiveness of targeted therapies for graft dysfunction.
Materials and methods. A retrospective study was conducted among heart transplant recipients hospitalized with signs of circulatory failure (graft dysfunction). All patients underwent coronary angiography, endomyocardial biopsy, and serological testing for anti-HLA IgG antibodies at baseline and, when applicable, after treatment. All heart transplant recipients underwent endomyocardial biopsy with examination of six tissue samples using histological and immunohistochemical techniques, and coronary angiography. Anti-HLA IgG levels in serum were measured using a Luminex device. Follow-up anti-HLA IgG testing was performed only in patients with initially detectable antibodies and after administration of specific therapies for presumed graft rejection.
Results. The study included 362 heart transplant recipients observed at Shumakov National Medical Research Center of Transplantology and Artificial Organs between January 2018 and November 2024. Participants were aged 18–72 years (mean 48.1 ± 1.3 years) with a mean post-transplant follow-up of 1343.6 ± 125.1 days (95% CI 1218.5–1408.6), comprising 69 females and 293 males. Anti-HLA IgG antibodies were detected in 111 recipients (30.7%). Univariate analysis identified significant associations between adverse events and: repeat heart transplantation (p = 0.005), perioperative use of mechanical circulatory support (p < 0.003), age under 46 years at hospitalization (p = 0.023), and anti-HLA II maxMFI above 5000 (p = 0.042). Regression analysis adjusted for anti-HLA II levels showed that only initially elevated anti-HLA II maxMFI levels (>5000) and the persistence of any anti-HLA II levels after etiotropic treatment were associated with the risk of adverse events.
Conclusion. In heart recipients hospitalized with signs of circulatory failure due to graft dysfunction, the presence of anti-HLA II maxMFI titers above 5000 at baseline, as well as residual anti-HLA II titers after etiotropic treatment for antibody-mediated rejection, are independent predictors of adverse events, including retransplantation and death.
Heart transplantation (HT) remains the only definitive surgical treatment for end-stage chronic heart failure (CHF).
Objective: to investigate factors associated with adverse immediate outcomes following HT in children.
Materials and methods. Between January 1, 2012, and December 31, 2024, 91 HTs were performed in recipients under 18 years of age at Shumakov National Medical Research Center of Transplantology and Artificial Organs, Moscow. The patients were divided into two groups based on early postoperative outcomes: survivors (n = 79; 86.8%) and non-survivors (n = 12; 13.2%).
Results. Between 2012 and 2024, a total of 2,190 HTs were performed, including 91 (4.2%) in children. Severe early graft dysfunction occurred in 14 pediatric heart recipients (15.4%), and in-hospital mortality was 13.2%. A high recipient urgency status (UNOS), the use of short-term mechanical circulatory support, and clinical manifestations of multiple organ failure necessitated the expansion of donor heart selection criteria. Receiver operating characteristic (ROC) analysis demonstrated that baseline laboratory parameters influenced transplant outcomes. Serum sodium, lactate, and urea levels, as well as hemoglobin levels, red blood cell and platelet counts, showed statistically significant predictive value, as confirmed by area under the curve (AUC) analysis. Donors in the non-survivor group were significantly older than those in the survivor group. The donor-to-recipient weight ratio was higher among recipients who died in the early post-transplant period. In the non-survivor cohort, significantly higher values were observed for the donor-to-recipient height ratio, donorto-recipient body surface area ratio, and durations of graft ischemia, anesthesia, surgery, and cardiopulmonary bypass.
Conclusion. The effectiveness of pediatric HT (hospital survival rate 86.8%) is influenced primarily by recipient urgency status (UNOS). Additional contributing factors include severity of multiple organ dysfunction, donor age, significant donor–recipient anthropometric mismatch, operative time, and intraoperative blood loss.
Organ Donation
Introduction. This paper evaluates the key structural characteristics of turkey cornea and critically analyzes both the obtained results and existing literature to assess the feasibility of using this xenogeneic material for keratoplasty. Currently, more than 12 million patients worldwide are awaiting keratoplasty. The prolonged waiting times is largely driven by severe shortage of cadaveric human corneas required for the procedure. Recent studies have generated growing optimism regarding the potential of xenogeneic keratoplasty; however, the number of animal donors studied to date remains quite limited. In light of this gap, the objective of this study was formulated.
Objective: to assess the main structural characteristics of turkey (Meleagris gallopavo) cornea and to evaluate its potential use as a xenogeneic material for selective keratoplasty.
Materials and methods. Corneoscleral buttons (n = 24) were isolated from enucleated turkey eyeballs. In the first stage, the corneal microstructure was examined using scanning electron microscopy and confocal microscopy. In the second stage, the feasibility of cutting out a corneal xenograft suitable for deep anterior lamellar keratoplasty was evaluated. In the third stage, the potential for preservation of the obtained xenografts using xenogeneic turkey corneal material was assessed.
Results. The turkey cornea was shown to possess all the principal layers characteristic of the human cornea. Its mean thickness was 508 ± 33.5 μm, and the presence of Bowman’s membrane was confirmed. Preservation of the turkey cornea for up to five days was feasible while maintaining a high endothelial cell density. In addition, the preparation of a xenograft suitable for deep Anterior lamellar keratoplasty from turkey corneal tissue was successfully demonstrated.
Conclusion. Analysis using confocal microscopy and scanning electron microscopy, together with assessment of xenograft integrity following hypothermic preservation, indicates that turkey cornea represents a promising xenogeneic material for selective keratoplasty. Further studies are warranted to assess its potential application in reconstructive corneal surgery.
Transplantomics
Epigenetics is the study of changes in gene expression that occur without alterations in the primary DNA sequence. These changes are mediated by chemical modifications of DNA, histones, and non-coding RNAs, collectively forming the epigenome, that determines the functional activity of the genome. Epigenetic mechanisms play a fundamental role in cellular differentiation, organismal development, and adaptation to external conditions. In medicine, they have attracted considerable attention due to their involvement in the pathogenesis of oncological, autoimmune, and neurodegenerative diseases. MicroRNAs (miRNAs), as key components of epigenetic mechanisms, play a critical role in controlling immune responses, including those occurring after organ transplantation. This has opened new opportunities for a personalized approach to the management of transplant recipients. Accumulating evidence on the role of miRNAs in solid organ transplantation suggests that integration of omics technologies may expand the existing arsenal of diagnostic criteria, serving as an auxiliary diagnostic tool for monitoring graft function. This systematic review presents a comprehensive analysis of the current literature on the clinical significance of miRNAs in modern transplantology. It highlights the diagnostic and predictive potential of specific miRNAs in relation to the development of complications in recipients of heart, lung, kidney, and liver transplants, and examines current approaches to the use of miRNAs as therapeutic targets.
Renal ischemia–reperfusion injury (IRI), which develops during organ-preserving kidney surgery and particularly during kidney transplantation (KT), remains a major challenge in urology and transplantology, as it can lead to progression of acute kidney injury and chronic graft dysfunction. Conservative strategies aimed at minimizing oxidative stress are especially important in situations where surgical options are limited. In transplantology, IRI is of particular relevance, as KT is the treatment of choice for patients with end-stage renal disease, significantly improving both quality of life and survival compared with renal replacement therapy. A critical stage of the transplantation procedure involves donor organ ischemia (warm and cold), followed by reperfusion after restoration of blood flow in the recipient. The severity of IRI directly influences graft function and is a key risk factor for delayed graft function and acute rejection [1, 2]. Therefore, the search for effective the search for to prevent and correct IRI is critical to improving kidney transplant outcomes.
Objective to systematize current knowledge on the potential of conservative methods for correcting renal IRI caused by excessive reactive oxygen species (ROS) during organ-preserving kidney surgery and KT under conditions of warm ischemia.
Methods. A systematic analysis of literature published over the past 10 years was conducted using the PubMed search engine, the Cochrane Library database of evidence-based medicine, and the Scopus unified bibliographic and abstract database of peer-reviewed scientific literature. Particular emphasis was placed on randomized studies evaluating drugs or newly synthesized compounds that suppress ROS formation and restore or enhance the body’s antioxidant capacity.
Conclusion. At the current stage of medical science, considerable attention is focused on substances capable of blocking the molecular mechanisms involved in mitochondrial membrane pore opening, as well as on agents that suppress ROS formation through inhibition of NADPH oxidase and xanthine oxidase. The therapeutic potential of exogenous enzyme preparations (such as superoxide dismutase and catalase), low-molecular-weight catalytic ROS scavengers, and non-enzymatic antioxidants – including supraphysiological doses of ascorbic acid and mitochondria-targeted agents such as mitoquinone and elamipretide – is actively being investigated. In the future, the results of these studies may form the basis for the development of effective antioxidant strategies for the prevention and treatment of renal IRI during organ-preserving kidney surgery and transplantation.
Regenerative Medicine and Cell Technologies
Islet transplantation is a modern method of treating severe type 1 diabetes mellitus (T1D). However, multiple factors can negatively affect both the efficiency of islet isolation and subsequent transplantation outcomes. Consequently, ongoing efforts to optimize isolation techniques have led to the emergence of new unique protocols.
The objective of this study is to evaluate the effectiveness of a modified method for isolating pancreatic islets (islets of Langerhans) from deceased donors for clinical transplantation in recipients with T1D.
Materials and methods. Pancreatic islets were isolated using a modified technique that excluded the use of the Ricordi islet isolator during the enzymatic digestion stage, as well as density gradient centrifugation during the purification stage. Islet identification was performed using dithizone staining. Islet viability was assessed by fluorescent staining with acridine orange and propidium iodide. Functional activity was evaluated by determining the stimulation index using enzyme-linked immunosorbent assay (ELISA). Histological examination of the native pancreas (n = 3) included routine staining, as well as immunohistochemical analysis targeting the main types of islet cells.
Results. Pancreatic islets isolated from the pancreas of deceased donors using the modified technique retained their structural integrity and demonstrated variability in size and shape. The total islet yield was 630,000 ± 30,000, with a viability rate of 90 ± 3% and a stimulation index of 1.41 ± 0.01. These findings indicate the high quality of the isolated biomaterial and confirm the ability of β-cells to respond to changes in glucose content. Morphological assessment of the donor pancreas demonstrated preserved structural integrity of the islet apparatus, supporting the potential for obtaining viable and functionally active islets. The efficiency of islet isolation during enzymatic pancreatic processing was also confirmed.
Conclusion. Optimization of the methodological approach enabled the development and validation of a modified technique for processing the pancreas of deceased donors, resulting in the isolation of a substantial number of viable and functional islets of Langerhans. The characteristics of the obtained islet graft support its suitability for clinical transplantation in T1D patients.
This article addresses strategies for reducing corneal graft rejection and explores alternatives to full-thickness corneal transplantation. The objective of the study is to develop a technique for removing cellular material from human corneal lenticules obtained during refractive surgery for use as a biomaterial in keratoplasty. A decellularization procedure was developed in which lenticules were immersed in a 1% aqueous solution of sodium lauryl sulfate for 24 hours, followed by detergent removal using a buffered 0.9% NaCl solution for 5 days. Decellularization effectiveness was confirmed by the absence of genetic material using Hoechst staining and hematoxylin & eosin staining. Histological analysis of the biomaterial after preservation for various periods demonstrated no signs of biodegradation of the acellular lenticules, which remained sterile for at least 12 weeks. A comparative experimental study in rabbits demonstrated the efficacy and safety of the proposed technique.
Modern medical research identifies type 1 diabetes mellitus (T1D) as one of the most pressing global health challenges, with the incidence steadily increasing by up to 500,000 new cases annually. Current treatment approaches do not provide optimal glycemic control, as they are mainly aimed at compensating for endogenous insulin deficiency, which can significantly compromise patients’ quality of life. For a long time, researchers have sought a treatment strategy capable of achieving a rapid, safe, and reliable cure for T1D. Yet, due to the insufficient understanding of the mechanisms driving the autoimmune response underlying this condition, modern medicine has not been able to develop an etiotropic therapy that could ensure complete recovery. A wide range of therapeutic developments targeting different pathogenetic mechanisms of T1D are currently underway, all aimed at supporting optimal glucose metabolism. Their objectives include preventing or slowing disease progression (delaying the onset of the clinical stage) and facilitating glycemic control in affected patients. Despite substantial progress, none of these innovative strategies has yet reached widespread clinical application. The primary obstacles remain the lengthy timelines required to complete full cycles of clinical trials and the need to address limitations revealed during the research process. This review presents the leading modern approaches to T1DM therapy, with a focus on insulin therapy, immunotherapy, and cell-based strategies. Clinical trial data are analyzed, highlighting both advantages and limitations from practical and economic perspectives. The approaches discussed represent the most promising avenues and are expected to play a central role in future treatment of T1DM.
Экспериментальные исследования
Objective: to study the hydraulic characteristics of an axial pump under conditions where the external back pressure at the outlet exceeds the maximum pressure generated by the pump, and to determine the corresponding headcapacity curves (HCC) in this operating mode.
Materials and methods. A hydraulic test bench was designed and assembled for the experiments, capable of generating outlet back pressure up to 200 mmHg. Measurements were performed using axial blood pump Stream Cardio (Russia). A series of tests was conducted at rotational speeds ranging from 7,500 to 10,000 rpm. Initial excess pressures of 85, 100, and 120 mmHg were applied.
Results. The experimental data obtained on the hydraulic test bench made it possible to determine the complete HCC of the axial pump, including the reverse-flow region corresponding to negative flow rates. At a flow rate of 0 L/min, the pressure drop across the pump was 85 mmHg. At a pressure drop of 100 mmHg, the flow rate reached –2.5 L/min,
Implants and Artificial Organs
Tricuspid valve (TV) replacement for primary regurgitation is a relatively rare procedure. Bioprosthetic valves are generally preferred because they do not require anticoagulant therapy and tend to degenerate more slowly in the tricuspid position than in the mitral or aortic positions; however, their durability remains limited, particularly in young patients. In contrast, mechanical prosthetic valves require rigorous anticoagulant therapy due to low blood flow in the right heart chambers. Prosthesis selection is especially challenging in cases of infective endocarditis with fibrous annular abscess. The aim of this study is to analyze experimental data on cylinder valves and to review the first clinical experiences reported worldwide. The development of new prosthesis models using inert and more durable materials may address current limitations of TV implants, optimize surgical techniques, and improve patient quality of life in the long-term.
Objective: to identify characteristic patterns of calcium distribution in explanted bioprosthetic heart valves and evaluate their influence on the biomechanics of the device.
Materials and methods. Thirty-three bioprosthetic mitral valve leaflets explanted due to structural valve degeneration were analyzed. Multislice computed tomography (MSCT) images were used to identify pathological calcification within each leaflet. Calcified regions were segmented from top-view projections using a radiographic density threshold of 130 HU. The resulting dataset was clustered according to the number of pixels representing calcified areas, yielding three distinct classes: no calcification, mild calcification, and severe calcification. For each class, a three-dimensional computational model of the bioprosthesis was constructed. Biomechanical behavior was evaluated numerically in a series of computer simulation experiments using the finite element method. Each model included the supporting frame and three valve leaflets, with physiologically relevant boundary conditions simulating pressures in the left atrium and left ventricle. The analysis assessed maximum principal stress, strain, and their spatial distribution across the prosthesis.
Results. Calcification of one or two valve leaflets resulted in a slight reduction in the average stress and strain values of the intact leaflet – from 0.319 to 0.303 MPa and from 0.134 to 0.130 mm/mm, respectively. Increased calcium content also lowered the peak stress and strain values, from 2.884 to 2.117 MPa and from 0.384 to 0.333 mm/mm. A clear relationship was observed between calcification pattern and local stress concentrations, which exceeded the leaflet’s mean stress values by 40–50%. Co-localization of mild or severe calcification clusters on one or two leaflets produced qualitative alterations in the closure mechanism, including «overlap» of mineralized leaflets over adjacent intact ones.
Conclusion. The findings demonstrate a relationship between the stress–strain behavior of bioprosthetic valve leaflets and the spatial pattern of calcification. While an increase in calcium volume up to 28% does not substantially affect mean stress or strain values, it significantly reduces their peak values.
Objective: to evaluate changes in the immunogenicity of epoxy-treated xenogeneic bioprosthetic heart valves (BHVs) during in vivo function by detecting human IgG antibodies in the biomaterial of explanted samples.
Materials and methods. Fourteen BHVs explanted during repeat valve replacement were examined. Of these, three valves were removed at 1, 20, and 42 days after implantation, while 11 valves had functioned for periods ranging from 3 to 25 years. Histological sections were prepared from the valve leaflets and analyzed by immunohistochemistry using antibodies against human immunoglobulin G (IgG), as well as Russell–Movat pentachrome staining. Selected leaflet fragments were additionally examined by scanning electron microscopy.
Results. BHVs removed due to early dysfunction after 1, 20, and 42 days of implantation showed no signs of biomaterial degeneration; however, small thrombi were detected on their leaflet surfaces. In contrast, valves that had functioned for 3 to 25 years exhibited clear features of structural degeneration of the biological tissue, including leaflet tears and extensive calcification. These BHVs were also characterized by moderate macrophage infiltration and a slight increase in pannus on the valve surface. Immunohistochemical staining of histological sections for human IgG revealed intense antibody deposition within the biomaterial of all examined BHVs, irrespective of implantation duration. Positive IgG staining was localized along the fibers of the xenogeneic tissue and was absent in recipient tissues, including thrombi and pannus.
Conclusion. The biological component of BHVs retains its immunogenic properties even after long-term (up to 25 years) function in the recipient’s body.
The centrepiece of this analytical review is the metabolism of hydroxyapatite in its natural, bone, and synthetic forms, where the mitochondria-mediated mechanism may serve as the leading mechanism. The possibility that osteoblast mitochondria play an important role in the initial stages of bone mineralisation is discussed. Furthermore, the paper highlights the key role of mitochondria in the metabolism of synthetic hydroxyapatite. Differences between the results of in vivo and in vitro studies using synthetic hydroxyapatite of different morphologies are also detailed. It is noted that long-term infiltration with immune cells and in vivo studies are necessary to adequately evaluate hydroxyapatite as a bone-plastic material. Particular attention is given to the interaction of hydroxyapatite with immune cells and its ability to affect the ribosomes and mitochondria of cells. Due to its mechanical properties, scalability, and potential use for the treatment of extensive bone defects of tumor origin, hydroxyapatite is a promising material. This study also highlights the importance of further development of in vitro research methods in the context of their biomimeticity. Overall, this work offers a theoretical direction for future studies of hydroxyapatite as a bone grafting material and emphasises the value of in vivo studies.

































