Editorial
Clinical Transplantology
Background. Liver cirrhosis occurring before 1 year of age can affect a child’s development. Liver transplantation is the only radical treatment for decompensated cirrhosis. In biliary atresia, cirrhosis develops during the first months of life. The duration of cirrhosis in biliary atresia may vary from palliative Kasai portoenterostomy (PE) to liver transplantation. Developmental abnormalities in children with biliary atresia have been shown to occur both before and after liver transplantation. Association between duration of liver cirrhosis and psychomotor development of children has been underestimated.
Objective: to determine the chances of developmental delay in children depending on the cirrhosis persistence duration.
Materials and methods. The study enrolled 83 children with biliary atresia (47 children underwent palliative Kasai PE, 36 children with liver transplantation did not undergo Kasai PE). All children had their psychomotor development assessed before PE and 12 months after PE using the Griffiths psychomotor developmental scale (translation and adaptation by E.S. Keshishian) for children up to 24 months of age. Statistical analysis was performed by calculating odds ratios with 95% confidence intervals.
Results. Comparative analysis showed that in the subgroup of children who underwent Kasai PE, cirrhosis persistence before transplantation was 2.6 months longer than in children without Kasai PE (p = 0.011). The odds of developmental delay in preparation for liver transplantation were 3.3 times higher in the subgroup of children who underwent Kasai palliative PE compared to children without palliative (95%, CI 1.35–8.31). The odds of developmental delay 12 months after liver transplantation were 4.4 times higher in the subgroup of children who underwent palliative Kasai PE than in children without the palliative care (95% CI 1.54–12.5).
Conclusion. Children who underwent liver transplantation after palliative surgical treatment had lower levels of psychomotor development than children without palliative Kasai PE both before and 12 months after liver transplantation (p = 0.0018, p = 0.01 respectively).
Cytomegalovirus (CMV) infection is the most severe viral infection in renal transplant recipients, which can occur in the post-transplant period in both adult and pediatric recipients. Developing and applying an effective prevention and treatment strategy for pediatric renal graft recipients is a priority. Objective: to compare the effectiveness of the protocols used for the prevention of CMV infection in pediatric kidney transplant recipients.
Materials and methods. The study enrolled 118 patients who underwent primary kidney transplantation at Shumakov National Medical Research Center of Transplantology and Artificial Organs. Based on retrospective analysis, all recipients were divided into two groups, depending on the prophylactic strategy after kidney transplantation. The followup period for pediatric kidney recipients ranged from 108 to 1803 (623.5 ± 379.5) days. CMV infection activity was monitored by polymerase chain reaction.
Results. The frequency of CMV infection activation episodes at 3 and 6 months was independent of the prophylaxis strategy used. The recurrence rate of CMV infection one year after surgery was significantly lower (p = 0.037) with Strategy 2. No cases of CMV syndrome or CMV disease, graft dysfunction, or chronic rejection associated with CMV infection were reported. Increasing the dose of antiviral drugs in Strategy 1 did not increase the risk of cytotoxicity and nephrotoxicity, which are reversible (creatinine levels were not significantly different in the study groups at 3, 6, 12 months, p = 0.542, p = 0.287, p = 0.535, respectively). The incidence of kidney graft rejection did not increase in patients with lower doses of immunosuppressants in Strategy 2.
Conclusion. Both prophylactic strategies are effective in pediatric kidney recipients. However, the choice of a strategy depends on the individual characteristics of the patient and requires a personalized approach.
Lung cancer remains the leading cause of cancer mortality worldwide. Solid organ transplant recipients are at risk of developing malignant tumors, including lung cancer, due to long-term use of immunosuppressive drugs. Development of cancer, including lung cancer, in this patient cohort, has a number of peculiarities. Moreover, malignant tumors in these patients are difficult to treat and have a poorer prognosis. This review presents a study of the issues concerning the mechanisms of lung cancer development, screening methods and treatment in solid organ transplant recipients.
Background. Chronic graft rejection (CR) represents an increasing concern in pediatric liver transplantation (LT). Risk factors of CR in this population are uncertain. In present study, we aimed to ascertain if clinical parameters could predict the occurrence of CR in LT children.
Methods. We retrospectively analyzed the results from 47 children who had experienced acute hepatic rejection in Namazee hospital, Shiraz, Iran during 2007–2017.
Results. Out of 47 children, 22 (46.8%) and 25 (53.2%) were boys and girls respectively. Ascites, gastrointestinal bleeding, and spontaneous bacterial peritonitis were observed in 20 (44.4%), 14 (31.1%), and 4 (9.1%) respectively. Posttransplant vascular and biliary complications were observed in 3 (7%) and 4 (9.3%) cases respectively. The mean time from LT to normalization of liver enzymes was 14.2 ± 7.5 days. The mean of acute rejection episodes was 1.4 ± 0.6 (median = 1 (22, 46.8%), range of 1–3). Six (12.7%) patients experienced CR. The mean time from LT to CR was 75 ± 28.4 days. A significant association was found between CR and patients’ condition (being inpatient or outpatient) before surgery (P = 0.03). No significant relationship was found between CR and post-transplant parameters except for biliary complications (P = 0.01). Both biliary complication (RR = 33.7, 95% CI: 2.2–511, P = 0.01) and inpatient status (RR = 10.9, 95% CI: 1.1–102.5, P = 0.03) significantly increased the risk of CR.
Conclusion. Being hospitalized at the time of LT, and development of biliary complications might predict risk factors for development of CR in LT children.
Objective: to compare changes in estimated glomerular filtration rate (eGFR) in liver recipients with initially normal and impaired eGFR within the first year after immunosuppression conversion.
Materials and methods. Enrolled in the study were 215 recipients of deceased-donor livers from February 2009 to February 2020, who received everolimus with dose reduction or complete withdrawal of calcineurin inhibitors (immunosuppression conversion, ISxC) for varying periods of time. GFR was measured using the MDRD-4 formula immediately before ISxC, then 3, 6, and 12 months after orthotopic liver transplantation (LTx). One month was considered an acceptable temporary deviation from the corresponding point.
Results. At the time of ISxC, 32 (15%) of 215 recipients had normal renal function. Chronic kidney disease (CKD) increased in 60% of the recipients with normal eGFR by the end of the first year following ISxC; the fall in eGFR was particularly pronounced in older recipients. In the group with a baseline eGFR of 60–89 mL/min/1.73 m2, eGFR normalized in 62% of cases within 12 months; 28% of cases had no changes in renal function. In the subgroup with a pronounced decrease in eGFR at the time of ISxC, increased eGFR was observed as early as 1 month after ISxC, and the maximum was recorded after 3–6 months. The mean eGFR relative to baseline by month 3 after eGFR were higher for ISxC that was done in the first 2 months after LTx (19.7 ± 15.7 ml/minute/1.73 m2) than for ISxC done in the long-term period after LTx (10.1 ± 8.7 ml/minute/1.73 m2, p < 0.05).
Conclusion. Changes in eGFR in liver recipients receiving EVR plus low-dose calcineurin inhibitor (CNI) depend on baseline eGFR and are multidirectional. The use of ISxC in the early post-LTx period led to a more pronounced improvement in eGFR. Maximal changes in eGFR were observed by 3–6 months after ISxC.
Background. In reproductive women, transplant disturbs the menstrual cycle pattern. The two major conditions usually encountered are amenorrhea and menorrhagia.
The objective of the study was to assess the pattern of menstrual cycle after kidney transplant in reproductive women.
Materials and methods. This cross-sectional study was carried out in a public sector hospital of Karachi, Pakistan. A total 69 patients of reproductive age were included who underwent living kidney donor transplant for more than a year ago. Women having genital tract infection, using hormonal treatment, organic cause of genital tract, clotting disorder and severe cardiac and/ or peripheral vascular disease were excluded. Frequency and percentages were calculated for demographic characteristics. Correlation and association analysis was calculated for type of menstruation with menstrual cycle pattern. A P-value less than 0.05 was considered statistically significant.
Results. Majority of female included in the study aged between 35–39 years (36, 52.2%). The most frequent menstrual disturbance observed was heavy menstrual bleeding (22, 31.9%) and amenorrhea (21, 30.4%). Only 2.9% cases showed normal menstrual pattern. The cross tabulation indicated that 26.1% patients had amenorrhea, 24.6% had oligomenorrhea and 31.9% had menorrhagia. The Durbin–Watson value of 0.656 indicated a strong positive relationship between menstruation cycle pattern (dependent variable) and type of menstruation, marital status, donor’s age, children and living location of the patients (independent variables).
Conclusion. From the result of the present study, it is concluded that the reproductive age women have shown a disturbed pattern of menstrual cycle after kidney transplant. The major observation was that such patients reported amenorrhea, menorrhagia, oligomenorrhea and hypomenorrhea.
Heart Transplantation and Assisted Circulation
The role of antibody-mediated rejection in predicting survival among heart recipients has been studied in clinical transplantology for over 20 years. This condition is a significant risk factor for heart failure and graft vasculopathy. Antibody-mediated rejection results from activation of the humoral immune system and production of donorspecific antibodies that cause myocardial injury through the complement system. The presence of donor-specific antibodies is associated with lower allograft survival. Treatment of antibody-mediated rejection should take into account the rejection category and the presence or absence of graft dysfunction. The main principle of treatment is to suppress humoral immunity at different levels. World clinical practice has made significant inroads into the study of this issue. However, further research is required to identify and develop optimal treatment regimens for patients with humoral rejection in cardiac transplantation.
Kidney injury in cardiac transplant recipients is one of the most severe complications affecting both short- and long-term transplant outcomes. The need for renal replacement therapy (RRT) is determined not only and not so much by the degree of renal dysfunction, as by the need for correction of fluid balance and metabolic disorders. These circumstances are associated with the specificity of extracorporeal renal replacement therapy in donor heart recipients. In this review, we discuss the problems of early versus delayed initiation of RRT, anticoagulation and vascular access, advantages and disadvantages of continuous and intermittent techniques. Special attention is paid to chronic kidney injury and peculiarities of kidney transplantation in heart recipients.
Objective: to study the effect of a pulsatile flow-generation (PFG) device on the basic hemodynamic parameters of the circulatory system using a mathematical model.
Results. Modelling and simulation showed that the use of PFG significantly (76%) increases aortic pulse pressure. The proposed mathematical model adequately describes the dynamics of the circulatory system and metabolism (oxygen debt) on physical activity in normal conditions and heart failure, and the use of non-pulsatile and pulsatile circulatory-assist systems. The mathematical model also shows that the use of PFG device blocks the development of rarefaction in the left ventricular cavity associated with a mismatch of blood inflow and outflow in diastolic phase when there is need to increase systemic blood flow by increasing the rotary pump speed.
The use of extracorporeal circulation systems (cardiopulmonary bypass pumps, ECMO) can lead to brain and coronary artery microembolism, which significantly reduces postoperative rehabilitation and often leads to severe complications. Microembolism occurs when oxygen or air microbubbles (MBs) enter the arterial system of patients. Existing CPB pumps come with built-in bubble trap systems but cannot remove bubbles in the circuit. ECMO devices have arterial filters but cannot reliably filter out <40 μm bubbles in a wide flow range. We have proposed an alternative method that involves the use of an efficient dynamic bubble trap (DBT) for both large and small bubbles. The design includes development of two DBT variants for hemodynamic conditions of adult and pediatric patients. The device is installed in the CPB pump and ECMO outlet lines. It provides sufficient bubble separation from the lines in a blood flow of 3.0–5.0 L/min for adults and 0.5–2.0 L/min for children. The developed computer models have shown that MBs smaller than 10 μm can be filtered. The use of this device will greatly reduce the likelihood of air embolism and provide the opportunity to reconsider the concept of expensive arterial filters.
Regenerative Medicine and Cell Technologies
Objective: to determine the optimal method for long-term wet storage of donor material (50 days after collection), with maximum ability to preserve the original mechanical characteristics.
Materials and methods. Porcine aortic wall fragments were used as objects of study. Half of the original material underwent detergent-based decellularization. The entire material (native and processed) was placed for 50 days in biocidal solutions: complex alcohol solution; ethanol and glycerol mixture; antibiotics mixture. Then the tests for mechanical strength of native and decellularized samples were carried out by the method of uniaxial longitudinal and circumferential stress.
Results. Storage of native material in all media resulted in a significant increase in tensile strength. In the «complex alcohol solution», «ethanol and glycerol mixture», and «antibiotic mixture» group, tensile strength increased by 1.38-, 1.72- and 1.62-fold compared to the native control in circumferential tension. Also, in the «complex alcohol solution» group, the decellularized material was 1.57-fold stronger than the native in circumferential tension. In the «antibiotic mixture» group, the decellularized material was 1.33-fold less strong than the native in longitudinal tension. According to elongation to rupture data, significantly greater plasticity was noted in the «ethanol-glycerol» storage group for the decellularized aortic wall compared to the control group (1.5-fold). Young’s modulus did not reliably differ from those of control in all experimental groups regardless of the stress direction. Notably, decellularized specimens clearly tended to be stiffer under circumferential stress.
Conclusion. Detergent-based decellularization of the porcine aortic wall and subsequent storage of these samples in our chosen experimental solutions for 50 days does not significantly affect the elastic properties of the material. Our proposed treatment methods partially increase the stiffness of the material after storage in alcohol-containing solutions.
The review includes the results of analytical research on the problem of application of pancreatic islet encapsulation technologies for compensation of type 1 diabetes. We present a review of modern encapsulation technologies, approaches to encapsulation strategies, insulin replacement technologies: auto-, allo- and xenotransplantation; prospects for cell therapy for insulin-dependent conditions; modern approaches to β-cell encapsulation, possibilities of optimization of encapsulation biomaterials to increase survival of transplanted cells and reduce adverse consequences for the recipient. The main problems that need to be solved for effective transplantation of encapsulated islets of Langerhans are identified and the main strategies for translating the islet encapsulation technology into medical reality are outlined.
Objective: using an adoptive transfer model to study the cellular mechanisms involved in the formation of the initial stage of liver regeneration during intraperitoneal injection of a healthy recipient with apoptotic bone marrowderived mononuclear cells (BM-MNCs) from a donor after extended liver resection.
Materials and methods. Male Wistar rats (n = 40) were used to create a model of adoptive transfer of apoptotic BM-MNCs (a-BM-MNCs) taken from the donor after extended liver resection to a healthy recipient. During the experiments, the animals were divided into five groups. Four experimental groups with intraperitoneal injection of the same doses to the recipient: freshly isolated BM-MNCs (group 1); BM-MNCs subjected to apoptosis for 48 hours by storage at t = 4–6 °C in phosphate-buffered saline (PBS) (group 2) or in a Custodiol HTK solution (group 3). In group 4, the animals were injected with PBS after storing BM-MNCs in it. The control animals were animals injected with saline (group 5). For selection of effective modes of apoptosis induction, BM-MNCs stained with 7AAD after incubation in solutions were analyzed by flow cytometry. Targeted transfer of regenerative signals to the recipient was assessed by the mitotic activity of hepatocytes in the liver and tubular epithelium in the kidneys, as well as by the intensity of microstructural changes in the liver 24, 48 and 72 hours after injection of the studied material.
Results. BMC incubation in PBS and HTK for 48 hours at t = 4–6 °C provides the most effective accumulation of a-BM-MNCs in early apoptosis. It was shown that a-BM-MNCs retain the ability to target-focused transmission of regulatory signals to the liver supported by autophagy process during adoptive transfer. It was established that a-BM-MNCs (groups 2 and 3) in comparison to native BM-MNCs (group 1) at adoptive transfer increased the regenerative potential of the liver due to pronounced increase in the activity of autophagy processes and directed infiltration of immunomodulatory mononuclear cells in the liver.
Conclusion. a-BM-MNCs create a stronger basis for development and implementation of a targeted and effective regeneration program by enhancing autophagy processes and immunomodulatory effect on mononuclear cells, which are regenerative signal carriers.
Objective: to develop a method for modifying composite small-diameter porous tubular biopolymer scaffolds based on bacterial copolymer poly(3-hydroxybutyrate-co-3-hydroxyvalerate) and gelatin modified with a double-layered bioactive coating based on heparin (Hp) and platelet lysate (PL) that promote adhesion and proliferation of cell cultures.
Materials and methods. Composite porous tubular biopolymer scaffolds with 4 mm internal diameter were made by electrospinning from a 1 : 2 (by volume) mixture of a 10% solution of poly(3-hydroxybutyrateco- 3-hydroxyvalerate) copolymer, commonly known as PHBV, and a 10% solution of gelatin, respectively, in hexafluoro-2-propanol. The structure of the scaffolds was stabilized with glutaraldehyde vapor. The scaffolds were modified with a bioactive Hp + PL-based coating. The surface morphology of the samples was analyzed using scanning electron microscopy. Biological safety of the modified scaffolds in vitro (hemolysis, cytotoxicity) was evaluated based on the GOST ISO 10993 standard. Interaction with cultures of human endothelial cell line (EA. hy926) and human adipose-derived mesenchymal stem cells (hADMSCs) was studied using vital dyes.
Results. We developed a method for modifying small-diameter composite porous tubular biopolymer scaffolds obtained by electrospinning from a mixture of PHBV and gelatin modified with double-layered bioactive coating based on covalently immobilized Hp and human PL. The modified scaffold was shown to have no cytotoxicity and hemolytic activity in vitro. It was also demonstrated that the developed coating promotes hADMSC adhesion and proliferation on the external surface and EA.hy926 on the internal surface of the composite porous tubular biopolymer scaffolds in vitro.
Conclusion. The developed coating can be used for the formation of in vivo tissueengineered small-diameter vascular grafts.
Implants and Artificial Organs
Objective: to construct geometric models of carotid bifurcation and build a computer modeling for carotid endarterectomy (CEA) operations with patches of various configurations.
Materials and methods. The method uses reconstructed models of a healthy blood vessel obtained from a preoperative computed tomography (CT) study of the affected blood vessel of a particular patient. Flow in the vessel is simulated by computational fluid dynamics using data from the patient's ultrasonic Doppler velocimetry and CT angiography. Risk factors are assessed by hemodynamic indices at the vessel wall associated with Wall Shear Stress (WSS).
Results. We used the proposed method to study the hemodynamic results of 10 virtual CEA operations with patches of various shapes on a reconstructed healthy artery of a particular patient. The reason for patch implantation was to ensure that the vessel lumen is not narrowed as a result of the surgery, since closing the incision without a patch can reduce the vessel lumen circumference by 4–5 mm, which adversely affects blood flow. On the other hand, too wide a patch creates aneurysmorphic deformation of the internal carotid artery (ICA) mouth, which is not optimal due to formation of a large recirculation zone. In this case, it was found that the implanted patch width of about 3 mm provides an optimal hemodynamic outcome. Deviations from this median value, both upward and downward, impair hemodynamics. The absence of a patch gives the worst of the results considered.
Conclusion: The proposed computer modeling technique is able to provide a personalized patch selection for classical CEA with low risk of restenosis in the long-term follow-up.
Clinical Cases
Cardiac myxoma is a primary tumor histologically formed by multipotent subendocardial mesenchymal cells. Myxomas account for approximately 50% of all cardiac tumors in adults. Myxomas are most commonly located in the left atrium. Very rarely, myxomas can be located in several heart chambers. Only about 100 cases of patients with myxomatous lesions of both atria have been described in the literature. In this paper, we present a successful clinical case of a young patient with biatrial myxomas.