Search results
Found 31107 matches for
The Influence of Surgeon Caseload and Usage on the Long-Term Outcomes of Mobile-Bearing Unicompartmental Knee Arthroplasty: An Analysis of Data From the National Joint Registry for England, Wales, Northern Ireland, and the Isle of Man.
BACKGROUND: Unicompartmental knee arthroplasty (UKA) revision rates are variable and known to be influenced by a surgeon's caseload (number of UKAs performed annually) and usage (UKA as a proportion of overall knee arthroplasty practice). It is not known which is more important. We explored the influence of caseload and usage on cemented and cementless UKA. METHODS: A total of 34,277 medial Oxford UKAs (23,707 cemented and 10,570 cementless) from the National Joint Registry were analyzed. UKAs were subdivided by the following: (1) surgeon caseload, into low (<10 UKAs/y) and high (≥10 UKAs/y) categories; and (2) usage, into low (<20%) and high (≥20%) categories. The 10-year revision rates were compared. RESULTS: The 10-year survival of the low-caseload/low-usage cemented and cementless UKA was 82.8% (CI 81.6-83.9) and 86.2% (CI 72.1-93.4), respectively. The 10-year survival of the high-caseload/high-usage cemented and cementless UKA was 90.0% (CI 89.2-90.6) and 93.3% (CI 91.3-94.8), respectively. For cemented UKA, the high-caseload/high-usage group had lower revision rates (hazard ratio [HR] 0.57, CI 0.52-0.63, P < .001) compared to the low-caseload/low-usage group. The high-caseload/low-usage (HR 0.74, CI 0.66-0.83, P < .001) and the low-caseload/high-usage (HR 0.86, CI 0.74-0.99, P = .04) groups also had lower revision rates than the low-caseload/low-usage group. CONCLUSION: Mobile-bearing UKA revision rates improve with both increasing surgeon UKA caseload and usage. Surgeons using cemented UKA who have usage ≥20% and caseload ≥10/year had a 10-year survival of 90%. Higher survivorship was associated with higher caseload, higher usage, and cementless fixation. LEVELS OF EVIDENCE: III.
The Effect of Age on the Relative Outcomes of Cemented and Cementless Mobile-Bearing Unicompartmental Knee Arthroplasty, Based on Data From National Databases.
BACKGROUND: Unicompartmental knee arthroplasty (UKA) is an effective treatment for medial compartment arthritis. A challenge is that patients requiring knee arthroplasty are becoming younger. It is currently unknown what the relative performances of cemented and cementless UKAs are, in different age groups. METHODS: A total of 12,882 cemented and cementless UKAs from the National Joint Registry and Hospital Episodes Statistics databases were matched on patient and surgical factors. Patients were stratified into 3 groups: (1) <60 years; (2) 60-69 years; and (3) ≥70 years. Revision and reoperation rates were compared using Cox regression analyses. RESULTS: The 10-year implant survival for the matched cemented and cementless UKAs for (1) <60 years were 81.4% (CI 73.6-87.0) and 86.7% (CI 80.7-90.9) (hazard ratio [HR] 0.73, CI 0.56-0.94, P = .02); (2) for 60-69 years were 91.8% (CI 88.9-94.0) and 94.5% (CI 92.9-95.7) (HR 0.90, CI 0.67-1.22, P = .51); and (3) ≥70 years were 93.5% (CI 91.1-95.3) and 94.2% (CI 92.0-95.8) (HR 1.0, CI 0.71-1.40, P = .99). The same trend was observed for reoperations. In the <60 years and 60-69 years groups there were significantly fewer revisions for aseptic loosening in the cementless group (0.5% versus 1.6% [P < .001] and 0.4% versus 1.3% [P = .002], respectively). CONCLUSION: Younger ages were associated with higher revision rates in both cemented and cementless UKA groups. Cementless fixation has reduced long-term revision rates compared to cemented fixation in the <60 years group with aseptic loosening rates 3 times lower. LEVEL OF EVIDENCE: III.
Risk factors associated with poor pain outcomes following primary knee replacement surgery: Analysis of data from the clinical practice research datalink, hospital episode statistics and patient reported outcomes as part of the STAR research programme.
OBJECTIVE: Identify risk factors for poor pain outcomes six months after primary knee replacement surgery. METHODS: Observational cohort study on patients receiving primary knee replacement from the UK Clinical Practice Research Datalink, Hospital Episode Statistics and Patient Reported Outcomes. A wide range of variables routinely collected in primary and secondary care were identified as potential predictors of worsening or only minor improvement in pain, based on the Oxford Knee Score pain subscale. Results are presented as relative risk ratios and adjusted risk differences (ARD) by fitting a generalized linear model with a binomial error structure and log link function. RESULTS: Information was available for 4,750 patients from 2009 to 2016, with a mean age of 69, of whom 56.1% were female. 10.4% of patients had poor pain outcomes. The strongest effects were seen for pre-operative factors: mild knee pain symptoms at the time of surgery (ARD 18.2% (95% Confidence Interval 13.6, 22.8), smoking 12.0% (95% CI:7.3, 16.6), living in the most deprived areas 5.6% (95% CI:2.3, 9.0) and obesity class II 6.3% (95% CI:3.0, 9.7). Important risk factors with more moderate effects included a history of previous knee arthroscopy surgery 4.6% (95% CI:2.5, 6.6), and use of opioids 3.4% (95% CI:1.4, 5.3) within three months after surgery. Those patients with worsening pain state change had more complications by 3 months (11.8% among those in a worse pain state vs. 2.7% with the same pain state). CONCLUSIONS: We quantified the relative importance of individual risk factors including mild pre-operative pain, smoking, deprivation, obesity and opioid use in terms of the absolute proportions of patients achieving poor pain outcomes. These findings will support development of interventions to reduce the numbers of patients who have poor pain outcomes.
A systematic review of the association between early comprehensive geriatric assessment and outcomes in hip fracture care for older people.
AIMS: Performance indicators are increasingly used to improve the quality of healthcare provided to hip fracture patients. Joint care, under orthopaedic surgeons and physicians with an interest in older patients, is one of the more common indicators of high-quality care. In this systematic review, we investigated the association between 'comprehensive geriatric assessment' and patient outcomes following hip fracture injury. METHODS: In total, 12 electronic databases and other sources were searched for evidence, and the methodological quality of studies meeting the inclusion criteria was assessed. The protocol for this suite of related systematic reviews was registered with PROSPERO (ID: CRD42023417515). RESULTS: A total of 24,591 articles were reviewed, and 39 studies met the inclusion criteria for the review, involving a total of 25,363 patients aged over 60 years with a hip fracture. There were five randomized clinical trials, three quasi-experimental studies, two non-randomized parallel group control trials, 22 pre-/post-intervention studies, and seven retrospective cohort studies, conducted between January 1992 and December 2021. The timing and content of a comprehensive geriatric assessment was ill-defined in many studies and care pathways were heterogeneous, which precluded meta-analysis of the data. Early comprehensive geriatric assessment was associated with improved outcomes in 31 of the 36 (86%) patient-reported outcomes, including improved mobility (acute/long-term), functional status, and better quality of life. In total, 155 out of 219 (70.78%) clinical outcomes derived from hospital records showed a positive association with early comprehensive geriatric review, including reduced preoperative time and length of hospital stay, reduced incidence of postoperative complications, fewer hospital readmissions, and lower mortality. CONCLUSION: Early comprehensive geriatric assessments after hip fracture in older people is associated with improved patient-reported outcomes and better clinical outcomes such as reduced incidence of complications, length of hospital stay, preoperative waiting time, and mortality. Standardization of the definitions of 'early' and 'comprehensive' geriatric assessments and consistent reporting of care pathway models would improve future evidence synthesis.
TBK1 and IKKε act like an OFF switch to limit NLRP3 inflammasome pathway activation.
NACHT, LRR, and PYD domains-containing protein 3 (NLRP3) inflammasome activation is beneficial during infection and vaccination but, when uncontrolled, is detrimental and contributes to inflammation-driven pathologies. Hence, discovering endogenous mechanisms that regulate NLRP3 activation is important for disease interventions. Activation of NLRP3 is regulated at the transcriptional level and by posttranslational modifications. Here, we describe a posttranslational phospho-switch that licenses NLRP3 activation in macrophages. The ON switch is controlled by the protein phosphatase 2A (PP2A) downstream of a variety of NLRP3 activators in vitro and in lipopolysaccharide-induced peritonitis in vivo. The OFF switch is regulated by two closely related kinases, TANK-binding kinase 1 (TBK1) and I-kappa-B kinase epsilon (IKKε). Pharmacological inhibition of TBK1 and IKKε, as well as simultaneous deletion of TBK1 and IKKε, but not of either kinase alone, increases NLRP3 activation. In addition, TBK1/IKKε inhibitors counteract the effects of PP2A inhibition on inflammasome activity. We find that, mechanistically, TBK1 interacts with NLRP3 and controls the pathway activity at a site distinct from NLRP3-serine 3, previously reported to be under PP2A control. Mutagenesis of NLRP3 confirms serine 3 as an important phospho-switch site but, surprisingly, reveals that this is not the sole site regulated by either TBK1/IKKε or PP2A, because all retain the control over the NLRP3 pathway even when serine 3 is mutated. Altogether, a model emerges whereby TLR-activated TBK1 and IKKε act like a "parking brake" for NLRP3 activation at the time of priming, while PP2A helps remove this parking brake in the presence of NLRP3 activating signals, such as bacterial pore-forming toxins or endogenous danger signals.
A Comparison of the Periprosthetic Fracture Rate of Cemented and Cementless Total Knee Arthroplasties: An Analysis of Data From the National Joint Registry.
BACKGROUND: Periprosthetic fractures are serious complications of knee arthroplasty often requiring complex surgery. There is concern of increased periprosthetic fracture risk with cementless components given the reliance on interference fit for primary stability. It is unknown how the periprosthetic fracture risk compares between cemented and cementless total knee arthroplasties (TKAs). METHODS: A total of 22,477 cemented and 22,477 cementless TKAs from the National Joint Registry and Hospital Episodes Statistics database were propensity score matched on patient and surgical factors. Cumulative periprosthetic fracture rates were calculated using Kaplan-Meier analyses and compared with Cox regressions. Subgroup analyses were performed in different age, body mass index, and sex groups. RESULTS: The 3-month fracture rate in the cemented and cementless TKA groups were 0.02% and 0.04%, respectively. At 10 years, the cumulative fracture rate after cemented TKA was 1.2%, and after cementless was 1.4%. During the study period, there were no significant differences in fracture rates between cemented and cementless TKAs with a hazards ratio 1.14 (confidence interval 0.94 to 1.37, P = .20) at 10 years postoperatively. There were no significant differences in fracture rates between fixation types on subgroup analyses of sex, body mass index, and age groups. Female sex was a risk factor for fracture in both cemented (odds ratio 2.35, P < .001) and cementless TKAs (odds ratio 2.97, P < .001). CONCLUSIONS: The periprosthetic fracture rates following cemented and cementless TKA surgery are low being approximately 1.2% and 1.4%, respectively at 10 years. There were no significant differences in periprosthetic fracture rates requiring readmission between cemented and cementless TKAs. LEVEL OF EVIDENCE: III.
A Matched Comparison of Long-Term Outcomes of Total and Unicompartmental Knee Replacements in Different Ages Based on National Databases: Analysis of Data From the National Joint Registry for England, Wales, Northern Ireland, and the Isle of Man.
BACKGROUND: The 2 main treatment options for end-stage single compartment knee arthritis are unicompartmental (UKR) or total knee replacement (TKR). We compared the long-term outcomes in different age groups. METHODS: In total, 54,215 UKRs and 54,215 TKRs from the National Joint Registry and Hospital Episode Statistics database were propensity score matched and Kaplan-Meier and regression analysis used to compare revision, reoperation, mortality, and 3-month complications. RESULTS: UKR had higher 10-year revision rates (12% vs 5%, hazard ratio [HR] 2.31, P < .001) and 10-year reoperation rates (25% vs 21%, HR 1.12, P < .001). UKR had lower 10-year mortality rates (13.6% vs 15.5%, HR 0.86, P < .001). UKR had lower rates of medical (P < .001) and procedure related (P < .001) complications and deaths (HR 0.61, P = .02). If 100 patients had a UKR instead of a TKR then over 10 years, if they were <55 years old there would be 7 more reoperations and 1 less death; if they were 55-64 years old there would be 6 more reoperations and 2 more deaths; if they were 65-74 years old there would be 4 more reoperations and 2 less deaths; and if they were ≥75 years old there would be 3 more reoperations and 4 less deaths. CONCLUSION: UKR has higher revision and slightly higher reoperation rates but lower mortality rates than matched TKR. The decision to do a UKR should, in part, be based on the balance of these risks, which are influenced by patient age. In the elderly group (>75 years) the data suggests that UKR compared to TKR has a greater absolute reduction in mortality than the increase in reoperation rate. LEVELS OF EVIDENCE: III.
A Comparison of the Periprosthetic Fracture Rate of Cemented and Cementless Mobile Bearing Unicompartmental Knee Arthroplasties: An Analysis of Data From the National Joint Registry for England, Wales, Northern Ireland, and the Isle of Man.
BACKGROUND: Periprosthetic fractures are rare but serious complications of unicompartmental knee arthroplasty (UKA). Although cementless UKA has a lower risk of loosening than cemented, there are concerns that tibial fracture risk may be higher given the reliance on interference fit for primary stability. The risk of fracture and the effect of surgical fixation are currently unknown. We compared the periprosthetic fracture rate following cemented and cementless UKA surgery. METHODS: A total of 14,122 medial mobile-bearing UKAs (7,061 cemented and 7,061 cementless) from the National Joint Registry and Hospital Episodes Statistics database were propensity score-matched. Cumulative fracture rates were calculated and Cox regressions were used to compare fixation groups. RESULTS: The three-month periprosthetic fracture rates were similar (P = .80), being 0.10% in the cemented group and 0.11% in the cementless group. The fracture rates were highest during the first three months postoperatively, but then decreased and remained constant between one and 10 years after surgery. The one-year cumulative fracture rates were 0.2% (confidence interval [CI]: 0.1 to 0.3) for cemented and 0.2% (CI: 0.1 to 0.3) for cementless cases. The 10-year cumulative fracture rates were 0.8% (CI: 0.2 to 1.3) and 0.8% (CI: 0.3 to 1.3), respectively. The hazard ratio during the whole study period was 1.06 (CI: 0.64 to 1.77; P = .79). CONCLUSIONS: The periprosthetic fracture rate following mobile bearing UKA surgery is low, being about 1% at 10 years. There were no significant differences in fracture rates between cemented and cementless implants after matching. We surmise that surgeons are aware of the higher theoretical risk of early fracture with cementless components and take care with tibial preparation. LEVELS OF EVIDENCE: III.
A Matched Comparison of Implant and Functional Outcomes of Cemented and Cementless Unicompartmental Knee Replacements: A Study from the National Joint Registry for England, Wales, Northern Ireland and the Isle of Man and the Hospital Episode Statistics Patient Reported Outcome Measures Database.
BACKGROUND: Unicompartmental knee replacement (UKR) is an effective treatment for end-stage medial compartment osteoarthritis, but there can be problems with fixation. The cementless UKR was introduced to address this issue. It is unknown how its functional outcomes compare with those of the cemented version on a national scale. We performed a matched comparison of the clinical and functional outcomes of cementless and cemented UKRs. METHODS: Using the National Joint Registry for England, Wales, Northern Ireland and the Isle of Man (NJR), 14,764 Oxford UKRs with linked data regarding patient-reported outcomes were identified. A total of 6,906 UKRs (3,453 cemented and 3,453 cementless) were propensity score matched on the basis of patient, surgical, and implant factors. RESULTS: The 10-year cumulative implant survival rate was 93.0% (95% confidence interval [CI], 90.0% to 95.1%) for cementless UKRs and 91.3% (95% CI, 89.0% to 93.0%) for cemented UKRs. The cementless UKR group had a significantly lower revision risk (hazard ratio [HR], 0.74; p = 0.02). Subgroup analyses showed a stronger effect size (HR, 0.66) among UKRs performed by high-caseload surgeons (i.e., surgeons performing ≥30 UKRs/year). In the overall cohort, the postoperative Oxford Knee Score (OKS) in the cementless group (mean and standard deviation, 39.1 ± 8.7) was significantly higher (p = 0.001) than that in the cemented group (38.5 ± 8.6). The cementless group gained a mean of 17.6 ± 9.3 points in the OKS postoperatively and the cemented group gained 16.5 ± 9.6 points, with a difference of 1.1 points between the groups (p < 0.001). The difference in OKS points gained postoperatively was highest among UKRs performed by high-caseload surgeons, with the cementless group gaining 1.8 points more (p < 0.001) than the cemented group. CONCLUSIONS: The cementless UKR demonstrated better 10-year implant survival and postoperative functional outcomes than the cemented UKR. The difference was largest among UKRs performed by high-caseload surgeons, with the cementless fixation group having an HR for revision of 0.66 and an approximately 2-point greater improvement in the OKS compared with the cemented fixation group. LEVEL OF EVIDENCE: Prognostic Level III . See Instructions for Authors for a complete description of levels of evidence.
How long does an elbow replacement last? A systematic review and meta-analysis of case-series and national registry reports with more than 10 years of follow-up.
BACKGROUND AND PURPOSE: This study aims to determine, for the first time, generalizable data on the longevity and long-term function of elbow replacements. METHODS: In this systematic review and meta-analysis, we searched MEDLINE and Embase for articles reporting 10-year or greater survival of total elbow replacements (TERs) and distal humeral hemiarthroplasty. Implant survival and patient reported outcome measures (PROMs) data were extracted. National joint replacement registries were also analyzed. We weighted each series and calculated a pooled survival estimate at 10, 15, and 20 years. For PROMs we pooled the standardized mean difference (SMD) at 10 years. FINDINGS: Despite its widespread use, we identified only 9 series reporting all-cause survival of 628 linked TERs and 610 unlinked TERs and no series for distal humeral hemiarthroplasty. The studied population was treated for rheumatoid arthritis in over 90% of cases. The estimated 10-year survival for linked TERs was 92% (95% CI 90-95) and unlinked TERs 84% (CI 81-88). 2 independent registries contributed 32 linked TERs and 530 unlinked TERs. The pooled registry 10-year survival for unlinked TERs was 86% (CI 83-89). Pooled 10-year PROMs from 164 TERs (33 linked and 131 unlinked), revealed a substantial improvement from baseline scores (SMD 2.7 [CI 1.6-3.8]). INTERPRETATION: Over 80% of all elbow replacements and over 90% of linked elbow replacements can last more than 10 years with sustained patient-reported benefits. This information is long overdue and will be particularly useful to patients as well as healthcare providers.
Comparison of five-year clinical outcomes of 524 cemented and cementless medial unicompartmental knee replacements.
AIM: To compare the outcomes of cemented and cementless Unicompartmental Knee Replacements (UKR) at 5 years after surgery. METHODS: 262 cemented and 262 cementless medial mobile-bearing UKR, implanted by four high-volume surgeons using identical indications and surgical techniques, were reviewed by independent physiotherapists at 5 years. Survival, Oxford Knee Score (OKS), American Knee Society Score (AKSS), and EQ-5D-5L were assessed. The cementless cohort was mainly implanted after the cemented. Each cohort was divided into early and late sub-groups and compared, to assess if any differences were due to progressive improvement in surgical practice over time. RESULTS: There were no significant differences between the cohorts for demographics, pre-operative scores, and 5-year revision (0.8%), re-operation (1.5%), and complication rates (5%). The cementless cohort had significantly better 5-year OKS (43v41, p = 0.008), AKSS-Objective (94v90, p = 0.049) and EQ-5D-5L (0.81v0.87, p = 0.0001). Pain sub-scores within OKS, AKSS, and EQ-5D-5L were also significantly better in the cementless cohort, and the differences were proportionally much greater and more significant than differences in their respective overall scores. There was no significant improvement in scores between the early and late subgroups of the cohorts, whereas the 'early-cementless' cohort had significantly better scores than the contemporaneously implanted 'late-cemented' cohort. This suggests that differences found were due to implant type, instead of improved surgical practice over time. CONCLUSION: Cementless UKR is associated with better clinical outcomes than cemented UKR, which is primarily due to improved pain relief. Both cemented and cementless UKR are safe with low reoperation and complication rates, and a 5-year survival of 99%.
Extended sample size calculations for evaluation of prediction models using a threshold for classification.
When evaluating the performance of a model for individualised risk prediction, the sample size needs to be large enough to precisely estimate the performance measures of interest. Current sample size guidance is based on precisely estimating calibration, discrimination, and net benefit, which should be the first stage of calculating the minimum required sample size. However, when a clinically important threshold is used for classification, other performance measures are also often reported. We extend the previously published guidance to precisely estimate threshold-based performance measures. We have reported closed-form solutions to estimate the sample size required to target sufficiently precise estimates of accuracy, specificity, sensitivity, positive predictive value (PPV), negative predictive value (NPV), and an iterative method to estimate the sample size required to target a sufficiently precise estimate of the F1-score, in an external evaluation study of a prediction model with a binary outcome. This approach requires the user to pre-specify the target standard error and the expected value for each performance measure alongside the outcome prevalence. We describe how the sample size formulae were derived and demonstrate their use in an example. Extension to time-to-event outcomes is also considered. In our examples, the minimum sample size required was lower than that required to precisely estimate the calibration slope, and we expect this would most often be the case. Our formulae, along with corresponding Python code and updated R, Stata and Python commands (pmvalsampsize), enable researchers to calculate the minimum sample size needed to precisely estimate threshold-based performance measures in an external evaluation study. These criteria should be used alongside previously published criteria to precisely estimate the calibration, discrimination, and net-benefit.
Longitudinal trajectories of health indicators using real world data
Older people with complex health needs are often excluded from clinical trials, primarily due to factors such as age, multimorbidity, and polypharmacy. However, they represent a significant portion of healthcare resource consumption and the use of newly authorised medications. Existing guidelines for identifying and treating this population often rely on using cross-sectional values and providing guidance on treating individual conditions rather than addressing the complexities of multimorbidity and treatment combinations. In this thesis, I propose applying novel approaches for identifying and characterising older people with complex health needs, using different health indicators and study designs on real world data. The definitions and cohorts established in this work have the potential to inform decisions for identifying, managing and treating older people with complex health needs. In the first project, I conducted a cross-sectional analysis to identify three cohorts of older people with high levels of frailty, polypharmacy, or unplanned hospital admissions. Patients in any of these cohorts had high comorbidity burden and preventive therapy use. Although there was considerable overlap between these cohorts, many patients only belonged to one of the three cohorts. This indicates that these health markers are intersectional and complementary to each other. Frailty and polypharmacy are cumulative conditions that take years to develop, making cross-sectional cohorts unable to describe their progression over time. In projects two and three, I modelled frailty and polypharmacy in older people over 4-5 years of follow-up. I identified subgroups with distinct frailty or polypharmacy trajectories, which demonstrated different association levels with the risk of mortality. Most of the population belonged to the low-steady/slow (healthy) subgroup. However, important subgroups emerging from these studies started from a seemingly healthy state, deteriorated rapidly over the study follow-up and had the highest mortality risks, indicating their need for more healthcare resources and monitoring. The subgroups were identified in a UK primary care database and then externally validated in two independent national and international databases. They demonstrated generalisability with good external validity, similar trajectories and clinical characteristics. Previous evidence reported that frailty and polypharmacy could start from middle age, and some of the identified subgroups of older people started from elevated or intermediate levels of frailty and polypharmacy. To understand these health markers’ progression from early on, I modelled polypharmacy over time in middle-aged people in the fourth project. I identified three subgroups with distinct polypharmacy trajectories and associated mortality risks. I found that those with the fastest polypharmacy trajectory had the highest mortality risk, followed by those starting at the highest polypharmacy baseline values. Those patients are likely to continue progressing and end up in one of the non-healthy subgroups at older age. My research demonstrated that monitoring trajectories of frailty and polypharmacy predicts mortality better than cross-sectional values. The identified subgroups were generalisable and had distinctive clinical characteristics. Future research can focus on further generalisability of the identified subgroups, and investigate how polypharmacy and frailty progress over longer periods, together and individually.
Stretch-induced microstructural evolution of electrospun polycaprolactone microfibers for biomedical applications
Abstract: The performance and degradation of polymeric medical yarns depend strongly on their microstructure, which can evolve significantly during fabrication. This work investigates and models how the microstructure of microfibrous electrospun (ES) filaments change during the critical post-processing step of uniaxial stretching. Specifically, we studied filaments designed for use in a knee ligament regeneration implant, made from biodegradable, semicrystalline polycaprolactone (PCL). Structural changes were characterized at both the fiber and molecular scales. Stretching led to fiber alignment, thinning, and coalescence, as revealed by micro-computed tomography (µCT) and scanning electron microscopy (SEM). At the molecular scale, the crystalline microarchitecture transformed profoundly, as shown by differential scanning calorimetry (DSC), 1D and 2D X-ray diffraction (XRD), and dynamic mechanical thermal analysis (DMTA). Based on these findings, we propose a conceptual model for stretch-induced microstructural evolution: at low strains, chain-folded crystals (CFC) fragment while amorphous chains extend; at higher strains, CFC unfold and recrystallize with extended chains into more thermodynamically stable chain-extended crystals (CEC) aligned with the stretch axis. This mechanism clarifies how uniaxial strain reorganizes semicrystalline domains in PCL, with important implications for thermomechanical and degradative properties relevant to implant performance. Understanding how microstructure responds to stretching enables the future development of more accurate simulations of complex fibrous materials under physiological conditions and informs the optimization of fabrication and design parameters for next-generation medical yarns.
Characterisation and modelling of continuous electrospun poly(ɛ- caprolactone) filaments for biological tissue repair.
This study investigates the mechanical behaviour of poly(ɛ-caprolactone) (PCL) continuous filaments produced by a novel electrospinning (ES) method. These filaments can be processed into woven or braided structures, showing great promises as scaffolds for ligament and tendon repair. Mechanical characterisation of the filaments using DMA and uniaxial tensile tests shows that the filament response is viscoelastic-viscoplastic. Filaments tested using bollard grips present an initially linear elastic response, followed by plastic yielding with two-stage hardening. The filaments are highly stretchable, reaching more than 1000% strain. The different deformation stages are correlated to the evolution of the micro-fibre network observed using SEM, involving the untangling, alignment and stretching of the fibres. A large deformation viscoelastic-viscoplastic model is proposed, which successfully captures the mechanical response of the filaments under non-monotonic loading conditions. Our study also highlights the sensitivity of the measured mechanical response to the type of mechanical grips, namely bollard or screw-side grips.
The experiences of clinical staff approaching families for organ donation consent: A systematic review and thematic synthesis of qualitative studies.
Healthcare professionals (HCPs) play an essential role in organ donation (OD) particularly when approaching families to discuss consent to OD. We synthesized the evidence on experiences of HCPs when approaching potential organ donor families. Fourteen electronic databases were searched to identify studies describing HCP experiences or associations between HCP experiences and consent rates. Methodological quality was assessed by independent reviewers using the Mixed Methods Appraisal Tool. Qualitative data were analysed using thematic synthesis, while quantitative data were summarized by narrative review. Ninety-two studies were included. HCP experiences were conceptualised as a paradox due to the challenges to negotiate the boundaries between life and death. Organisational and personal aspects broadly shape the experiences of professionals. Studies suggest that staff experiences can be improved by training and education, however, quantitative studies did not show a strong association between OD training and improved consent rates. The complexities of the family approach were evident in the variety of interactions between HCPs and the donor family, which may explain why there is no uniform approach across settings and countries. The review highlights the challenges faced by professionals when negotiating policy and practice and informs recommendations to support staff involved in the OD process worldwide.
The association between early-life gut microbiota and childhood respiratory diseases: a systematic review.
Data from animal models suggest a role of early-life gut microbiota in lung immune development, and in establishing susceptibility to respiratory infections and asthma in humans. This systematic review summarises the association between infant (ages 0-12 months) gut microbiota composition measured by genomic sequencing, and childhood (ages 0-18 years) respiratory diseases (ie, respiratory infections, wheezing, or asthma). Overall, there was evidence that low α-diversity and relative abundance of particular gut-commensal bacteria genera (Bifidobacterium, Faecalibacterium, Ruminococcus, and Roseburia) are associated with childhood respiratory diseases. However, results were inconsistent and studies had important limitations, including insufficient characterisation of bacterial taxa to species level, heterogeneous outcome definitions, residual confounding, and small sample sizes. Large longitudinal studies with stool sampling during the first month of life and shotgun metagenomic approaches to improve bacterial and fungal taxa resolution are needed. Standardising follow-up times and respiratory disease definitions and optimising causal statistical approaches might identify targets for primary prevention of childhood respiratory diseases.
Factors predicting poor glycemic control in the first two years of childhood onset type 1 diabetes in a cohort from East London, UK: Analyses using mixed effects fractional polynomial models.
BACKGROUND/OBJECTIVE: Poor early glycemic control in childhood onset type 1 diabetes (T1D) is associated with future risk of acute and chronic complications. Our aim was to identify the predictors of higher glycated hemoglobin (HbA1c) within 24 months of T1D diagnosis in children and adolescents. METHODS: Mixed effects models with fractional polynomials were used to analyze longitudinal data of patients <19 years of age, followed from T1D diagnosis for up to 2 years, at three diabetes clinics in East London, United Kingdom. RESULTS: A total of 2209 HbA1c observations were available for 356 patients (52.5% female; 64.4% non-white), followed from within 3 months of diagnosis during years 2005 to 2015, with a mean ± SD of 6.2 ± 2.5 HbA1c observations/participant. The mean age and HbA1c at diagnosis were 8.9 ± 4.3 years and 10.7% ±4.3% (or expressed as mmol/mol HbA1c mean ± SD 92.9 ± 23.10 mmol/mol) respectively. Over the 2 years following T1D diagnosis, HbA1c levels were mostly above the National Institute for Health, Care and Excellence (NICE), UK recommendations of 7.5% (<58 mmol/mol). Significant (P 9.5%, ie, >80 mmol/mol), clinic site, non-white ethnicity, and period (pre-year 2011) of diagnosis. Additionally in univariable analyses, frequency of clinic visits, HbA1c at diagnosis, and type of insulin treatment regimen showed association with poor glycemic control (P
Effect of early glycemic control on HbA1c tracking and development of vascular complications after 5 years of childhood onset type 1 diabetes: Systematic review and meta-analysis.
OBJECTIVE: A systematic review and meta-analysis was conducted to investigate if glycemic control measured by glycated hemoglobin (HbA1c) levels near diagnosis are predictive of future glycemic outcomes and vascular complications in childhood onset type 1 diabetes (T1D). METHODS: Evidence was gathered using electronic databases (MEDLINE, EMBASE, Web of Science, CINAHL, Scopus, and Cochrane Library up to February 2017) and snowballing techniques. Studies investigating the association between the exposure "early glycemic control" and main outcome: "tracking of early control" and secondary outcome: risk of future complications; in children and young people aged 0 to 19 years at baseline; were systematically double-reviewed, quality assessed, and outcome data extracted for synthesis and meta-analysis. FINDINGS: Five studies (N = 4227 participants) were eligible. HbA1c levels were sub-optimal throughout the study period but tended to stabilize in a "track" by 6 months after T1D diagnosis. The group with low HbA1c <53 mmol/mol (<7%) at baseline had lower long-term HbA1c levels than the higher HbA1c group. The estimated standardized mean difference between the sub groups showed a reduction of HbA1c levels on average by 1.6% (range -0.95% to -2.28%) from baseline. Only one study investigated the association between early glycemic control and development of vascular complications in childhood onset T1D. INTERPRETATIONS: Glycemic control after the first few months of childhood onset T1D, remains stable but sub-optimal for a decade. The low and high HbA1c levels at baseline seem to "track" in their respective tracks during the 10-year follow-up, however, the initial difference between groups narrows over time. PROSPERO: CRD42015024546 http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42015024546.
Predictors of glycemic control in the first year of diagnosis of childhood onset type 1 diabetes: A systematic review of quantitative evidence.
Early glycemic control is associated with reduced future vascular complications risk in type 1 diabetes (T1D). The aim of this study was to systematically review evidence on the predictors of glycemic control within 12 months of diagnosis of childhood onset T1D. Inclusion criteria for the electronic search were: interventional and observational studies that assessed and quantified an association between the predictor and glycemic control within 12 months of diagnosis of childhood onset T1D. A total of 17 915 articles were identified from 6 databases and 20 studies were finally included in the analysis. Harvest plots and narrative synthesis were used to summarize data from intervention (n = 0), prospective/retrospective cohort (n = 15), and cross-sectional (n = 5) studies. Significant predictors of poorer glycemic control 0 to 3 months after diagnosis were older age and female gender. Non-white ethnicity, diabetes autoantibody positivity, measures of deprivation, and non-private health insurance were potential predictors. Predictors of poorer glycemic control 4 to 12 months after diagnosis were: older age, non-white ethnicity, a single parent family, high hemoglobin A1c (HbA1c) levels at diagnosis, longer T1D duration, and non-intensive insulin therapy. Potential predictors included: family with health issues, clinical factors, and comorbidities at diagnosis. Most significant predictors of poor glycemic control within 12 months of diagnosis of childhood onset T1D are non-modifiable. These factors need to be recognized and addressed through individualized and multidisciplinary diabetes care. Further research is required to confirm the association of potential predictors with early glycemic control.