Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 314
Filter
1.
Eye (Lond) ; 2024 May 17.
Article in English | MEDLINE | ID: mdl-38760462

ABSTRACT

The design and development of a sustained-release drug delivery system targeting the administration of active pharmaceutical ingredients (APIs) to the eye could overcome the limitations of topically administered eye drops. Understanding how to modify or design new materials with specific functional properties that promote the attachment and release of specific drugs over longer time periods, alongside understanding clinical needs, can lead to new strategic opportunities to improve treatment options. In this paper we discuss two approaches to the design or modification of materials to produce a sustained therapeutic effect. Firstly, we discuss how the synthesis of a peptide hydrogel from a naturally-derived antimicrobial material led to the design of a bandage contact lens which may be able to be used prophylactically to reduce post-surgery infection. Secondly, we discuss how silicone oil tamponade agents used to treat retinal detachments can have adjunctive behaviour to enhance the solubility of the anti-proliferative drug retinoic acid and produce a sustained release over several weeks. These studies are the result of close partnerships between clinical ophthalmologists, materials scientists, and chemists, and illustrate how these partnerships can lead to comprehensive understandings that have the potential to change patient outcomes.

2.
BMC Prim Care ; 25(1): 168, 2024 May 17.
Article in English | MEDLINE | ID: mdl-38760733

ABSTRACT

BACKGROUND: The PaRIS survey, an initiative of the Organisation for Economic Co-operation and Development (OECD), aims to assess health systems performance in delivering primary care by measuring the care experiences and outcomes of people over 45 who used primary care services in the past six months. In addition, linked data from primary care practices are collected to analyse how the organisation of primary care practices and their care processes impact care experiences and outcomes. This article describes the development and validation of the primary care practice questionnaire for the PaRIS survey, the PaRIS-PCPQ. METHOD: The PaRIS-PCPQ was developed based on domains of primary care practice and professional characteristics included in the PaRIS conceptual framework. Questionnaire development was conducted in four phases: (1) a multi-step consensus-based development of the source questionnaire, (2) translation of the English source questionnaire into 17 languages, (3) cross-national cognitive testing with primary care professionals in participating countries, and (4) cross-national field-testing. RESULTS: 70 items were selected from 7 existing questionnaires on primary care characteristics, of which 49 were included in a first draft. Feedback from stakeholders resulted in a modified 34-item version (practice profile, care coordination, chronic care management, patient follow-up, and respondent characteristics) designed to be completed online by medical or non-medical staff working in a primary care practice. Cognitive testing led to changes in the source questionnaire as well as to country specific localisations. The resulting 32-item questionnaire was piloted in an online survey and field test. Data from 540 primary care practices from 17 countries were collected and analysed. Final revision resulted in a 34-item questionnaire. CONCLUSIONS: The cross-national development of a primary care practice questionnaire is challenging due to the differences in care delivery systems. Rigorous translation and cognitive testing as well as stakeholder engagement helped to overcome most challenges. The PaRIS-PCPQ will be used to assess how key characteristics of primary care practices relate to the care experiences and outcomes of people living with chronic conditions. As such, policymakers and care providers will be informed about the performance of primary care from the patient's perspective.


Subject(s)
Primary Health Care , Humans , Surveys and Questionnaires , Cross-Cultural Comparison , Reproducibility of Results , Female , Health Care Surveys , Middle Aged
3.
Water Res ; 256: 121612, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38642537

ABSTRACT

Genomic surveillance of SARS-CoV-2 has given insight into the evolution and epidemiology of the virus and its variant lineages during the COVID-19 pandemic. Expanding this approach to include a range of respiratory pathogens can better inform public health preparedness for potential outbreaks and epidemics. Here, we simultaneously sequenced 38 pathogens including influenza viruses, coronaviruses and bocaviruses, to examine the abundance and seasonality of respiratory pathogens in urban wastewater. We deployed a targeted bait capture method and short-read sequencing (Illumina Respiratory Virus Oligos Panel; RVOP) on composite wastewater samples from 8 wastewater treatment plants (WWTPs) and one associated hospital site. By combining seasonal sampling with whole genome sequencing, we were able to concurrently detect and characterise a range of common respiratory pathogens, including SARS-CoV-2, adenovirus and parainfluenza virus. We demonstrated that 38 respiratory pathogens can be detected at low abundances year-round, that hospital pathogen diversity is higher in winter vs. summer sampling events, and that significantly more viruses are detected in raw influent compared to treated effluent samples. Finally, we compared detection sensitivity of RT-qPCR vs. next generation sequencing for SARS-CoV-2, enteroviruses, influenza A/B, and respiratory syncytial viruses. We conclude that both should be used in combination; RT-qPCR allowed accurate quantification, whilst genomic sequencing detected pathogens at lower abundance. We demonstrate the valuable role of wastewater genomic surveillance and its contribution to the field of wastewater-based epidemiology, gaining rapid understanding of the seasonal presence and persistence for common respiratory pathogens. By simultaneously monitoring seasonal trends and early warning signs of many viruses circulating in communities, public health agencies can implement targeted prevention and rapid response plans.


Subject(s)
Wastewater , Wastewater/virology , SARS-CoV-2/genetics , SARS-CoV-2/isolation & purification , Humans , High-Throughput Nucleotide Sequencing/methods , COVID-19/virology , COVID-19/epidemiology , Seasons
4.
FEMS Microbes ; 5: xtae007, 2024.
Article in English | MEDLINE | ID: mdl-38544682

ABSTRACT

Wastewater-based epidemiology is now widely used in many countries for the routine monitoring of SARS-CoV-2 and other viruses at a community level. However, efficient sample processing technologies are still under investigation. In this study, we compared the performance of the novel Nanotrap® Microbiome Particles (NMP) concentration method to the commonly used polyethylene glycol (PEG) precipitation method for concentrating viruses from wastewater and their subsequent quantification and sequencing. For this, we first spiked wastewater with SARS-CoV-2, influenza and measles viruses and norovirus and found that the NMP method recovered 0.4%-21% of them depending on virus type, providing consistent and reproducible results. Using the NMP and PEG methods, we monitored SARS-CoV-2, influenza A and B viruses, RSV, enteroviruses and norovirus GI and GII and crAssphage in wastewater using quantitative PCR (qPCR)-based methods and next-generation sequencing. Good viral recoveries were observed for highly abundant viruses using both methods; however, PEG precipitation was more successful in the recovery of low-abundance viruses present in wastewater. Furthermore, samples processed with PEG precipitation were more successfully sequenced for SARS-CoV-2 than those processed with the NMP method. Virus recoveries were enhanced by high sample volumes when PEG precipitation was applied. Overall, our results suggest that the NMP concentration method is a rapid and easy virus concentration method for viral targets that are abundant in wastewater, whereas PEG precipitation may be more suited to the recovery and analysis of low-abundance viruses and for next generation sequencing.

5.
Syst Biol ; 2024 Mar 15.
Article in English | MEDLINE | ID: mdl-38490727

ABSTRACT

Across the Tree of Life, most studies of phenotypic disparity and diversification have been restricted to adult organisms. However, many lineages have distinct ontogenetic phases that differ from their adult forms in morphology and ecology. Focusing disproportionately on the evolution of adult forms unnecessarily hinders our understanding of the pressures shaping evolution over time. Non-adult disparity patterns are particularly important to consider for coastal ray-finned fishes, which often have juvenile phases with distinct phenotypes. These juvenile forms are often associated with sheltered nursery environments, with phenotypic shifts between adults and juvenile stages that are readily apparent in locomotor morphology. Whether this ontogenetic variation in locomotor morphology reflects a decoupling of diversification dynamics between life stages remains unknown. Here we investigate the evolutionary dynamics of locomotor morphology between adult and juvenile triggerfishes. We integrate a time-calibrated phylogenetic framework with geometric morphometric approaches and measurement data of fin aspect ratio and incidence, and reveal a mismatch between morphospace occupancy, the evolution of morphological disparity, and the tempo of trait evolution between life stages. Collectively, our results illuminate how the heterogeneity of morpho-functional adaptations can decouple the mode and tempo of morphological diversification between ontogenetic stages.

6.
Indian J Orthop ; 58(3): 250-256, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38425831

ABSTRACT

Aim: To facilitate the posterolateral approach to the posterior malleolus patients are often positioned prone initially, then turned supine to complete fixation at the medial malleolus. We sought to define observed differences in the radiographic appearance of implants relative to the joint line, in prone and supine positions. Methods: A 3.5 mm tubular plate and a 3.5 mm posterior distal tibial periarticular plate were applied sequentially to 3 individual cadaveric legs, via a posterolateral approach. The tubular plate was positioned to simulate buttress fixation and the posterolateral plate placed more distally. Each limb was secured on a custom jig and radiographs were taken on a mobile c-arm fluoroscopy machine with a calibration ball. A series of prone AP, supine PA and mortise radiographs were taken. Prone radiographs were also taken in different degrees of caudal tilt to simulate knee flexion which occurs in practice, during intraoperative positioning. Plate tip-joint line distances were measured and Mann-Whitney U tests performed. Results: There was no statistically significant difference in plate tip-joint line distance when comparing equivalent prone and supine views (PA/AP or mortise). However, significant differences in apparent implant position were noted with alterations in caudal tilt. When taking a prone image, when the knee is flexed to 20 degrees, the plate tip will appear 6.5-8.5 mm more proximal than in the equivalent supine image where the knee is extended and the fluoroscopy beam is orthogonal to the anatomic axis of the tibia. Conclusion: Observed differences in radiographic appearance of metalwork in the prone and supine position are most likely due to knee flexion and the resulting variation in the angle of the fluoroscopy beam, rather than projectional differences between supine and prone views. Surgeons should be alert to this when analysing intraoperative images.

7.
Nat Commun ; 15(1): 1652, 2024 Feb 23.
Article in English | MEDLINE | ID: mdl-38396069

ABSTRACT

Viral clearance, antibody response and the mutagenic effect of molnupiravir has not been elucidated in at-risk populations. Non-hospitalised participants within 5 days of SARS-CoV-2 symptoms randomised to receive molnupiravir (n = 253) or Usual Care (n = 324) were recruited to study viral and antibody dynamics and the effect of molnupiravir on viral whole genome sequence from 1437 viral genomes. Molnupiravir accelerates viral load decline, but virus is detectable by Day 5 in most cases. At Day 14 (9 days post-treatment), molnupiravir is associated with significantly higher viral persistence and significantly lower anti-SARS-CoV-2 spike antibody titres compared to Usual Care. Serial sequencing reveals increased mutagenesis with molnupiravir treatment. Persistence of detectable viral RNA at Day 14 in the molnupiravir group is associated with higher transition mutations following treatment cessation. Viral viability at Day 14 is similar in both groups with post-molnupiravir treated samples cultured up to 9 days post cessation of treatment. The current 5-day molnupiravir course is too short. Longer courses should be tested to reduce the risk of potentially transmissible molnupiravir-mutated variants being generated. Trial registration: ISRCTN30448031.


Subject(s)
COVID-19 , Cytidine/analogs & derivatives , Hydroxylamines , SARS-CoV-2 , Adult , Humans , SARS-CoV-2/genetics , Outpatients , Antibody Formation , Antibodies, Viral , Antiviral Agents/therapeutic use
8.
NPJ Digit Med ; 7(1): 33, 2024 Feb 12.
Article in English | MEDLINE | ID: mdl-38347090

ABSTRACT

Digital measures of health status captured during daily life could greatly augment current in-clinic assessments for rheumatoid arthritis (RA), to enable better assessment of disease progression and impact. This work presents results from weaRAble-PRO, a 14-day observational study, which aimed to investigate how digital health technologies (DHT), such as smartphones and wearables, could augment patient reported outcomes (PRO) to determine RA status and severity in a study of 30 moderate-to-severe RA patients, compared to 30 matched healthy controls (HC). Sensor-based measures of health status, mobility, dexterity, fatigue, and other RA specific symptoms were extracted from daily iPhone guided tests (GT), as well as actigraphy and heart rate sensor data, which was passively recorded from patients' Apple smartwatch continuously over the study duration. We subsequently developed a machine learning (ML) framework to distinguish RA status and to estimate RA severity. It was found that daily wearable sensor-outcomes robustly distinguished RA from HC participants (F1, 0.807). Furthermore, by day 7 of the study (half-way), a sufficient volume of data had been collected to reliably capture the characteristics of RA participants. In addition, we observed that the detection of RA severity levels could be improved by augmenting standard patient reported outcomes with sensor-based features (F1, 0.833) in comparison to using PRO assessments alone (F1, 0.759), and that the combination of modalities could reliability measure continuous RA severity, as determined by the clinician-assessed RAPID-3 score at baseline (r2, 0.692; RMSE, 1.33). The ability to measure the impact of the disease during daily life-through objective and remote digital outcomes-paves the way forward to enable the development of more patient-centric and personalised measurements for use in RA clinical trials.

9.
Cereb Cortex ; 34(2)2024 01 31.
Article in English | MEDLINE | ID: mdl-38300181

ABSTRACT

Humans are often tasked with determining the degree to which a given situation poses threat. Salient cues present during prior events help bring online memories for context, which plays an informative role in this process. However, it is relatively unknown whether and how individuals use features of the environment to retrieve context memories for threat, enabling accurate inferences about the current level of danger/threat (i.e. retrieve appropriate memory) when there is a degree of ambiguity surrounding the present context. We leveraged computational neuroscience approaches (i.e. independent component analysis and multivariate pattern analyses) to decode large-scale neural network activity patterns engaged during learning and inferring threat context during a novel functional magnetic resonance imaging task. Here, we report that individuals accurately infer threat contexts under ambiguous conditions through neural reinstatement of large-scale network activity patterns (specifically striatum, salience, and frontoparietal networks) that track the signal value of environmental cues, which, in turn, allows reinstatement of a mental representation, primarily within a ventral visual network, of the previously learned threat context. These results provide novel insight into distinct, but overlapping, neural mechanisms by which individuals may utilize prior learning to effectively make decisions about ambiguous threat-related contexts as they navigate the environment.


Subject(s)
Cues , Learning , Humans , Multivariate Analysis , Magnetic Resonance Imaging , Neural Networks, Computer
10.
Epidemiol Infect ; 152: e31, 2024 Feb 08.
Article in English | MEDLINE | ID: mdl-38329110

ABSTRACT

Wastewater-based epidemiology (WBE) has proven to be a powerful tool for the population-level monitoring of pathogens, particularly severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). For assessment, several wastewater sampling regimes and methods of viral concentration have been investigated, mainly targeting SARS-CoV-2. However, the use of passive samplers in near-source environments for a range of viruses in wastewater is still under-investigated. To address this, near-source passive samples were taken at four locations targeting student hall of residence. These were chosen as an exemplar due to their high population density and perceived risk of disease transmission. Viruses investigated were SARS-CoV-2 and its variants of concern (VOCs), influenza viruses, and enteroviruses. Sampling was conducted either in the morning, where passive samplers were in place overnight (17 h) and during the day, with exposure of 7 h. We demonstrated the usefulness of near-source passive sampling for the detection of VOCs using quantitative polymerase chain reaction (qPCR) and next-generation sequencing (NGS). Furthermore, several outbreaks of influenza A and sporadic outbreaks of enteroviruses (some associated with enterovirus D68 and coxsackieviruses) were identified among the resident student population, providing evidence of the usefulness of near-source, in-sewer sampling for monitoring the health of high population density communities.


Subject(s)
Enterovirus Infections , Wastewater , Humans , Universities , Disease Outbreaks , Antigens, Viral , SARS-CoV-2 , RNA, Viral
11.
JCO Precis Oncol ; 8: e2300453, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38412388

ABSTRACT

PURPOSE: Establishing accurate age-related penetrance figures for the broad range of cancer types that occur in individuals harboring a pathogenic germline variant in the TP53 gene is essential to determine the most effective clinical management strategies. These figures also permit optimal use of cosegregation data for classification of TP53 variants of unknown significance. Penetrance estimation can easily be affected by bias from ascertainment criteria, an issue not commonly addressed by previous studies. MATERIALS AND METHODS: We performed a maximum likelihood penetrance estimation using full pedigree data from a multicenter study of 146 TP53-positive families, incorporating adjustment for the effect of ascertainment and population-specific background cancer risks. The analysis included pedigrees from Australia, Spain, and United States, with phenotypic information for 4,028 individuals. RESULTS: Core Li-Fraumeni syndrome (LFS) cancers (breast cancer, adrenocortical carcinoma, brain cancer, osteosarcoma, and soft tissue sarcoma) had the highest hazard ratios of all cancers analyzed in this study. The analysis also detected a significantly increased lifetime risk for a range of cancers not previously formally associated with TP53 pathogenic variant status, including colorectal, gastric, lung, pancreatic, and ovarian cancers. The cumulative risk of any cancer type by age 50 years was 92.4% (95% CI, 82.2 to 98.3) for females and 59.7% (95% CI, 39.9 to 81.3) for males. Females had a 63.3% (95% CI, 35.6 to 90.1) cumulative risk of developing breast cancer by age 50 years. CONCLUSION: The results from maximum likelihood analysis confirm the known high lifetime risk for the core LFS-associated cancer types providing new risk estimates and indicate significantly increased lifetime risks for several additional cancer types. Accurate cancer risk estimates will help refine clinical recommendations for TP53 pathogenic variant carriers and improve TP53 variant classification.


Subject(s)
Breast Neoplasms , Li-Fraumeni Syndrome , Male , Female , Humans , United States , Middle Aged , Li-Fraumeni Syndrome/diagnosis , Li-Fraumeni Syndrome/genetics , Genes, p53/genetics , Pedigree , Tumor Suppressor Protein p53/genetics , Genetic Predisposition to Disease/genetics , Breast Neoplasms/genetics , Risk Factors
12.
Clin Immunol ; 259: 109901, 2024 02.
Article in English | MEDLINE | ID: mdl-38218209

ABSTRACT

Chronic human norovirus (HuNoV) infections in immunocompromised patients result in severe disease, yet approved antivirals are lacking. RNA-dependent RNA polymerase (RdRp) inhibitors inducing viral mutagenesis display broad-spectrum in vitro antiviral activity, but clinical efficacy in HuNoV infections is anecdotal and the potential emergence of drug-resistant variants is concerning. Upon favipiravir (and nitazoxanide) treatment of four immunocompromised patients with life-threatening HuNoV infections, viral whole-genome sequencing showed accumulation of favipiravir-induced mutations which coincided with clinical improvement although treatment failed to clear HuNoV. Infection of zebrafish larvae demonstrated drug-associated loss of viral infectivity and favipiravir treatment showed efficacy despite occurrence of RdRp variants potentially causing favipiravir resistance. This indicates that within-host resistance evolution did not reverse loss of viral fitness caused by genome-wide accumulation of sequence changes. This off-label approach supports the use of mutagenic antivirals for treating prolonged RNA viral infections and further informs the debate surrounding their impact on virus evolution.


Subject(s)
Amides , Norovirus , Pyrazines , Viruses , Animals , Humans , Norovirus/genetics , Antiviral Agents/pharmacology , Antiviral Agents/therapeutic use , Zebrafish , Mutagenesis , RNA-Dependent RNA Polymerase/genetics , Immunocompromised Host
13.
Emerg Infect Dis ; 30(1): 163-167, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38063078

ABSTRACT

We detected a novel GII.4 variant with an amino acid insertion at the start of epitope A in viral protein 1 of noroviruses from the United States, Gabon, South Africa, and the United Kingdom collected during 2017-2022. Early identification of GII.4 variants is crucial for assessing pandemic potential and informing vaccine development.


Subject(s)
Caliciviridae Infections , Gastroenteritis , Norovirus , Humans , Gastroenteritis/epidemiology , Norovirus/genetics , Caliciviridae Infections/epidemiology , Genotype , Pandemics , Phylogeny
14.
J Am Vet Med Assoc ; 262(1): 1-6, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-37918106

ABSTRACT

OBJECTIVE: To report the rate of surgical site infections (SSIs) after clean-contaminated and dirty gastrointestinal surgery in dogs and cats that did and did not receive incisional infiltration of Nocita and report the bacteria isolated. ANIMALS: Client-owned dogs (n = 211) and cats (78). METHODS: Records of dogs and cats that underwent gastrointestinal surgery at the Matthew J. Ryan Veterinary Hospital of the University of Pennsylvania and the University of Florida Small Animal Hospital between July 1, 2020, and April 1, 2023, were reviewed for surgical procedures, presence of preoperative septic peritonitis, use of Nocita, perioperative antibiotics administered, postoperative antibiotic use, SSI development postoperatively, and aerobic bacteria isolated. RESULTS: 7 of 124 (5.6%) dogs that received Nocita and 9 of 87 (10.2%) that did not receive Nocita developed an SSI. No dogs presenting with septic peritonitis and given Nocita (n = 5) developed an SSI. Two of 55 (3.6%) cats that received Nocita and 1 of 23 (4%) that did not receive Nocita developed an SSI. Multidrug-resistant (MDR) Escherichia coli was the most common aerobic bacteria isolated from SSIs (n = 3), and MDR bacteria were isolated commonly from both groups (4). CLINICAL RELEVANCE: Use of Nocita for gastrointestinal surgery in dogs and cats is not associated with higher rates of SSI than published rates of SSI after gastrointestinal surgery. Use of Nocita in dogs with preoperative septic peritonitis is not associated with the development of SSI. MDR bacteria are commonly isolated via culture from both dogs that received Nocita and those that did not.


Subject(s)
Anesthetics , Cat Diseases , Digestive System Surgical Procedures , Dog Diseases , Peritonitis , Humans , Cats , Dogs , Animals , Surgical Wound Infection/epidemiology , Surgical Wound Infection/veterinary , Digestive System Surgical Procedures/veterinary , Cat Diseases/surgery , Dog Diseases/surgery , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Bupivacaine , Peritonitis/veterinary , Retrospective Studies
15.
Health Technol Assess ; 27(33): 1-97, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38149666

ABSTRACT

Background: Lumbar puncture is an essential tool for diagnosing meningitis. Neonatal lumbar puncture, although frequently performed, has low success rates (50-60%). Standard technique includes lying infants on their side and removing the stylet 'late', that is, after the needle is thought to have entered the cerebrospinal fluid. Modifications to this technique include holding infants in the sitting position and removing the stylet 'early', that is, following transection of the skin. To the best of our knowledge, modified techniques have not previously been tested in adequately powered trials. Objectives: The aim of the Neonatal Champagne Lumbar punctures Every time - An RCT (NeoCLEAR) trial was to compare two modifications to standard lumbar puncture technique, that is, use of the lying position rather than the sitting position and of 'early' rather than 'late' stylet removal, in terms of success rates and short-term clinical, resource and safety outcomes. Methods: This was a multicentre 2 × 2 factorial pragmatic non-blinded randomised controlled trial. Infants requiring lumbar puncture (with a working weight ≥ 1000 g and corrected gestational age from 27+0 to 44+0 weeks), and whose parents provided written consent, were randomised by web-based allocation to lumbar puncture (1) in the sitting or lying position and (2) with early or late stylet removal. The trial was powered to detect a 10% absolute risk difference in the primary outcome, that is, the percentage of infants with a successful lumbar puncture (cerebrospinal fluid containing < 10,000 red cells/mm3). The primary outcome was analysed by modified intention to treat. Results: Of 1082 infants randomised (sitting with early stylet removal, n = 275; sitting with late stylet removal, n = 271; lying with early stylet removal, n = 274; lying with late stylet removal, n = 262), 1076 were followed up until discharge. Most infants were term born (950/1076, 88.3%) and were aged < 3 days (936/1076, 87.0%) with a working weight > 2.5 kg (971/1076, 90.2%). Baseline characteristics were balanced across groups. In terms of the primary outcome, the sitting position was significantly more successful than lying [346/543 (63.7%) vs. 307/533 (57.6%), adjusted risk ratio 1.10 (95% confidence interval 1.01 to 1.21); p = 0.029; number needed to treat = 16 (95% confidence interval 9 to 134)]. There was no significant difference in the primary outcome between early stylet removal and late stylet removal [338/545 (62.0%) vs. 315/531 (59.3%), adjusted risk ratio 1.04 (95% confidence interval 0.94 to 1.15); p = 0.447]. Resource consumption was similar in all groups, and all techniques were well tolerated and safe. Limitations: This trial predominantly recruited term-born infants who were < 3 days old, with working weights > 2.5 kg. The impact of practitioners' seniority and previous experience of different lumbar puncture techniques was not investigated. Limited data on resource use were captured, and parent/practitioner preferences were not assessed. Conclusion: Lumbar puncture success rate was higher with infants in the sitting position but was not affected by timing of stylet removal. Lumbar puncture is a safe, well-tolerated and simple technique without additional cost, and is easily learned and applied. The results support a paradigm shift towards sitting technique as the standard position for neonatal lumbar puncture, especially for term-born infants during the first 3 days of life. Future work: The superiority of the sitting lumbar puncture technique should be tested in larger populations of premature infants, in those aged > 3 days and outside neonatal care settings. The effect of operators' previous practice and the impact on family experience also require further investigation, alongside in-depth analyses of healthcare resource utilisation. Future studies should also investigate other factors affecting lumbar puncture success, including further modifications to standard technique. Trial registration: This trial is registered as ISRCTN14040914 and as Integrated Research Application System registration 223737. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 15/188/106) and is published in full in Health Technology Assessment; Vol. 27, No. 33. See the NIHR Funding and Awards website for further award information.


Newborn babies are more susceptible to getting meningitis, and this can be fatal or have lifelong complications. A lumbar puncture is an essential test for diagnosing meningitis. Lumbar puncture involves taking a small amount of spinal fluid from the lower back using a needle. Analysing the fluid confirms or excludes meningitis, allowing the right treatment to be given. Lumbar punctures are commonly performed in newborns, but are technically difficult. In 50­60% of lumbar punctures in newborns, either no fluid is obtained or the sample is mixed with blood, making analysis less reliable. No-one knows which is the best technique, and so practice varies. The baby can be held lying on their side or sat up, and the 'stylet', which is a thin piece of metal that sits inside (and aids insertion of) the needle, can be removed either soon after passing through the skin (i.e. 'early stylet removal') or once the tip is thought to have reached the spinal fluid (i.e. 'late stylet removal'). We wanted to find the best technique for lumbar puncture in newborns. Therefore, we compared sitting with lying position, and 'early' with 'late' stylet removal. We carried out a large trial in newborn care and maternity wards in 21 UK hospitals. With parental consent, we recruited 1082 full-term and premature babies who needed a lumbar puncture. Our results demonstrated that the sitting position was more successful than lying position, but the timing of stylet removal did not affect success. In summary, the sitting position is an inexpensive, safe, well-tolerated and easily learned way to improve lumbar puncture success rates in newborns. Our results strongly support using this technique in newborn babies worldwide.


Subject(s)
Infant, Premature , Spinal Puncture , Humans , Infant , Infant, Newborn , Intention , Spinal Puncture/adverse effects , Technology Assessment, Biomedical
16.
Nat Commun ; 14(1): 7295, 2023 11 13.
Article in English | MEDLINE | ID: mdl-37957154

ABSTRACT

Mutations in SNCA, the gene encoding α-synuclein (αSyn), cause familial Parkinson's disease (PD) and aberrant αSyn is a key pathological hallmark of idiopathic PD. This α-synucleinopathy leads to mitochondrial dysfunction, which may drive dopaminergic neurodegeneration. PARKIN and PINK1, mutated in autosomal recessive PD, regulate the preferential autophagic clearance of dysfunctional mitochondria ("mitophagy") by inducing ubiquitylation of mitochondrial proteins, a process counteracted by deubiquitylation via USP30. Here we show that loss of USP30 in Usp30 knockout mice protects against behavioral deficits and leads to increased mitophagy, decreased phospho-S129 αSyn, and attenuation of SN dopaminergic neuronal loss induced by αSyn. These observations were recapitulated with a potent, selective, brain-penetrant USP30 inhibitor, MTX115325, with good drug-like properties. These data strongly support further study of USP30 inhibition as a potential disease-modifying therapy for PD.


Subject(s)
Parkinson Disease , Thiolester Hydrolases , Animals , Mice , alpha-Synuclein/genetics , alpha-Synuclein/metabolism , Dopaminergic Neurons/metabolism , Mice, Knockout , Mitochondria/metabolism , Parkinson Disease/metabolism , Ubiquitin-Protein Ligases/genetics , Ubiquitin-Protein Ligases/metabolism , Ubiquitination , Thiolester Hydrolases/genetics
17.
Orphanet J Rare Dis ; 18(1): 360, 2023 Nov 16.
Article in English | MEDLINE | ID: mdl-37974153

ABSTRACT

BACKGROUND: Hypoketotic hypoglycaemia with suppressed plasma fatty acids and detectable insulin suggests congenital hyperinsulinism (CHI). Severe hypoketotic hypoglycaemia mimicking hyperinsulinism but without detectable insulin has recently been described in syndromic individuals with mosaic genetic activation of post-receptor insulin signalling. We set out to expand understanding of this entity focusing on metabolic phenotypes. METHODS: Metabolic profiling, candidate gene and exome sequencing were performed in six infants with hypoketotic, hypoinsulinaemic hypoglycaemia, with or without syndromic features. Additional signalling studies were carried out in dermal fibroblasts from two individuals. RESULTS: Two infants had no syndromic features. One was mistakenly diagnosed with CHI. One had mild features of megalencephaly-capillary malformation-polymicrogyria (MCAP) syndrome, one had non-specific macrosomia, and two had complex syndromes. All required intensive treatment to maintain euglycaemia, with CHI-directed therapies being ineffective. Pathogenic PIK3CA variants were found in two individuals - de novo germline c.323G>A (p.Arg108His) in one non-syndromic infant and postzygotic mosaic c.2740G>A (p.Gly914Arg) in the infant with MCAP. No causal variants were proven in the other individuals despite extensive investigation, although rare variants in mTORC components were identified in one. No increased PI3K signalling in fibroblasts of two individuals was seen. CONCLUSIONS: We expand the spectrum of PI3K-related hypoinsulinaemic hypoketotic hypoglycaemia. We demonstrate that pathogenic germline variants activating post-insulin-receptor signalling may cause non-syndromic hypoinsulinaemic hypoketotic hypoglycaemia closely resembling CHI. This distinct biochemical footprint should be sought and differentiated from CHI in infantile hypoglycaemia. To facilitate adoption of this differential diagnosis, we propose the term "pseudohyperinsulinism".


Subject(s)
Congenital Hyperinsulinism , Proto-Oncogene Proteins c-akt , Infant , Humans , Proto-Oncogene Proteins c-akt/genetics , Insulin , Congenital Hyperinsulinism/genetics , Phosphatidylinositol 3-Kinases/metabolism
18.
Environ Sci Pollut Res Int ; 30(59): 123785-123795, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37989946

ABSTRACT

Wastewater-based epidemiology (WBE) has been commonly used for monitoring SARS-CoV-2 outbreaks. As sampling times and methods (i.e. grab vs composite) may vary, diurnal changes of viral concentrations in sewage should be better understood. In this study, we collected untreated wastewater samples hourly for 4 days at two wastewater treatment plants in Wales to establish diurnal patterns in virus concentrations and the physico-chemical properties of the water. Simultaneously, we also trialled three absorbent materials as passive samples as a simple and cost-efficient alternative for the collection of composite samples. Ninety-six percent of all liquid samples (n = 74) and 88% of the passive samplers (n = 59) were positive for SARS-CoV-2, whereas 87% and 97% of the liquid and passive samples were positive for the faecal indicator virus crAssphage, respectively. We found no significant daily variations in the concentration of the target viruses, ammonium and orthophosphate, and the pH and electrical conductivity levels were also stable. Weak positive correlations were found between some physico-chemical properties and viral concentrations. More variation was observed in samples taken from the influent stream as opposed to those taken from the influent tank. Of the absorbent materials trialled as passive samples, we found that tampons provided higher viral recoveries than electronegative filter paper and cotton gauze swabs. For all materials tested, viral recovery was dependent on the virus type. Our results indicate that grab samples may provide representative alternatives to 24-h composite samples if taken from the influent tank, hence reducing the costs of sampling for WBE programmes. Tampons are also viable alternatives for cost-efficient sampling; however, viral recovery should be optimised prior to use.


Subject(s)
Viruses , Wastewater , SARS-CoV-2 , Sewage , Wastewater-Based Epidemiological Monitoring
19.
Sci Rep ; 13(1): 18311, 2023 10 25.
Article in English | MEDLINE | ID: mdl-37880288

ABSTRACT

Rheumatoid arthritis (RA) is a fluctuating progressive disease requiring frequent symptom assessment for appropriate management. Continuous tracking using digital technologies may provide greater insights of a patient's experience. This prospective study assessed the feasibility, reliability, and clinical utility of using novel digital technologies to remotely monitor participants with RA. Participants with moderate to severe RA and non-RA controls were monitored continuously for 14 days using an iPhone with an integrated bespoke application and an Apple Watch. Participants completed patient-reported outcome measures and objective guided tests designed to assess disease-related impact on physical function. The study was completed by 28 participants with RA, 28 matched controls, and 2 unmatched controls. Completion rates for all assessments were > 97% and were reproducible over time. Several guided tests distinguished between RA and control cohorts (e.g., mean lie-to-stand time [seconds]: RA: 4.77, control: 3.25; P < 0.001). Participants with RA reporting greater stiffness, pain, and fatigue had worse guided test performances (e.g., wrist movement [P < 0.001] and sit-to-stand transition time [P = 0.009]) compared with those reporting lower stiffness, pain, and fatigue. This study demonstrates that digital technologies can be used in a well-controlled, remote clinical setting to assess the daily impact of RA.


Subject(s)
Arthritis, Rheumatoid , Mobile Applications , Humans , Prospective Studies , Reproducibility of Results , Arthritis, Rheumatoid/diagnosis , Pain , Fatigue/diagnosis , Patient-Centered Care
20.
Elife ; 122023 09 21.
Article in English | MEDLINE | ID: mdl-37732733

ABSTRACT

Accurate inference of who infected whom in an infectious disease outbreak is critical for the delivery of effective infection prevention and control. The increased resolution of pathogen whole-genome sequencing has significantly improved our ability to infer transmission events. Despite this, transmission inference often remains limited by the lack of genomic variation between the source case and infected contacts. Although within-host genetic diversity is common among a wide variety of pathogens, conventional whole-genome sequencing phylogenetic approaches exclusively use consensus sequences, which consider only the most prevalent nucleotide at each position and therefore fail to capture low-frequency variation within samples. We hypothesized that including within-sample variation in a phylogenetic model would help to identify who infected whom in instances in which this was previously impossible. Using whole-genome sequences from SARS-CoV-2 multi-institutional outbreaks as an example, we show how within-sample diversity is partially maintained among repeated serial samples from the same host, it can transmitted between those cases with known epidemiological links, and how this improves phylogenetic inference and our understanding of who infected whom. Our technique is applicable to other infectious diseases and has immediate clinical utility in infection prevention and control.


During an infectious disease outbreak, tracing who infected whom allows public health scientists to see how a pathogen is spreading and to establish effective control measures. Traditionally, this involves identifying the individuals an infected person comes into contact with and monitoring whether they also become unwell. However, this information is not always available and can be inaccurate. One alternative is to track the genetic data of pathogens as they spread. Over time, pathogens accumulate mutations in their genes that can be used to distinguish them from one another. Genetically similar pathogens are more likely to have spread during the same outbreak, while genetically dissimilar pathogens may have come from different outbreaks. However, there are limitations to this approach. For example, some pathogens accumulate genetic mutations very slowly and may not change enough during an outbreak to be distinguishable from one another. Additionally, some pathogens can spread rapidly, leaving less time for mutations to occur between transmission events. To overcome these challenges, Torres Ortiz et al. developed a more sensitive approach to pathogen genetic testing that took advantage of the multiple pathogen populations that often coexist in an infected patient. Rather than tracking only the most dominant genetic version of the pathogen, this method also looked at the less dominant ones. Torres Ortiz et al. performed genome sequencing of SARS-CoV-2 (the virus that causes COVID-19) samples from 451 healthcare workers, patients, and patient contacts at participating London hospitals. Analysis showed that it was possible to detect multiple genetic populations of the virus within individual patients. These subpopulations were often more similar in patients that had been in contact with one another than in those that had not. Tracking the genetic data of all viral populations enabled Torres Ortiz et al. to trace transmission more accurately than if only the dominant population was used. More accurate genetic tracing could help public health scientists better track pathogen transmission and control outbreaks. This may be especially beneficial in hospital settings where outbreaks can be smaller, and it is important to understand if transmission is occurring within the hospital or if the pathogen is imported from the community. Further research will help scientists understand how pathogen population genetics evolve during outbreaks and may improve the detection of subpopulations present at very low frequencies.


Subject(s)
COVID-19 , Communicable Diseases , Humans , SARS-CoV-2/genetics , Phylogeny , COVID-19/epidemiology , Disease Outbreaks , Communicable Diseases/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...