Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 99
Filter
1.
J Crit Care ; 82: 154809, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38609773

ABSTRACT

PURPOSE: A positive fluid balance (FB) is associated with harm in intensive care unit (ICU) patients with acute kidney injury (AKI). We aimed to understand how a positive balance develops in such patients. METHODS: Multinational, retrospective cohort study of critically ill patients with AKI not requiring renal replacement therapy. RESULTS: AKI occurred at a median of two days after admission in 7894 (17.3%) patients. Cumulative FB became progressively positive, peaking on day three despite only 848 (10.7%) patients receiving fluid resuscitation in the ICU. In those three days, persistent crystalloid use (median:60.0 mL/h; IQR 28.9-89.2), nutritional intake (median:18.2 mL/h; IQR 0.0-45.9) and limited urine output (UO) (median:70.8 mL/h; IQR 49.0-96.7) contributed to a positive FB. Although UO increased each day, it failed to match input, with only 797 (10.1%) patients receiving diuretics in ICU. After adjustment, a positive FB four days after AKI diagnosis was associated with an increased risk of hospital mortality (OR 1.12;95% confidence intervals 1.05-1.19;p-value <0.001). CONCLUSION: Among ICU patients with AKI, cumulative FB increased after diagnosis and was associated with an increased risk of mortality. Continued crystalloid administration, increased nutritional intake, limited UO, and minimal use of diuretics all contributed to positive FB. KEY POINTS: Question How does a positive fluid balance develop in critically ill patients with acute kidney injury? Findings Cumulative FB increased after AKI diagnosis and was secondary to persistent crystalloid fluid administration, increasing nutritional fluid intake, and insufficient urine output. Despite the absence of resuscitation fluid and an increasing cumulative FB, there was persistently low diuretics use, ongoing crystalloid use, and a progressive escalation of nutritional fluid therapy. Meaning Current management results in fluid accumulation after diagnosis of AKI, as a result of ongoing crystalloid administration, increasing nutritional fluid, limited urine output and minimal diuretic use.


Subject(s)
Acute Kidney Injury , Critical Illness , Fluid Therapy , Intensive Care Units , Water-Electrolyte Balance , Humans , Acute Kidney Injury/therapy , Acute Kidney Injury/physiopathology , Retrospective Studies , Female , Male , Middle Aged , Fluid Therapy/methods , Aged , Hospital Mortality , Crystalloid Solutions/administration & dosage , Crystalloid Solutions/therapeutic use , Diuretics/therapeutic use
2.
Int J Infect Dis ; 144: 107061, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38631508

ABSTRACT

OBJECTIVES: The accuracy of malaria rapid diagnostic tests is threatened by Plasmodium falciparum with pfhrp2/3 deletions. This study compares gene deletion prevalence determined by multiplex real time polymerase chain reaction (qPCR) and conventional polymerase chain reaction (cPCR) using existing samples with clonality previously determined by microsatellite genotyping. METHODS: Multiplex qPCR was used to estimate prevalence of pfhrp2/3 deletions in three sets of previously collected patient samples from Eritrea and Peru. The qPCR was validated by multiplex digital polymerase chain reaction. Sample classification was compared with cPCR, and receiver operating characteristic curve analysis was used to determine the optimal ΔCq threshold that aligned the results of the two assays. RESULTS: qPCR classified 75% (637 of 849) of samples as single, and 212 as mixed-pfhrp2/3 genotypes, with a positive association between clonality and proportion of mixed-pfhrp2/3 genotype samples. The sample classification agreement between cPCR and qPCR was 75.1% (95% confidence interval [CI] 68.6-80.7%) and 47.8% (95% CI 38.9-56.9%) for monoclonal and polyclonal infections. The qPCR prevalence estimates of pfhrp2/3 deletions showed almost perfect (κ = 0.804, 95% CI 0.714-0.895) and substantial agreement (κ = 0.717, 95% CI 0.562-0.872) with cPCR for Peru and 2016 Eritrean samples, respectively. For 2019 Eritrean samples, the prevalence of double pfhrp2/3 deletions was approximately two-fold higher using qPCR. The optimal threshold for matching the assay results was ΔCq = 3. CONCLUSIONS: Multiplex qPCR and cPCR produce comparable estimates of gene deletion prevalence when monoclonal infections dominate; however, qPCR provides higher estimates where multi-clonal infections are common.


Subject(s)
Antigens, Protozoan , Malaria, Falciparum , Multiplex Polymerase Chain Reaction , Plasmodium falciparum , Protozoan Proteins , Plasmodium falciparum/genetics , Humans , Malaria, Falciparum/epidemiology , Malaria, Falciparum/diagnosis , Malaria, Falciparum/parasitology , Protozoan Proteins/genetics , Multiplex Polymerase Chain Reaction/methods , Prevalence , Antigens, Protozoan/genetics , Gene Deletion , Real-Time Polymerase Chain Reaction/methods , Peru/epidemiology , Genotype
3.
Blood Purif ; : 1-10, 2024 Apr 16.
Article in English | MEDLINE | ID: mdl-38626729

ABSTRACT

INTRODUCTION: In critically ill patients undergoing continuous renal replacement therapy (CRRT), a positive fluid balance (FB) is associated with adverse outcomes. However, current FB management practices in CRRT patients are poorly understood. We aimed to study FB and its components in British and Australian CRRT patients to inform future trials. METHODS: We obtained detailed electronic health record data on all fluid-related variables during CRRT and hourly FB for the first 7 days of treatment. RESULTS: We studied 1,616 patients from three tertiary intensive care units (ICUs) in two countries. After the start of CRRT, the mean cumulative FB became negative at 31 h and remained negative over 7 days to a mean nadir of -4.1 L (95% confidence interval (CI) of -4.6 to -3.5). The net ultrafiltration (NUF) rate was the dominant fluid variable (-67.7 mL/h; standard deviation (SD): 75.7); however, residual urine output (-34.7 mL/h; SD: 54.5), crystalloid administration (48.1 mL/h; SD: 44.6), and nutritional input (36.4 mL/h; SD: 29.7) significantly contributed to FB. Patients with a positive FB after 72 h of CRRT were more severely ill, required high-dose vasopressors, and had high lactate concentrations (5.0 mmol/L; interquartile range: 2.3-10.5). A positive FB was independently associated with increased hospital mortality (odds ratio: 1.70; 95% CI; p = 0.004). CONCLUSION: In the study ICUs, most CRRT patients achieved a predominantly NUF-dependent negative FB. Patients with a positive FB at 72 h had greater illness severity and haemodynamic instability. Achieving equipoise for conducting trials that target a negative early FB in such patients may be difficult.

4.
Antimicrob Agents Chemother ; 67(12): e0101423, 2023 12 14.
Article in English | MEDLINE | ID: mdl-37971260

ABSTRACT

Plasmodium vivax infections and relapses remain a major health problem for malaria-endemic countries, deployed military personnel, and travelers. Presumptive anti-relapse therapy and radical cure using the 8-aminoquinoline drugs primaquine and tafenoquine are necessary to prevent relapses. Although it has been demonstrated that the efficacy of primaquine is associated with Cytochrome P450 2D6 (CYP2D6) activity, there is insufficient data on the role of CYP2D6 in the anti-relapse efficacy of tafenoquine. We investigated the relationship between CYP2D6 activity status and tafenoquine efficacy in preventing P. vivax relapses retrospectively using plasma samples collected from Australian Defence Force personnel deployed to Papua New Guinea and Timor-Leste who participated in clinical trials of tafenoquine during 1999-2001. The CYP2D6 gene was amplified from plasma samples and fully sequenced from 92 participant samples, comprised of relapse (n = 31) and non-relapse (n = 61) samples, revealing 14 different alleles. CYP2D6 phenotypes deduced from combinations of CYP2D6 alleles predicted that among 92 participants 67, 15, and 10 were normal, intermediate, and poor metabolizers, respectively. The deduced CYP2D6 phenotype did not correlate with the corresponding participant's plasma tafenoquine concentrations that were determined in the early 2000s by high-performance liquid chromatography or liquid chromatography-mass spectrometry. Furthermore, the deduced CYP2D6 phenotype did not associate with P. vivax relapse outcomes. Our results indicate that CYP2D6 does not affect plasma tafenoquine concentrations and the efficacy of tafenoquine in preventing P. vivax relapses in the assessed Australian Defence Force personnel.


Subject(s)
Antimalarials , Malaria, Vivax , Humans , Primaquine/therapeutic use , Plasmodium vivax/genetics , Antimalarials/therapeutic use , Cytochrome P-450 CYP2D6/genetics , Retrospective Studies , Australia , Aminoquinolines/therapeutic use , Malaria, Vivax/drug therapy , Malaria, Vivax/prevention & control , Recurrence
5.
BMC Public Health ; 23(1): 2215, 2023 11 09.
Article in English | MEDLINE | ID: mdl-37946172

ABSTRACT

BACKGROUND: Due to the relatively low numbers of households in high income countries experiencing food insecurity most studies conflate the levels of severity, which masks between- and within-country differences. This study aims to describe the characteristics of individuals living in high income countries who were moderately or severely food insecure and investigates temporal trends in prevalence. It assesses these characteristics in comparison to those who were food secure. METHODS: This is a secondary analysis of data collected by the FAO Voices of the Hungry between 2014-2018. The data were collected during the annual Gallup World Polls of nationally representative samples using the Food Insecurity Experience Scale. Data from 34 highly developed, wealthy countries were analysed. The age, gender, income, education, area of residence and household structure of individuals experiencing moderate/severe food insecurity (FI), and severe FI, were compared using ANOVA, Welch's F, Pearson's Chi-square, and Linear-by-Linear Association, dependent on the variable of interest. Hierarchical cluster analysis was used to group countries according to their prevalence of moderate/severe FI, and severe FI. RESULTS: Overall, 6.5% of the weighted sample were moderately/severely food insecure (M-SFI), while 1.6% were severely food insecure. M-SFI individuals were present in all 34 countries, in all years and across all education levels and income quintiles. The proportion of individuals experiencing moderate/severe FI varied between years and countries. Fifteen countries showed a significant downward temporal trend in prevalence of moderate/severe FI (p < 0.001), while three countries demonstrated an increasing temporal trend driven by increasing prevalence in those aged 65 years or less (p < 0.001). Comparing individuals experiencing moderate versus severe FI showed over-representation of males, single adult households and lower household income in the severe FI group. CONCLUSIONS: Individuals across all income, education and age categories living in high income countries are experiencing moderate/severe food insecurity, but with higher prevalence in those experiencing more disadvantage. Over the study period some countries experienced escalating while others demonstrated decreasing moderate/severe FI trends. This comparison of countries with similar economic and human development indices highlights an opportunity to investigate subtle variations in social, economic and education policy that could have profound impacts on food insecurity.


Subject(s)
Food Insecurity , Food Supply , Adult , Male , Humans , Prevalence , Developed Countries , Cross-Sectional Studies
6.
Crit Care Resusc ; 25(3): 126-135, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37876369

ABSTRACT

Objective: The overall objective of this scoping review is to assess the extent of the literature related to the fluid management of critically ill patients with acute kidney injury (AKI). Introduction: AKI is common in critically ill patients where fluid therapy is a mainstay of treatment. An association between fluid balance (FB) and adverse patient-centred outcomes in critically ill patients with AKI regardless of severity has been demonstrated. The evidence for the prospective intervention of FB and its impact on outcomes is unknown. Inclusion criteria: All studies investigating FB in patients with AKI admitted to an intensive care unit were included. Literature not related to FB in the critically ill patient with AKI population was excluded. Methods: We searched MEDLINE, EMBASE, and CINAHL from January 1st, 2012, onwards. We included primary research studies, experimental and observational, recruiting adult participants admitted to an intensive care unit who had an AKI. We extracted data on study and patient characteristics, as well as FB, renal-based outcomes, and patient-centred outcomes. Two reviewers independently screened citations for eligible studies and performed data extraction. Results: Of the 13,767 studies reviewed, 22 met the inclusion criteria. Two studies examined manipulation of fluid input, 18 studies assessed enhancing fluid removal, and two studies applied a restrictive fluid protocol. Sixteen studies examined patients receiving renal replacement therapy, five studies included non-renal replacement therapy patients, and one study included both. Current evidence is broad with varied approaches to managing fluid input and fluid removal. The studies did not demonstrate a consensus approach for any aspect of the fluid management of critically ill patients. There was a limited application of a restrictive fluid protocol with no conclusions possible. Conclusions: The current body of evidence for the management of FB in critically ill patients with AKI is limited in nature. The current quality of evidence is unable to guide current clinical practice. The key outcome of this review is to highlight areas for future research.

7.
BMC Public Health ; 23(1): 1400, 2023 07 20.
Article in English | MEDLINE | ID: mdl-37474891

ABSTRACT

BACKGROUND: Acute respiratory infections (ARI) in Cúcuta -Colombia, have a comparatively high burden of disease associated with high public health costs. However, little is known about the epidemiology of these diseases in the city and its distribution within suburban areas. This study addresses this gap by estimating and mapping the risk of ARI in Cúcuta and identifying the most relevant risk factors. METHODS: A spatial epidemiological analysis was designed to investigate the association of sociodemographic and environmental risk factors with the rate of ambulatory consultations of ARI in urban sections of Cúcuta, 2018. The ARI rate was calculated using a method for spatial estimation of disease rates. A Bayesian spatial model was implemented using the Integrated Nested Laplace Approximation approach and the Besag-York-Mollié specification. The risk of ARI per urban section and the hotspots of higher risk were also estimated and mapped. RESULTS: A higher risk of IRA was found in central, south, north and west areas of Cúcuta after adjusting for sociodemographic and environmental factors, and taking into consideration the spatial distribution of the city's urban sections. An increase of one unit in the percentage of population younger than 15 years; the Index of Multidimensional Poverty and the rate of ARI in the migrant population was associated with a 1.08 (1.06-1.1); 1.04 (1.01-1.08) and 1.25 (1.22-1.27) increase of the ARI rate, respectively. Twenty-four urban sections were identified as hotspots of risk in central, south, north and west areas in Cucuta. CONCLUSION: Sociodemographic factors and their spatial patterns are determinants of acute respiratory infections in Cúcuta. Bayesian spatial hierarchical models can be used to estimate and map the risk of these infections in suburban areas of large cities in Colombia. The methods of this study can be used globally to identify suburban areas and or specific communities at risk to support the implementation of prevention strategies and decision-making in the public and private health sectors.


Subject(s)
Respiratory Tract Infections , Humans , Cities , Colombia/epidemiology , Bayes Theorem , Respiratory Tract Infections/epidemiology , Risk Factors
8.
Asia Pac J Public Health ; 35(4): 276-283, 2023 05.
Article in English | MEDLINE | ID: mdl-37070630

ABSTRACT

Healthy, diverse diets are vital for life. In low/middle-income countries, however, the focus is more on food quantity rather than diet quality. This study assessed household diet diversity (HDD) in the Vietnamese Mekong Delta and its associations with household food insecurity (HFI) and household food availability (HFA) controlling for socioeconomic factors. Primary food-preparers in 552 randomly selected households in two rural provinces were interviewed about socioeconomic factors, HDD, HFI, and HFA. More than 80% of households predominantly consumed energy-dense foods, whereas less than 20% consumed nutrient-dense foods. Lower HDD was associated with HFI, lower HFA, for the Khmer ethnic minority, and low livelihood capitals (landlessness, low expenditure, debt) and low utensil scores. The study highlighted the need to provide improved food and nutrition policies that increase availability and access to diverse and healthy foods as well as reduce poverty and increase incomes for at-risk rural and ethnic minority groups.


Subject(s)
Ethnicity , Southeast Asian People , Humans , Cross-Sectional Studies , Food Supply , Minority Groups , Diet , Food Insecurity
9.
Emerg Med Australas ; 35(3): 442-449, 2023 06.
Article in English | MEDLINE | ID: mdl-36410371

ABSTRACT

OBJECTIVES: To describe the demographics, presentation characteristics, clinical features and cardiac outcomes for Aboriginal and Torres Strait Islander patients who present to a regional cardiac referral centre ED with suspected acute coronary syndrome (ACS). METHODS: This was a single-centre observational study conducted at a regional referral hospital in Far North Queensland, Australia from November 2017 to September 2018 and January 2019 to December 2019. Study participants were 278 Aboriginal and Torres Strait Islander people presenting to an ED and investigated for suspected ACS. The main outcome measure was the proportion of patients with ACS at index presentation and differences in characteristics between those with and without ACS. RESULTS: ACS at presentation was diagnosed in 38.1% of patients (n = 106). The mean age of patients with ACS was 53.5 years (SD 9.5) compared with 48.7 years (SD 12.1) in those without ACS (P = 0.001). Patients with ACS were more likely to be male (63.2% vs 39.0%, P < 0.001), smokers (70.6% vs 52.3%, P = 0.002), have diabetes (56.6% vs 38.4%, P = 0.003) and have renal impairment (24.5% vs 10.5%, P = 0.002). CONCLUSIONS: Aboriginal and Torres Strait Islander patients with suspected ACS have a high burden of traditional cardiac risk factors, regardless of whether they are eventually diagnosed with ACS. These patients may benefit from assessment for coronary artery disease regardless of age at presentation.


Subject(s)
Acute Coronary Syndrome , Australian Aboriginal and Torres Strait Islander Peoples , Humans , Male , Middle Aged , Female , Acute Coronary Syndrome/diagnosis , Australia , Queensland/epidemiology , Referral and Consultation
10.
Malar J ; 21(1): 233, 2022 Aug 03.
Article in English | MEDLINE | ID: mdl-35922803

ABSTRACT

BACKGROUND: Rapid diagnostic tests (RDTs) that rely on the detection of Plasmodium falciparum histidine-rich protein 2 (PfHRP2) have become key tools for diagnosing P. falciparum infection. The utility of RDTs can be limited by PfHRP2 persistence, however it can be a potential benefit in low transmission settings where detection of persistent PfHRP2 using newer ultra-sensitive PfHRP2 based RDTs can serve as a surveillance tool to identify recent exposure. Better understanding of the dynamics of PfHRP2 over the course of a malaria infection can inform optimal use of RDTs. METHODS: A previously published mathematical model was refined to mimic the production and decay of PfHRP2 during a malaria infection. Data from 15 individuals from volunteer infection studies were used to update the original model and estimate key model parameters. The refined model was applied to a cohort of patients from Namibia who received treatment for clinical malaria infection for whom longitudinal PfHRP2 concentrations were measured. RESULTS: The refinement of the PfHRP2 dynamic model indicated that in malaria naïve hosts, P. falciparum parasites of the 3D7 strain produce 33.6 × 10-15 g (95% CI 25.0-42.1 × 10-15 g) of PfHRP2 in vivo per parasite replication cycle, with an elimination half-life of 1.67 days (95% CI 1.11-3.40 days). The refined model included these updated parameters and incorporated individualized body fluid volume calculations, which improved predictive accuracy when compared to the original model. The performance of the model in predicting clearance of PfHRP2 post treatment in clinical samples from six adults with P. falciparum infection in Namibia improved when using a longer elimination half-life of 4.5 days, with 14% to 67% of observations for each individual within the predicted range. CONCLUSIONS: The updated mathematical model can predict the growth and clearance of PfHRP2 during the production and decay of a mono-infection with P. falciparum, increasing the understanding of PfHRP2 antigen dynamics. This model can guide the optimal use of PfHRP2-based RDTs for reliable diagnosis of P. falciparum infection and re-infection in endemic settings, but also for malaria surveillance and elimination programmes in low transmission areas.


Subject(s)
Malaria, Falciparum , Plasmodium falciparum , Adult , Antigens, Protozoan , Diagnostic Tests, Routine , Humans , Malaria, Falciparum/epidemiology , Models, Theoretical , Namibia , Protozoan Proteins
11.
PLoS One ; 17(5): e0267344, 2022.
Article in English | MEDLINE | ID: mdl-35511953

ABSTRACT

INTRODUCTION: Household food insecurity and inadequate water, sanitation, and hygiene (WASH) contribute to ill health. However, the interactions between household food insecurity, WASH and health have been rarely assessed concurrently. This study investigated compounded impacts of household food insecurity and WASH on self-reported physical and mental health of adults in the Vietnamese Mekong Delta. MATERIALS AND METHODS: This cross-sectional survey interviewed 552 households in one northern and one southern province of the Vietnamese Mekong Delta. The survey incorporated previously validated tools such as the Short Form 12-item Health Survey, Household Food Insecurity Assessment Scale, and the Access and Behavioural Outcome Indicators for Water, Sanitation, and Hygiene. Physical and mental health were quantified using the physical health composite score (PCS) and mental health composite score (MCS), respectively. These measures were the dependent variables of interest for this study. RESULTS: Statistical analysis revealed that household food insecurity and using <50 litres of water per person per day (pppd) were independently associated with lower PCS (p<0.05), after adjusting for socio-economic confounders. Household food insecurity and lack of food availability, using <50 litres of water pppd, and the use of untreated drinking water were associated with lower MCS (p<0.05), with water usage being an effect modifier of the relationship between household food insecurity and MCS. The results indicate that being food insecure and having limited potable quality water had a compounding effect on MCS, compared to being individually either food insecure or having limited water. CONCLUSION: This study is one of only a few that have established a link between potable water availability, food insecurity and poorer physical and mental health. The results also indicate a need to validate national data with fine-scale investigations in less populous regions to evaluate national initiatives with local populations that may be at higher risk. Adopting joint dual-action policies for interventions that simultaneously address water and food insecurity should result in larger improvements in health, particularly mental health, compared to targeting either food or water insecurity in isolation.


Subject(s)
Drinking Water , Mental Health , Adult , Asian People , Cross-Sectional Studies , Food Insecurity , Food Supply , Humans , Self Report
12.
PLOS Glob Public Health ; 2(1): e0000106, 2022.
Article in English | MEDLINE | ID: mdl-36962137

ABSTRACT

Malaria rapid diagnostic tests (RDTs) are dominated by products which use histidine-rich protein 2 (HRP2) to detect Plasmodium falciparum. The emergence of parasites lacking the pfhrp2 gene can lead to high rates of false-negative results amongst these RDTs. One solution to restore the ability to correctly diagnose falciparum malaria is to switch to an RDT which is not solely reliant on HRP2. This study used an agent-based stochastic simulation model to investigate the impact on prevalence and transmission caused by switching the type of RDT used once false-negative rates reached pre-defined thresholds within the treatment-seeking symptomatic population. The results show that low transmission settings were the first to reach the false-negative switch threshold, and that lower thresholds were typically associated with better long-term outcomes. Changing the diagnostic RDT away from a HRP2-only RDT is predicted to restore the ability to correctly diagnose symptomatic malaria infections, but often did not lead to the extinction of HRP2-negative parasites from the population which continued to circulate in low density infections, or return to the parasite prevalence and transmission levels seen prior to the introduction of the HRP2-negative parasite. In contrast, failure to move away from HRP2-only RDTs leads to near fixation of these parasites in the population, and the inability to correctly diagnose symptomatic cases. Overall, these results suggest pfhrp2-deleted parasites are likely to become a significant component of P. falciparum parasite populations, and that long-term strategies are needed for diagnosis and surveillance which do not rely solely on HRP2.

13.
Sci Rep ; 11(1): 21082, 2021 10 26.
Article in English | MEDLINE | ID: mdl-34702923

ABSTRACT

Eritrea was the first African country to complete a nationwide switch in 2016 away from HRP2-based RDTs due to high rates of false-negative RDT results caused by Plasmodium falciparum parasites lacking hrp2/hrp3 genes. A cross-sectional survey was conducted during 2019 enrolling symptomatic malaria patients from nine health facilities across three zones consecutively to investigate the epidemiology of P. falciparum lacking hrp2/3 after the RDT switch. Molecular analyses of 715 samples revealed the overall prevalence of hrp2-, hrp3-, and dual hrp2/3-deleted parasites as 9.4% (95%CI 7.4-11.7%), 41.7% (95% CI 38.1-45.3%) and 7.6% (95% CI 5.8-9.7%), respectively. The prevalence of hrp2- and hrp3-deletion is heterogeneous within and between zones: highest in Anseba (27.1% and 57.9%), followed by Gash Barka (6.4% and 37.9%) and Debub zone (5.2% and 43.8%). hrp2/3-deleted parasites have multiple diverse haplotypes, with many shared or connected among parasites of different hrp2/3 status, indicating mutant parasites have likely evolved from multiple and local parasite genetic backgrounds. The findings show although prevalence of hrp2/3-deleted parasites is lower 2 years after RDT switching, HRP2-based RDTs remain unsuitable for malaria diagnosis in Eritrea. Continued surveillance of hrp2/3-deleted parasites in Eritrea and neighbouring countries is required to monitor the trend.


Subject(s)
Antigens, Protozoan/genetics , Malaria, Falciparum/genetics , Mutation , Plasmodium falciparum/genetics , Protozoan Proteins/genetics , Adolescent , Adult , Aged , Child , Child, Preschool , Eritrea/epidemiology , Female , Humans , Infant , Malaria, Falciparum/epidemiology , Male , Middle Aged
14.
Malar J ; 20(1): 39, 2021 Jan 13.
Article in English | MEDLINE | ID: mdl-33435999

ABSTRACT

BACKGROUND: The World Health Organization recommends confirmatory diagnosis by microscopy or malaria rapid diagnostic test (RDT) in patients with suspected malaria. In recent years, mobile medical applications (MMAs), which can interpret RDT test results have entered the market. To evaluate the performance of commercially available MMAs, an evaluation was conducted by comparing RDT results read by MMAs to RDT results read by the human eye. METHODS: Five different MMAs were evaluated on six different RDT products using cultured Plasmodium falciparum blood samples at five dilutions ranging from 20 to 1000 parasites (p)/microlitre (µl) and malaria negative blood samples. The RDTs were performed in a controlled, laboratory setting by a trained operator who visually read the RDT results. A second trained operator then used the MMAs to read the RDT results. Sensitivity (Sn) and specificity (Sp) for the RDTs were calculated in a Bayesian framework using mixed models. RESULTS: The RDT Sn of the P. falciparum (Pf) test line, when read by the trained human eye was significantly higher compared to when read by MMAs (74% vs. average 47%) at samples of 20 p/µl. In higher density samples, the Sn was comparable to the human eye (97%) for three MMAs. The RDT Sn of test lines that detect all Plasmodium species (Pan line), when read by the trained human eye was significantly higher compared to when read by MMAs (79% vs. average 56%) across all densities. The RDT Sp, when read by the human eye or MMAs was 99% for both the Pf and Pan test lines across all densities. CONCLUSIONS: The study results show that in a laboratory setting, most MMAs produced similar results interpreting the Pf test line of RDTs at parasite densities typically found in patients that experience malaria symptoms (> 100 p/µl) compared to the human eye. At low parasite densities for the Pf line and across all parasite densities for the Pan line, MMAs were less accurate than the human eye. Future efforts should focus on improving the band/line detection at lower band intensities and evaluating additional MMA functionalities like the ability to identify and classify RDT errors or anomalies.


Subject(s)
Diagnostic Tests, Routine/statistics & numerical data , Malaria, Falciparum/diagnosis , Plasmodium falciparum/isolation & purification , Humans
15.
J Infect Dis ; 223(9): 1631-1638, 2021 05 20.
Article in English | MEDLINE | ID: mdl-32901248

ABSTRACT

BACKGROUND: Artemisinin monotherapy of Plasmodium falciparum infection is frequently ineffective due to recrudescence. Artemisinin-induced dormancy, shown in vitro and in animal models, provides a plausible explanation. To date, direct evidence of artemisinin-induced dormancy in humans is lacking. METHODS: Blood samples were collected from Plasmodium falciparum 3D7- or K13-infected participants before and 48-72 hours after single-dose artesunate (AS) treatment. Parasite morphology, molecular signature of dormancy, capability and dynamics of seeding in vitro cultures, and genetic mutations in the K13 gene were investigated. RESULTS: Dormant parasites were observed in post-AS blood samples of 3D7- and K13-infected participants. The molecular signature of dormancy, an up-regulation of acetyl CoA carboxylase, was detected in 3D7 and K13 samples post-AS, but not in pre-AS samples. Posttreatment samples successfully seeded in vitro cultures, with a significant delay in time to reach 2% parasitemia compared to pretreatment samples. CONCLUSIONS: This study provides strong evidence for the presence of artemisinin-induced dormant parasites in P. falciparum infections. These parasites are a likely reservoir for recrudescent infection following artemisinin monotherapy and artemisinin combination therapy (ACT). Combination regimens that target dormant parasites or remain at therapeutic levels for a sufficient time to kill recovering parasites will likely improve efficacy of ACTs.


Subject(s)
Antimalarials , Artesunate , Malaria, Falciparum , Antimalarials/pharmacology , Antimalarials/therapeutic use , Artesunate/therapeutic use , Drug Resistance/drug effects , Humans , Malaria, Falciparum/drug therapy , Plasmodium falciparum/drug effects , Plasmodium falciparum/genetics , Protozoan Proteins
16.
Environ Res ; 195: 110285, 2021 04.
Article in English | MEDLINE | ID: mdl-33027631

ABSTRACT

BACKGROUND: Dengue is a wide-spread mosquito-borne disease globally with a likelihood of becoming endemic in tropical Queensland, Australia. The aim of this study was to analyse the spatial variation of dengue notifications in relation to climate variability and socio-ecological factors in the tropical climate zone of Queensland, Australia. METHODS: Data on the number of dengue cases and climate variables including minimum temperature, maximum temperature and rainfall for the period of January 1st, 2010 to December 31st, 2015 were obtained for each Statistical Local Area (SLA) from Queensland Health and Australian Bureau of Meteorology, respectively. Socio-ecological data including estimated resident population, percentage of Indigenous population, housing structure (specifically terrace house), socio-economic index and land use types for each SLA were obtained from Australian Bureau of Statistics, and Australian Bureau of Agricultural and Resource Economics and Sciences, respectively. To quantify the relationship between dengue, climate and socio-ecological factors, multivariate Poisson regression models in a Bayesian framework were developed with a conditional autoregressive prior structure. Posterior parameters were estimated using Bayesian Markov Chain Monte Carlo simulation with Gibbs sampling. RESULTS: In the tropical climate zone of Queensland, the estimated number of dengue cases was predicted to increase by 85% [95% Credible Interval (CrI): 25%, 186%] and 7% (95% CrI: 0.1%, 14%) for a 1-mm increase in average annual rainfall and 1% increase in the proportion of terrace houses, respectively. The estimated spatial variation (structured random effects) was small compared to the remaining unstructured variation, suggesting that the inclusion of covariates resulted in no residual spatial autocorrelation in dengue data. CONCLUSIONS: Climate and socio-ecological factors explained much of the heterogeneity of dengue transmission dynamics in the tropical climate zone of Queensland. Results will help to further understand the risk factors of dengue transmission and will provide scientific evidence in designing effective local dengue control programs in the most needed areas.


Subject(s)
Dengue , Animals , Australia , Bayes Theorem , Dengue/epidemiology , Incidence , Queensland/epidemiology , Spatial Analysis
17.
Malar J ; 19(1): 392, 2020 Nov 04.
Article in English | MEDLINE | ID: mdl-33148265

ABSTRACT

BACKGROUND: Malaria rapid diagnostic tests (RDTs) have greatly improved access to diagnosis in endemic countries. Most RDTs detect Plasmodium falciparum histidine-rich protein 2 (HRP2), but their sensitivity is seriously threatened by the emergence of pfhrp2-deleted parasites. RDTs detecting P. falciparum or pan-lactate dehydrogenase (Pf- or pan-LDH) provide alternatives. The objective of this study was to systematically assess the performance of malaria RDTs against well-characterized pfhrp2-deleted P. falciparum parasites. METHODS: Thirty-two RDTs were tested against 100 wild-type clinical isolates (200 parasites/µL), and 40 samples from 10 culture-adapted and clinical isolates of pfhrp2-deleted parasites. Wild-type and pfhrp2-deleted parasites had comparable Pf-LDH concentrations. Pf-LDH-detecting RDTs were also tested against 18 clinical isolates at higher density (2,000 parasites/µL) lacking both pfhrp2 and pfhrp3. RESULTS: RDT positivity against pfhrp2-deleted parasites was highest (> 94%) for the two pan-LDH-only RDTs. The positivity rate for the nine Pf-LDH-detecting RDTs varied widely, with similar median positivity between double-deleted (pfhrp2/3 negative; 63.9%) and single-deleted (pfhrp2-negative/pfhrp3-positive; 59.1%) parasites, both lower than against wild-type P. falciparum (93.8%). Median positivity for HRP2-detecting RDTs against 22 single-deleted parasites was 69.9 and 35.2% for HRP2-only and HRP2-combination RDTs, respectively, compared to 96.0 and 92.5% for wild-type parasites. Eight of nine Pf-LDH RDTs detected all clinical, double-deleted samples at 2,000 parasites/µL. CONCLUSIONS: The pan-LDH-only RDTs evaluated performed well. Performance of Pf-LDH-detecting RDTs against wild-type P. falciparum does not necessarily predict performance against pfhrp2-deleted parasites. Furthermore, many, but not all HRP2-based RDTs, detect pfhrp2-negative/pfhrp3-positive samples, with implications for the HRP2-based RDT screening approach for detection and surveillance of HRP2-negative parasites.


Subject(s)
Antigens, Protozoan/genetics , Diagnostic Tests, Routine/statistics & numerical data , Gene Deletion , Malaria, Falciparum/diagnosis , Plasmodium falciparum/genetics , Protozoan Proteins/genetics , Malaria, Falciparum/epidemiology
18.
Malar J ; 19(1): 324, 2020 Sep 04.
Article in English | MEDLINE | ID: mdl-32887612

ABSTRACT

Microscopy performed on stained films of peripheral blood for detection, identification and quantification of malaria parasites is an essential reference standard for clinical trials of drugs, vaccines and diagnostic tests for malaria. The value of data from such research is greatly enhanced if this reference standard is consistent across time and geography. Adherence to common standards and practices is a prerequisite to achieve this. The rationale for proposed research standards and procedures for the preparation, staining and microscopic examination of blood films for malaria parasites is presented here with the aim of improving the consistency and reliability of malaria microscopy performed in such studies. These standards constitute the core of a quality management system for clinical research studies employing microscopy as a reference standard. They can be used as the basis for the design of training and proficiency testing programmes as well as for procedures and quality assurance of malaria microscopy in clinical research.


Subject(s)
Malaria/parasitology , Microscopy/methods , Diagnostic Tests, Routine/methods , Diagnostic Tests, Routine/standards , Humans , Laboratory Proficiency Testing/methods , Laboratory Proficiency Testing/standards , Microscopy/standards , Quality Control , Reproducibility of Results , Sensitivity and Specificity , Staining and Labeling/methods , Staining and Labeling/standards
19.
Environ Res ; 184: 109222, 2020 05.
Article in English | MEDLINE | ID: mdl-32114157

ABSTRACT

BACKGROUND: Dengue is a significant public health concern in northern Queensland, Australia. This study compared the epidemic features of dengue transmission among different climate zones and explored the threshold of weather variability for climate zones in relation to dengue in Queensland, Australia. METHODS: Daily data on dengue cases and weather variables including minimum temperature, maximum temperature and rainfall for the period of January 1, 2010 to December 31, 2015 were obtained from Queensland Health and Australian Bureau of Meteorology, respectively. Climate zones shape file for Australia was also obtained from Australian Bureau of Meteorology. Kruskal-Wallis test was performed to check whether the distribution of dengue significantly differed between climate zones. Time series regression tree model was used to estimate the threshold effects of the monthly weather variables on dengue in different climate zones. RESULTS: During the study period, the highest dengue incidence rate was found in the tropical climate zone (15.09/10,000) with the second highest in the grassland climate zone (3.49/10,000). Dengue responded differently to weather variability in different climate zones. In every climate zone, temperature was the primary predictor of dengue. However, the threshold values, type of temperature (e.g. maximum, minimum, or mean), and lag time for dengue varied between climate zones. Monthly mean temperature above 27°C at a lag of two months and monthly minimum temperature above 22°C at a lag of one month was found to be the most favourable weather condition for dengue in the tropical and subtropical climate zone, respectively. However, in the grassland climate zone, maximum temperature above 38°C at a lag of five months was found to be the ideal condition for dengue. Monthly rainfall with threshold value of 1.7 mm was found to be a significant contributor to dengue only in the tropical climate zone. CONCLUSIONS: The temperature threshold for dengue was lower in both tropical and subtropical climate zones than in the grassland climate zone. The different temperature threshold between climate zones suggests that an early warning system would need to be developed based on local socio-ecological conditions of the climate zone to manage dengue control and intervention programs effectively.


Subject(s)
Climate , Dengue , Weather , Dengue/epidemiology , Humans , Incidence , Queensland/epidemiology , Temperature
20.
J Med Entomol ; 57(1): 241-251, 2020 01 09.
Article in English | MEDLINE | ID: mdl-31310648

ABSTRACT

Flood frequency is expected to increase across the globe with climate change. Understanding the relationship between flooding and arboviral disease can reduce disease risk and associated costs. South-eastern Australia is dominated by the flood-prone Murray-Darling River system where the incidence of Australia's most common arboviral disease, Ross River virus (RRV), is high. This study aimed to determine the relationship between riverine flooding and RRV disease outbreaks in inland south-eastern Australia, specifically New South Wales (NSW). Each study month from 1991 to 2013, for each of 37 local government areas (LGAs) was assigned 'outbreak/non-outbreak' status based on long-term trimmed-average age-standardized RRV notification rates and 'flood/non-flood' status based on riverine overflow. LGAs were grouped into eight climate zones with the relationship between flood and RRV outbreak modeled using generalized estimating equations. Modeling adjusted for rainfall in the previous 1-3 mo. Spring-summer flooding increased the odds of summer RRV outbreaks in three climate zones before and after adjusting for rainfall 1, 2, and 3 mo prior to the outbreak. Flooding at any time of the year was not predictive of RRV outbreaks in the remaining five climate zones. Predicting RRV disease outbreaks with flood events can assist with more targeted mosquito spraying programs, thereby reducing disease transmission and mosquito resistance.


Subject(s)
Alphavirus Infections/epidemiology , Disease Outbreaks , Floods , Alphavirus Infections/transmission , Animals , Culicidae/virology , Humans , New South Wales/epidemiology , Ross River virus/physiology , Seasons
SELECTION OF CITATIONS
SEARCH DETAIL
...