Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
Add more filters

Publication year range
1.
Med J Aust ; 220(11): 566-572, 2024 Jun 17.
Article in English | MEDLINE | ID: mdl-38803004

ABSTRACT

OBJECTIVES: To investigate the distribution and prevalence of Japanese encephalitis virus (JEV) antibody (as evidence of past infection) in northern Victoria following the 2022 Japanese encephalitis outbreak, seeking to identify groups of people at particular risk of infection; to investigate the distribution and prevalence of antibodies to two related flaviviruses, Murray Valley encephalitis virus (MVEV) and West Nile virus Kunjin subtype (KUNV). STUDY DESIGN: Cross-sectional serosurvey (part of a national JEV serosurveillance program). SETTING: Three northern Victorian local public health units (Ovens Murray, Goulburn Valley, Loddon Mallee), 8 August - 1 December 2022. PARTICIPANTS: People opportunistically recruited at pathology collection centres and by targeted recruitment through community outreach and advertisements. People vaccinated against or who had been diagnosed with Japanese encephalitis were ineligible for participation, as were those born in countries where JEV is endemic. MAIN OUTCOME MEASURES: Seroprevalence of JEV IgG antibody, overall and by selected factors of interest (occupations, water body exposure, recreational activities and locations, exposure to animals, protective measures). RESULTS: 813 participants were recruited (median age, 59 years [interquartile range, 42-69 years]; 496 female [61%]); 27 were JEV IgG-seropositive (3.3%; 95% confidence interval [CI], 2.2-4.8%) (median age, 73 years [interquartile range, 63-78 years]; 13 female [48%]); none were IgM-seropositive. JEV IgG-seropositive participants were identified at all recruitment locations, including those without identified cases of Japanese encephalitis. The only risk factors associated with JEV IgG-seropositivity were age (per year: prevalence odds ratio [POR], 1.07; 95% CI, 1.03-1.10) and exposure to feral pigs (POR, 21; 95% CI, 1.7-190). The seroprevalence of antibody to MVEV was 3.0% (95% CI, 1.9-4.5%; 23 of 760 participants), and of KUNV antibody 3.3% (95% CI, 2.1-4.8%; 25 of 761). CONCLUSIONS: People living in northern Victoria are vulnerable to future JEV infection, but few risk factors are consistently associated with infection. Additional prevention strategies, including expanding vaccine eligibility, may be required to protect people in this region from Japanese encephalitis.


Subject(s)
Antibodies, Viral , Disease Outbreaks , Encephalitis Virus, Japanese , Encephalitis, Japanese , Humans , Cross-Sectional Studies , Encephalitis Virus, Japanese/immunology , Middle Aged , Seroepidemiologic Studies , Encephalitis, Japanese/epidemiology , Encephalitis, Japanese/immunology , Adult , Female , Male , Antibodies, Viral/blood , Aged , Victoria/epidemiology , Immunoglobulin G/blood , Young Adult , Encephalitis Virus, Murray Valley/immunology , Adolescent , Risk Factors
2.
Acta Paediatr ; 110(5): 1620-1632, 2021 05.
Article in English | MEDLINE | ID: mdl-33220086

ABSTRACT

AIM: Pneumonia is the leading infectious cause of death among children under five globally. Many pneumonia deaths result from inappropriate treatment due to misdiagnosis of signs and symptoms. This study aims to identify whether health extension workers (HEWs) in Ethiopia, using an automated multimodal device (Masimo Rad-G), adhere to required guidelines while assessing and classifying under five children with cough or difficulty breathing and to understand device acceptability. METHODS: A cross-sectional study was conducted in three districts of Southern Nations, Nationalities, and Peoples' Region, Ethiopia. Between September and December 2018, 133 HEWs were directly observed using Rad-G while conducting 599 sick child consultations. Usability was measured as adherence to the World Health Organization requirements to assess fast breathing and device manufacturer instructions for use. Acceptability was assessed using semi-structured interviews with HEWs, first-level health facility workers and caregivers. RESULTS: Adherence using the Rad-G routinely for 2 months was 85.3% (95% CI 80.2, 89.3). Health workers and caregivers stated a preference for Rad-G. Users highlighted a number of device design issues. CONCLUSION: While demonstrating high levels of acceptability and usability, the device modifications to consider include better probe fit, improved user interface with exclusive age categories and simplified classification outcomes.


Subject(s)
Case Management , Pneumonia , Child , Community Health Workers , Cross-Sectional Studies , Ethiopia , Humans , Pneumonia/diagnosis , Pneumonia/therapy , Respiratory Rate
3.
Acta Paediatr ; 109(6): 1196-1206, 2020 06.
Article in English | MEDLINE | ID: mdl-31638714

ABSTRACT

AIM: Manually counting respiratory rate (RR) is commonly practiced by community health workers to detect fast breathing, an important sign of childhood pneumonia. Correctly counting and classifying breaths manually is challenging, often leading to inappropriate treatment. This study aimed to determine the usability of a new automated RR counter (ChARM) by health extension workers (HEWs), and its acceptability to HEWs, first-level health facility workers (FLHFWs) and caregivers in Ethiopia. METHODS: A cross-sectional study was conducted in one region of Ethiopia between May and August 2018. A total of 131 HEWs were directly observed conducting 262 sick child consultations after training and 337 after 2 months. Usability was measured as adherence to the WHO requirements to assess fast breathing and device manufacturer instructions for use (IFU). Acceptability was measured through semi-structured interviews. RESULTS: After 2 months, HEWs were shown to adhere to the requirements in 74.6% consultations; an increase of 18.6% after training (P < .001). ChARM is acceptable to users and caregivers, with HEWs suggesting that ChARM increased client flow and stating a willingness to use ChARM in future. CONCLUSION: Further research on the performance, cost-effectiveness and implementation of this device is warranted to inform policy decisions in countries with a high childhood pneumonia burden.


Subject(s)
Pneumonia , Respiratory Rate , Child , Community Health Workers , Cross-Sectional Studies , Ethiopia , Humans , Pneumonia/diagnosis , Pneumonia/therapy
4.
Aust Health Rev ; 41(1): 26-32, 2017 Mar.
Article in English | MEDLINE | ID: mdl-27075773

ABSTRACT

Objective The aim of the present study was to quantify hospital steam steriliser resource consumption to provide baseline environmental data and identify possible efficiency gains. We sought to find the amount of steriliser electricity and water used for active cycles and for idling (standby), and the relationship between the electricity and water consumption and the mass and type of items sterilised. Methods We logged a hospital steam steriliser's electricity and water meters every 5min for up to 1 year. We obtained details of all active cycles (standard 134°C and accessory or 'test' cycles), recording item masses and types. Relationships were investigated for both the weight and type of items sterilised with electricity and water consumption. Results Over 304 days there were 2173 active cycles, including 1343 standard 134°C cycles that had an average load mass of 21.2kg, with 32% of cycles <15kg. Electricity used for active cycles was 32652kWh (60% of total), whereas the water used was 1243495L (79%). Standby used 21457kWh (40%) electricity and 329200L (21%) water. Total electricity and water consumption per mass sterilised was 1.9kWhkg-1 and 58Lkg-1, respectively. The linear regression model predicting electricity use was: kWh=15.7+ 0.14×mass (in kg; R2=0.58, P<0.01). Models for water and item type were poor. Electricity and water use fell from 3kWhkg-1 and 200Lkg-1, respectively, for 5-kg loads to 0.5kWhkg-1 and 20Lkg-1, respectively, for 40-kg loads. Conclusions Considerable electricity and water use occurred during standby, load mass was only moderately predictive of electricity consumption and light loads were common yet inefficient. The findings of the present study are a baseline for steam sterilisation's environmental footprint and identify areas to improve efficiencies. What is known about the topic? There is increasing interest in the environmental effects of healthcare. Life cycle assessment ('cradle to grave') provides a scientific method of analysing environmental effects. Although data of the effects of steam sterilisation are integral to the life cycles of reusable items and procedures using such items, there are few data available. Further, there is scant information regarding the efficiency of the long-term in-hospital use of sterilisers. What does this paper add? We quantified, for the first time, long-term electricity and water use of a hospital steam steriliser. We provide useful input data for future life cycle assessments of all reusable, steam-sterilised equipment. Further, we identified opportunities for improved steriliser efficiencies, including rotating off idle sterilisers and reducing the number of light steriliser loads. Finally, others could use our methods to examine steam sterilisers and many other energy-intensive items of hospital equipment. What are the implications for practitioners? We provide useful input data for all researchers examining the environmental footprint of reusable hospital equipment and procedures using such equipment. As a result of the present study, staff in the hospital sterile supply department have reduced steam steriliser electricity and water use considerably without impeding sterilisation throughput (and reduced time inefficiencies). Many other hospitals could benefit from similar methods to improve steam steriliser and other hospital equipment efficiencies.


Subject(s)
Conservation of Energy Resources , Conservation of Natural Resources , Hospitals , Steam , Sterilization/methods , Water Supply , Humans , Victoria
5.
J Virol ; 89(1): 165-80, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25320291

ABSTRACT

UNLABELLED: The precise role(s) and topological organization of different factors in the hepatitis C virus (HCV) RNA replication complex are not well understood. In order to elucidate the role of viral and host proteins in HCV replication, we have developed a novel in vitro replication system that utilizes a rolling-circle RNA template. Under close-to-physiological salt conditions, HCV NS5BΔ21, an RNA-dependent RNA polymerase, has poor affinity for the RNA template. Human replication protein A (RPA) and HCV NS5A recruit NS5BΔ21 to the template. Subsequently, NS3 is recruited to the replication complex by NS5BΔ21, resulting in RNA synthesis stimulation by helicase. Both RPA and NS5A(S25-C447), but not NS5A(S25-K215), enabled the NS5BΔ21-NS3 helicase complex to be stably associated with the template and synthesize RNA product in a highly processive manner in vitro. This new in vitro HCV replication system is a useful tool that may facilitate the study of other replication factors and aid in the discovery of novel inhibitors of HCV replication. IMPORTANCE: The molecular mechanism of hepatitis C virus (HCV) replication is not fully understood, but viral and host proteins collaborate in this process. Using a rolling-circle RNA template, we have reconstituted an in vitro HCV replication system that allows us to interrogate the role of viral and host proteins in HCV replication and delineate the molecular interactions. We showed that HCV NS5A(S25-C447) and cellular replication protein A (RPA) functionally cooperate as a processivity factor to stimulate HCV replication by HCV NS5BΔ21 polymerase and NS3 helicase. This system paves the way to test other proteins and may be used as an assay for discovery of HCV inhibitors.


Subject(s)
Hepacivirus/enzymology , Hepacivirus/physiology , Host-Pathogen Interactions , Replication Protein A/metabolism , Viral Nonstructural Proteins/metabolism , Virus Replication , Humans , Mutant Proteins/genetics , Mutant Proteins/metabolism , Protein Binding , RNA, Viral/metabolism , Sequence Deletion , Viral Nonstructural Proteins/genetics
6.
Exp Cell Res ; 338(2): 203-13, 2015 Nov 01.
Article in English | MEDLINE | ID: mdl-26256888

ABSTRACT

The possibility of converting cells from blood mononuclear cells (MNC) to liver cells provides promising opportunities for the study of diseases and the assessment of new drugs. However, clinical applications have to meet GMP requirements and the methods for generating induced pluripotent cells (iPCs) have to avoid insertional mutagenesis, a possibility when using viral vehicles for the delivery of reprogramming factors. We have developed an efficient non-integration method for reprogramming fresh or frozen blood MNC, maintained in an optimised cytokine cocktail, to generate induced pluripotent cells. Using electroporation for the effective delivery of episomal transcription factors (Oct4, Sox2, Klf4, L-Myc, and Lin28) in a feeder-free system, without any requirement for small molecules, we achieved a reprogramming efficiency of up to 0.033% (65 colonies from 2×10(5) seeded MNC). Applying the same cytokine cocktail and reprogramming methods to cord blood or fetal liver-derived CD34(+) cells, we obtained 148 iPS colonies from 10(5) seeding cells (0.148%). The iPS cell lines we generated maintained typical characteristics of pluripotent cells and could be successfully differentiated into hepatocytes with drug metabolic function.


Subject(s)
Cell Differentiation/physiology , Cellular Reprogramming/physiology , Fetal Blood/physiology , Hepatocytes/physiology , Leukocytes, Mononuclear/physiology , Plasmids/metabolism , Antigens, CD34/metabolism , Cell Culture Techniques/methods , Cell Line , Cytokines/metabolism , Fetal Blood/metabolism , Hepatocytes/metabolism , Humans , Induced Pluripotent Stem Cells/metabolism , Induced Pluripotent Stem Cells/physiology , Kruppel-Like Factor 4 , Leukocytes, Mononuclear/metabolism , Transcription Factors/metabolism
7.
BMC Health Serv Res ; 15: 67, 2015 Feb 18.
Article in English | MEDLINE | ID: mdl-25889803

ABSTRACT

BACKGROUND: Questions about the impact of large donor-funded HIV interventions on low- and middle-income countries' health systems have been the subject of a number of expert commentaries, but comparatively few empirical research studies. Aimed at addressing a particular evidence gap vis-à-vis the influence of HIV service scale-up on micro-level health systems, this article examines the impact of HIV scale-up on mechanisms of accountability in Zambian primary health facilities. METHODS: Guided by the Mechanisms of Effect framework and Brinkerhoff's work on accountability, we conducted an in-depth multi-case study to examine how HIV services influenced mechanisms of administrative and social accountability in four Zambian primary health centres. Sites were selected for established (over 3 yrs) antiretroviral therapy (ART) services and urban, peri-urban and rural characteristics. Case data included provider interviews (60); patient interviews (180); direct observation of facility operations (2 wks/centre) and key informant interviews (14). RESULTS: Resource-intensive investment in HIV services contributed to some early gains in administrative answerability within the four ART departments, helping to establish the material capabilities necessary to deliver and monitor service delivery. Simultaneous investment in external supervision and professional development helped to promote transparency around individual and team performance and also strengthened positive work norms in the ART departments. In the wider health centres, however, mechanisms of administrative accountability remained weak, hindered by poor data collection and under capacitated leadership. Substantive gains in social accountability were also elusive as HIV scale-up did little to address deeply rooted information and power asymmetries in the wider facilities. CONCLUSIONS: Short terms gains in primary-level service accountability may arise from investment in health system hardware. However, sustained improvements in service quality and responsiveness arising from genuine improvements in social and administrative accountability require greater understanding of, and investment in changing, the power relations, work norms, leadership and disciplinary mechanisms that shape these micro-level health systems.


Subject(s)
Community Health Centers/economics , Government Programs , HIV Infections , Primary Health Care/standards , Quality Improvement , Social Responsibility , HIV Infections/economics , Health Resources/economics , Humans , Interviews as Topic , Medical Assistance , Observation , Organizational Case Studies , Rural Population , Systems Analysis , Zambia
8.
Article in English | MEDLINE | ID: mdl-36850065

ABSTRACT

Introduction: Pathogens can enter the drinking water supply and cause gastroenteritis outbreaks. Such events can affect many people in a short time, making them a high risk for public health. In Australia, the Victoria State Government Department of Health is deploying a syndromic surveillance system for drinking water contamination events. We assessed the utility of segmented regression models for detecting such events and determined the number of excess presentations needed for such methods to signal a detection. Methods: The study involved an interrupted time series study of a past lapse in water treatment. The baseline period comprised the four weeks before the minimum incubation period of suspected pathogens, set at two days post-event. The surveillance period comprised the week after. We used segmented linear regression to compare the count of gastroenteritis presentations to public hospital emergency departments (EDs) between the surveillance and baseline periods. We then simulated events resulting in varying excess presentations. These were superimposed onto the ED data over fifty different dates across 2020. Using the same regression, we calculated the detection probability at p < 0.05 for each outbreak size. Results: In the retrospective analysis, there was strong evidence for an increase in presentations shortly after the event. In the simulations, with no excess presentations (i.e., with the ED data as is) the models signalled 8% probability of detection. The models returned 50% probability of detection with 28 excess presentations and 100% probability of detection with 78 excess presentations. Conclusions: The transient increase in presentations after the event may be attributed to microbiological hazards or increased health-seeking behaviour following the issuing of boil water advisories. The simulations demonstrated the ability for segmented regressions to signal a detection, even without a large excess in presentations. The approach also demonstrated high specificity and should be considered for informing Victoria's syndromic surveillance system.


Subject(s)
Drinking Water , Gastroenteritis , Waterborne Diseases , Humans , Interrupted Time Series Analysis , Retrospective Studies , Sentinel Surveillance , Waterborne Diseases/epidemiology , Disease Outbreaks , Regression Analysis , Gastroenteritis/epidemiology , Victoria/epidemiology
9.
Scand J Infect Dis ; 44(4): 289-96, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22106922

ABSTRACT

OBJECTIVES: To assess the secondary attack rates (SAR) and impact of the 2009 H1N1 epidemic in Melbourne, Victoria, Australia, and the measures implemented to control household transmission. METHODS: Patients with polymerase chain reaction-confirmed influenza A and pandemic H1N1 (pH1N1) were identified from hospital and microbiology laboratory records and asked to take part in a retrospective survey. Information obtained included: the constellation of symptoms, contact history, secondary infection, and household information, including adherence and attitudes towards quarantine measures. RESULTS: The overall SAR of pH1N1 index patients was 30.6%, but a significantly lower SAR was noted with oseltamivir treatment (36.6% vs 22.8%, p < 0.05). The greatest reduction in SAR was observed when index patients aged 0-4 y received oseltamivir (83.3% vs 22.2%, p < 0.01). Quarantine was requested of 65.8% of patients and 92.8% self-reported adhering to recommendations. pH1N1 index patients, the number of median days bed-bound is 2.5 days, being unable or too sick to work for a median of 5.0 days, and lost a median of 7.0 days of work for reasons related to an influenza-like illness. CONCLUSIONS: The pH1N1 influenza pandemic had a significant clinical impact on households. Public health interventions such as oseltamivir treatment of index cases were beneficial in reducing secondary attack rates, whilst quarantine measures were found to have high rates of self-reported compliance, understanding, and acceptability.


Subject(s)
Influenza A Virus, H1N1 Subtype/isolation & purification , Influenza, Human/epidemiology , Influenza, Human/prevention & control , Quarantine/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Australia/epidemiology , Child , Child, Preschool , Family Characteristics , Female , Humans , Infant , Influenza, Human/psychology , Male , Middle Aged , Patient Compliance/statistics & numerical data , Quarantine/psychology , Retrospective Studies , Statistics, Nonparametric , Surveys and Questionnaires
10.
Stud Health Technol Inform ; 178: 26-32, 2012.
Article in English | MEDLINE | ID: mdl-22797015

ABSTRACT

INTRODUCTION: Information systems with clinical decision support (CDS) offer great potential to assist the co-ordination of patients with chronic diseases and to improve patient care. Despite this, few have entered routine clinical use. BACKGROUND: Tuberculosis (TB) is an infection of public health importance. It has complex interactions with many comorbid conditions, requires close supervised care and prolonged treatment for effective cure. These features make it suitable for use with an information management system with CDS features. In close consultation with key stakeholders, a clinical application was developed for the management of TB patients in Victoria. METHODS: A formal usability assessment using semi-structured case-scenario based exercises was performed. Subjects were 12 individuals closely involved in the care of TB patients, including Infectious Diseases and Respiratory Physicians, and Public Health Nurses. Two researchers conducted the sessions, independently analysed responses and discrepancies compared to the voice record for validity. RESULTS: Despite varied computer experience, responses were positive regarding user interface and content. Data location was not always intuitive, however this improved with familiarity of the program. Decision support was considered valuable, with useful suggestions for expansion of these features. Automated reporting for correspondence and notification to the Health Department were felt worth the initial investment in data entry. An important workflow-based issue regarding dismissal of alerts and several errors were detected. CONCLUSION: Usability assessment validated many design elements of the system, provided a unique insight into workflow issues faced by users and hopefully will impact on its ultimate clinical utility.


Subject(s)
Information Management/organization & administration , Self Care , Tuberculosis , User-Computer Interface , Humans , Patient Participation , Task Performance and Analysis
11.
Asia Pac J Public Health ; 34(4): 377-383, 2022 05.
Article in English | MEDLINE | ID: mdl-35016535

ABSTRACT

The purpose of this study was to examine the determinants of health service utilization in a population at high risk of developing type 2 diabetes mellitus in India. Using Andersen's behavioral model of healthcare utilization, multivariate logistic regression analysis was performed on baseline data of the Kerala Diabetes Prevention Program. We examined the association between predisposing, enabling, and need factors with outpatient health service use in the past four weeks and inpatient health service use in the past 12 months. More than a quarter (27.9%) and 12.9% of 1007 participants used outpatient services and inpatient services, respectively. Men were less likely to use outpatient services (odds ratio [OR] = 0.56). Outpatient service utilization was positively associated with low social support (OR = 1.69), low general health status (OR = 5.71), and time off from work due to illness (OR = 8.01). Higher educational status (OR = 0.63), low general health status (OR = 3.59), and time off from work due to illness (OR = 1.21) were associated with increased utilization of inpatient services. Although gender, educational status, and social support had important roles, health service utilization in this study population was largely dependent on general health status and presence of illness.


Subject(s)
Diabetes Mellitus, Type 2 , Adult , Ambulatory Care , Diabetes Mellitus, Type 2/epidemiology , Diabetes Mellitus, Type 2/prevention & control , Health Status , Humans , India/epidemiology , Male , Patient Acceptance of Health Care
12.
BMC Public Health ; 11: 617, 2011 Aug 03.
Article in English | MEDLINE | ID: mdl-21810279

ABSTRACT

BACKGROUND: In India, 55% of women and 69.5% of preschool children are anaemic despite national policies recommending routine iron supplementation. Understanding factors associated with receipt of iron in the field could help optimise implementation of anaemia control policies. Thus, we undertook 1) a cross-sectional study to evaluate iron supplementation to children (and mothers) in rural Karnataka, India, and 2) an analysis of all-India rural data from the National Family Health Study 2005-6 (NFHS-3). METHODS: All children aged 12-23 months and their mothers served by 6 of 8 randomly selected sub-centres managed by 2 rural Primary Health Centres of rural Karnataka were eligible for the Karnataka Study, conducted between August and October 2008. Socioeconomic and demographic data, access to health services and iron receipt were recorded. Secondly, NFHS-3 rural data were analysed. For both studies, logistic regression was used to evaluate factors associated with receipt of iron. RESULTS: The Karnataka Study recruited 405 children and 377 of their mothers. 41.5% of children had received iron, and 11.5% received iron through the public system. By multiple logistic regression, factors associated with children's receipt of iron included: wealth (Odds Ratio (OR) 2.63 [95% CI 1.11, 6.24] for top vs bottom wealth quintile), male sex (OR 2.45 [1.47, 4.10]), mother receiving postnatal iron (OR 2.31 [1.25, 4.28]), mother having undergone antenatal blood test (OR 2.10 [1.09, 4.03]); Muslim religion (OR 0.02 [0.00, 0.27]), attendance at Anganwadi centre (OR 0.23 [0.11, 0.49]), fully vaccinated (OR 0.33 [0.15, 0.75]), or children of mothers with more antenatal health visits (8-9 visits OR 0.25 [0.11, 0.55]) were less likely to receive iron. Nationally, 3.7% of rural children were receiving iron; this was associated with wealth (OR 1.12 [1.02, 1.23] per quintile), maternal education (compared with no education: completed secondary education OR 2.15 [1.17, 3.97], maternal antenatal iron (2.24 [1.56, 3.22]), and child attending an Anganwadi (OR 1.47 [1.20, 1.80]). CONCLUSION: In rural India, public distribution of iron to children is inadequate and disparities exist. Measures to optimize receipt of government supplied iron to all children regardless of wealth and ethnic background could help alleviate anaemia in this population.


Subject(s)
Anemia, Iron-Deficiency/epidemiology , Dietary Supplements , Iron Compounds/administration & dosage , Rural Population , Adolescent , Adult , Anemia, Iron-Deficiency/blood , Anemia, Iron-Deficiency/therapy , Cross-Sectional Studies , Delivery of Health Care , Female , Health Services Research , Humans , India/epidemiology , Infant , Iron, Dietary/blood , Male , Middle Aged , Nutritional Status , Socioeconomic Factors , Surveys and Questionnaires , Young Adult
13.
J Adv Nurs ; 66(11): 2490-9, 2010 Nov.
Article in English | MEDLINE | ID: mdl-21039775

ABSTRACT

AIM: The aim of this study was to develop a clinical algorithm to assess chronic obstructive pulmonary disease exacerbation severity in a community setting. BACKGROUND: An important aspect of community management of exacerbations is assessing patient safety. Although researchers have investigated risk factors for rapid deterioration, there is a lack of evidence validating clinical measures of exacerbation severity. METHODS: This was a prospective, community-based cohort study of patients enrolled in the Melbourne Longitudinal Chronic Obstructive Pulmonary Disease Cohort. The outreach team collected data on symptom severity at baseline and exacerbation onset using the Medical Research Council Dyspnoea Scale, St George Quality-of-Life Questionnaire and Symptom Severity Index. RESULTS: Ninety-two patients were monitored from 2003 to 2005. There were 148 exacerbations: 121 (82%) were treated at home and 27 (17·5%) required hospitalization. An ordinal logistic regression model demonstrated that a combination of chronic obstructive pulmonary disease severity with dyspnoea and wheeze severity at exacerbation onset could differentiate severe from milder episodes [(OR 7·69, 95%CI: 3·9-11·5, P < 0·01), area under the receiver operating characteristics curve 0·75 (95%CI: 0·65-0·86)]. CONCLUSION: The majority of chronic obstructive pulmonary disease exacerbations can be safely managed in a community setting, but clinical assessment alone may not be sufficient to identify all patients who will develop complications such as respiratory failure. Further research is needed to validate clinical assessment and decision-making algorithms for community-management of chronic obstructive pulmonary disease exacerbations.


Subject(s)
Community Health Services , Decision Support Techniques , Disease Progression , Nurse Practitioners , Nursing Assessment/methods , Pulmonary Disease, Chronic Obstructive/physiopathology , Algorithms , Ambulatory Care , Dyspnea/complications , Hospitalization , Humans , Longitudinal Studies , Patient Selection , Pulmonary Disease, Chronic Obstructive/diagnosis , Pulmonary Disease, Chronic Obstructive/nursing , Quality of Life , Respiratory Sounds/diagnosis , Severity of Illness Index
14.
Sci Total Environ ; 737: 140263, 2020 Oct 01.
Article in English | MEDLINE | ID: mdl-32783854

ABSTRACT

BACKGROUND: In epidemic thunderstorm asthma (ETSA) events a large number of people develop asthma symptoms over a short period of time. This is thought to occur because of a unique combination of high amounts of pollen and certain meteorological conditions. However, the exact cause and mechanism of epidemic thunderstorm asthma remains unclear. OBJECTIVES: The objective of this study was to test the hypothesis that convergence lines may be a causative factor in ETSA events, by investigating whether convergence line weather events are associated with the occurrence of high asthma presentations days during the Victorian grass pollen season (October-December). METHODS: A case control method was used. All public hospitals within 75 km of the Melbourne weather radar were included, and data were taken from 2009 to 2017 during the Victorian grass pollen season. Cases hospital days were hospitals with a high number of asthma presentations within a 24-h period, and controls were hospitals with an expected number of asthma presentations. Exposure was defined as geographical proximity of a convergence line to the hospital case or control. RESULTS: Eighty-one case hospital days and 157 hospital day controls were included in the study. The odds of exposure to a convergence line were significantly higher for cases than for controls at all exposure distances. At 4 km, 80 of the 81 cases had been exposed to a convergence line. CONCLUSION: Convergence lines appear to be a necessary, but not sufficient, element in the cause of epidemic thunderstorm asthma. This is the first study to show a clear link between epidemic thunderstorm asthma and convergence lines.


Subject(s)
Allergens , Asthma , Australia , Case-Control Studies , Humans , Pollen/immunology , Weather
15.
BMC Public Health ; 9: 59, 2009 Feb 17.
Article in English | MEDLINE | ID: mdl-19222859

ABSTRACT

BACKGROUND: Anaemia is an important problem amongst young children living in rural India. However, there has not previously been a detailed study of the biological aetiology of this anaemia, exploring the relative contributions of iron, vitamin B12, folate and Vitamin A deficiency, inflammation, genetic haemoglobinopathy, hookworm and malaria. Nor have studies related these aetiologic biological factors to household food security, standard of living and child feeding practices. Barriers to conducting such work have included perceived reluctance of village communities to permit their children to undergo venipuncture, and logistical issues. We have successfully completed a community based, cross sectional field study exploring in detail the causes of anaemia amongst young children in a rural setting. METHODS AND DESIGN: A cross sectional, community based study. We engaged in extensive community consultation and tailored our study design to the outcomes of these discussions. We utilised local women as field workers, harnessing the capacity of local health workers to assist with the study. We adopted a programmatic approach with a census rather than random sampling strategy in the village, incorporating appropriate case management for children identified to have anaemia. We developed a questionnaire based on existing standard measurement tools for standard of living, food security and nutrition. Specimen processing was conducted at the Primary Health Centre laboratory prior to transport to an urban research laboratory. DISCUSSION: Adopting this study design, we have recruited 415 of 470 potentially eligible children who were living in the selected villages. We achieved support from the community and cooperation of local health workers. Our results will improve the understanding into anaemia amongst young children in rural India. However, many further studies are required to understand the health problems of the population of rural India, and our study design and technique provide a useful demonstration of a successful strategy.


Subject(s)
Anemia/epidemiology , Avitaminosis/epidemiology , Anemia/diagnosis , Anemia/etiology , Anemia, Iron-Deficiency/epidemiology , Avitaminosis/complications , Cross-Sectional Studies , Diet , Female , Humans , India/epidemiology , Infant , Male , Pilot Projects , Prevalence , Risk Factors , Rural Health , Socioeconomic Factors , Surveys and Questionnaires
16.
Respir Med ; 101(12): 2472-81, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17822891

ABSTRACT

Respiratory viruses are associated with severe acute exacerbations of chronic obstructive pulmonary disease (COPD) in hospitalized patients. However, exacerbations are increasingly managed in the community, where the role of viruses is unclear. In community exacerbations, the causal association between viruses and exacerbation maybe confounded by random fluctuations in the prevalence of circulating respiratory viruses. Therefore, to determine whether viral respiratory tract infections are causally associated with community exacerbations, a time-matched case-control study was performed. Ninety-two subjects (mean age 72 yrs), with moderate to severe COPD, (mean FEV(1) 40% predicted), were enrolled. Nasopharyngeal swabs for viral multiplex polymerase chain reaction and atypical pneumonia serology were obtained at exacerbation onset. Control samples were collected in synchrony, from a randomly selected stable patient drawn from the same cohort. In 99 weeks of surveillance, there were 148 exacerbations. Odds of viral isolation were 11 times higher in cases, than their time-matched controls (34 discordant case-control pairs; in 31 pairs only the case had virus and in three pairs only control). Picornavirus (26), influenza A (3), parainfluenza 1,2,3 (2), respiratory syncytial virus (1), and adenovirus (1) were detected in cases while adenovirus (1) and picornavirus (2) were detected in controls. In patients with moderate or severe COPD the presence of a virus in upper airway secretions is strongly associated with the development of COPD exacerbations. These data support the causative role of viruses in triggering COPD exacerbations in the community.


Subject(s)
Community-Acquired Infections/complications , Pulmonary Disease, Chronic Obstructive/virology , Virus Diseases/complications , Acute Disease , Adenoviruses, Human/genetics , Aged , Antibodies, Viral/blood , Common Cold/complications , Common Cold/diagnosis , Community-Acquired Infections/diagnosis , Epidemiologic Methods , Female , Fluorescent Antibody Technique, Indirect , Humans , Influenza A virus/genetics , Influenza A virus/immunology , Male , Middle Aged , Picornaviridae/genetics , Picornaviridae/immunology , Pneumonia, Viral/complications , Pneumonia, Viral/diagnosis , Polymerase Chain Reaction , Respiratory Syncytial Viruses/genetics , Respiratory Syncytial Viruses/immunology , Respirovirus/genetics , Respirovirus/immunology , Virus Diseases/diagnosis
17.
Respir Care ; 62(12): 1582-1587, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28951467

ABSTRACT

BACKGROUND: Widespread access to medical oxygen would reduce global pneumonia mortality. Oxygen concentrators are one proposed solution, but they have limitations, in particular vulnerability to electricity fluctuations and failure during blackouts. The low-pressure oxygen storage system addresses these limitations in low-resource settings. This study reports testing of the system in Melbourne, Australia, and nonclinical field testing in Mbarara, Uganda. METHODS: The system included a power-conditioning unit, a standard oxygen concentrator, and an oxygen store. In Melbourne, pressure and flows were monitored during cycles of filling/emptying, with forced voltage fluctuations. The bladders were tested by increasing pressure until they ruptured. In Mbarara, the system was tested by accelerated cycles of filling/emptying and then run on grid power for 30 d. RESULTS: The low-pressure oxygen storage system performed well, including sustaining a pressure approximately twice the standard working pressure before rupture of the outer bag. Flow of 1.2 L/min was continuously maintained to a simulated patient during 30 d on grid power, despite power failures totaling 2.9% of the total time, with durations of 1-176 min (mean 36.2, median 18.5). CONCLUSIONS: The low-pressure oxygen storage system was robust and durable, with accelerated testing equivalent to at least 2 y of operation revealing no visible signs of imminent failure. Despite power cuts, the system continuously provided oxygen, equivalent to the treatment of one child, for 30 d under typical power conditions for sub-Saharan Africa. The low-pressure oxygen storage system is ready for clinical field trials.


Subject(s)
Compressed Air/supply & distribution , Developing Countries , Electric Power Supplies/supply & distribution , Oxygen Inhalation Therapy/methods , Oxygen/supply & distribution , Health Resources , Humans , Pneumonia/therapy , Pressure , Uganda , Victoria
18.
Clin Infect Dis ; 43(9): 1185-93, 2006 Nov 01.
Article in English | MEDLINE | ID: mdl-17029140

ABSTRACT

Travelers returning to their country of origin to visit friends and relatives (VFRs) have increased risk of travel-related health problems. We examined GeoSentinel data to compare travel characteristics and illnesses acquired by 3 groups of travelers to low-income countries: VFRs who had originally been immigrants (immigrant VFRs), VFRs who had not originally been immigrants (traveler VFRs), and tourist travelers. Immigrant VFRs were predominantly male, had a higher mean age, and disproportionately required treatment as inpatients. Only 16% of immigrant VFRs sought pretravel medical advice. Proportionately more immigrant VFRs visited sub-Saharan Africa and traveled for >30 days, whereas tourist travelers more often traveled to Asia. Systemic febrile illnesses (including malaria), nondiarrheal intestinal parasitic infections, respiratory syndromes, tuberculosis, and sexually transmitted diseases were more commonly diagnosed among immigrant VFRs, whereas acute diarrhea was comparatively less frequent. Immigrant VFRs and traveler VFRs had different demographic characteristics and types of travel-related illnesses. A greater proportion of immigrant VFRs presented with serious, potentially preventable travel-related illnesses than did tourist travelers.


Subject(s)
Communicable Diseases/classification , Emigration and Immigration , Family , Friends , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Communicable Disease Control , Female , Humans , Infant , Male , Middle Aged , Travel
19.
J Health Serv Res Policy ; 21(3): 166-71, 2016 07.
Article in English | MEDLINE | ID: mdl-26769573

ABSTRACT

OBJECTIVES: Steam sterilization in hospitals is an energy and water intensive process. Our aim was to identify opportunities to improve electricity and water use. The objectives were to find: the time sterilizers spent active, idle and off; the variability in sterilizer use with the time of day and day of the week; and opportunities to switch off sterilizers instead of idling when no loads were waiting, and the resultant electricity and water savings. METHODS: Analyses of routine data for one year of the activity of the four steam sterilizers in one hospital in Melbourne, Australia. We examined active sterilizer cycles, routine sterilizer switch-offs, and when sterilizers were active, idle and off. Several switch-off strategies were examined to identify electricity and water savings: switch off idle sterilizers when no loads are waiting and switch off one sterilizer after 10:00 h and a second sterilizer after midnight on all days. RESULTS: Sterilizers were active for 13,430 (38%) sterilizer-hours, off for 4822 (14%) sterilizer-hours, and idle for 16,788 (48%) sterilizer-hours. All four sterilizers were simultaneously active 9% of the time, and two or more sterilizers were idle for 69% of the time. A sterilizer was idle for two hours or less 13% of the time and idle for more than 2 h 87% of the time. A strategy to switch off idle sterilizers would reduce electricity use by 66 MWh and water use by 1004 kl per year, saving 26% electricity use and 13% of water use, resulting in financial savings of AUD$13,867 (UK£6,517) and a reduction in 79 tonnes of CO2 emissions per year. An alternative switch-off strategy of one sterilizer from 10:00 h onwards and a second from midnight would have saved 30 MWh and 456 kl of water. CONCLUSIONS: The methodology used of how hospital sterilizer use could be improved could be applied to all hospitals and more broadly to other equipment used in hospitals.


Subject(s)
Conservation of Energy Resources , Hospitals , Sterilization , Australia , Electricity , Equipment Reuse , Steam , Water
20.
PLoS Negl Trop Dis ; 10(9): e0005018, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27661978

ABSTRACT

BACKGROUND: Effective response to emerging infectious disease (EID) threats relies on health care systems that can detect and contain localised outbreaks before they reach a national or international scale. The Asia-Pacific region contains low and middle income countries in which the risk of EID outbreaks is elevated and whose health care systems may require international support to effectively detect and respond to such events. The absence of comprehensive data on populations, health care systems and disease characteristics in this region makes risk assessment and decisions about the provision of such support challenging. METHODOLOGY/PRINCIPAL FINDINGS: We describe a mathematical modelling framework that can inform this process by integrating available data sources, systematically explore the effects of uncertainty, and provide estimates of outbreak risk under a range of intervention scenarios. We illustrate the use of this framework in the context of a potential importation of Ebola Virus Disease into the Asia-Pacific region. Results suggest that, across a wide range of plausible scenarios, preemptive interventions supporting the timely detection of early cases provide substantially greater reductions in the probability of large outbreaks than interventions that support health care system capacity after an outbreak has commenced. CONCLUSIONS/SIGNIFICANCE: Our study demonstrates how, in the presence of substantial uncertainty about health care system infrastructure and other relevant aspects of disease control, mathematical models can be used to assess the constraints that limited resources place upon the ability of local health care systems to detect and respond to EID outbreaks in a timely and effective fashion. Our framework can help evaluate the relative impact of these constraints to identify resourcing priorities for health care system support, in order to inform principled and quantifiable decision making.

SELECTION OF CITATIONS
SEARCH DETAIL