Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 88
Filter
1.
Ned Tijdschr Tandheelkd ; 131(2): 51-58, 2024 Feb.
Article in Dutch | MEDLINE | ID: mdl-38318630

ABSTRACT

The diagnosis of an endodontic disease requires thorough research, collecting both clinical and radiographic information. The clinical examination includes history taking, visual inspection of the tooth and surrounding tissues, palpation of the soft and hard tissues, periodontal examination and percussion. The radiographic examination provides valuable information, but can never stand alone in arriving at a diagnosis. It is important to link the findings of the radiographic examination to other information. Sometimes, invasive examination is necessary, during which the coronal restoration is removed to allow better assessment of the tooth. This can provide additional information about the presence of caries, fractures, leakage of the restoration or other reasons for failure of the initial root canal treatment. A good diagnosis is essential for planning successful follow-up treatment.


Subject(s)
Root Canal Therapy , Tooth , Humans , Tooth Root
2.
Chaos ; 33(12)2023 Dec 01.
Article in English | MEDLINE | ID: mdl-38048255

ABSTRACT

Steady states are invaluable in the study of dynamical systems. High-dimensional dynamical systems, due to separation of time scales, often evolve toward a lower dimensional manifold M. We introduce an approach to locate saddle points (and other fixed points) that utilizes gradient extremals on such a priori unknown (Riemannian) manifolds, defined by adaptively sampled point clouds, with local coordinates discovered on-the-fly through manifold learning. The technique, which efficiently biases the dynamical system along a curve (as opposed to exhaustively exploring the state space), requires knowledge of a single minimum and the ability to sample around an arbitrary point. We demonstrate the effectiveness of the technique on the Müller-Brown potential mapped onto an unknown surface (namely, a sphere). Previous work employed a similar algorithmic framework to find saddle points using Newton trajectories and gentlest ascent dynamics; we, therefore, also offer a brief comparison with these methods.

3.
Ethics Med Public Health ; 27: 100876, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36846862

ABSTRACT

Objective: Telehealth has been an integral part of ensuring continued general practice access during the COVID-19 pandemic. Whether telehealth was similarly adopted across different ethnic, cultural, and linguistic groups in Australia is unknown. In this study, we assessed how telehealth utilisation differed by birth country. Methods: In this retrospective observational study, electronic health record data from 799 general practices across Victoria and New South Wales, Australia between March 2020 to November 2021 were extracted (12,403,592 encounters from 1,307,192 patients). Multivariate generalised estimating equation models were used to assess the likelihood of a telehealth consultation (against face-to-face consultation) by birth country (relative to Australia or New Zealand born patients), education index, and native language (English versus others). Results: Patients born in Southeastern Asia (aOR: 0.54; 95% CI: 0.52-0.55), Eastern Asia (aOR: 0.63; 95% CI: 0.60-0.66), and India (aOR: 0.64; 95% CI: 0.63-0.66) had a lower likelihood of having a telehealth consultation compared to those born in Australia or New Zealand. Northern America, British Isles, and most European countries did not present with a statistically significant difference. Additionally, higher education levels (aOR: 1.34; 95% CI: 1.26-1.42) was associated with an increase in the likelihood of a telehealth consultation, while being from a non-English-speaking country was associated with a reduced likelihood (aOR: 0.83; 95% CI: 0.81-0.84). Conclusions: This study provides evidence showing differences in telehealth use associated with birth country. Strategies to ensure continued healthcare access for patients, whose native language is not English, such as providing interpreter services for telehealth consultations, would be beneficial. Perspectives: Understanding cultural and linguistic differences may reduce health disparities in telehealth access in Australia and could present an opportunity to promote healthcare access in diverse communities.

4.
Gerontol Geriatr Med ; 8: 23337214221144192, 2022.
Article in English | MEDLINE | ID: mdl-36568485

ABSTRACT

Background: Adverse incidents are well studied within acute care settings, less so within aged care homes. The aim of this scoping review was to define the types of adverse incidents studied in aged care homes and highlight strengths, gaps, and challenges of this research. Methods: An expanded definition of adverse incidents including physical, social, and environmental impacts was used in a scoping review based on the PRISMA Extension for Scoping Reviews Checklist. MEDLINE, CINAHL, and EBSCOhost were searched for English language, peer-reviewed studies conducted in aged care home settings between 2000 and 2020. Forty six articles across 12 countries were identified, charted, and analyzed using descriptive statistics and narrative summary methods. Results: Quantitative studies (n = 42, 91%) dominated adverse incidents literature. The majority of studies focused on physical injuries (n = 29, 63%), with fewer examining personal/interpersonal (15%) and environmental factors (22%). Many studies did not describe the country's aged care system (n = 26, 56%). Only five studies (11%) included residents' voices. Discussion: This review highlights a need for greater focus on resident voices, qualitative research, and interpersonal/environmental perspectives in adverse event research in aged care homes. Addressing these gaps, future research may contribute to better understanding of adverse incidents within this setting.

5.
BMJ Open ; 12(10): e063179, 2022 10 27.
Article in English | MEDLINE | ID: mdl-36302573

ABSTRACT

OBJECTIVES: Telehealth has emerged as a viable and safe mode of care delivery in Australia during the COVID-19 pandemic. However, electronic general practice data reveal differences in uptake and consultation mode, which we hypothesise may be due to potential barriers impacting on quality of care. We aimed to identify the benefits and barriers of telehealth use in general practice, using an 'Action Research' approach involving general practitioners (GPs) and general practice stakeholders. DESIGN: Qualitative focus group performed within a broader Action Research methodology. SETTING: A focus group was held in August 2021, with general practice participants from Victoria, Australia. PARTICIPANTS: The study consisted of a purposive sample of 11 participants, including GPs (n=4), representatives from three primary health networks (n=4) and data custodian representatives (n=3) who were part of a project stakeholder group guided by an Action Research approach. METHODS: Semistructured interview questions were used to guide focus group discussions via videoconference, which were recorded and transcribed verbatim for analysis. The transcript was analysed using an inductive thematic approach. RESULTS: Emerging themes included evolution of telehealth, barriers to telehealth (privacy, eligibility, technology, quality of care, sociodemographic and residential aged care barriers) and benefits of telehealth (practice, quality of care, sociodemographic and residential aged care benefits). CONCLUSION: The findings highlight a range of barriers to telehealth that impact general practice, but also provide justification for the continuation and development of telehealth. These results provide important context to support data-driven population-based findings on telehealth uptake. They also highlight areas of quality improvement for the enhancement of telehealth as a valuable tool for routine general practice patient care.


Subject(s)
COVID-19 , General Practice , Telemedicine , Humans , Aged , COVID-19/epidemiology , Pandemics , Qualitative Research , Victoria
7.
Ann R Coll Surg Engl ; 2022 Jan 04.
Article in English | MEDLINE | ID: mdl-34981986

ABSTRACT

We report a rare case of adrenal extramedullary haematopoiesis (EMH) in a thalassaemia patient in Cyprus. A 40-year-old woman with ß-thalassaemia presented with a 2-day history of non-specific right-sided abdominal pain on routine follow-up for her thalassaemia treatment. Her laboratory tests were not dissimilar to her routine results and no palpable mass was detected. Computed tomography findings revealed a 5.8×4.2×4.6cm solid lesion in the right adrenal gland. Surgical excision was advised for this symptomatic large tumour with the possibility of malignancy in a young patient, and a laparoscopic adrenalectomy was performed. Postoperative follow-up was uneventful. A review of the literature in PubMed and MEDLINE revealed 14 case reports worldwide with adrenal EMH secondary to ß-thalassaemia. EMH tumours in patients with thalassaemia have been reported incidentally, which stresses the importance of considering this in the list of differentials of adrenal incidentalomas in this patient population.

8.
Appl Ergon ; 98: 103590, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34598079

ABSTRACT

Histopathologists make diagnostic decisions that are thought to be based on pattern recognition, likely informed by cue-based associations formed in memory, a process known as cue utilisation. Typically, the cases presented to the histopathologist have already been classified as 'abnormal' by clinical examination and/or other diagnostic tests. This results in a high disease prevalence, the potential for 'abnormality priming', and a response bias leading to false positives on normal cases. This study investigated whether higher cue utilisation is associated with a reduction in positive response bias in the diagnostic decisions of histopathologists. Data were collected from eighty-two histopathologists who completed a series of demographic and experience-related questions and the histopathology edition of the Expert Intensive Skills Evaluation 2.0 (EXPERTise 2.0) to establish behavioural indicators of context-related cue utilisation. They also completed a separate, diagnostic task comprising breast histopathology images where the frequency of abnormality was manipulated to create a high disease prevalence context for diagnostic decisions relating to normal tissue. Participants were assigned to higher or lower cue utilisation groups based on their performance on EXPERTise 2.0. When the effects of experience were controlled, higher cue utilisation was specifically associated with a greater accuracy classifying normal images, recording a lower positive response bias. This study suggests that cue utilisation may play a protective role against response biases in histopathology settings.


Subject(s)
Cues , Bias , Humans
9.
Hand Surg Rehabil ; 41(1): 125-130, 2022 02.
Article in English | MEDLINE | ID: mdl-34700023

ABSTRACT

Operative repair of flexor tendons after traumatic injury may be performed under general anesthesia (GA), regional blocks, or a wide-awake local anesthesia no tourniquet (WALANT) technique. To our knowledge there are currently no large-scale reports evaluating outcomes of flexor tendon repair in patients where wide-awake anesthesia was utilized in comparison to regional anesthesia (RA) and general anesthesia. We performed a retrospective analysis of patients who underwent treatment for flexor tendon injuries at a tertiary referral center for hand surgery over a two-year period. A total of 151 patients were included (53 WALANT, 57 RA, and 41 GA) and a total of 251 tendons were repaired (63 WALANT, 104 RA and 84 GA). No statistically significant difference was observed in rates of tendon rupture, adhesions, infection, or hand function. Flexor tendon repair under WALANT is found to be safe and presents comparable operative and functional outcomes to more traditional anesthetic techniques. Additional advantages, including the ability to test the repair intraoperatively, patient education, and the potential for boosting theatre efficiency. Further studies, preferably utilizing a randomized trial methodology, may further elucidate the benefits and risks of WALANT versus regional and general anesthesia.


Subject(s)
Anesthesia, Local , Anesthetics, Local , Anesthesia, General , Anesthesia, Local/methods , Humans , Retrospective Studies , Tendons/surgery
10.
BMJ Open ; 11(7): e046865, 2021 07 05.
Article in English | MEDLINE | ID: mdl-34226221

ABSTRACT

BACKGROUND AND OBJECTIVE: Serum iron results are not indicative of iron deficiency yet may be incorrectly used to diagnose iron deficiency instead of serum ferritin results. Our objective was to determine the association between serum iron test results and iron-deficiency diagnosis in children by general practitioners. DESIGN, SETTING, PATIENTS AND MAIN OUTCOME MEASURES: A retrospective observational study of 14 187 children aged 1-18 years with serum ferritin and serum iron test results from 137 general practices in Victoria, Australia, between 2008 and 2018. Generalised estimating equation models calculating ORs were used to determine the association between serum iron test results (main exposure measure) and iron-deficiency diagnosis (outcome measure) in the following two population groups: (1) iron-deplete population, defined as having a serum ferritin <12 µg/L if aged <5 years and <15 µg/L if aged ≥5 years and (2) iron-replete population, defined as having a serum ferritin >30 µg/L. RESULTS: 3484 tests were iron deplete and 15 528 were iron replete. Iron-deplete children were less likely to be diagnosed with iron deficiency if they had normal serum iron levels (adjusted OR (AOR): 0.73; 95% CI 0.57 to 0.96). Iron-replete children had greater odds of an iron-deficiency diagnosis if they had low serum iron results (AOR: 2.59; 95% CI 1.72 to 3.89). Other contributors to an iron-deficiency diagnosis were female sex and having anaemia. CONCLUSION: Serum ferritin alone remains the best means of diagnosing iron deficiency. Reliance on serum iron test results by general practitioners is leading to significant overdiagnosis and underdiagnosis of iron deficiency in children.


Subject(s)
Anemia, Iron-Deficiency , Anemia, Iron-Deficiency/diagnosis , Anemia, Iron-Deficiency/epidemiology , Child , Female , Ferritins , Humans , Iron , Retrospective Studies , Victoria
11.
Arch Oral Biol ; 129: 105167, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34126418

ABSTRACT

OBJECTIVES: The aim of this systematic review was to summarize the existing evidence on the local production and systemic traces of reactive oxygen species (ROS) in apical periodontitis (AP). DESIGN: A search of MEDLINE-PubMed and EMBASE was conducted up to January 12 of 2021 to identify studies in 6 different languages. Eligibility was evaluated and data were extracted from the eligible studies following the predefined objective. The Newcastle-Ottawa Scale was used for quality assessment of the included studies. RESULTS: After screening, 21 papers met the inclusion criteria. Six studies were about systemic oxidative stress, 14 studies examined local production of reactive oxygen species and one studied both. ROS modulate cell signalling and cause oxidant imbalance locally at the site of AP. Cell signalling leads to a pro-inflammatory response, activation of MMPs and formation and progression of the AP lesion. Simultaneously, these oxidative stress biomarkers are also found in blood and saliva of subjects with AP. CONCLUSIONS: Understanding the mechanism of ROS generation, involved in chronic inflammation, can provide us with important information to enhance local and systemic healing and possibly improve diagnostic tools. Future research considerations would be to use antioxidants to accelerate the return to oxidative balance.


Subject(s)
Periapical Periodontitis , Antioxidants , Biomarkers , Humans , Oxidative Stress , Reactive Oxygen Species
12.
Contemp Clin Trials Commun ; 21: 100686, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33490705

ABSTRACT

Increased systemic inflammation has been identified in presence of oral disease, specifically endodontic disease. It is important to investigate whether treatment of the oral disease ameliorates systemic inflammation. Furthermore, there is no information about the extent to which different microorganisms may trigger inflammatory response. OBJECTIVES: Primarily (i) to compare the plasma concentrations of inflammatory mediators of apical periodontitis (AP) subjects to controls, (ii) to evaluate whether elimination of the endodontic infection reduces systemic inflammation (iii) to investigate the microbiome of root canal infections. Secondarily i) to correlate the inflammatory mediator data with the microbiome data to investigate whether the type of infection influences the type and severity of the inflammatory condition ii) to examine patterns in the inflammatory mediator data before and after tooth extraction in order to establish a biomarker signature of AP/oral disease.This is a multi-centre prospective case-control intervention study. The cohort will consist of 30 healthy human volunteers with one or two teeth with a root-tip inflammation and 30 matched healthy controls. Peripheral blood will be drawn at 6 time points, 3 before and 3 after the extraction of the tooth with apical periodontitis. The teeth will be pulverized, DNA extraction and sequencing will be performed.This study aims to compare the concentration of inflammatory blood plasma proteins in between AP-subjects and controls at different time points before and after the tooth extraction in a systematic and complete way. Additionally the composition of the root canal microbiome in association with the inflammatory response of the host will be assessed.

13.
BMC Health Serv Res ; 20(1): 883, 2020 Sep 18.
Article in English | MEDLINE | ID: mdl-32948168

ABSTRACT

BACKGROUND: Internationally, point prevalence surveys are the main source of antibiotic use data in residential aged care (RAC). Our objective was to describe temporal trends in antibiotic use and antibiotics flagged for restricted use, resident characteristics associated with use, and variation in use by RAC home, using electronic health record data. METHODS: We conducted a retrospective cohort study of 9793 unique residents aged ≥65 years in 68 RAC homes between September 2014 and September 2017, using electronic health records. We modelled the primary outcome of days of antibiotic therapy /1000 resident days (DOT/1000 days), and secondary outcomes of number of courses/1000 days and the annual prevalence of antibiotic use. Antibiotic use was examined for all antibiotics and antibiotics on the World Health Organization's (WHO) Watch List (i.e. antibiotics flagged for restricted use). RESULTS: In 2017, there were 85 DOT/1000 days (99% CI: 79, 92), 8.0 courses/1000 days (99% CI: 7.6, 8.5), and 63.4% (99% CI: 61.9, 65.0) of residents received at least one course of antibiotics. There were 7.7 DOT/1000 days (99% CI: 6.69, 8.77) of antibiotics on the WHO Watch List administered in 2017. Antibiotic use increased annually by 4.09 DOT/1000 days (99% CI: 1.18, 6.99) before adjusting for resident factors, and 3.12 DOT/1000 days (99% CI: - 0.05, 6.29) after adjustment. Annual prevalence of antibiotic use decreased from 68.4% (99% CI: 66.9, 69.9) in 2015 to 63.4% (99% CI: 61.9, 65.0) in 2017, suggesting fewer residents were on antibiotics, but using them for longer. Resident factors associated with higher use were increasing age; chronic respiratory disease; a history of urinary tract infections, and skin and soft tissue infections; but dementia was associated with lower use. RAC home level antibiotic use ranged between 44.0 to 169.2 DOT/1000 days in 2016. Adjusting for resident factors marginally reduced this range (42.6 to 155.5 DOT/1000 days). CONCLUSIONS: Antibiotic course length and RAC homes with high use should be a focus of antimicrobial stewardship interventions. Practices in RAC homes with low use could inform interventions and warrant further investigation. This study provides a model for using electronic health records as a data source for antibiotic use surveillance in RAC.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Drug Utilization/statistics & numerical data , Electronic Health Records , Homes for the Aged/statistics & numerical data , Nursing Homes/statistics & numerical data , Aged , Aged, 80 and over , Antimicrobial Stewardship/statistics & numerical data , Australia , Female , Humans , Male , Retrospective Studies , Urinary Tract Infections/drug therapy
14.
Yearb Med Inform ; 26(1): 59-67, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28480477

ABSTRACT

Objectives: To set the scientific context and then suggest principles for an evidence-based approach to secondary uses of clinical data, covering both evaluation of the secondary uses of data and evaluation of health systems and services based upon secondary uses of data. Method: Working Group review of selected literature and policy approaches. Results: We present important considerations in the evaluation of secondary uses of clinical data from the angles of governance and trust, theory, semantics, and policy. We make the case for a multi-level and multi-factorial approach to the evaluation of secondary uses of clinical data and describe a methodological framework for best practice. We emphasise the importance of evaluating the governance of secondary uses of health data in maintaining trust, which is essential for such uses. We also offer examples of the re-use of routine health data to demonstrate how it can support evaluation of clinical performance and optimize health IT system design. Conclusions: Great expectations are resting upon "Big Data" and innovative analytics. However, to build and maintain public trust, improve data reliability, and assure the validity of analytic inferences, there must be independent and transparent evaluation. A mature and evidence-based approach needs not merely data science, but must be guided by the broader concerns of applied health informatics.


Subject(s)
Medical Informatics Applications , Medical Records , Humans , Reproducibility of Results
15.
Yearb Med Inform ; (1): 61-69, 2016 Nov 10.
Article in English | MEDLINE | ID: mdl-27830232

ABSTRACT

BACKGROUND AND OBJECTIVES: With growing use of IT by healthcare professionals and patients, the opportunity for any unintended effects of technology to disrupt care health processes and outcomes is intensified. The objectives of this position paper by the IMIA Working Group (WG) on Technology Assessment and Quality Development are to highlight how our ongoing initiatives to enhance evaluation are also addressing the unintended consequences of health IT. METHODS: Review of WG initiatives Results: We argue that an evidence-based approach underpinned by rigorous evaluation is fundamental to the safe and effective use of IT, and for detecting and addressing its unintended consequences in a timely manner. We provide an overview of our ongoing initiatives to strengthen study design, execution and reporting by using evaluation frameworks and guidelines which can enable better characterization and monitoring of unintended consequences, including the Good Evaluation Practice Guideline in Health Informatics (GEP-HI) and the Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI). Indicators to benchmark the adoption and impact of IT can similarly be used to monitor unintended effects on healthcare structures, processes and outcome. We have also developed EvalDB, a web-based database of evaluation studies to promulgate evidence about unintended effects and are developing the content for courses to improve training in health IT evaluation. CONCLUSION: Evaluation is an essential ingredient for the effective use of IT to improve healthcare quality and patient safety. WG resources and skills development initiatives can facilitate a proactive and evidence-based approach to detecting and addressing the unintended effects of health IT.


Subject(s)
Medical Informatics , Technology Assessment, Biomedical/standards , Telemedicine , Humans , Medical Informatics/education , Organizational Policy , Patient Safety , Societies, Medical , Technology Assessment, Biomedical/methods
16.
BMJ Open ; 6(10): e011811, 2016 10 21.
Article in English | MEDLINE | ID: mdl-27797997

ABSTRACT

INTRODUCTION: Medication errors are the most frequent cause of preventable harm in hospitals. Medication management in paediatric patients is particularly complex and consequently potential for harms are greater than in adults. Electronic medication management (eMM) systems are heralded as a highly effective intervention to reduce adverse drug events (ADEs), yet internationally evidence of their effectiveness in paediatric populations is limited. This study will assess the effectiveness of an eMM system to reduce medication errors, ADEs and length of stay (LOS). The study will also investigate system impact on clinical work processes. METHODS AND ANALYSIS: A stepped-wedge cluster randomised controlled trial (SWCRCT) will measure changes pre-eMM and post-eMM system implementation in prescribing and medication administration error (MAE) rates, potential and actual ADEs, and average LOS. In stage 1, 8 wards within the first paediatric hospital will be randomised to receive the eMM system 1 week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses and record reviews. Actual and potential severity will be assigned. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews and direct observation of clinicians will investigate the effects of the system on workflow. Data from site 1 will be used to develop improvements in the eMM and implemented at site 2, where the SWCRCT design will be repeated (stage 2). ETHICS AND DISSEMINATION: The research has been approved by the Human Research Ethics Committee of the Sydney Children's Hospitals Network and Macquarie University. Results will be reported through academic journals and seminar and conference presentations. TRIAL REGISTRATION NUMBER: Australian New Zealand Clinical Trials Registry (ANZCTR) 370325.


Subject(s)
Drug Monitoring/methods , Drug-Related Side Effects and Adverse Reactions/prevention & control , Electronics, Medical , Hospitals, Pediatric , Length of Stay , Medication Errors/prevention & control , Medication Systems, Hospital , Child , Humans , Pediatrics , Pharmaceutical Preparations , Research Design
17.
J Hosp Infect ; 92(4): 392-6, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26876747

ABSTRACT

BACKGROUND: In June 2014, a cluster of identical S. maltophilia isolates was reported in an adult intensive care unit (ICU) at a district general hospital. An outbreak control team was convened to investigate the cluster and inform control measures. AIM: To identify potential risk factors for isolation of S. maltophilia in this setting. METHODS: We conducted a cohort study of ICU patients for whom a bronchoalveolar lavage (BAL) specimen was submitted between October 2013 and October 2014. Cases were patients with S. maltophilia-positive BAL. We calculated the association between isolation of S. maltophilia and patient characteristics using risk ratios (RRs) with 95% confidence intervals (95% CIs) and univariate logistic regression. Chi-squared or Fisher's exact tests were used. BAL specimens were microbiologically typed using pulse-field gel electrophoresis (PFGE). FINDINGS: Eighteen patients met the case definition. Two patients had clinical presentations that warranted antibiotic treatment for S. maltophilia. All cases were exposed to bronchoscopy. PFGE typing revealed clusters of two strain types. We found statistically significant elevated risks of isolating BRISPOSM-4 in patients exposed to bronchoscope A (RR: 13.56; 95% CI: 1.82-100; P < 0.001) and BRISPOSM-3 in patients exposed to bronchoscope B (RR: 16.89; 95% CI: 2.14-133; P < 0.001). S. maltophilia type BRISPOSM-4 was isolated in water used to flush bronchoscope A after decontamination. CONCLUSION: Two pseudo-outbreaks occurred in which BAL specimens had been contaminated by reusable bronchoscopes. We cannot exclude the potential for colonization of the lower respiratory tract of exposed patients. Introduction of single-use bronchoscopes was an effective control measure.


Subject(s)
Bronchoscopes/microbiology , Cross Infection/epidemiology , Disease Outbreaks , Gram-Negative Bacterial Infections/epidemiology , Stenotrophomonas maltophilia/isolation & purification , Adult , Aged , Aged, 80 and over , Bronchoalveolar Lavage Fluid/microbiology , Cohort Studies , Cross Infection/microbiology , England/epidemiology , Female , Gram-Negative Bacterial Infections/microbiology , Hospitals, District , Hospitals, General , Humans , Intensive Care Units , Male , Middle Aged , Risk Assessment
18.
Appl Clin Inform ; 6(3): 443-53, 2015.
Article in English | MEDLINE | ID: mdl-26448790

ABSTRACT

OBJECTIVES: To assess the impact of introducing a new Picture Archiving and Communication System (PACS) and Radiology Information System (RIS) on: (i) Medical Imaging work processes; and (ii) turnaround times (TATs) for x-ray and CT scan orders initiated in the Emergency Department (ED). METHODS: We employed a mixed method study design comprising: (i) semi-structured interviews with Medical Imaging Department staff; and (ii) retrospectively extracted ED data before (March/April 2010) and after (March/April 2011 and 2012) the introduction of a new PACS/RIS. TATs were calculated as: processing TAT (median time from image ordering to examination) and reporting TAT (median time from examination to final report). RESULTS: Reporting TAT for x-rays decreased significantly after introduction of the new PACS/RIS; from a median of 76 hours to 38 hours per order (p<.0001) for patients discharged from the ED, and from 84 hours to 35 hours (p<.0001) for patients admitted to hospital. Medical Imaging staff reported that the changeover to the new PACS/RIS led to gains in efficiency, particularly regarding the accessibility of images and patient-related information. Nevertheless, assimilation of the new PACS/RIS with existing Departmental work processes was considered inadequate and in some instances unsafe. Issues highlighted related to the synchronization of work tasks (e.g., porter arrangements) and the material set up of the work place (e.g., the number and location of computers). CONCLUSIONS: The introduction of new health IT can be a "double-edged sword" providing improved efficiency but at the same time introducing potential hazards affecting the effectiveness of the Medical Imaging Department.


Subject(s)
Diagnostic Imaging , Medical Informatics/organization & administration , Radiology Information Systems , Workflow , Access to Information , Humans , Research Design , Time Factors
19.
Br J Anaesth ; 115(4): 601-7, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26385668

ABSTRACT

BACKGROUND: The intensive care unit (ICU) is a high-cost high-risk area in which analysis of previous litigation cases may help to improve future practice. METHODS: Claims received by the National Health Service Litigation Authority (NHSLA) relating to ICU, from 1995 to 2012, were analysed according to clinical categories. The severity of outcome was classified as none, low, moderate, severe and death. The cost of the claims were corrected to 2013 values using the retail prices index. RESULTS: Of the 523 claims in the NHSLA dataset, 210 were excluded (as the claims did not relate to ICU care) and 313 were included in the analysis. The commonest claim categories were [number (% of all claims)]: positioning/nursing standards/skin care (mostly relating to pressure sores)- 86 (28%), infection- 79 (26%), respiratory/airway- 63 (20%). The commonest claims relating to patient death were: respiratory/airway (30%), missed/delayed diagnosis (20%) and paediatrics (17%). The claims categories with the highest proportion of severe outcomes were: positioning/nursing standards/skin care (52%) and infection (33%). The total cost of closed claims was £19,973,339. The categories incurring the highest costs were: infection (£6.6 million), positioning/skin care/nursing standards (£4.5 million), delayed/inadequate treatment (£4.3 million). CONCLUSIONS: Litigation arising from care in the ICU is common, costly, and is likely to follow a poor outcome. Whilst the importance of airway/respiratory care and infection control measures are highlighted, the clear prominence of pressure sores in ICU-related litigation is worrisome and represents one particular area for practice improvement.


Subject(s)
Intensive Care Units/legislation & jurisprudence , Intensive Care Units/statistics & numerical data , Malpractice/legislation & jurisprudence , Malpractice/statistics & numerical data , National Health Programs/legislation & jurisprudence , National Health Programs/statistics & numerical data , Child , England , Humans
20.
J Viral Hepat ; 22(12): 1079-87, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26146764

ABSTRACT

The kinetics of serum HBsAg and interferon-inducible protein 10 (IP10) levels in patients with chronic hepatitis B infection treated with tenofovir are unclear. We evaluated the changes of HBsAg levels and the predictability of IP10 for HBsAg decline in 160 HBeAg-negative patients receiving tenofovir for ≥12 months. Serum samples taken before and at 6, 12, 24, 36 and 48 months after tenofovir were tested for HBsAg levels. In 104 patients, serum samples before tenofovir were tested for IP10 levels. Compared to before tenofovir, HBsAg levels decreased by a median of 0.08, 0.11, 0.24, 0.33 and 0.38 log10 IU/mL at 6, 12, 24, 36 and 48 months, respectively (P < 0.001). HBsAg kinetics did not differ between nucleos(t)ide analogue(s) naive and experienced patients. The 12-, 24-, 36- and 48-month cumulative rates of ≥0.5 log10 HBsAg decline were 8%, 16%, 24% and 41% and of HBsAg ≤100 IU/mL were 9%, 12%, 14% and 18%, respectively. The only factor associated with HBsAg ≤100 IU/mL was lower HBsAg levels before tenofovir (P < 0.001), while HBsAg decline ≥0.5 log10 was associated with higher IP10 levels (P = 0.002) and particularly with IP10 > 350 pg/mL (P < 0.001). In conclusion, tenofovir decreases serum HBsAg levels in both nucleos(t)ide analogue(s) naive and experienced patients with HBeAg-negative chronic hepatitis B infection. After 4 years of therapy, HBsAg ≤100 IU/mL can be achieved in approximately 20% of patients, particularly in those with low baseline HBsAg levels. HBsAg decline is slow (≥0.5 log10 in 40% of patients after 4 years) and is associated only with higher baseline serum IP10 levels.


Subject(s)
Chemokine CXCL10/blood , Hepatitis B Surface Antigens/blood , Hepatitis B e Antigens/blood , Hepatitis B, Chronic/drug therapy , Tenofovir/therapeutic use , Antiviral Agents/therapeutic use , DNA, Viral/blood , Female , Hepatitis B Surface Antigens/immunology , Hepatitis B e Antigens/immunology , Hepatitis B virus/genetics , Hepatitis B virus/immunology , Hepatitis B, Chronic/blood , Hepatitis B, Chronic/immunology , Humans , Male , Middle Aged
SELECTION OF CITATIONS
SEARCH DETAIL
...