RESUMO
Purpose: To investigate the effect of obesity on mortality and invasive respiratory care (IRC) in patients with COVID-19. Methods: We studied 1,105 patients for 34 months and collected data. The primary outcome was all-cause death at 29 days. The secondary outcome was IRC indicated by a pulse oximetry rate below 93% at a mask oxygenation rate of 5 L/min or more. Results: Age- and sex-adjusted multivariate regression analysis for 29-day deaths showed the significance of body mass index (BMI) > 19.6 kg/m2 (odds ratio 0.117, 95% confidence interval 0.052-0.265, P<0.001). The graphs with BMI in the abscissa showed, within a BMI between 11 and 25 kg/m2, a decreasing pattern for mortality and IRC rate, and no increase in overweight. Conclusion: In Japanese COVID-19 patients, the risk of mortality and the IRC rate decreased in underweight patients and remained low in overweight patients, suggesting the importance of the obesity paradox.
Assuntos
COVID-19 , Sobrepeso , Humanos , Sobrepeso/complicações , Sobrepeso/epidemiologia , Resultado do Tratamento , Paradoxo da Obesidade , População do Leste Asiático , COVID-19/epidemiologia , COVID-19/complicações , Índice de Massa Corporal , Fatores de Risco , Estudos RetrospectivosRESUMO
Clostridioides (Clostridium) difficile is the leading cause of healthcare-associated infectious diarrhea in the developed world. Retrospective studies have shown a lower incidence of C. difficile infection (CDI) in Japan than in Europe or North America. Prospective studies are needed to determine if this is due lack of testing for C. difficile or a true difference in CDI epidemiology. A prospective cohort study of CDI was conducted from May 2014 to May 2015â¯at 12 medical facilities (20 wards) in Japan. Patients with at least three diarrheal bowel movements (Bristol stool grade 6-7) in the preceding 24â¯h were enrolled. CDI was defined by positive result on enzyme immunoassay for toxins A/B, nucleic acid amplification test for the toxin B gene or toxigenic culture. C. difficile isolates were subjected to PCR-ribotyping (RT), slpA-sequence typing (slpA-ST), and antimicrobial susceptibility testing. The overall incidence of CDI was 7.4/10,000 patient-days (PD). The incidence was highest in the five ICU wards (22.2 CDI/10,000 PD; range: 13.9-75.5/10,000 PD). The testing frequency and CDI incidence rate were highly correlated (R2â¯=â¯0.91). Of the 146 isolates, RT018/018â³ was dominant (29%), followed by types 014 (23%), 002 (12%), and 369 (11%). Among the 15 non-ICU wards, two had high CDI incidence rates (13.0 and 15.9 CDI/10,000 PD), with clusters of RT018/slpA-ST smz-02 and 018"/smz-01, respectively. Three non-RT027 or 078 binary toxin-positive isolates were found. All RT018/018" isolates were resistant to moxifloxacin, gatifloxacin, clindamycin, and erythromycin. This study identified a higher CDI incidence in Japanese hospitals than previously reported by actively identifying and testing patients with clinically significant diarrhea. This suggests numerous patients with CDI are being overlooked due to inadequate diagnostic testing in Japan.
Assuntos
Clostridioides difficile , Infecções por Clostridium/epidemiologia , Infecções por Clostridium/microbiologia , Antibacterianos/farmacologia , Antibacterianos/uso terapêutico , Clostridioides difficile/classificação , Clostridioides difficile/efeitos dos fármacos , Clostridioides difficile/genética , Geografia Médica , Humanos , Incidência , Japão/epidemiologia , Testes de Sensibilidade Microbiana , Tipagem Molecular , Vigilância em Saúde Pública , Estudos Retrospectivos , RibotipagemRESUMO
BACKGROUND: The optimal and practical laboratory diagnostic approach for detection of Clostridioides difficile to aid in the diagnosis of C. difficile infection (CDI) is controversial. A two-step algorithm with initial detection of glutamate dehydrogenase (GDH) or nucleic acid amplification test (NAAT) alone are recommended as a predominant method for C. difficile detection in developed countries. The aim of this study was to compare the performance of enzyme immunoassays (EIA) detecting toxins A and B, NAAT detecting the toxin B gene, and GDH compared to toxigenic culture (TC) for C. difficile as the gold standard, in patients prospectively and actively assessed with clinically significant diarrhea in 12 medical facilities in Japan. METHODS: A total of 650 stool specimens were collected from 566 patients with at least three diarrheal bowel movements (Bristol stool grade 6-7) in the preceding 24â¯h. EIA and GDH were performed at each hospital, and NAAT and toxigenic C. difficile culture with enriched media were performed at the National Institute of Infectious Diseases. All C. difficile isolates recovered were analyzed by PCR-ribotyping. RESULTS: Compared to TC, the sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of EIA were 41%, 96%, 75% and 84%, respectively, and for NAAT were 74%, 98%, 91%, and 92%, respectively. In 439 specimens tested with GDH, the sensitivity, specificity, PPV, and NPV were 73%, 87%, 65%, and 91%, and for an algorithm (GDH plus toxin EIA, arbitrated by NAAT) were 71%, 96%, 85%, and 91%, respectively. Among 157 isolates recovered, 75% of isolates corresponded to one of PCR-ribotypes (RTs) 002, 014, 018/018", and 369; RT027 was not isolated. No clear differences in the sensitivities of any of EIA, NAAT and GDH for four predominant RTs were found. CONCLUSION: The analytical sensitivities of NAAT and GDH-algorithm to detect toxigenic C. difficile in this study were lower than most previous reports. This study also found low PPV of EIAs. The optimal method to detect C. difficile or its toxins to assist in the diagnosis of CDI needs further investigation.
Assuntos
Técnicas Bacteriológicas , Clostridioides difficile/genética , Infecções por Clostridium/diagnóstico , Infecções por Clostridium/microbiologia , Toxinas Bacterianas/genética , Técnicas Bacteriológicas/métodos , Técnicas Bacteriológicas/normas , Clostridioides difficile/classificação , Clostridioides difficile/isolamento & purificação , Infecções por Clostridium/epidemiologia , Feminino , Humanos , Japão/epidemiologia , Masculino , Reação em Cadeia da Polimerase , Estudos Prospectivos , Ribotipagem , Sensibilidade e EspecificidadeRESUMO
This study was performed to elucidate the relationship between antimicrobial use density (AUD) and Clostridium difficile infection (CDI) manifesting as antimicrobial-associated diarrhea (AAD) in hospital wards during a 4-year period. Case definition of CDI was an adult exhibiting AAD with a daily stool frequency of three or more, arising at least 48 hours after ward admission, and fecal samples testing positive for toxin (A and/or B). Metronidazole or vancomycin was orally administered as treatment. AUDs were calculated for a total of 21 antimicrobials in a span of 48 months and nine wards. We included the average value of AUDs, representing two succeeding months of sample submission into the sample information. We also entered data on the 2-year division and intensified contact precaution for statistical analysis. Of a total of 463 cases, 95 (20.5%) were CDI-positive. Multivariate regression analysis showed odds ratios [OR] of 1.739 (95% confidence interval [CI] of 1.050 - 2.881, P = 0.032) and 1.598 (95% CI of 1.006 -2.539, P = 0.047) for clindamycin and piperacillin, respectively in AUD. Thus increased ward AUDs of clindamycin and piperacillin may run the risk of CDI.
Assuntos
Anti-Infecciosos/efeitos adversos , Clostridioides difficile , Infecções por Clostridium/induzido quimicamente , Diarreia/induzido quimicamente , Humanos , Modelos Logísticos , RiscoRESUMO
BACKGROUND: Casirivimab-imdevimab has been developed to neutralize SARS-CoV-2. The global clinical trials in outpatients documented several adverse effects (AE), which mandate caution in Japan where part of patients return home. To investigate post-infusion clinical events and their risk factors, we attempted a retrospective study. MAIN BODY: Subjects were a consecutive series of inpatients with COVID-19 undergoing an infusion of casirivimab-imdevimab in our institute. The criteria for administration were in accordance with previous clinical trials, e.g., exclusion of patients necessitating oxygen supply. In Japan, however, SARS-CoV-2 vaccinees were eligible. Methods were review of background factors of status, imaging, and laboratory findings for the outcome of post-infusion events such as temperature increase (Temp+), pulse oximetry below 94%, and other events. Also, we documented the drug efficacy. Of a total of 96 patients with a median follow-up of 54 days, one (1.0%) died who alone was an exception demanding oxygen supply. Other 95 patients (99.0%) recovered from fever and hypoxia by Day 4 and later had no worsening of COVID-19. Median increase of body temperature was 1.0 degrees Celsius, which was used for computation of Temp+. Multivariate analysis showed that for Temp+ (n = 47), white blood cell counts more than 4.3 × 103/microliter (Odds Ratio [OR] 2.593, 95% Confidence Interval [CI] 1.060-6.338, P = 0.037) was at risk, whereas 2-time vaccination for SARS-CoV-2 (OR 0.128, 95% CI 0.026-0.636, P = 0.012) was a preventing factor. Likewise for lowered oximetry (n = 21), CT showing bilateral ground glass attenuation (OR 5.544, CI 1.599-19.228, P = 0.007) was a significant risk factor. Two patients (2.1%) showed bradycardia (asymptomatic, intervention not indicated) on Day 3 and recovery on Day 5. Limitations for this study included the difficulty distinguishing AE from worsening of COVID-19, thus we documented as clinical events. CONCLUSIONS: For 24 h after infusion of casirivimab-imdevimab, COVID-19 patients with increased white blood cell counts may be predisposed to temperature elevation more than 1.0 degrees centigrade, as may bilateral ground glass opacity to lowered oximetry. Thus, patients with leukocytosis and bilateral ground glass attenuation may need precaution for transient fever and hypoxia, respectively.
RESUMO
We investigated the relation between hospital antimirobial use density (AUD) and minimum inhibitory concentrations (MIC) for Pseudomonas aeruginosa in four community hospitals. Subjects were a total of 476 strains isolated from urine, sputum, and pus during a total of seven years since 2002, for which 50- and 90-percentile MICs were analyzed. Hospitals A, B, and C moved in 2000, 2005, and 2009, respectively, but MIC50 and MIC90 were stable. MIC values showed significance in five drugs, in which Hospital B showed maximal values in five and Hospital D showed minimal values in four drugs. AUD values were different in nine drugs, Hospital B showing the highest data in meropenem, flomoxef, and sulbactam/cefoperazone while Hospital D having the lowest data in meropenem, ceftazidime, cefotaxime, and sulbactam/cefoperazone. Thus MIC for P aeruginosa may show resistance in the presence of high AUD with wide antimicrobial spectrum.
Assuntos
Antibacterianos/farmacologia , Anti-Infecciosos , Ceftazidima/farmacologia , Ciprofloxacina/farmacologia , Uso de Medicamentos/estatística & dados numéricos , Hospitais Comunitários/estatística & dados numéricos , Pseudomonas aeruginosa/efeitos dos fármacos , Pseudomonas aeruginosa/isolamento & purificação , Tienamicinas/farmacologia , Farmacorresistência Bacteriana , Humanos , Japão , Meropeném , Fatores de TempoRESUMO
We aimed to evaluate the risk factors, including the hospital epidemiology of methicillin-resistant Staphylococcus aureus (MRSA), for central venous line-associated and laboratory-confirmed bloodstream infections (CLA-BSI and LC-BSI, respectively). The risk factors examined included the age and sex of patients, whether or not they were in the surgery service, the number of days of central line (CL) placement, the monthly number of inpatients and those positive for MRSA, and whether the standard or maximal barrier precautions were observed at CL insertion. As the outcome factors, we selected CLA-BSI and LC-BSI, while precluding repeated isolation within 28 days. Of a total of 22,723 device days in 927 patients with CL placement, we observed 81 CLA-BSIs and 40 LC-BSIs, rates of 3.56 and 1.76 (/1000 device-days), respectively. Logistic regression analysis revealed a single significant factor, CL placement of more than 30 days, with odds ratios of 3.038 [95% confidence interval (CI) 1.733-5.326; P < 0.001] for CLA-BSI and 3.227 (95% CI 1.427-7.299; P = 0.005) for LC-BSI. Both BSIs included MRSA in seven events without temporal clusters. We conclude that the factor of long CL placement outweighs other risk factors, including the hospital epidemiology of MRSA.
Assuntos
Bacteriemia/epidemiologia , Cateterismo Venoso Central/efeitos adversos , Infecção Hospitalar/epidemiologia , Staphylococcus aureus Resistente à Meticilina , Infecções Estafilocócicas/epidemiologia , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Bacteriemia/microbiologia , Criança , Pré-Escolar , Infecção Hospitalar/microbiologia , Feminino , Humanos , Lactente , Recém-Nascido , Masculino , Pessoa de Meia-Idade , Fatores de Risco , Infecções Estafilocócicas/microbiologia , Adulto JovemRESUMO
PURPOSE: To reduce Clostridioides difficile infection (CDI), we implemented interprofessional antimicrobial, infection control, and diagnostic stewardship (ipAS) conducted by physicians/pharmacists, infection control nurses, and medical technologists, respectively. As a numerical indicator for ipAS, we used antimicrobial use density (AUD) in an 8-year study to validate its efficacy in CDI reduction. PATIENTS AND METHODS: This was an observational study. CDI was defined as stool samples or C. difficile isolates containing toxin A and/or B from a patient with diarrhea occurring three or more times per day. From 2011-2018 at a 10-ward single site the subjects were in-patients with CDI, and the following data were collected: AUDs for 23 antibiotics, and antimicrobial test results. By 2015, we had established ipAS, consisting of culture submission before the administration of broad-spectrum antimicrobials, the promotion of point-of-care testing for diagnosis-based antimicrobials, perioperative prophylactic antibiotics, intervention at positive diagnosis of blood culture, team round for diarrhea, and inspection on contact precautions and disinfection in CDI cases. The study outcomes included annual numbers of CDI patients and blood culture sets. We compared annual AUDs between former (2011-14) and latter (2015-18) periods using Kruskal-Wallis tests and examined the correlation between AUDs and CDI numbers. RESULTS: Of a total 50,970 patients, 1,750 patients underwent C. difficile toxin tests, of whom 171 patients (9.8%) were positive for CDI. Between the former and latter periods, AUDs for flomoxef (11.96 to 2.71 by medians), panipenem/betamipron (0.30 to 0.00), and clindamycin (3.87 to 2.19) significantly decreased (P<0.05) as did numbers of CDIs (26.5 to 10) (P=0.043). The correlation analysis revealed a significant correlation between AUD for flomoxef and CDIs (P=0.004) and the AUD for piperacillin/tazobactam and CDIs (P=0.010) with a positive Pearson r. CONCLUSION: The integrated antimicrobial, diagnostic, and infection control approach used in ipAS may reduce CDIs.
RESUMO
The aim of this study was to elucidate risk factors, including ward antimicrobial use density (AUD), for central line-associated bloodstream infection (CLABSI) as defined by the Centers for Disease Control and Prevention in a 430-bed community hospital using central venous lines with closed-hub systems. We calculated AUD as (total dose)/(defined daily dose × patient days) ×1,000 for a total of 20 drugs, nine wards, and 24 months. Into each line day data, we inputed AUD and device utilization ratios, number of central line days, and CLABSI. The ratio of susceptible strains in isolates were subjected to correlation analysis with AUD. Of a total of 9,997 line days over 24 months, CLABSI was present in 33 cases (3.3 ), 14 (42.4%) of which were on surgical wards out of nine wards. Of a total of 43 strains isolated, eight (18.6%) were methicillin-resistant Staphylococcus aureus (MRSA); none of the MRSA-positive patients had received cefotiam before the onset of infection. Receiver-operating characteristic analysis showed that central line day 7 had the highest accuracy. Logistic regression analysis showed the central line day showed an odds ratio of 5.511 with a 95% confidence interval of 1.936-15.690 as did AUD of cefotiam showing an odds ratio of 0.220 with 95% confidence interval of 0.00527-0.922 (P=0.038). Susceptible strains ratio and AUD showed a negative correlation (R (2)=0.1897). Thus, CLABSI could be prevented by making the number of central line days as short as possible. The preventative role of AUD remains to be investigated.
RESUMO
It is not known whether or not ward-specific antimicrobial use density (AUD) affects the ratio of methicillin-resistant Staphylococcus aureus (MRSA) in culture-positive S. aureus. A 60-month study was attempted to ascertain the association between inpatient MRSA ratio and ward-specific AUDs as well as the former and latter study intervals, specimen types, and ward specialty. During the study, the professionals in infection control regulated the use of broad-spectrum antimicrobials and those for MRSA. By both month and ward, the ratio of inpatients positive for MRSA to those positive for S. aureus was calculated. Factors associated with MRSA ratio included AUDs averaged for the sampling month and its previous month, outpatient MRSA ratio by age, ward specialty, specimen type, and half intervals to represent historical changes. Of a total of 4,245 strains of S. aureus isolated during the 5-year study, 2,232 strains (52.6%) were MRSA. By year, outpatient MRSA ratio at age ≥15 decreased in later years, as did inpatient MRSA ratio. Multivariate analysis for inpatient MRSA ratio revealed a positive risk in AUDs for meropenem (odds ratio [OR] 1.761; 95% confidence interval [CI] 1.761-2.637, P = 0.01), imipenem-cilastatin (OR 1.583; 95% CI 1.087-2.306, P = 0.02), ampicillin-sulbactam (OR 1.623; 95% CI 1.114-2.365, P = 0.01), and minocycline (OR 1.680; CI 1.135-2.487, P = 0.01), respiratory care ward (OR 2.292; 95% CI 1.085-4.841, P = 0.03), and outpatient MRSA ratio (OR 1.536; 95% CI 1.070-2.206, P = 0.02). Use of broad-spectrum antimicrobials, such as meropenem, imipenem-cilastatin, and ampicillin-sulbactam may increase inpatient MRSA ratio. Ward factor should be included in MRSA surveillance because of the possible effect on AUD and considering patients' backgrounds.
RESUMO
BACKGROUND: Prolonged use of totally implantable access ports (APs) and central lines (CLs) has been known to carry a risk of bloodstream infection (BSI), but the safe cutoff day for discontinuing use remains unknown. We performed a receiver operating characteristic (ROC) curve analysis to determine this cutoff. METHODS: A retrospective 24-month study covered a total of 22,481 days of device use. For each day of use, the following findings were recorded: patient age and sex; presence or absence of diabetes mellitus, preexisting sepsis, and renal disease; and occurrence of device-associated BSI. BSI was defined in accordance with the Centers for Disease Control and Prevention's definition of catheter-related infection. RESULTS: BSIs occurred in 81 patients with an AP, for a BSI rate of 2.81 cases per 1,000 days of use. Among the 896 patients with a CL, the BSI rate was 5.60 cases per 1,000 days of use. The ROC analysis found a cutoff time of 33 days for APs (median days of use, 48) and 10 days for CLs (median days of use, 20.5). For the total 22,481 days of use, the odds ratio between APs and CLs with respect to BSI was 0.556 (95% confidence interval [CI], 0.256-1.208; P = .138). Days of use beyond the cutoff had an odds ratio of 2.867 (95% CI, 1.823-4.507; P < .001). Among the risk factors, preexisting sepsis had an odds ratio of 7.843 (95% CI, 4.666-13.184; P < .001). CONCLUSION: Use of an AP for more than 33 days and a CL for more than 10 days may carry an increased risk of device-associated BSI. These cutoff periods are longer than those expected at the time of device placement and indicate the importance of postplacement care.