RESUMEN
Enterococcal bone and joint infections (BJIs) are reported to have poor outcomes, but there are conflicting results. This study aimed to describe the clinical characteristics and outcomes of patients with enterococcal BJI and to assess the factors associated with treatment failure. We conducted a retrospective cohort study at Nimes University Hospital from January 2007 to December 2020. The factors associated with treatment failure were assessed using a Cox model. We included 90 consecutive adult patients, 11 with native BJIs, 40 with prosthetic joint infections and 39 with orthopedic implant-associated infections. Two-thirds of patients had local signs of infection, but few (9%) had fever. Most BJIs were caused by Enterococcus faecalis (n = 82, 91%) and were polymicrobial (n = 75, 83%). The treatment failure rate was 39%, and treatment failure was associated with coinfection with Staphylococcus epidermidis (adjusted hazard ratio = 3.04, confidence interval at 95% [1.31-7.07], p = 0.01) and with the presence of local signs of inflammation at the time of diagnosis (aHR = 2.39, CI 95% [1.22-4.69], p = 0.01). Our results confirm the poor prognosis of enterococcal BJIs, prompting clinicians to carefully monitor for local signs of infection and to optimize the medical-surgical management in case of coinfections, especially with S. epidermidis.
RESUMEN
We aimed to assess the factors associated with mortality in patients treated with tocilizumab for a SARS-CoV-2 pneumonia due to the delta or omicron variants of concern (VOC) and detect an effect of tocilizumab on mortality. We conducted a prospective cohort study in a tertiary hospital from 1 August 2021 to 31 March 2022 including patients with severe COVID-19, treated with tocilizumab. Factors associated with mortality were assessed in a Cox model; then, the 60-day mortality rates of COVID-19 patients treated with standard of care (SoC) +/- tocilizumab were compared after 1:1 propensity score matching. The mortality rate was 22% (N = 26/118) and was similar between delta and omicron cases (p = 0.6). The factors independently associated with mortality were age (HR 1.06; 95% CI (1.02-1.11), p = 0.002), Charlson index (HR 1.33; 95% CI (1.11-1.6), p = 0.002), WHO-CPS (HR 2.56; 95% CI (1.07-6.22) p = 0.03), and tocilizumab infusion within the first 48 h following hospital admission (HR 0.37, 95% CI (0.14-0.97), p = 0.04). No significant differences in mortality between the tocilizumab plus SoC and SoC alone groups (p = 0.5) were highlighted. However, the patients treated with tocilizumab within the 48 h following hospital admission had better survival (p = 0.04). In conclusion, our results suggested a protective effect on mortality of the early administration of tocilizumab in patients with severe COVID-19 regardless of the VOC involved.
RESUMEN
BACKGROUND: Despite their spread in daily practice, few data is available on clinical factors associated with peripherally inserted central catheter (PICC)-related bloodstream infections (PR-BSI). We aimed to assess the PR-BSI incidence, microbiology, and factors associated with PR-BSI with a focus on clinical symptoms. METHODS: We conducted a retrospective cohort study in a French university hospital. We screened all PICC insertions performed from April 1st, 2018, to April 1st, 2019, and included PICC insertions in adult patients. We assessed the PR-BSI incidence, the factors associated with PR-BSI using a Cox model, and negative and positive predictive values (NPVs and PPVs) of each clinical sign for PR-BSI. RESULTS: Of the 901 PICCs inserted in 783 patients (38,320 catheters days), 214 PICCs (24%) presented with a complication. The most prevalent complication was PR-BSI (1.9 per 1000 catheter days; 8.1% of inserted PICCs ). Enterobacterales (N = 27, 37%) and coagulase negative Staphylococci (N = 24, 33%), were the main microorganisms responsible for PR-BSI. Factors independently associated with occurrence of PR-BSI were fever (hazard ratio 13.21, 95% confidence interval 6.00-29.11, p < 0.001) and chills (HR 3.66, 95%CI 1.92-6.99, p < 0.001). All clinical signs and a duration of PICC maintenance ≥ 28 days, had a low PPVs (≤ 67.1%) but high NPVs (≥ 92.5%) for PR-BSI. CONCLUSIONS: Monitoring of clinical signs, especially fever and chills, with caution and limitation of device maintenance duration, could improve PICC management.
Asunto(s)
Infecciones Relacionadas con Catéteres , Cateterismo Venoso Central , Sepsis , Adulto , Humanos , Estudios Retrospectivos , Infecciones Relacionadas con Catéteres/epidemiología , Infecciones Relacionadas con Catéteres/etiología , Escalofríos/complicaciones , Sepsis/epidemiología , Catéteres/efectos adversosRESUMEN
OBJECTIVES: Our aim was to compare the clinical and virological outcomes in Omicron BA.1- and BA.2-infected patients who received sotrovimab with those in patients who received nirmatrelvir for the prevention of severe COVID-19. METHODS: In this multi-centric, prospective ANRS 0003S CoCoPrev cohort study, patients at a high risk of progression of mild-to-moderate BA.1 or BA.2 COVID-19 who received sotrovimab or nirmatrelvir were included. The proportion of patients with progression to severe COVID-19, time between the start of treatment to negative PCR conversion, SARS-CoV-2 viral decay, and characterization of resistance variants were determined. A multi-variable Cox proportional hazard model was used to determine the time to negative PCR conversion and a mixed-effect model for the dynamics of viral decay. RESULTS: Amongst 255 included patients, 199 (80%) received ≥3 vaccine doses, 195 (76%) received sotrovimab, and 60 (24%) received nirmatrelvir. On day 28, new COVID-19-related hospitalization occurred in 4 of 193 (2%; 95% CI, 1-5%) sotrovimab-treated patients and 0 of 55 nirmatrelvir-treated patients (p 0.24). One out of the 55 nirmatrelvir-treated patients died (2%; 95% CI, 0-10%). The median time to negative PCR conversion was 11.5 days (95% CI, 10.5-13) in the sotrovimab-treated patients vs. 4 days (95% CI, 4-9) in the nirmatrelvir-treated patients (p < 0.001). Viral decay was faster in the patients who received nirmatrelvir (p < 0.001). In the multi-variable analysis, nirmatrelvir and nasopharyngeal PCR cycle threshold values were independently associated with faster conversion to negative PCR (hazard ratio, 2.35; 95% CI, 1.56-3.56; p < 0.0001 and hazard ratio, 1.05; 95% CI, 1.01-1.08; p 0.01, respectively). CONCLUSIONS: Early administration of nirmatrelvir in high-risk patients compared with that of sotrovimab was associated with faster viral clearance. This may participate to decrease transmission and prevent viral resistance.
Asunto(s)
COVID-19 , Humanos , Estudios de Cohortes , Estudios Prospectivos , SARS-CoV-2/genética , Reacción en Cadena de la Polimerasa , Lactamas , Leucina , Nitrilos , Prueba de COVID-19RESUMEN
Introduction: Novel last resort beta-lactam antibiotics are now available for management of infections due to New-Delhi Metallo-Beta-Lactamase (NDM) producing Enterobacterales and non-fermenters with Difficult-to-Treat Resistance. However, data regarding the use of imipenem-cilastatin-relebactam (IMI-REL), cefiderocol (CFD) and ceftazidime-avibactam plus aztreonam (CAZ-AVI-ATM) are scarce in real-life settings. This study aimed to describe the use of last resort beta-lactam antibiotics, the microbiology and the outcome, in patients hospitalized in a tertiary hospital. Methods: We conducted a monocentric observational cohort study from 2020/01/01, to 2022/08/31. We screened all patients admitted to Nimes University Hospital who have received ≥ 1 dose of last resort beta-lactam antibiotics during the study period, using the Pharmacy database. We included patients treated with IMI-REL, CFD and CAZ-AVI-ATM. The primary endpoint was the infection-free survival rate. We also calculated rates of microbiological and clinical cure, recurrent infection, death and adverse events. Results: Twenty-seven patients were included in the study and 30 treatment courses were analyzed: CFD (N=24; 80%), CAZ-AVI-ATM (N=3; 10%) and IMI-REL (N=3; 10%). Antibiotics were used in 21 males (70%) and 9 females (30%) with a median age at 65-year-old [50-73.5] and a median Charlson index at 1 [0-2]. Almost all the patients had ≥ 1 risk factor for carbapenem resistant bacteria, a half of them was hospitalized for severe COVID-19, and most of antibiotic courses (N=26; 87%) were associated with ICU admission. In the study population, the probability of infection-free survival at day-90 after last resort beta-lactam therapy initiation was 48.4% CI95% [33.2-70.5]. Clinical failure rate was at 30%, microbiological failure rate at 33% and mortality rate at 23%. Adverse events were documented in 5 antibiotic courses (17%). In details, P. aeruginosa were mainly treated with CFD and IMI-REL, S. maltophilia with CFD and CAZ-AVI-ATM, A. baumannii with CFD, and NDM producing-K. pneumoniae with CAZ-AVI-ATM and CFD. After a treatment course with CFD, CAZ-AVI-ATM and IMI-REL, the probability of infection-free survival was 48% CI95% [10.4-73.5], 33.3% CI95% [6.7-100], 66.7% CI95% [30-100], respectively. Discussion/conclusion: Use of last resort beta-lactam antimicrobials in real-life settings was a safe and efficient therapeutic option for severe infections related to Gram-negative bacteria with Difficult-to-Treat Resistance.
Asunto(s)
COVID-19 , Masculino , Femenino , Humanos , Anciano , Antibacterianos/farmacología , Antibacterianos/uso terapéutico , beta-Lactamasas , Bacterias Gramnegativas , Combinación de Medicamentos , Klebsiella pneumoniae , Pruebas de Sensibilidad Microbiana , CefiderocolAsunto(s)
Vacunas contra la COVID-19 , COVID-19 , Anticuerpos Antivirales , Humanos , Inmunización Secundaria , SARS-CoV-2RESUMEN
Background: Postoperative hypotension associated with postoperative morbidity and early mortality has been studied previously. Hypertension and other hemodynamic, respiratory, and temperature abnormalities have comparatively understudied during the first postoperative days. Methods: This bi-centre observational cohort study will include 114 adult patients undergoing non-cardiac surgery hospitalized on an unmonitored general care floor and wearing a multi-signal wearable sensor, allowing remote monitoring ( Biobeat Technologies Ltd, Petah Tikva, Israel). The study will cover the first 72 hours after discharge of the patient from the post-anaesthesia care unit. Several thresholds will be used for each variable (arterial pressure, heart rate, respiratory rate, oxygen saturation, and skin temperature). Data obtained using the sensor will be compared to data obtained during the routine nurse follow-up. The primary outcome is hemodynamic abnormality. The secondary outcomes are postoperative respiratory and temperature abnormalities, artefacts and blank/null outputs from the wearable device, postoperative complications, and finally, the ease of use of the device. We hypothesize that remote monitoring will detect abnormalities in vital signs more often or more quickly than the detection by nurses' routine surveillance. Discussion: A demonstration of the ability of wireless sensors to outperform standard monitoring techniques paves the way for the creation of a loop which includes this monitoring mode, the automated creation of alerts, and the sending of these alerts to caregivers. Trial registration: ClinicalTrials.gov, NCT04585178. Registered on October 14, 2020.
Asunto(s)
Signos Vitales , Dispositivos Electrónicos Vestibles , Adulto , Estudios de Cohortes , Hemodinámica , Humanos , Estudios Observacionales como Asunto , Frecuencia RespiratoriaRESUMEN
Context: Disseminated infections due to Mycobacterium bovis Bacillus Calmette-Guérin (BCG) are unusual and occur mostly in patients with inborn error of immunity (IEI) or acquired immunodeficiency. However, cases of secondary BCGosis due to intravesical BCG instillation have been described. Herein, we present a case of severe BCGosis occurring in an unusual situation. Case Description: We report one case of severe disseminated BCG disease occurring after hematological malignancy in a 48-year-old man without BCG instillation and previously vaccinated in infancy with no complication. Laboratory investigations demonstrated that he was not affected by any known or candidate gene of IEI or intrinsic cellular defect involving IFNγ pathway. Whole genome sequencing of the BCG strain showed that it was most closely related to the M. bovis BCG Tice strain, suggesting an unexpected relationship between the secondary immunodeficiency of the patient and the acquired BCG infection. Conclusion: This case highlights the fact that, in addition to the IEI, physicians, as well as microbiologists and pharmacists should be aware of possible acquired disseminated BCG disease in secondary immunocompromised patients treated in centers that administrate BCG for bladder cancers.
Asunto(s)
Protocolos de Quimioterapia Combinada Antineoplásica/uso terapéutico , Neoplasias Hematológicas/tratamiento farmacológico , Reconstitución Inmune , Huésped Inmunocomprometido , Mycobacterium bovis/patogenicidad , Infecciones Oportunistas/microbiología , Tuberculosis Pulmonar/microbiología , Administración Intravesical , Protocolos de Quimioterapia Combinada Antineoplásica/efectos adversos , Antituberculosos/uso terapéutico , Vacuna BCG/administración & dosificación , Vacuna BCG/efectos adversos , Interacciones Huésped-Patógeno , Humanos , Masculino , Persona de Mediana Edad , Mycobacterium bovis/efectos de los fármacos , Mycobacterium bovis/inmunología , Infecciones Oportunistas/tratamiento farmacológico , Infecciones Oportunistas/inmunología , Factores de Riesgo , Resultado del Tratamiento , Tuberculosis Pulmonar/diagnóstico , Tuberculosis Pulmonar/tratamiento farmacológico , Tuberculosis Pulmonar/inmunología , Neoplasias de la Vejiga Urinaria/tratamiento farmacológico , Neoplasias de la Vejiga Urinaria/inmunologíaRESUMEN
BACKGROUND: Streptococci involved in infective endocarditis (IE) primarily comprise alpha- or non-hemolytic streptococci (ANHS). Moreover, beta-hemolytic streptococci (BHS) can be involved, and guidelines recommend the addition of gentamicin for the first 2 weeks of treatment and the consideration of early surgery in such cases. This study compared the morbidity and mortality associated with IE depending on the microorganisms involved (BHS, ANHS, staphylococci, and enterococci). METHODS: We conducted a retrospective observational study between 2012 and 2017 in a single hospital in France. The endpoints were overall in-hospital mortality, 1-year mortality and the occurrence of complications. RESULTS: We analyzed 316 episodes of definite IE including 150 (38%), 96 (25%), 46 (12%), and 24 cases (6%) of staphylococcal, ANHS, enterococcal, and BHS IE, respectively. In-hospital mortality was significantly higher in the staphylococcal (n = 40; 26.7%) and BHS groups (n = 6; 25.0%) than in the ANHS (n = 9; 9.4%) and enterococcal groups (n = 5; 10.9%) (all p < 0.01). The rates of septic shock and cerebral emboli were also higher in the BHS group than in the ANHS group [n = 7 (29.2%) vs. n = 3 (3.1%), p < 0.001; n = 7 (29.2%) vs. n = 12 (12.5%); p = 0.05, respectively]. CONCLUSION: This study confirmed that BHS IE has a more severe prognosis than ANHS IE. The virulence of BHS may be similar to that of staphylococci, justifying increased monitoring of these patients and more 'aggressive' treatments such as early surgery.
Asunto(s)
Endocarditis Bacteriana/epidemiología , Infecciones Estreptocócicas/epidemiología , Streptococcus/fisiología , Streptococcus/patogenicidad , Adulto , Anciano , Anciano de 80 o más Años , Endocarditis Bacteriana/microbiología , Endocarditis Bacteriana/mortalidad , Enterococcus/fisiología , Femenino , Francia/epidemiología , Infecciones por Bacterias Grampositivas/epidemiología , Infecciones por Bacterias Grampositivas/microbiología , Infecciones por Bacterias Grampositivas/mortalidad , Humanos , Masculino , Persona de Mediana Edad , Morbilidad , Estudios Retrospectivos , Infecciones Estafilocócicas/epidemiología , Infecciones Estafilocócicas/microbiología , Infecciones Estafilocócicas/mortalidad , Staphylococcus/fisiología , Infecciones Estreptocócicas/microbiología , Infecciones Estreptocócicas/mortalidad , Virulencia , Adulto JovenRESUMEN
BACKGROUND: The management of patients with locally recurrent rectal cancer (LRRC) is often complex and requires multidisciplinary input whereas only few patients are referred to a specialist centre. The aim of this study was to design a regional referral pathway for LRRC, in Nouvelle Aquitaine (South-West, France). METHODS: In 2016, we conducted with a Study Steering Committee (SC) a three phase mixed-methods study including identification of key factors, identification of key stakeholders and Delphi voting consensus. During three rounds of Delphi voting, a consensus was defined as favorable, if at least 80% of participating experts rate the factor, below or equal to 3/10 using a Likert scale, or consider it as "useful" using a binary scale (third round only). Finally, the SC drafted guidelines. RESULTS: Among the 423 physicians involved in 29 regional digestive Multi-Disciplinary Team (MDT) meeting, 59 participants (from 26 MDT meeting) completed all three rounds of Delphi voting. Thirteen out of twenty initially selected factors reached a favorable consensus. All patients with a LRRC need to be included into a referral pathway. Patients with a central pelvic recurrence offered curative treatment in their local hospital and patients with unresectable metastatic disease were excluded of the referral. Key performance indicators were also agreed including the time to referral and completion of pelvic MRI-, CT-, PET-scan prior to MDT referral. CONCLUSION: The development of this referral pathway represents an innovative health service, which will improve the management of patients with LRRC in France.
Asunto(s)
Consenso , Manejo de la Enfermedad , Recurrencia Local de Neoplasia/terapia , Grupo de Atención al Paciente/organización & administración , Derivación y Consulta/organización & administración , Técnica Delphi , Humanos , Imagen por Resonancia Magnética , Recurrencia Local de Neoplasia/diagnóstico , Tomografía de Emisión de Positrones , Neoplasias del RectoRESUMEN
BACKGROUND: Infective endocarditis (IE) remains a severe disease with a high mortality rate. Therefore, guidelines encourage the setup of a multidisciplinary group in reference centers. The present study evaluated the impact of this "Endocarditis Team" (ET). METHODS: We conducted a monocentric observational study at Strasbourg University Hospital, Strasbourg, France, between 2012 and 2017. The primary end point was in-hospital mortality. Secondary end points were 6-month and 1-year mortality, surgery rate, time to surgical procedure, duration of effective antibiotic therapy, length of in-hospital stay, and sequelae. We also assessed predictors of in-hospital mortality. RESULTS: We analyzed 391 episodes of IE. In the post-ET period, there was a nonsignificant decrease in in-hospital mortality (20.3% vs 14.7%, respectively; P = .27) and sequelae, along with a significant reduction in time to surgery (16.4 vs 10.3 days, respectively; P = .049), duration of antibiotic therapy (55.2 vs 47.2 days, respectively; P < .001), and length of in-hospital stay (40.6 vs 31.9 days, respectively; P < .01). In a multivariate analysis, the post-ET period was positively associated with survival (odds ratio, 0.45; 95% confidence interval, 0.20-0.96; P = .048). CONCLUSIONS: This multidisciplinary approach exerted a positive impact on the management of IE and should be considered in all hospitals managing IE.
RESUMEN
Aspergillus sp. fungi cause various diseases in both immunocompetent and immunocompromised patients. The most frequent Aspergillus disorders include chronic pulmonary aspergillosis (CPA), a life-threatening disease that affects at least 3 million people worldwide, and allergic bronchopulmonary aspergillosis (ABPA), which affects approximately 4.8 million severe asthmatic patients globally. Diagnosis of such diseases involves IgG serological testing; however, the currently available anti-Aspergillus IgG detection assays are inappropriate for resource-poor laboratory settings, as they are expensive, rely on automated procedures, and require stable electrical power. Therefore, accurate CPA or ABPA diagnosis facilities are lacking in most low- and middle-income countries. We evaluated a novel anti-Aspergillus antibody immunochromatographic test (ICT) that requires minimal laboratory equipment. Two evaluations were performed: a single-center 4-month prospective study in a French reference laboratory (44 cases/257 patients) and a retrospective study in five French reference laboratories (262 cases and 188 controls). We estimated the ICT indices for the diagnosis of chronic aspergillosis, and the test results were compared to those of anti-Aspergillus IgG immunoblot (IB) assay. Of the 713 patients included in the study, 306 had chronic aspergillosis. Test sensitivity and specificity were 88.9% (95%CI[85-92]) and 96.3% (95%CI[94-98]) for the ICT and 93.1% (95%CI[90-96]) and 94.3% (95%CI[92-96]) for the IB, respectively. Agreement between the two assays was almost perfect (kappa = 0.86). As this ICT displays good diagnostic performance and complies with the ASSURED (Affordable, Sensitive, Specific, User-friendly, Equipment-free, and Delivered) criteria, we concluded that this anti-Aspergillus antibody ICT can be used to diagnose Aspergillus diseases in resource-poor settings.
Asunto(s)
Anticuerpos Antifúngicos/sangre , Aspergilosis/diagnóstico , Aspergillus/inmunología , Inmunoensayo/métodos , Inmunoglobulina G/sangre , Pruebas Serológicas/métodos , Francia , Humanos , Estudios Prospectivos , Estudios Retrospectivos , Sensibilidad y EspecificidadRESUMEN
OBJECTIVE: Delayed implementation of effective road safety policies must be considered when quantifying the avoidable part of the fatal and nonfatal injuries burden. We sought to assess the avoidable part of disability-adjusted life years (DALYs) lost due to road traffic injuries related to delays in implementing road safety laws in low- and lower-middle-income countries. METHODS: We chose one country for each of the regions of the World Health Organization (WHO) and World Bank (WB) country income levels. We used freely available data sets (WHO, International Traffic Safety Data and Analysis Group, the WB). Delays in implementation were calculated until 2013, from the year mandatory use of safety belts by motor vehicle front seat occupants was first introduced worldwide. We used life expectancy tables and age groups as social values in the DALY calculation model. From the estimated total burden, avoidable DALYs were calculated using estimates of the effectiveness of seat belt laws on fatal and nonfatal injuries combined, as extracted from published international reviews of evidence. RESULTS: From the reference year 1972, implementation delays varied from 27 years (Uzbekistan) to 41 years in Bolivia (no seat belt law as of 2013). During delays, total absolute numbers of DALYs lost due to road traffic injuries reached 8,462,099 in Nigeria, 7,203,570 in Morocco, 4,695,500 in Uzbekistan, 3,866,391 in Cambodia, 3,253,359 in Bolivia, and 3,128,721 in Sri Lanka. Using effectiveness estimates ranging from 3 to 20% reduction, the avoidable burden of road traffic injuries for car occupants was highest in Uzbekistan (avoidable part from 1.2 to 10.4%) and in Morocco (avoidable part from 1.5 to 12.3%). In countries where users of public transport and pedestrians were the most affected by the burden, the avoidable parts ranged from 0.5 to 4.4% (Nigeria) and from 0.5 to 3.4% (Bolivia). Burden of road traffic injuries mostly affected motorcyclists in Sri Lanka and Cambodia where the avoidable parts were less than 2% in both countries. In all selected countries, burden of traffic injuries mostly affected men (about 80%) as well as young people (15-34 years). CONCLUSIONS: Despite limited data availability in low- and middle-income countries, the avoidable part of the burden related to delayed intervention is measurable. These results can be used to convince countries to avoid delaying the provision of better protection to road users.
Asunto(s)
Accidentes de Tránsito/estadística & datos numéricos , Países en Desarrollo , Personas con Discapacidad/estadística & datos numéricos , Cinturones de Seguridad/legislación & jurisprudencia , Heridas y Lesiones/epidemiología , Adolescente , Adulto , Anciano , Niño , Preescolar , Femenino , Humanos , Masculino , Persona de Mediana Edad , Años de Vida Ajustados por Calidad de Vida , Adulto JovenRESUMEN
Non-invasive respiratory variations in arterial pulse pressure using infrared-plethysmography (PPVCNAP) are able to predict fluid responsiveness in mechanically ventilated patients. However, they cannot be continuously monitored. The present study evaluated a new algorithm allowing continuous measurements of PPVCNAP (PPVCNAPauto) (CNSystem, Graz, Austria). Thirty-five patients undergoing vascular surgery were studied after induction of general anaesthesia. Stroke volume was measured using the VigileoTM/FloTracTM. Invasive pulse pressure variations were manually calculated using an arterial line (PPVART) and PPVCNAPauto was continuously displayed. PPVART and PPVCNAPauto were simultaneously recorded before and after volume expansion (500 ml hydroxyethylstarch). Subjects were defined as responders if stroke volume increased by ≥15 %. Twenty-one patients were responders. Before volume expansion, PPVART and PPVCNAPauto exhibited a bias of 0.1 % and limits of agreement from -7.9 % to 7.9 %. After volume expansion, PPVART and PPVCNAPauto exhibited a bias of -0.4 % and limits of agreement from -5.3 % to 4.5 %. A 14 % baseline PPVART threshold discriminated responders with a sensitivity of 86 % (95 % CI 64-97 %) and a specificity of 100 % (95 % CI 77-100 %). Area under the receiver operating characteristic (ROC) curve for PPVART was 0.93 (95 % CI 0.79-0.99). A 15 % baseline PPVCNAPauto threshold discriminated responders with a sensitivity of 76% (95 % CI 53-92 %) and a specificity of 93 % (95 % CI 66-99 %). Area under the ROC curves for PPVCNAPauto was 0.91 (95 % CI 0.76-0.98), which was not different from that for PPVART. When compared with PPVART, PPVCNAPauto performs satisfactorily in assessing fluid responsiveness in hemodynamically stable surgical patients.
Asunto(s)
Presión Sanguínea , Fluidoterapia , Monitoreo Intraoperatorio/instrumentación , Monitoreo Intraoperatorio/métodos , Monitoreo Fisiológico/métodos , Pletismografía , Anciano , Algoritmos , Aorta/cirugía , Área Bajo la Curva , Automatización , Gasto Cardíaco , Endarterectomía Carotidea , Femenino , Frecuencia Cardíaca , Hemodinámica , Humanos , Derivados de Hidroxietil Almidón/química , Masculino , Persona de Mediana Edad , Curva ROC , Respiración Artificial , Sensibilidad y Especificidad , Volumen SistólicoRESUMEN
OBJECTIVES: We discuss in this review, urologists' expectations of imaging in terms of detection, characterization, pre-planning treatment and follow-up of urinary stones. MATERIALS AND METHODS: Data acquisition regarding kidney stones and imaging was performed using MEDLINE searches with combinations of the following keywords: urinary stones, CT Urography, low dose CT, MRI urography, renal stones ultrasound, conventional radiography, surgery. RESULTS: CT has become the gold standard for the evaluation of urinary stones. Scanning provides information regarding stone (composition, size, burden, location), collecting system and renal parenchyma. Those findings are crucial in determining appropriate treatment strategies. Because CT exposes the patient to substantial ionizing radiation, efforts have already been made to decrease the CT radiation dose for CT examination (low dose CT) and optimize image quality. Efforts also are being made to use non ionizing modalities such as ultrasound in combination with radiography particularly for the follow up of renal stones. CONCLUSION: CT is the preferred method for the evaluation and treatment planning of urolithiasis. CT radiation dose reduction can be achieved with low dose CT. However, conventional radiography and ultrasound are still recommended in the follow up of renal stones.
Asunto(s)
Urolitiasis/diagnóstico , Humanos , Radiografía , Cálculos Urinarios/diagnóstico , Cálculos Urinarios/diagnóstico por imagen , Urolitiasis/diagnóstico por imagenRESUMEN
Potential differences in the toxicological properties of nanosized and non-nanosized particles have been notably pointed out for titanium dioxide (TiO(2)) particles, which are currently widely produced and used in many industrial areas. Nanoparticles of the iron oxides magnetite (Fe(3)O(4)) and hematite (Fe(2)O(3)) also have many industrial applications but their toxicological properties are less documented than those of TiO(2). In the present study, the in vitro cytotoxicity and genotoxicity of commercially available nanosized and microsized anatase TiO(2), rutile TiO(2), Fe(3)O(4), and Fe(2)O(3) particles were compared in Syrian hamster embryo (SHE) cells. Samples were characterized for chemical composition, primary particle size, crystal phase, shape, and specific surface area. In acellular assays, TiO(2) and iron oxide particles were able to generate reactive oxygen species (ROS). At the same mass dose, all nanoparticles produced higher levels of ROS than their microsized counterparts. Measurement of particle size in the SHE culture medium showed that primary nanoparticles and microparticles are present in the form of micrometric agglomerates of highly poly-dispersed size. Uptake of primary particles and agglomerates by SHE exposed for 24 h was observed for all samples. TiO(2) samples were found to be more cytotoxic than iron oxide samples. Concerning primary size effects, anatase TiO(2), rutile TiO(2), and Fe(2)O(3) nanoparticles induced higher cytotoxicity than their microsized counterparts after 72 h of exposure. Over this treatment time, anatase TiO(2) and Fe(2)O(3) nanoparticles also produced more intracellular ROS compared to the microsized particles. However, similar levels of DNA damage were observed in the comet assay after 24 h of exposure to anatase nanoparticles and microparticles. Rutile microparticles were found to induce more DNA damage than the nanosized particles. However, no significant increase in DNA damage was detected from nanosized and microsized iron oxides. None of the samples tested showed significant induction of micronuclei formation after 24 h of exposure. In agreement with previous size-comparison studies, we suggest that in vitro cytotoxicity and genotoxicity induced by metal oxide nanoparticles are not always higher than those induced by their bulk counterparts.
Asunto(s)
Daño del ADN , Compuestos Férricos/toxicidad , Nanopartículas del Metal/toxicidad , Tamaño de la Partícula , Especies Reactivas de Oxígeno/metabolismo , Titanio/toxicidad , Animales , Recuento de Células , Células Cultivadas , Cricetinae , Medios de Cultivo/química , Relación Dosis-Respuesta a Droga , Embrión de Mamíferos , Compuestos Férricos/química , Citometría de Flujo , Sustancias Peligrosas/toxicidad , Mesocricetus , Modelos Animales , Pruebas de Mutagenicidad/métodos , Mutágenos/toxicidad , Titanio/químicaRESUMEN
This study modeled the urinary toxicokinetics of cobalt exposure based on 507 urine samples from 16 workers, followed up for 1 week, and 108 related atmospheric cobalt measurements to determine an optimal urinary cobalt sampling strategy at work and a corresponding urinary exposure threshold (UET). These data have been used to calibrate a population toxicokinetic model, taking into account both the measurement uncertainty and intra- and interindividual variability. Using the calibrated model, urinary sampling sensitivity and specificity performance in detecting exposure above the 20 microg/m(3) threshold limit value - time-weighted average (TLV-TWA) has been applied to identify an optimal urine sampling time. The UET value is obtained by minimizing misclassification rates in workplace exposures below or above the TLV. Total atmospheric cobalt concentrations are in the 5-144 microg/m(3) range, and total urinary cobalt concentrations are 0.5-88 microg/g creatinine. A two-compartment toxicokinetic model best described urinary elimination. Terminal elimination half-time from the central compartment is 10.0 hr (95% confidence interval [8.3-12.3]). The optimal urinary sampling time has been identified as 3 hr before the end of shift at the end of workweek. If we assume that misclassification errors are of equal cost, the UET associated with the TLV of 20 microg/m(3) is 5 microg/L, which is lower than the ACGIH-recommended biological exposure index of 15 microg/L.