RESUMEN
Fairness constitutes a cornerstone of social norms, emphasizing equal treatment and equitable distribution in interpersonal relationships. Unfair treatment often leads to direct responses and can spread to others through a phenomenon known as pay-it-forward (PIF) reciprocity. This study examined how unfairness spreads in interactions with new partners who have higher, equal, or lower status than the participants. In the present study, participants (N = 47, all Korean) were given either fair or unfair treatment in the first round of a dictator game. They then allocated monetary resources among partners positioned at various hierarchical levels in the second round. Our main goal was to determine if the severity of inequity inflicted on new partners was influenced by their hierarchical status. The results revealed an inclination among participants to act more generously towards partners of higher ranking despite prior instances of unfair treatment, whereas a tendency for harsher treatment was directed towards those with lower ranking. The interaction between the fairness in the first round (DG1) and the hierarchical status of the partner in the second round (DG2) was significant, indicating that the effect of previous fairness on decision-making differed depending on the ranking of the new partners. This study, therefore, validates the presence of unfairness PIF reciprocity within hierarchical contexts.
Asunto(s)
Suero Antilinfocítico , Ciclofosfamida , Enfermedad Injerto contra Huésped , Síndromes Mielodisplásicos , Humanos , Síndromes Mielodisplásicos/terapia , Suero Antilinfocítico/uso terapéutico , Ciclofosfamida/uso terapéutico , Masculino , Femenino , Persona de Mediana Edad , Incidencia , Adulto , Anciano , Enfermedad Aguda , Trasplante de Células Madre Hematopoyéticas/métodosRESUMEN
The phantom array effect is one of the temporal light artefacts that can decrease performance and increase fatigue. The phantom array effect visibility shows large individual differences; however, the dominant factors that can explain these individual differences remain unclear. We investigated the relationship between saccadic eye movement speed and phantom array visibility at two different angles and four different directions of saccadic eye movement. The peak speed of saccadic eye movement and the phantom array effect visibility were measured at different modulation frequencies of the light source. Our results show that phantom array visibility increased as eye movement speed increased; the phantom array visibility was higher at a wide viewing angle with fast eye movement speed than at a narrow viewing angle. Moreover, when clustered into subgroups according to individual eye movement speed, the mean speed of the saccadic eye movement of each subgroup is related to the variations in the visibility of the phantom array effect of the subgroup. Therefore, saccadic eye movement speed is related to variations in phantom array effect visibility.
Asunto(s)
Movimientos Oculares , Movimientos Sacádicos , Humanos , FatigaRESUMEN
BACKGROUND AND OBJECTIVES: Intensive care unit (ICU) physicians perform weaning procedures considering complex clinical situations and weaning protocols; however, liberating critical patients from mechanical ventilation (MV) remains challenging. Therefore, this study aims to aid physicians in deciding the early liberation of patients from MV by developing an artificial intelligence model that predicts the success of spontaneous breathing trials (SBT). METHODS: We retrospectively collected data of 652 critical patients (SBT success: 641, SBT failure: 400) who received MV at the Chungbuk National University Hospital (CBNUH) ICU from July 2020 to July 2022, including mixed and trauma ICUs. Patients underwent SBTs according to the CBNUH weaning protocol or physician's decision, and SBT success was defined as extubation performed by the physician on the SBT day. Additionally, our dataset comprised 11 numerical and 2 categorical features that can be obtained for any ICU patient, such as vital signs and MV setting values. To predict SBT success, we analyzed tabular data using a graph neural network-based approach. Specifically, the graph structure was designed considering feature correlation, and a novel deep learning model, called feature tokenizer graph attention network (FT-GAT), was developed for graph analysis. FT-GAT transforms the input features into high-dimensional embeddings and analyzes the graph via the attention mechanism. RESULTS: The quantitative evaluation results indicated that FT-GAT outperformed conventional models and clinical indicators by achieving the following model performance (AUROC): FT-GAT (0.80), conventional models (0.69-0.79), and clinical indicators (0.65-0.66) CONCLUSIONS: Through timely detection critical patients who can succeed in SBTs, FT-GAT can help prevent long-term use of MV and potentially lead to improvement in patient outcomes.
Asunto(s)
Inteligencia Artificial , Respiración Artificial , Humanos , Respiración Artificial/métodos , Estudios Retrospectivos , Desconexión del Ventilador/métodos , Redes Neurales de la ComputaciónRESUMEN
Recent studies have reported that the lower airway microbiome may play an essential role in the development and progression of interstitial lung disease (ILD). The aim of the current study was to evaluate the characteristics of the respiratory microbiome and intrasubject variation in patients with ILD. Patients with ILD were recruited prospectively for 12 months. The sample size was small (nâ =â 11) owing to delayed recruitment during the COVID-19 pandemic. All subjects were hospitalized and were evaluated by a questionnaire survey, blood sampling, pulmonary function test, and bronchoscopy. Bronchoalveolar lavage fluid (BALF) was obtained at 2 sites, the most and least disease-affected lesions. Sputum collection was also performed. Furthermore, 16S ribosomal RNA gene sequencing was performed using the Illumina platform and indexes of α- and ß-diversity were evaluated. Species diversity and richness tended to be lower in the most-affected lesion than in the least-affected lesion. However, taxonomic abundance patterns were similar in these 2 groups. The phylum Fusobacteria was more prevalent in fibrotic ILD than in nonfibrotic ILD. Inter-sample differences in relative abundances were more prominent in BALF versus sputum specimens. Rothia and Veillonella were more prevalent in the sputum than in BALF. We did not detect site-specific dysbiosis in the ILD lung. BALF was an effective respiratory specimen type for evaluating the lung microbiome in patients with ILD. Further studies are needed to evaluate the causal links between the lung microbiome and the pathogenesis of ILD.
Asunto(s)
COVID-19 , Enfermedades Pulmonares Intersticiales , Microbiota , Humanos , Pandemias , COVID-19/complicaciones , Enfermedades Pulmonares Intersticiales/diagnóstico , Pulmón , Líquido del Lavado Bronquioalveolar/microbiologíaRESUMEN
Clinical effect of donor-derived natural killer cell infusion (DNKI) after HLA-haploidentical hematopoietic cell transplantation (HCT) was evaluated in high-risk myeloid malignancy in phase 2, randomized trial. Seventy-six evaluable patients (aged 21-70 years) were randomized to receive DNKI (N = 40) or not (N = 36) after haploidentical HCT. For the HCT conditioning, busulfan, fludarabine, and anti-thymocyte globulin were administered. DNKI was given twice 13 and 20 days after HCT. Four patients in the DNKI group failed to receive DNKI. In the remaining 36 patients, median DNKI doses were 1.0 × 108/kg and 1.4 × 108/kg on days 13 and 20, respectively. Intention-to-treat analysis showed a lower disease progression for the DNKI group (30-month cumulative incidence, 35% vs 61%, P = 0.040; subdistribution hazard ratio, 0.50). Furthermore, at 3 months after HCT, the DNKI patients showed a 1.8- and 2.6-fold higher median absolute blood count of NK and T cells, respectively. scRNA-sequencing analysis in seven study patients showed that there was a marked increase in memory-like NK cells in DNKI patients which, in turn, expanded the CD8+ effector-memory T cells. In high-risk myeloid malignancy, DNKI after haploidentical HCT reduced disease progression. This enhanced graft-vs-leukemia effect may be related to the DNKI-induced, post-HCT expansion of NK and T cells. Clinical trial number: NCT02477787.
Asunto(s)
Enfermedad Injerto contra Huésped , Trasplante de Células Madre Hematopoyéticas , Leucemia Mieloide Aguda , Humanos , Interleucina-15 , Enfermedad Injerto contra Huésped/patología , Células Asesinas Naturales/patología , Progresión de la Enfermedad , Leucemia Mieloide Aguda/terapia , Leucemia Mieloide Aguda/patología , Acondicionamiento PretrasplanteRESUMEN
In a prospective, explorative study, the donor-source difference of haploidentical family (HF), matched sibling (MS), and unrelated donors (UD) was evaluated for the outcome of haematopoietic cell transplantations (HCT) in 101 patients with acute myeloid leukaemia (AML) in complete remission (CR). To eliminate compounding effects, a uniform conditioning regimen containing antithymocyte globulin (ATG) was used. After transplantation, there was a significantly higher cumulative incidence of acute graft-versus-host disease (GVHD) in HF-HCT patients (49%, 7%, and 16% for HF-, MS- and UD-HCT respectively; p < 0.001). A quarter of acute GVHD cases observed in HF-HCT patients occurred within three days of engraftment and were characterized by diffuse skin rash, fever, weight gain, and hypoalbuminaemia. This peri-engraftment acute GVHD was not observed in MS-HCT or UD-HCT patients. Additionally, a significantly higher proportion of HF-HCT patients achieved complete donor chimaerism in the peripheral mononuclear cells at one month (88%, 46%, and 69% for HF-, MS- and UD-HCT respectively; p = 0.001). There was no significant difference in engraftment, chronic GVHD, leukaemia recurrence, non-relapse mortality, and patient survival. In patients with AML in CR who received HCT using ATG-containing conditioning, stronger donor-patient alloreactivity was observed in HF-HCT, in terms of increased acute GVHD and higher likelihood of complete donor chimaerism.
Asunto(s)
Enfermedad Injerto contra Huésped , Trasplante de Células Madre Hematopoyéticas , Leucemia Mieloide Aguda , Humanos , Busulfano/uso terapéutico , Suero Antilinfocítico/uso terapéutico , Donante no Emparentado , Hermanos , Estudios Prospectivos , Recurrencia Local de Neoplasia , Enfermedad Injerto contra Huésped/etiología , Enfermedad Injerto contra Huésped/prevención & control , Leucemia Mieloide Aguda/terapia , Acondicionamiento PretrasplanteRESUMEN
A 68-year-old man was transferred to our tertiary hospital. Ten years ago, he received radiation therapy for tonsil cancer, and while there was no evidence of recurrence, he suffered from recurrent aspiration. We treated his aspiration pneumonia in the intensive care unit. Prior to his discharge, he received percutaneous dilatational tracheostomy (PDT) before he was transferred to a nursing hospital. Nine months later, he was readmitted owing to tracheoesophageal fistula (TEF). However, he was considered unsuitable for conservative intervention after a multidisciplinary team discussion. Esophageal stent insertion was impossible due to the high level of TEF in the esophagus. Additionally, the size of the TEF could not be covered by an endosponge and endoluminal vacuum therapy, and there was no tracheal stent that could cover his large trachea. The preceding percutaneous enteral gastrostomy (PEG) procedure was required for the primary closure operation of the esophagus; however, family's consent could not be obtained. After 1month, the patient and his family changed their minds and agreed to the procedure and we attempted to perform PEG procedure. However, we could not proceed with PEG owing to stenosis in the inlet of the esophagus. Then, the patient deteriorated clinically and died due to pneumonia with septic shock.
RESUMEN
Interstitial lung disease (ILD) is widely known to be associated with high mortality and poor prognosis, especially in patients admitted to the intensive care unit (ICU). The objective of this study was to investigate clinical predictors for assisting relatively early decision of treatment level in the ICU. We retrospectively investigated patients with ILD who were admitted to the ICU between January 1, 2014, and September 30, 2019. A total of 64 patients were analyzed. We found the ICU and hospital mortality rates to be 67.2% and 69.8%, respectively. Nonsurvivors had a higher fraction of inspired oxygen (FiO2) on days 1 (79â ±â 21 vs 60%â ±â 21%, Pâ =â .001) and 3 (61â ±â 31 vs 46%â ±â 19%, Pâ =â .004). They showed lower partial pressure of oxygen/FiO2 (PF) ratio on days 1 (134â ±â 80 vs 173â ±â 102, Pâ =â .049) and 3 (147â ±â 74 vs 235â ±â 124, Pâ =â .003) than the survivor group. The lactic acid levels obtained on day 1 and PF ratio measured on day 3 were associated with mortality (odds ratio, 1.89; 95% confidence interval 1.03-3.47 and odds ratio, 0.99; 95% confidence interval 0.98-1.00, respectively). Among the 31 ICU survivors, 10 patients died in the general ward, 12 patients died after hospital discharge; only 9 patients survived after 1 year. We suggest that these clinical predictors could be used to determine the level of further treatment or withdrawal on day 3 of admission in patients with ILD admitted to the ICU to minimize the prolonged suffering in a relatively early period.
Asunto(s)
Unidades de Cuidados Intensivos , Enfermedades Pulmonares Intersticiales , Muerte , Humanos , Ácido Láctico , Enfermedades Pulmonares Intersticiales/terapia , Oxígeno , Estudios RetrospectivosRESUMEN
Bronchiectasis show various ventilatory disorders in pulmonary function. The characteristics and severity of patients with bronchiectasis according to these pulmonary dysfunctions are still very limited. This study aimed to evaluate the clinical, radiologic feature and the disease severity of patients with bronchiectasis according to spirometric patterns. We retrospectively evaluated 506 patients with bronchiectasis who underwent pulmonary lung function test (PFT) at a referral hospital between 2014 to 2021. The results showed that cylindrical type was the most common (70.8%) type of bronchiectasis on chest Computed tomography (CT), and 70% of patients had bilateral lung involvement. On the other hand, obstructive ventilatory disorder was the most common (51.6%), followed by normal ventilation (30%) and restrictive ventilatory disorder (18.4%). The modified Medical Research Council (mMRC) was highest in patients with obstructive ventilatory disorders, Modified Reiff score [median (interquartile range)] [6 (3-10), P < 0.001], FACED (FEV1, Age, Chronic colonization, Extension, and Dyspnea) score [3 (1-4), P < 0.001], and Bronchiectasis Severity (BSI) score [8 (5-11), P < 0.001] showed significantly highest values of obstructive ventilatory disorder rather than restrictive ventilatory disorder and normal ventilation. More than half of patients with bronchiectasis had obstructive ventilatory disorder. Bronchiectasis with obstructive ventilatory disorders has more dyspnea symptom, more disease severity and more radiologic severity. There was no significant association between spirometric pattern and radiologic type, but the more severe the radiologic severity, the more severe the lung function impairment.
Asunto(s)
Bronquiectasia , Bronquiectasia/diagnóstico por imagen , Disnea , Humanos , Estudios Retrospectivos , Índice de Severidad de la Enfermedad , EspirometríaRESUMEN
BACKGROUND: Scoring systems developed for predicting survival after allogeneic hematopoietic cell transplantation (HCT) show suboptimal prediction power, and various factors affect posttransplantation outcomes. OBJECTIVE: A prediction model using a machine learning-based algorithm can be an alternative for concurrently applying multiple variables and can reduce potential biases. In this regard, the aim of this study is to establish and validate a machine learning-based predictive model for survival after allogeneic HCT in patients with hematologic malignancies. METHODS: Data from 1470 patients with hematologic malignancies who underwent allogeneic HCT between December 1993 and June 2020 at Asan Medical Center, Seoul, South Korea, were retrospectively analyzed. Using the gradient boosting machine algorithm, we evaluated a model predicting the 5-year posttransplantation survival through 10-fold cross-validation. RESULTS: The prediction model showed good performance with a mean area under the receiver operating characteristic curve of 0.788 (SD 0.03). Furthermore, we developed a risk score predicting probabilities of posttransplantation survival in 294 randomly selected patients, and an agreement between the estimated predicted and observed risks of overall death, nonrelapse mortality, and relapse incidence was observed according to the risk score. Additionally, the calculated score demonstrated the possibility of predicting survival according to the different transplantation-related factors, with the visualization of the importance of each variable. CONCLUSIONS: We developed a machine learning-based model for predicting long-term survival after allogeneic HCT in patients with hematologic malignancies. Our model provides a method for making decisions regarding patient and donor candidates or selecting transplantation-related resources, such as conditioning regimens.
RESUMEN
Cavitary pulmonary tuberculosis (TB) is associated with poor outcomes, treatment recurrence, higher transmission rates, and the development of drug resistance. However, reports on its clinical characteristics, associated factors, and treatment outcomes are lacking. Hence, this study sought to evaluate the clinical factors associated with cavitary pulmonary TB and its treatment outcomes. We retrospectively evaluated 410 patients with drug-susceptible pulmonary TB in a university hospital in Korea between 2014 and 2019. To evaluate the factors associated with cavitary TB, multivariable logistic regression was performed with adjustments for potential confounders. We also compared the treatment outcomes between patients with cavitary TB and those without cavitary TB. Of the 410 patients, 244 (59.5%) had non-cavitary TB and 166 (40.5%) had cavitary TB. Multivariable logistic analysis with forward selection method showed that body mass index (BMI) (adjusted OR = 0.88, 95% CI: 0.81-0.97), previous history of TB (adjusted OR = 3.45, 95% CI: 1.24-9.59), ex- or current smoker (adjusted OR = 1.77, 95% CI: 1.01-3.13), diabetes mellitus (adjusted OR = 2.72, 95% CI: 1.36-5.44), and positive results on the initial sputum acid-fast bacilli (AFB) smear (adjusted OR = 2.24, 95% CI: 1.26-3.98) were significantly associated with cavitary TB. Although treatment duration was significantly longer in patients with cavitary TB than in those with non-cavitary TB (248 (102-370 days) vs. 202 (98-336 days), p < 0.001), the recurrence rate after successful treatment was significantly higher in the patients with cavitary TB than in those with non-cavitary TB (0.4% vs. 3.0% p = 0.042). In conclusion, ex- or current smoker, lower BMI, previous history of TB, diabetes mellitus, and positivity of the initial AFB smear were associated with cavitary TB. The patients with cavitary TB had more AFB culture-positive results at 2 months, longer treatment duration, and higher recurrence rates than those with non-cavitary TB.
RESUMEN
BACKGROUND: This study presents outcomes of management in graft failure (GF) after allogeneic hematopoietic stem cell transplantation (HCT) and provides prognostic information including rare cases of autologous reconstitution (AR). METHODS: We analyzed risk factors and outcomes of primary and secondary GF, and occurrence of AR in 1,630 HCT recipients transplanted over period of 18 years (January 2000-September 2017) at our center. RESULTS: Primary and secondary GF occurred in 13 (0.80%), and 69 patients (10-year cumulative incidence, 4.5%) respectively. No peri-transplant variables predicted primary GF, whereas reduced intensity conditioning (RIC) regimen (relative risk [RR], 0.97-28.0, P < 0.001) and lower CD34⺠cell dose (RR, 2.44-2.84, P = 0.002) were associated with higher risk of secondary GF in multivariate analysis. Primary GF demonstrated 100% mortality, in the secondary GF group, the 5-year Kaplan-Meier survival rate was 28.8%, relapse ensued in 18.8%, and AR was observed in 11.6% (n = 8). In survival analysis, diagnosis of aplastic anemia (AA), chronic myeloid leukemia and use of RIC had a positive impact. There were 8 patients who experienced AR, which was rarely reported after transplantation for acute leukemia. Patient shared common characteristics such as young age (median 25 years), use of RIC regimen, absence of profound neutropenia, and had advantageous survival rate of 100% during follow period without relapse. CONCLUSION: Primary GF exhibited high mortality rate. Secondary GF had 4.5% 10-year cumulative incidence, median onset of 3 months after HCT, and showed 5-year Kaplan-Meier survival of 28.8%. Diagnosis of severe AA and use of RIC was both associated with higher incidence and better survival rate in secondary GF group. AR occurred in 11.6% in secondary GF, exhibited excellent prognosis.
Asunto(s)
Rechazo de Injerto/epidemiología , Trasplante de Células Madre Hematopoyéticas/efectos adversos , Acondicionamiento Pretrasplante/métodos , Trasplante Homólogo/efectos adversos , Adolescente , Adulto , Anciano , Femenino , Enfermedad Injerto contra Huésped , Trasplante de Células Madre Hematopoyéticas/métodos , Humanos , Leucemia Mieloide Aguda/terapia , Masculino , Persona de Mediana Edad , Acondicionamiento Pretrasplante/efectos adversos , Insuficiencia del Tratamiento , Adulto JovenRESUMEN
BACKGROUND: Data on the clinical characteristics of delayed treatment initiation among pulmonary tuberculosis (TB) patients are lacking. Thus, this study aimed to identify the factors associated with delayed treatment in culture-confirmed pulmonary TB and to assess outcomes of delayed treatment. METHODS: We retrospectively evaluated 151 patients with culture-confirmed pulmonary TB between 2015 and 2017. Delayed and timely treatment was defined as initiation of anti-TB treatment after and before the identification of Mycobacterium tuberculosis complex isolate, respectively. Factors related to delayed treatment, such as comorbidities, clinical presentation, and patterns of initial healthcare use, were collected. We analyzed whether delayed treatment was associated with all-cause mortality using a multivariate binary logistic regression model adjusted for age, sex, cardiovascular disease, and malignancy. RESULTS: In total, 55 (36.4%) patients had delayed treatment. The median length between the first medical visit and treatment initiation was 9 days. Compared with timely treatment, delayed treatment was associated with no initial visit to a non-pulmonary department [adjusted odds ratio (aOR) =10.49, 95% confidence interval (CI), 2.56-42.93] and absence of nucleic acid amplification test (aOR =7.54, 95% CI, 2.75-20.67). After adjusting for age, sex, cardiovascular disease, and solid malignancies, delayed treatment was significantly associated with all-cause mortality (aOR =3.79, 95% CI, 1.36-10.58). The most frequent possible cause of delayed treatment was the doctor's low suspicion of active TB disease. CONCLUSIONS: Given that delayed treatment is associated with worse outcomes in South Korea, targeted interventions to increase awareness on TB in the healthcare community are necessary for additional mycobacterial tests and consults of suspicious patients to TB specialists.
Asunto(s)
Tiempo de Tratamiento , Tuberculosis Pulmonar , Estudios Transversales , Humanos , República de Corea , Estudios Retrospectivos , Centros de Atención Terciaria , Tuberculosis Pulmonar/tratamiento farmacológicoRESUMEN
OBJECTIVES: The aim was to determine whether various clinical specimens obtained from COVID-19 patients contain the infectious virus. METHODS: To demonstrate whether various clinical specimens contain the viable virus, we collected naso/oropharyngeal swabs and saliva, urine and stool samples from five COVID-19 patients and performed a quantitative polymerase chain reaction (qPCR) to assess viral load. Specimens positive with qPCR were subjected to virus isolation in Vero cells. We also used urine and stool samples to intranasally inoculate ferrets and evaluated the virus titres in nasal washes on 2, 4, 6 and 8 days post infection. RESULTS: SARS-CoV-2 RNA was detected in all naso/oropharyngeal swabs and saliva, urine and stool samples collected between days 8 and 30 of the clinical course. Notably, viral loads in urine, saliva and stool samples were almost equal to or higher than those in naso/oropharyngeal swabs (urine 1.08 ± 0.16-2.09 ± 0.85 log10 copies/mL, saliva 1.07 ± 0.34-1.65 ± 0.46 log10 copies/mL, stool 1.17 ± 0.32 log10 copies/mL, naso/oropharyngeal swabs 1.18 ± 0.12-1.34 ± 0.30 log10 copies/mL). Further, viable SARS-CoV-2 was isolated from naso/oropharyngeal swabs and saliva of COVID-19 patients, as well as nasal washes of ferrets inoculated with patient urine or stool. DISCUSSION: Viable SARS-CoV-2 was demonstrated in saliva, urine and stool samples from COVID-19 patients up to days 11-15 of the clinical course. This result suggests that viable SARS-CoV-2 can be secreted in various clinical samples and respiratory specimens.
Asunto(s)
Betacoronavirus/fisiología , Infecciones por Coronavirus/virología , Neumonía Viral/virología , Manejo de Especímenes/métodos , Animales , Betacoronavirus/genética , Betacoronavirus/aislamiento & purificación , COVID-19 , Chlorocebus aethiops , Heces/virología , Femenino , Hurones , Genoma Viral/genética , Humanos , Masculino , Viabilidad Microbiana , Persona de Mediana Edad , Pandemias , Faringe/virología , ARN Viral/genética , SARS-CoV-2 , Saliva/virología , Orina/virología , Células Vero , Carga Viral , Esparcimiento de VirusRESUMEN
BACKGROUND: Liberation and extubation are important for patients supported by mechanical ventilation. Extubation success is related to the duration of an intensive care unit (ICU) stay and mortality rate. High-flow nasal cannula (HFNC) oxygen therapy has physiological and clinical benefits in respiratory care. The present study compared clinical outcomes associated with HFNC and conventional oxygen therapy (COT) among patients at high risk for reintubation. METHODS: A single-center randomized clinical trial was conducted between March 2018 and June 2019. Sixty adults admitted to the ICU and who were at high-risk of reintubation and met the inclusion criteria were enrolled in this study. "High risk" for reintubation was defined as having at least one of the following risk factors: age > 65 years, Acute Physiology and Chronic Health Evaluation II score > 12 points on extubation day, obesity, poor expectoration, airway patency problems, difficult or prolonged weaning, and more than one comorbidity. The primary outcome of interest was reintubation within 72 hours. Secondary outcomes included duration of ICU and hospital stay, mortality rate, and time to reintubation. RESULTS: Of 60 patients, 31 received HFNC and 29 received COT (mean age, 78 ± 7.8 vs. 76 ± 6.5 years, respectively). Reintubation rate within 72 hours did not differ between the groups (3 patients [9.7%] vs. 1 patient [3.4%], respectively). Reintubation time was shorter among patients who received COT than among patients who received HFNC (0.5 hour vs. 25 hours), but this difference was not statistically significant. Duration of ICU did not differ between the groups (14.7 ± 9.6 days vs. 13.8 ± 15.7 days, for HFNC and COT, respectively). CONCLUSION: Among patients at high risk for reintubation, compared with COT, HFNC did not reduce the risk of reintubation within 72 hours.
Asunto(s)
Intubación Intratraqueal/métodos , Terapia por Inhalación de Oxígeno , Insuficiencia Respiratoria/terapia , Anciano , Anciano de 80 o más Años , Presión Sanguínea , Femenino , Frecuencia Cardíaca , Humanos , Unidades de Cuidados Intensivos , Tiempo de Internación , Masculino , Factores de Riesgo , Resultado del TratamientoRESUMEN
: Blood sampling via heparin-locked central venous catheter, including coagulation tests, is possible in accordance with the Clinical & Laboratory Standards Institute guidelines. However, differences exist between the test values of samples obtained from central venous catheter and those obtained from peripheral veins, even the guidelines are followed. To compare the coagulation time between blood samples from the heparin-locked central venous catheter and peripheral veins. In total, 72 hospitalized patients using heparin-locked Hickman catheters were enrolled. Blood samples for coagulation testing were simultaneously obtained via the peripheral veins and heparin-locked Hickman catheters. For sampling from the catheters, 0.9% sodium chloride flushing was performed and 10 or 23âml of blood was discarded prior to collecting the coagulation test samples. Correlation, Bland-Altman plot, covariate, and regression analysis were performed for data analyses. Despite following the guidelines, the activated partial thromboplastin time test values differed. In the 10âml of blood discard group, a correlation coefficient of 0.378 and a mean bias of 6.46âs were determined, while and in the 23âml blood discard group, a correlation coefficient of 0.80 and a mean bias of 2.518âs were determined. Therefore, the volume of blood discarded from the heparin-locked Hickman catheters may affect the activated partial thromboplastin time test values.
Asunto(s)
Pruebas de Coagulación Sanguínea/normas , Catéteres Venosos Centrales , Flebotomía , Sesgo , Pruebas de Coagulación Sanguínea/métodos , Recolección de Muestras de Sangre , Correlación de Datos , Heparina , Humanos , Tiempo de Tromboplastina ParcialAsunto(s)
Andrógenos/uso terapéutico , Síndromes Mielodisplásicos/tratamiento farmacológico , Trombocitopenia/tratamiento farmacológico , Adulto , Anciano , Anciano de 80 o más Años , Danazol/uso terapéutico , Femenino , Humanos , Masculino , Persona de Mediana Edad , Síndromes Mielodisplásicos/complicaciones , Oximetolona/uso terapéutico , Estudios Retrospectivos , Trombocitopenia/etiología , Resultado del Tratamiento , Adulto JovenRESUMEN
A globally imminent shortage of freshwater has been demanding viable strategies for improving desalination efficiencies with the adoption of cost- and energy-efficient membrane materials. The recently explored 2D transition metal dichalcogenides (2D TMDs) of near atomic thickness have been envisioned to offer notable advantages as high-efficiency membranes owing to their structural uniqueness; that is, extremely small thickness and intrinsic atomic porosity. Despite theoretically projected advantages, experimental realization of near atom-thickness 2D TMD-based membranes and their desalination efficiency assessments have remained largely unexplored mainly due to the technical difficulty associated with their seamless large-scale integration. Herein, we report the experimental demonstration of high-efficiency water desalination membranes based on few-layer 2D molybdenum disulfide (MoS2) of only â¼7 nm thickness. Chemical vapor deposition (CVD)-grown centimeter-scale 2D MoS2 layers were integrated onto porous polymeric supports with well-preserved structural integrity enabled by a water-assisted 2D layer transfer method. These 2D MoS2 membranes of near atomic thickness exhibit an excellent combination of high water permeability (>322 L m-2 h-1 bar-1) and high ionic sieving capability (>99%) for various seawater salts including Na+, K+, Ca2+, and Mg2+ with a range of concentrations. Moreover, they present near 100% salt ion rejection rates for actual seawater obtained from the Atlantic coast, significantly outperforming the previously developed 2D MoS2 layer membranes of micrometer thickness as well as conventional reverse osmosis (RO) membranes. Underlying principles behind such remarkably excellent desalination performances are attributed to the intrinsic atomic vacancies inherent to the CVD-grown 2D MoS2 layers as verified by aberration-corrected electron microscopy characterization.