RESUMEN
BACKGROUND: Oral vardenafil (VDF) tablet is an effective treatment for erectile dysfunction (ED), but intranasal administration with a suitable formulation can lead to a faster onset of action and offer more convenient planning for ED treatment. AIM: The primary purpose of the present pilot clinical study was to determine whether intranasal VDF with an alcohol-based formulation can result in more "user-friendly pharmacokinetics" as compared with oral tablet administration. METHODS: This single-dose randomized crossover study was conducted in 12 healthy young volunteers receiving VDF as a 10-mg oral tablet or 3.38-mg intranasal spray. Multiple blood concentrations were obtained, and VDF concentrations were determined with a liquid chromatography-tandem mass spectrometry assay. Pharmacokinetic parameters following each treatment were compared and adverse events assessed. OUTCOMES: Pharmacokinetic parameters were obtained: apparent elimination rate constant, elimination half-life, peak concentration, peak time, total area under the curve, and relative bioavailability. RESULTS: Although mean apparent elimination rate constant, elimination half-life, peak concentration, and total area under the curve were similar between intranasal and oral administration, the median peak time from intranasal was much shorter (10 vs 58 minutes, P < .001, Mann-Whitney U test). The variability of the pharmacokinetic parameters was also less with intranasal than oral administration. The relative bioavailability of intranasal to oral was 1.67. Intranasal VDF caused transient but tolerable local nasal reactions in 50% of subjects. Other adverse events (eg, headache) were similar between the treatments. The incidence of adverse events was, however, significantly less in the second treatment after initial exposure to VDF. No serious adverse events were noted. CLINICAL IMPLICATIONS: Intranasal VDF potentially offers a more timely and lower dose for the treatment of ED in patients who can tolerate the transient local adverse reactions. STRENGTHS AND LIMITATIONS: The strength of this study is its randomized crossover design. Because the study was conducted in 12 healthy young subjects, the results may not reflect those observed in elderly patients who may be likely taking VDF for ED. Nevertheless, the changes of pharmacokinetic parameters in the present study are likely a reflection of the differences between intranasal and oral administration of the formulations. CONCLUSION: Our study indicated that the present VDF formulation, when administered intranasally, can achieve a more rapid but similar plasma concentration with only about one-third dose when compared with the oral administration.
Asunto(s)
Disfunción Eréctil , Masculino , Humanos , Anciano , Diclorhidrato de Vardenafil , Administración Intranasal , Estudios Cruzados , Disponibilidad Biológica , Área Bajo la Curva , Comprimidos , Administración OralRESUMEN
BACKGROUND: Methamphetamine is an addictive drug with various effects on the neurotransmitters in the central nervous system. Methamphetamine-induced encephalopathy in the absence of hyperammonemia presents a unique challenge in a clinical setting. Previously published cases of methamphetamine-induced encephalopathy suggested that methamphetamine-induced hepatotoxicity and subsequent hyperammonemia may be the cause of encephalopathy. However, the literature is limited on methamphetamine-induced encephalopathy without hyperammonemia. CASE: This case presents a disoriented patient with methamphetamine use disorder in acute toxicity, unable to ambulate independently, and poorly responsive to verbal stimuli. The patient was found to have normal ammonia levels. DISCUSSION: This patient's presentation and laboratory findings, namely normal ammonia levels, suggest a different pathophysiological pathway for methamphetamine-induced encephalopathy. One potential pathway is through the direct action of methamphetamine on the central nervous system through acute disruption of neurotransmitter signaling and disruption of the blood-brain barrier. CONCLUSION: Further research should be conducted into the prevalence and pathophysiology of methamphetamine-induced encephalopathy in the absence of hyperammonemia. KEY POINTS: Methamphetamine-induced encephalopathy (MIE) in the absence of hyperammonemia presents a unique challenge in a clinical setting. Previously published cases of MIE suggest that methamphetamine-induced hepatotoxicity and subsequent hyperammonemia may be the cause of encephalopathy. Further research should be conducted into the prevalence and pathophysiology of MIE in the absence of hyperammonemia.
Asunto(s)
Encefalopatías , Enfermedad Hepática Inducida por Sustancias y Drogas , Efectos Colaterales y Reacciones Adversas Relacionados con Medicamentos , Hiperamonemia , Metanfetamina , Humanos , Hiperamonemia/inducido químicamente , Metanfetamina/efectos adversos , Amoníaco/efectos adversos , Amoníaco/metabolismoRESUMEN
INTRODUCTION: Tracheostomy management is a routine aspect of care in the critical care setting. While there are multiple complications that can arise in the post-operative setting after creation of a tracheostomy, dislodgement of a tracheostomy tube is associated with high mortality requiring rapid intervention. It is therefore important to prevent the occurrence with proper securement of the tracheostomy. In this study, we look at two methods commonly used to secure tracheostomy tubes: suturing of the lateral flanges to the skin with the use of cloth neck ties versus cloth neck ties alone. METHODS: This is a retrospective study with data collected from 1355 consecutive tracheostomy cases at a single institution. Our institution serves the County of San Bernardino, California as a level II trauma center. Patient selection occurred between 2004 and 2018, with distribution of patients to skin-sutured with neck tie tracheostomies (ST) and non-sutured neck tied only tracheostomies (NST) groups occurring by date of tracheostomy surgery. Our study investigates the dislodgement rate of percutaneous tracheostomies secured by either of these two methods. Due to a greater morbidity of tracheostomy dislodgement before a mature fistula tract is formed, we were specifically interested in the dislodgement rate within 7 days. RESULTS: In total, 328 cases of NST and 1027 cases of ST were collected. Overall, there was no statistically significant difference regarding the dislodgement and accidental decannulation rate between NST and ST (2.32% vs 4.46% for NST and ST, respectively, p = 0.1476). There was also no statistically significant difference regarding rates of dislodgement and accidental decannulation within 7 days between NST and ST (1.54% vs 1.11% for NST and ST, respectively, p = 0.5608). DISCUSSION: It takes 5-7 days for a tracheostomy tract to mature, and therefore most dislodgement occurs perioperatively within the first week after placement. Dislodgement of the tracheostomy tube can lead to devastating complications for those patients. To our knowledge, there has been no study investigating dislodgement in the early post-operative period in relation to tracheostomy securement method. CONCLUSION: Due to the emergent nature of tracheostomy dislodgement and loss of airway, prevention of this complication is critical. Our investigation found no statistically significant difference in the rate of early tracheostomy dislodgement in the skin sutured with neck tie and non-sutured neck tie only groups. This study contributes further data to the available literature regarding tracheostomy securement methods and dislodgement rate, specifically within the early post-operative period. LEVEL OF EVIDENCE: 2b.
Asunto(s)
Cuello/cirugía , Complicaciones Posoperatorias/epidemiología , Complicaciones Posoperatorias/etiología , Piel , Anclas para Sutura , Técnicas de Sutura , Traqueostomía/efectos adversos , Traqueostomía/métodos , Adulto , Estudios de Cohortes , Femenino , Humanos , Masculino , Periodo Posoperatorio , Estudios Retrospectivos , Factores de TiempoRESUMEN
OBJECTIVE: To evaluate the biocompatibility of canine fascia lata (FL) in vitro and after FL allograft implantation in dogs with clinical disease. STUDY DESIGN: In vitro experiment and small case series. SAMPLE POPULATION: Six dogs treated with allogenic freeze-dried FL. METHODS: Fibroblasts were cultured on disks of FL, polypropylene mesh (PM; negative control), and porcine small intestinal submucosa (SIS; positive control). Constructs were compared at 3, 7, and 14 days for water content, DNA amounts, scanning electron microscopy, and histology. Records of dogs treated with FL allografts with follow-up examination were reviewed for signalment, indication for surgery, surgical procedure, and outcomes. All owners were invited to complete a standardized questionnaire for long-term follow-up. RESULTS: Water content was greater in FL and SIS than in PM (P = .03). Fascia lata constructs contained more DNA compared with PM constructs at days 7 and 14 (P < .05), whereas SIS constructs did not differ from FL or PM. Fibroblasts appeared spherical and distributed throughout FL constructs, whereas they appeared stellate and remained on the surface of SIS and PM. Fascia lata allografts were implanted in six dogs with surgical conditions. No incisional complications were noted. All dogs had good to excellent long-term outcomes, except one that experienced recurrence of a perineal hernia 2 years after repair. CONCLUSION: In vitro, canine FL allowed attachment and proliferation of fibroblasts throughout layers of the graft. Canine allogenic FL was clinically well tolerated in this small population of dogs. CLINICAL SIGNIFICANCE: Allogenic FL is biocompatible and can be considered an alternative to SIS for soft tissue augmentation in dogs.
Asunto(s)
Perros/cirugía , Fascia Lata/trasplante , Fibroblastos/fisiología , AnimalesRESUMEN
BACKGROUND: Hospital-acquired pressure injuries are a chronic phenomenon in health care, and their prevention is an ongoing challenge. This study aims to investigate whether the application of a silicone-bordered multilayered foam dressing during the initial trauma resuscitation reduces sacral hospital-acquired pressure injury occurrence in trauma patients. METHODS: This is a single-center quality improvement study using a nonequivalent control group posttest-only design to study the effect of silicone-bordered multilayered foam dressing on the incidence of hospital-acquired pressure injuries. The study population included admitted, highest tier trauma activations, age 18 years and older. Preimplementation 2014 data were compared with postimplementation 2018 data. RESULTS: The result showed no statistically significant reduction in hospital-acquired pressure injury occurrence between the control and intervention groups. Incident rates for sacral hospital-acquired pressure injuries were 0.23% (2014) compared with 0.21% (2018). No statistically significant difference was found in the hospital and intensive care lengths of stay or injury severity. Preventive dressing costs were $7,689 annually compared with the estimated treatment costs of $70,000 per hospital-acquired pressure injury. CONCLUSION: Although this study's hospital-acquired pressure injury reduction rate was not significant, the inclusion of multidisciplinary team members in the quality improvement project led to the cultural hardwiring of hospital-acquired pressure injury prevention among all team members beyond that of just nursing.
Asunto(s)
Úlcera por Presión , Adolescente , Vendajes , Cuidados Críticos , Humanos , Región Sacrococcígea , SiliconasRESUMEN
BACKGROUND: Although specific specialties and residency programs have investigated student performance factors affecting matching, there is a paucity of information from medical schools. Furthermore, factors contributing to matching into first-choice residency have not been examined. This study aimed to identify academic performance factors affecting matching into first-choice residency and highly competitive specialties. METHODS: The authors conducted a study of 1726 graduates from their institution from 2010 to 2017 and assessed pre-/post-admission academic variables associated with matching into first choice and highly competitive specialties. RESULTS: 53.9% of graduates matched into their first choice. This was associated with passing COMLEX Level 2 CE (p = 0.01), PE (p = 0.02) on first attempt, and higher COMLEX Level 2 CE and USMLE 2 CK scores (p < 0.001 and 0.002; 14.1 and 3.9-point difference in mean scores respectively). Pre-clinical GPA (p = 0.002) and highest MCAT score (p = 0.02) were associated, however differences in means were < 1 point for both. Factors associated with matching into first choice included: MCAT (OR 0.95, 95% CI = (0.92, 0.98)), Level 2 CE score (OR = 1.01, 95% CI = (1.01, 1.02)) and passing Level 2 PE (OR = 3.68, 95% CI = (1.2, 11.28)). 12% of graduates matched into high- and 63% into low-competitiveness specialties. Matching into highly competitive specialties was associated with passing COMLEX Level 1 (p < 0.001), Level 2 CE (p < 0.001), USMLE Step 1 (p < 0.001) and Step 2 CK (p = 0.03) on first attempt. Mean scores of students matching into high- versus low-competitiveness specialties differed as follows: COMLEX Level 1 62.7 points, Level 2 CE 50.5 points, USMLE Step 1 13.6 points, Step 2 CK 7 points (all p < 0.001), as did pre-clinical GPA (2.4 points, p < 0.001). Level 1 score was the strongest predictor for matching into highly competitive specialties (OR = 1.04, 95% CI = (1.02, 1.05)). CONCLUSIONS: Licensing exam performance is important for matching into first-choice residency and into highly competitive specialties. Differences in exam scores were more pronounced for matching into highly competitive specialties than into first choice, with a larger difference in mean scores between students matching into specialties of high versus low competitiveness, than between students matching into their first- versus non first-choice residency. These results may help faculty prepare students and inform curriculum design to improve matching.
Asunto(s)
Rendimiento Académico , Evaluación Educacional , Internado y Residencia , Licencia Médica , Medicina Osteopática/educación , Estudiantes de Medicina/estadística & datos numéricos , California , Curriculum , Análisis Multivariante , Estudios Retrospectivos , Facultades de MedicinaRESUMEN
BACKGROUND: Blood product mistransfusions occur when a process error causes transfusion of incompatible blood products. These events are known sources of negative patient outcomes. One such event demonstrated an institutional knowledge gap and an opportunity to reduce this source of transfusion errors. The focus of this study was to evaluate the application of point of care cognitive aids to bridge potentially lethal knowledge gaps in blood product to patient compatibility. METHODS: A patient-donor ABO antigen compatibility grid for red blood cells (RBC) and fresh frozen plasma (FFP) was developed for creation of a cognitive aid and a blood product safety quiz. Participants included 117 registered nurses and postgraduate medical interns who were given 2 minutes to complete the quiz for establishing institutional controls. A separate group of 111 registered nurses and interns were given the same timed quiz twice, without and then with a blood product compatibility cognitive aid. An analysis of covariance was used to evaluate without cognitive aid versus with cognitive aid quiz results while taking the specialty (nurse versus interns) and baseline score into consideration. The blood bank adopted the grid as a forcing function to be completed before release of blood products. RESULTS: The correct RBC answer percentage increased from 84.7% to 98.3% without and with cognitive aid (average improvement 13.6%, standard deviation [SD] = 18.3%, 95% confidence interval, 10.1%-17.1%, P < .0001, ); the correct FFP answer percentage increased from 54.2% to 99.6% without and with cognitive aid (average improvement 45.4%, SD = 20.1%, 95% confidence interval, 41.7%-49.2%, P < .0001). Participants with lower baseline RBC and FFP score showed better improvement in the correct answer percentage for RBC and FFP (P < .001), respectively. CONCLUSIONS: The use of a cognitive aid for determining blood product ABO compatibility may improve performance during a time-limited test for matching correct patient and blood product ABO type. The use of the cognitive aid as a "forcing function" before the release of blood from the blood bank and before transfusion at the bedside may reduce transfusion mismatch associated with gaps in ABO compatibility knowledge.
Asunto(s)
Sistema del Grupo Sanguíneo ABO , Tipificación y Pruebas Cruzadas Sanguíneas/normas , Transfusión Sanguínea/normas , Conocimientos, Actitudes y Práctica en Salud , Errores Médicos/prevención & control , Sistema del Grupo Sanguíneo ABO/administración & dosificación , Tipificación y Pruebas Cruzadas Sanguíneas/métodos , Transfusión Sanguínea/métodos , Transfusión de Eritrocitos/métodos , Transfusión de Eritrocitos/normas , Humanos , Proyectos Piloto , PlasmaRESUMEN
Spinal orthotic bracing is a common modality for treating nonoperative spinal fractures with risks. This study aimed to assess the effect of an intervention on critical care nurses to improve their clinical knowledge and comfort level of managing patients. A literature review was conducted regarding common complications associated with spinal orthotics. This information was compiled and used to create a questionnaire and spinal orthotic course for nurses. Pre- and postassessments of nurses' knowledge regarding spinal orthotics were conducted. A total of 197 nurses completed the presentation. The ability to correctly identify thoracolumbosacral orthotics (TLSO), lumbosacral orthotics (LSO) and cervico-thoracic orthotics (CTO) all significantly increased. Regarding the clinical knowledge, the right answer to the question whether or not halo vest needed to be removed for cardiopulmonary resuscitation increased from 45.2% to 100% (p < .0001), and the correct answer to the question whether or not TLSO braces need to be worn at all times in patients with spinal precautions increased from 62.4% to 100% (p < .0001). Nurses reported that their comfort level of taking care of patients with spinal precautions increased from 94.4% before the presentation to 100% after the presentation. The quality improvement project seemed to improve the critical care nurses' ability to correctly identify different type of braces and their comfort level of managing patients with spinal precautions.
Asunto(s)
Competencia Clínica , Tratamiento Conservador/enfermería , Capacitación en Servicio/organización & administración , Personal de Enfermería en Hospital/educación , Aparatos Ortopédicos/efectos adversos , Fracturas de la Columna Vertebral/terapia , Tratamiento Conservador/métodos , Femenino , Humanos , Masculino , Aparatos Ortopédicos/estadística & datos numéricos , Seguridad del Paciente , Fracturas de la Columna Vertebral/diagnóstico por imagen , Centros TraumatológicosRESUMEN
PURPOSE: To report the long-term outcomes of 78 adult patients who underwent coronectomy with bone grafting (CWG) of the bony crypt. MATERIALS AND METHODS: Seventy-eight patients with follow-up of at least 5 to 9 years underwent CWG. Preoperative imaging and probing depths were recorded, as were yearly follow-up radiographs or cone-beam computed tomograms and yearly postoperative probing depths. RESULTS: Periodontal probing depths and bone levels on the distal surfaces of adjacent mandibular molars showed marked improvement compared with preoperative probing depths and bone levels. All retained roots maintained their immediate postoperative positions with no migrations and no reoperations required. CONCLUSION: CWG is a predictable procedure that should be considered for patients at risk for developing sensory disturbances or for those with deeper (>4 mm) preoperative probing depths on the distal surfaces of the adjacent molars. Adding bone graft appears to aid in preventing root migration and decreasing probing depths on the distal surfaces of adjacent molars.
Asunto(s)
Trasplante Óseo , Corona del Diente/diagnóstico por imagen , Corona del Diente/cirugía , Adulto , Tomografía Computarizada de Haz Cónico , Estudios de Seguimiento , Humanos , Persona de Mediana Edad , Radiografía Dental , Estudios Retrospectivos , Factores de Tiempo , Resultado del Tratamiento , Adulto JovenRESUMEN
There is a need for evidence-based scientific research to address the question of the effectiveness of acupuncture in improving clinical signs of laminitis in horses. The objective of this study was to compare lameness levels before and after 2 acupuncture treatments in horses with chronic laminitis. Twelve adult horses with chronic laminitis received 2 acupuncture treatments 1 week apart. The points were treated using dry needling, hemo-acupuncture, and aqua-acupuncture. Lameness level was objectively evaluated using an inertial sensor-based lameness evaluation system (Lameness Locator), as well as routine examinations following American Association of Equine Practitioners scoring before the first and 1 week after the second acupuncture treatment. Data were analyzed using Wilcoxon signed-rank test and P-values < 0.05 were considered statistically significant. Both the Lameness Locator (P = 0.0269) and routine lameness examination (P = 0.0039) showed a significant reduction in lameness severity. Our results support using acupuncture, along with other treatment options, in treating chronic equine laminitis.
Réponse à un traitement à l'acupuncture chez des chevaux atteints de laminite chronique. Il existe un besoin de recherche scientifique factuelle afin d'aborder la question de l'efficacité de l'acupuncture pour améliorer les signes cliniques de la laminite chez les chevaux. L'objectif de cette étude consistait à comparer les niveaux de boiterie avant et après deux traitements d'acupuncture chez des chevaux atteints de laminite chronique. Douze chevaux adultes souffrant de laminite chronique ont reçu deux traitements d'acupuncture à 1 semaine d'intervalle. Les points ont été traités en utilisant des aiguilles, l'hémo-acupuncture et l'aqua-acupuncture. Le niveau de boiterie a été évalué objectivement en utilisant un système inertiel d'évaluation de la boiterie à base de sonde (repérage de la boiterie) ainsi qu'à l'aide d'examens de routine en se basant sur la notation de l'American Association of Equine Practitioners avant le premier traitement et 1 semaine après le deuxième traitement d'acupuncture. Les données ont été analysées en utilisant les tests de rang signés de Wilcoxon et des valeurs P < 0,05 étaient considérées comme étant significatives sur le plan statistique. Le repérage de la boiterie (P = 0,0269) et l'examen de routine de la boiterie (P = 0,0039) ont montré une réduction significative de la gravité de la boiterie. Nos résultats appuient l'utilisation de l'acupuncture, conjointement à d'autres options de traitement, pour traiter la laminite équine chronique.(Traduit par Isabelle Vallières).
Asunto(s)
Terapia por Acupuntura/veterinaria , Enfermedades del Pie/veterinaria , Enfermedades de los Caballos/terapia , Cojera Animal/terapia , Terapia por Acupuntura/métodos , Animales , Femenino , Enfermedades del Pie/terapia , Marcha , Pezuñas y Garras/patología , Caballos , Masculino , Resultado del TratamientoRESUMEN
OBJECTIVE: To compare in vitro biomechanical properties of the tube knot (TB) to a crimp clamp (CC) system, and square knot (SQ) using 3 monofilament materials. STUDY DESIGN: In vitro biomechanical study. SAMPLE POPULATION: Suture loops (n=20 per material/knot construct). METHODS: Monotonic tensile loading (300 mm/min single pull to failure) was performed on knots tied using 3 knots (TB, 5-throw SQ, and CC system) with each of 3 materials (40# Securos® nylon, #2 polypropylene, and #2 nylon). Ultimate tensile strength, elongation, and stiffness were measured and compared by sequential 1- and 2-way ANOVA. RESULTS: Ultimate tensile strength was greatest with 40# nylon CC (mean ± SD, 293.6 ± 26.2 N), followed by TB (289.8 ± 9.2 N) and SQ (252.2 ± 8.5 N) with no significant difference between CC and TB. TB with #2 polypropylene (158.1 ± 7.4 N) and #2 nylon (126.3 ± 5.5 N) had significantly greater tensile strength than SQ with #2 polypropylene (143.6 ± 5.3 N) and #2 nylon (110.7 ± 6.2 N). Elongation at failure was significantly greater in 40# nylon TB (25.3 ± 3.2 mm) and SQ (10.8 ± 1.6 mm) compared to CC (5.3 ± 1.0 mm). Both material and knotting method had an effect on ultimate tensile strength, elongation at failure, and stiffness, based on 2-way ANOVA. CONCLUSION: Ultimate tensile strength of TB was equivalent to that of CC; however, elongation at failure was greatest for TB, which may be of concern for clinical applications.
Asunto(s)
Ensayo de Materiales/veterinaria , Instrumentos Quirúrgicos/veterinaria , Técnicas de Sutura/veterinaria , Suturas/veterinaria , Resistencia a la Tracción , Animales , Fenómenos Biomecánicos , Técnicas de Sutura/instrumentaciónRESUMEN
Finding effective antibiotic alternatives is crucial to managing the re-emerging health risk of Clostridium perfringens (CP) type A/G-induced avian necrotic enteritis (NE), a disease that has regained prominence in the wake of governmental restrictions on antibiotic use in poultry. Known for its antimicrobial and immunomodulatory effects, the use of bovine lactoferrin (bLF) in chickens is yet to be fully explored. In this study, we hypothesized that bLF can accumulate in the small intestines of healthy chickens through gavage and intramuscular supplementation and serves as a potential antibiotic alternative. Immunohistochemistry located bLF in various layers of the small intestines and ELISA testing confirmed its accumulation. Surprisingly, sham-treated chickens also showed the presence of bLF, prompting a western blotting analysis that dismissed the notion of cross-reactivity between bLF and the avian protein ovotransferrin. Although the significance of the route of administration remains inconclusive, this study supports the hypothesis that bLF is a promising and safe antibiotic alternative with demonstrated resistance to the degradative environment of the chicken intestines. Further studies are needed to determine its beneficial pharmacological effects in CP-infected chickens.
Asunto(s)
Antibacterianos , Pollos , Infecciones por Clostridium , Clostridium perfringens , Lactoferrina , Enfermedades de las Aves de Corral , Animales , Lactoferrina/administración & dosificación , Lactoferrina/farmacología , Clostridium perfringens/fisiología , Clostridium perfringens/efectos de los fármacos , Antibacterianos/farmacología , Antibacterianos/administración & dosificación , Enfermedades de las Aves de Corral/tratamiento farmacológico , Enfermedades de las Aves de Corral/prevención & control , Enfermedades de las Aves de Corral/microbiología , Infecciones por Clostridium/veterinaria , Infecciones por Clostridium/prevención & control , Bovinos , Alimentación Animal/análisis , Intestino Delgado/efectos de los fármacos , Dieta/veterinaria , Enteritis/veterinaria , Suplementos Dietéticos/análisisRESUMEN
BACKGROUND: The recent global pandemic due to severe acute respiratory syndrome coronavirus-2 resulted in a high rate of multi-organ failure and mortality in a large patient population across the world. As such, a possible correlation between acute kidney injury (AKI) and increased mortality rate in these patients has been suggested in literature. METHODS: This is a two-year retrospective study of critically ill adult patients infected with COVID-19 that were admitted to the intensive care unit (ICU) on ventilatory support. Two groups of patients were identified in this study, those who were directly admitted to the ICU or those who were initially admitted to the Medical Floor and were later transferred to the ICU due to either worsening respiratory status or change in their hemodynamic conditions. Within each group, three subgroups were created based on the status of AKI, namely, those who did not develop AKI, those who developed AKI, and those who with previous history of dialysis dependent AKI. RESULTS: The AKI subgroup had the highest mortality rate in the ICU and Floor patients. Of note, those patients who were directly admitted to the Floor and were later transferred to the ICU for worsening conditions also experienced a higher mortality rate if they had developed AKI during their course of hospital stay. CONCLUSIONS: This study identified a statistically significant higher mortality in patients who developed AKI than those who did not develop AKI among critically ill patients. TRIAL REGISTRATION: Clinicaltrials.gov registration number NCT05964088. Date of registration: July 24 2023.
RESUMEN
Introduction The management of maxillofacial trauma can be challenging in different unique clinical presentations. While maxillofacial fractures vary in location based on the mechanism of injury, the mandibular fracture is noted to be one of the most common facial fractures. The objective of this study was to explore the differences in injury patterns, outcomes, and demographics of isolated traumatic mandibular fractures between incarcerated and general populations. Methods This retrospective study analyzed consecutive patients presenting for trauma care from January 1, 2010, to December 31, 2020, at the Arrowhead Regional Medical Center (ARMC). Patients 18 years and older were included in this study. Patients diagnosed with mandibular fracture as the primary diagnosis at admission and discharge were identified using the International Classification of Disease, Ninth and Tenth Revision (ICD-9, ICD-10) Code. Patient demographics were extracted from their electronic medical records and included race, marital status, and insurance status. Results A total of 1080 patients with confirmed mandibular fractures were included in the final analysis. Among these patients, 87.5% (n=945) were males, 40% (n=432) of the patients were Hispanic, and the average age was 31.55 years old. The most common mechanism of injury was blunt trauma secondary to assault. Compared to the general population with mandibular fracture, the incarcerated patients with mandibular fracture were more likely to be males (96.1% vs 86.1% for incarcerated population vs. general population respectively, p=0.0005). No other variables were statistically different between these two groups. Conclusion The evidence from this study suggests that the patterns, outcomes, and demographics of mandibular fracture in both incarcerated and general populations are similar.
RESUMEN
The epidemic of opioid overdose brought a major health crisis to the front line of public health in the United States. Early efforts have focused on the prevention of production, distribution, and consumption of the drugs. However, there is little information about youth populations at risk for opioid overdose and their response to targeted treatment plans. The San Bernardino County Youth Opioid Response (SBCYOR) coalition in collaboration with the San Bernadino County (SBC) Probation Department organized a safety net system for at-risk youth by improving communication among county resources. This program mainly focused on individuals aged 12 to 24 years in the county's detention centers along with educational and prevention projects such as naloxone programs for first responders in the region. To describe the impact of the SBCYOR program on at-risk youth, we compare the frequencies of patients referred and treated with medications for opioid use disorder (MOUD) at the SBC Probation Department, which was responsible for individuals from age 12 to less than 18 years, with those from the West Valley Detention Center (WVDC), which was responsible for adults (18 to 24 years of age), from September 2020 through June 2022. Similar proportions of youths were referred for treatment of opioid use disorder (OUD) at the respective sites (3.7% SBC Probation Department, 3.6% WVDC). Of these, however, 78.0% were treated with MOUD at SBC Probation Department compared with only 7.1% at WVDC. SBCYOR coalition partners were able to transform their services into a comprehensive medical and behavioral health program for the incarcerated youth population at risk for OUD.
RESUMEN
As business drivers create pressure to see more patients in a given period, there is no reliable guidance regarding the optimal allocation of resources in ambulatory visits. Many pediatric primary clinics set appointment lengths in increments of "five minutes." Defining the appointment lengths for potentially longer visits by arbitrary increments (e.g., twice the time for an acute visit) is a common "experiential" scheme. However, how much additional time is really needed if the patient is new to practice or when the visit is arranged for preventive services is unknown. Identifying the misallocation of clinic resources is fundamental because misallocation reduces access for patients and increases practice costs. In this study, using a time-motion approach, we examined the characteristics of 372 visits in a pediatric primary care clinic.
Asunto(s)
Citas y Horarios , Eficiencia Organizacional , Visita a Consultorio Médico , Evaluación de Procesos, Atención de Salud , Adolescente , Niño , Preescolar , Humanos , Lactante , Recién Nacido , Pediatría , Proyectos Piloto , Administración de la Práctica Médica , Atención Primaria de Salud , Estudios de Tiempo y Movimiento , Estados UnidosRESUMEN
Cranial cruciate ligament deficiency (CCLD) results in internal rotational instability of the stifle (RLS). By contrast, tibial torsion (TT) is an anatomical feature of the tibia along its longitudinal axis. The objective of this study was to validate a dynamic radiographic technique to measure internal rotational laxity of the stifle and differentiate it from TT. Models included transection of the CCL for RLS and an osteotomy for TT. One limb within eight pairs of canine cadaveric hind limbs was randomly assigned to CCLD. The contralateral limb underwent TT, followed by CCLD. Neutral and stress radiographs were taken with the limb in a custom rotating 3-D printed positioning device before and after each modification. The position of the calcaneus on neutral views and the magnitude of its displacement under standardized torque were compared within limbs and between groups. Transection of the CCL increased the magnitude of displacement of the calcaneus by 1.6 mm (0.3-3.1 mm, p < 0.05) within limbs. The lateral calcaneal displacement (dS-dN) tended to be greater when CCLD limbs were compared to limbs with intact CCL. A magnitude of calcaneal displacement of 3.45 mm differentiated limbs with RLS from intact limbs with 87.5% sensitivity and 68.7% specificity. The calcaneus was displaced further laterally by about 3 mm on neutral radiographs (dN) when limbs with experimental TT were compared to those without TT (p < 0.05). A calcaneus located at least 3.25 mm from the sulcus differentiated limbs with TT from intact limbs with 87.5% sensitivity and 87.5% specificity. The technique reported here allowed detection of RLS, especially within limbs. A calcaneus located at least 3.25 mm on neutral radiographs of large dogs should prompt a presumptive diagnosis of TT.
RESUMEN
A traumatic brain injury (TBI) is a significant factor in injury-related deaths in the United States and may lead to complex psychological disorders. Auto-cannibalism as a sequela of a TBI has yet to be reported in the literature. The current literature regarding such behavior is often associated with psychosis, intellectual disability, or substance use. A 35-year-old male had a past medical history significant for a TBI a decade ago. He was transferred to the emergency department due to a self-inflicted wound. The patient had been scratching his arms and legs for the last few months and displayed an intense new pattern of self-destructive behavior in the past week. He went through surgical wound debridement and psychiatric evaluation before he was discharged home. This case depicts the importance of regular, long-term psychiatric, and neurological follow-up for patients sustaining TBIs, regardless of whether or not they were previously deemed stable. A greater understanding of many factors leading to self-destructive behavior following TBIs is needed to improve patient outcomes.
RESUMEN
Shock is the clinical presentation of circulatory failure with impaired perfusion that results in inadequate cellular oxygen utilization. Treatment requires properly identifying the type of shock that is impacting the patient (obstructive, distributive, cardiogenic, and/or hypovolemic). Complex cases may involve numerous contributors to each type of shock and/or multiple types of shock which can present interesting diagnostic and management challenges to the clinician. In this case report, we present a 54-year-old male with a remote history of a right lung pneumonectomy presenting with multifactorial shock including cardiac tamponade, with initial compression of the expanding pericardial effusion by the postoperative fluid accumulation within the right hemithorax. While in the emergency department, the patient gradually became hypotensive with worsening tachycardia and dyspnea. A bedside echocardiogram revealed an increase in size of the pericardial effusion. An emergent ultrasound-guided pericardial drain was inserted with gradual improvement of his hemodynamics followed by placement of thoracostomy tube. This unique case highlights the importance of utilizing point-of-care ultrasound along with emergent intervention in critical resuscitation.
RESUMEN
Serum lactate levels have been recommended as a standard in guiding resuscitation and management of post-traumatic orthopedic injuries. Studies have suggested an increased incidence of postoperative complications in trauma patients with injury severity scores (ISSs) greater than 18. However, in trauma patients without an elevated ISS, the role of lactate in guiding operative timing has not been explored. This study considers the role of lactate measurement with respect to surgical timing and predicting postoperative complications in trauma patients with long bone fractures and an ISS less than 16. Materials and methods: A total of 164 patients, ages 18 and above with long bone fractures and ISS less than 16 were sampled in the last 5 years. Demographic data was ascertained. Patients were placed into two cohorts with a serum preoperative lactate greater than or equal to 2.0 mmol/l and a serum preoperative lactate less than 2.0 mmol/l. Key endpoints included hospital mortality, length of hospitalization (LOH), discharge designation, and postoperative complications. Results: A total of 148 patients had a lactate level less than 2.0 mmol/l and 16 had a lactate greater than or equal to 2.0 mmol/l. There was no significant difference in demographics between these two preoperative lactate groups. There was no statistical difference when considering mortality, discharge designation, LOH, and postoperative complications. Conclusion: Lactate levels assist providers in guide resuscitative efforts in trauma patients. However, this study finds that preoperative lactate measurements and efforts made to normalize lactate level are not correlated with mortality, LOH, and postoperative complications in trauma patients with an ISS less than 16. This study does not support preoperative lactate normalization in guiding surgical timing.