Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 190
Filtrar
1.
Am Surg ; 90(6): 1187-1194, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38197391

RESUMO

INTRODUCTION: This study aims to compare the impact of early initiation of enteral feeding initiation on clinical outcomes in critically ill adult trauma patients with isolated traumatic brain injuries (TBI). METHODS: A retrospective cohort analysis of the American College of Surgeons Trauma Quality Program Participant Use File 2017-2021 dataset of critically ill adult trauma patients with moderate to severe blunt isolated TBI. Outcomes included ICU length of stay (ICU-LOS), ventilation-free days (VFD), and complication rates. Timing cohorts were defined as very early (<6 hours), early (6-24 hours), intermediate (24-48 hours), and late (>48 hours). RESULTS: 9210 patients were included in the analysis, of which 952 were in the very early enteral feeding initiation group, 652 in the early, 695 in intermediate, and 6938 in the late group. Earlier feeding was associated with significantly shorter ICU-LOS (very early: 7.82 days; early: 11.28; intermediate 12.25; late 17.55; P < .001) and more VFDs (very early: 21.72 days; early: 18.81; intermediate 18.81; late 14.51; P < .001). Patients with late EF had a significantly higher risk of VAP than very early (OR .21, CI 0.12-.38, P < .001) or early EF (OR .33, CI 0.17-.65, P = .001), and higher risk of ARDS than the intermediate group (OR .23, CI 0.05-.925, P = .039). CONCLUSION: Early enteral feeding in critically ill adult trauma patients with moderate to severe isolated TBI resulted in significantly fewer days in the ICU, more ventilation-free days, and lower odds of VAP and ARDS the sooner enteral feeding was initiated, with the most optimized outcomes within 6 hours.


Assuntos
Lesões Encefálicas Traumáticas , Estado Terminal , Nutrição Enteral , Tempo de Internação , Humanos , Nutrição Enteral/métodos , Lesões Encefálicas Traumáticas/terapia , Lesões Encefálicas Traumáticas/complicações , Masculino , Feminino , Estudos Retrospectivos , Estado Terminal/terapia , Pessoa de Meia-Idade , Adulto , Tempo de Internação/estatística & dados numéricos , Fatores de Tempo , Unidades de Terapia Intensiva , Resultado do Tratamento
2.
Am Surg ; 89(11): 4842-4852, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37167954

RESUMO

INTRODUCTION: Despite the increasing amount of evidence supporting its use, cell salvage (CS) remains an underutilized resource in operative trauma care in many hospitals. We aim to evaluate the utilization of CS in adult trauma patients and associated outcomes to provide evidence-based recommendations. METHODS: A systematic review was conducted using PubMed, Google Scholar, and CINAHL. Articles evaluating clinical outcomes and the cost-effectiveness of trauma patients utilizing CS were included. The primary study outcome was mortality rates. The secondary outcomes included complication rates (sepsis and infection) and ICU-LOS. The tertiary outcome was the cost-effectiveness of CS. RESULTS: This systematic review included 9 studies that accounted for a total of 1119 patients that received both CS and allogeneic transfusion (n = 519), vs allogeneic blood transfusions only (n = 601). In-hospital mortality rates ranged from 13% to 67% in patients where CS was used vs 6%-65% in those receiving allogeneic transfusions only; however, these findings were not significantly different (P = .21-.56). Similarly, no significant differences were found between sepsis and infection rates or ICU-LOS in those patients where CS usage was compared to allogeneic transfusions alone. Of the 4 studies that provided comparisons on cost, 3 found the use of CS to be significantly more cost-effective. CONCLUSIONS: Cell salvage can be used as an effective method of blood transfusion for trauma patients without compromising patient outcomes, in addition to its possible cost advantages. Future studies are needed to further investigate the long-term effects of cell salvage utilization in trauma patients.


Assuntos
Transfusão de Sangue Autóloga , Sepse , Adulto , Humanos , Transfusão de Sangue Autóloga/métodos , Análise Custo-Benefício , Transfusão de Sangue/métodos , Sepse/terapia
3.
J Surg Res ; 289: 106-115, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37087837

RESUMO

INTRODUCTION: Although it has been established that electrolyte abnormalities are a consequence of traumatic brain injury (TBI), the degree to which electrolyte imbalances impact patient outcomes has not been fully established. We aim to determine the impact of sodium, potassium, calcium, and magnesium abnormalities on outcomes in patients with TBI. METHODS: Four databases were searched for studies related to the impact of electrolyte abnormalities on outcomes for TBI patients. Outcomes of interest were rates of mortality, Glasgow Outcome Scale (GOS), and intensive care unit length of stay (ICU-LOS). The search included studies published up to July 21, 2022. Articles were then screened and included if they met inclusion and exclusion criteria. RESULTS: In total, fourteen studies met inclusion and exclusion criteria for analysis in this systematic review. In patients with TBI, an increased mortality rate was associated with hypernatremia, hypokalemia, and hypocalcemia in the majority of studies. Both hyponatremia and hypomagnesemia were associated with worse GOS at 6 months. Whereas, both hyponatremia and hypernatremia were associated with increased ICU-LOS. There was no evidence to suggest other electrolyte imbalances were associated with either GOS or ICU-LOS. CONCLUSIONS: Hyponatremia and hypomagnesemia were associated with worse GOS. Hypernatremia was associated with increased mortality and ICU-LOS. Hypokalemia and hypocalcemia were associated with increased mortality. Given these findings, future practice guidelines should consider the effects of electrolytes' abnormalities on outcomes in TBI patients prior to establishing management strategies.


Assuntos
Lesões Encefálicas Traumáticas , Hipernatremia , Hipocalcemia , Hipopotassemia , Hiponatremia , Desequilíbrio Hidroeletrolítico , Humanos , Hipernatremia/etiologia , Hipopotassemia/etiologia , Hiponatremia/etiologia , Hipocalcemia/epidemiologia , Hipocalcemia/etiologia , Lesões Encefálicas Traumáticas/complicações , Desequilíbrio Hidroeletrolítico/etiologia , Eletrólitos
5.
Parkinsonism Relat Disord ; 103: 60-68, 2022 10.
Artigo em Inglês | MEDLINE | ID: mdl-36063706

RESUMO

OBJECTIVE: To systematically evaluate structural MRI and diffusion MRI features for cross-sectional discrimination and tracking of longitudinal disease progression in early multiple system atrophy (MSA). METHODS: In a prospective, longitudinal study of synucleinopathies with imaging on 14 controls and 29 MSA patients recruited at an early disease stage (15 predominant cerebellar ataxia subtype or MSA-C and 14 predominant parkinsonism subtype or MSA-P), we computed regional morphometric and diffusion MRI features. We identified morphometric features by ranking them based on their ability to distinguish MSA-C from controls and MSA-P from controls and evaluated diffusion changes in these regions. For the top performing regions, we evaluated their utility for tracking longitudinal disease progression using imaging from 12-month follow-up and computed sample size estimates for a hypothetical clinical trial in MSA. We also computed these selected morphometric features in an independent validation dataset. RESULTS: We found that morphometric changes in the cerebellar white matter, brainstem, and pons can separate early MSA-C patients from controls both cross-sectionally and longitudinally (p < 0.01). The putamen and striatum, though useful for separating early MSA-P patients from control subjects at baseline, were not useful for tracking MSA disease progression. Cerebellum white matter diffusion changes aided in capturing early disease related degeneration in MSA. INTERPRETATION: Regardless of clinically predominant features at the time of MSA assessment, brainstem and cerebellar pathways progressively deteriorate with disease progression. Quantitative measurements of these regions are promising biomarkers for MSA diagnosis in early disease stage and potential surrogate markers for future MSA clinical trials.


Assuntos
Atrofia de Múltiplos Sistemas , Humanos , Atrofia de Múltiplos Sistemas/diagnóstico por imagem , Estudos Prospectivos , Estudos Longitudinais , Estudos Transversais , Imageamento por Ressonância Magnética/métodos , Cerebelo/diagnóstico por imagem , Progressão da Doença , Biomarcadores , Diagnóstico Diferencial
6.
Diabet Med ; 38(5): e14430, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33073393

RESUMO

AIMS: Sustained engagement in type 1 diabetes self-management behaviours is a critical element in achieving improvements in glycated haemoglobin (HbA1c) and minimising risk of complications. Evaluations of self-management programmes, such as Dose Adjustment for Normal Eating (DAFNE), typically find that initial improvements are rarely sustained beyond 12 months. This study identified behaviours involved in sustained type 1 diabetes self-management, their influences and relationships to each other. METHODS: A mixed-methods study was conducted following the first two steps of the Behaviour Change Wheel framework. First, an expert stakeholder consultation identified behaviours involved in self-management of type 1 diabetes. Second, three evidence sources (systematic review, healthcare provider-generated 'red flags' and participant-generated 'frequently asked questions') were analysed to identify and synthesise modifiable barriers and enablers to sustained self-management. These were characterised according to the Capability-Opportunity-Motivation-Behaviour (COM-B) model. RESULTS: 150 distinct behaviours were identified and organised into three self-regulatory behavioural cycles, reflecting different temporal and situational aspects of diabetes self-management: Routine (e.g. checking blood glucose), Reactive (e.g. treating hypoglycaemia) and Reflective (e.g. reviewing blood glucose data to identify patterns). Thirty-four barriers and five enablers were identified: 10 relating to Capability, 20 to Opportunity and nine to Motivation. CONCLUSIONS: Multiple behaviours within three self-management cycles are involved in sustained type 1 diabetes self-management. There are a wide range of barriers and enablers that should be addressed to support self-management behaviours and improve clinical outcomes. The present study provides an evidence base for refining and developing type 1 diabetes self-management programmes.


Assuntos
Diabetes Mellitus Tipo 1/terapia , Motivação/fisiologia , Autogestão , Diabetes Mellitus Tipo 1/epidemiologia , Diabetes Mellitus Tipo 1/psicologia , Prova Pericial/estatística & dados numéricos , Comportamentos Relacionados com a Saúde/fisiologia , Humanos , Defesa do Paciente/estatística & dados numéricos , Sistemas de Apoio Psicossocial , Autogestão/métodos , Autogestão/psicologia , Autogestão/estatística & dados numéricos , Comportamento Social , Revisões Sistemáticas como Assunto , Reino Unido/epidemiologia
7.
Ecol Evol ; 10(16): 8838-8854, 2020 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-32884661

RESUMO

Population growth is highly sensitive to changes in reproductive rates for many avian species. Understanding how reproductive rates are related to environmental conditions can give managers insight into factors contributing to population change. Harvest trends of eastern wild turkey in northeastern South Dakota suggest a decline in abundance. We investigated factors influencing reproductive success of this important game bird to identify potential factors contributing to the decline. We monitored nesting rate, nest survival, renesting rate, clutch size, hatchability, and poult survival of 116 eastern wild turkey hens using VHF radio transmitters during the springs and summers of 2017 and 2018. Heavier hens were more likely to attempt to nest than lighter hens, and adult hens were more likely to renest than yearling hens. Nest survival probability was lowest in agricultural fields relative to all other cover types and positively related to horizontal visual obstruction and distance to the nearest road. Daily nest survival probability demonstrated an interaction between temperature and precipitation, such that nest survival probability was lower on warm, wet days, but lowest on dry days. Egg predation was the leading cause of nest failure, followed by haying of the nest bowl and death of the incubating hen. Poults reared by adult hens had a greater probability of survival than poults reared by yearling hens. Our estimate of survival probability of poults raised by yearling hens was low relative to other studies, which may be contributing to the apparent regional population decline. However, there is little managers can do to influence poult survival in yearling hens. Alternatively, we found nest survival probability was lowest for nests initiated in agricultural fields. Wildlife-friendly harvesting practices such as delayed haying or installation of flushing bars could help increase productivity of eastern wild turkey in northeastern South Dakota.

8.
PeerJ ; 8: e9143, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32523807

RESUMO

We investigated survival and cause-specific mortality for a mountain goat (Oreamnos americanus) population during a period when the puma (Puma concolor) population was growing in the Black Hills, South Dakota, 2006-2018. We obtained survival data from 47 adult goats (n = 33 females, n = 14 males). Annual survival varied from 0.538 (95% CI [0.285-0.773]) to 1.00 (95% CI [1.00-1.00]) and puma predation was the primary cause-specific mortality factor over a 12-year period. Cumulative hectares of mountain pine beetle (Dendroctonus ponderosae) disturbance was a covariate of importance (w i  = 0.972; ß = 0.580, 95% CI [0.302-0.859]) influencing survival. To our knowledge, this is the first account of puma being the primary mortality factor of mountain goats over a long-term study. The Black Hills system is unique because we could examine the expanded realized niche of puma in the absence of other large carnivores and their influence on mountain goats. We hypothesize that puma were being sustained at higher densities due to alternate prey sources (e.g., white-tailed deer; Odocoileous virginianus) and this small population of mountain goats was susceptible to predation by one or several specialized puma in the Black Hills. However, we also hypothesize a changing landscape with increased tree mortality due to insect infestation provided conditions for better predator detection by goats and increased survival. Alternatively, open canopy conditions may have increased understory forage production potentially increasing mountain goat survival but we did not evaluate this relationship. Survival and mortality rates of mountain goats should continue to be monitored as this small population may be highly susceptible to population declines due to slow growth rates.

9.
Mayo Clin Proc ; 95(6): 1195-1205, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-32498775

RESUMO

OBJECTIVE: To report population age-specific prevalence of core cerebrovascular disease lesions (infarctions, cerebral microbleeds, and white-matter hyperintensities detected with magnetic resonance imaging); estimate cut points for white-matter hyperintensity positivity; investigate sex differences in prevalence; and estimate prevalence of any core cerebrovascular disease features. PATIENTS AND METHODS: Participants in the population-based Mayo Clinic Study of Aging aged 50 to 89 years underwent fluid-attenuated inversion recovery and T2* gradient-recalled echo magnetic resonance imaging to assess cerebrovascular disease between October 10, 2011, and September 29, 2017. We characterized each participant as having infarct, normal versus abnormal white-matter hyperintensity, cerebral microbleed, or a combination of lesions. Prevalence of cerebrovascular disease biomarkers was derived through adjustment for nonparticipation and standardization to the population of Olmsted County, Minnesota. RESULTS: Among 1462 participants without dementia (median [range] age, 68 [50 to 89] y; men, 52.7%), core cerebrovascular disease features increased with age. Prevalence (95% CI) of cerebral microbleeds was 13.6% (11.6%-15.6%); infarcts, 11.7% (9.7%-13.8%); and abnormal white-matter hyperintensity, 10.7% (8.7%-12.6%). Infarcts and cerebral microbleeds were more common among men. In contrast, abnormal white-matter hyperintensity was more common among women ages 60 to 79 y and men, ages 80 y and older. Prevalence of any core cerebrovascular disease feature determined by presence of at least one cerebrovascular disease feature increased from 9.5% (ages 50 to 59 y) to 73.8% (ages 80 to 89 y). CONCLUSION: Whereas this study focused on participants without dementia, the high prevalence of cerebrovascular disease imaging lesions in elderly persons makes assignment of clinical relevance to cognition and other downstream manifestations more probabilistic than deterministic.


Assuntos
Envelhecimento/fisiologia , Transtornos Cerebrovasculares/diagnóstico por imagem , Idoso , Idoso de 80 Anos ou mais , Transtornos Cerebrovasculares/epidemiologia , Transtornos Cerebrovasculares/patologia , Estudos de Coortes , Feminino , Humanos , Imageamento por Ressonância Magnética/métodos , Masculino , Pessoa de Meia-Idade , Prevalência , Distribuição por Sexo , Substância Branca/diagnóstico por imagem , Substância Branca/patologia
10.
Ecol Evol ; 10(7): 3491-3502, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32274004

RESUMO

Chronic pathogen carriage is one mechanism that allows diseases to persist in populations. We hypothesized that persistent or recurrent pneumonia in bighorn sheep (Ovis canadensis) populations may be caused by chronic carriers of Mycoplasma ovipneumoniae (Mo). Our experimental approach allowed us to address a conservation need while investigating the role of chronic carriage in disease persistence.We tested our hypothesis in two bighorn sheep populations in South Dakota, USA. We identified and removed Mo chronic carriers from the Custer State Park (treatment) population. Simultaneously, we identified carriers but did not remove them from the Rapid City population (control). We predicted removal would result in decreased pneumonia, mortality, and Mo prevalence. Both population ranges had similar habitat and predator communities but were sufficiently isolated to preclude intermixing.We classified chronic carriers as adults that consistently tested positive for Mo carriage over a 20-month sampling period (n = 2 in the treatment population; n = 2 in control population).We failed to detect Mo or pneumonia in the treatment population after chronic carrier removal, while both remained in the control. Mortality hazard for lambs was reduced by 72% in the treatment population relative to the control (CI = 36%, 91%). There was also a 41% reduction in adult mortality hazard attributable to the treatment, although this was not statistically significant (CI = 82% reduction, 34% increase). Synthesis and Applications: These results support the hypothesis that Mo is a primary causative agent of persistent or recurrent respiratory disease in bighorn sheep populations and can be maintained by a few chronic carriers. Our findings provide direction for future research and management actions aimed at controlling pneumonia in wild sheep and may apply to other diseases.

11.
J Trauma Acute Care Surg ; 88(3): 372-378, 2020 03.
Artigo em Inglês | MEDLINE | ID: mdl-32107352

RESUMO

BACKGROUND: On the morning of June 12, 2016, an armed assailant entered the Pulse Nightclub in Orlando, Florida, and initiated an assault that killed 49 people and injured 53. The regional Level I trauma center and two community hospitals responded to this mass casualty incident. A detailed analysis was performed to guide hospitals who strive to prepare for future similar events. METHODS: A retrospective review of all victim charts and/or autopsy reports was performed to identify victim presentation patterns, injuries sustained, and surgical resources required. Patients were stratified into three groups: survivors who received care at the regional Level I trauma center, survivors who received care at one of two local community hospitals, and decedents. RESULTS: Of the 102 victims, 40 died at the scene and 9 died upon arrival to the Level I trauma center. The remaining 53 victims received definitive medical care and survived. Twenty-nine victims were admitted to the trauma center and five victims to a community hospital. The remaining 19 victims were treated and discharged that day. Decedents sustained significantly more bullet impacts than survivors (4 ± 3 vs. 2 ± 1; p = 0.008) and body regions injured (3 ± 1 vs. 2 ± 1; p = 0.0002). Gunshots to the head, chest, and abdominal body regions were significantly more common among decedents than survivors (p < 0.0001). Eighty-two percent of admitted patients required surgery in the first 24 hours. Essential resources in the first 24 hours included trauma surgeons, emergency room physicians, orthopedic/hand surgeons, anesthesiologists, vascular surgeons, interventional radiologists, intensivists, and hospitalists. CONCLUSION: Mass shooting events are associated with high mortality. Survivors commonly sustain multiple, life-threatening ballistic injuries requiring emergent surgery and extensive hospital resources. Given the increasing frequency of mass shootings, all hospitals must have a coordinated plan to respond to a mass casualty event. LEVEL OF EVIDENCE: Epidemiological Study, level V.


Assuntos
Planejamento em Desastres/organização & administração , Serviços Médicos de Emergência/organização & administração , Incidentes com Feridos em Massa , Ferimentos por Arma de Fogo/terapia , Florida/epidemiologia , Hospitais Comunitários/organização & administração , Humanos , Estudos Retrospectivos , Centros de Traumatologia/organização & administração , Ferimentos por Arma de Fogo/mortalidade
12.
Ecol Evol ; 9(20): 11791-11798, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-31695888

RESUMO

Evaluating relationships between ecological processes that occur concurrently is complicated by the potential for such processes to covary. Ground-nesting birds rely on habitat characteristics that provide visual and olfactory concealment from predators; this protection often is provided by vegetation at the nest site. Recently, researchers have raised concern that measuring vegetation characteristics at nest fate (success or failure) introduces a bias, as vegetation at successful nests is measured later in the growing season (and has more time to grow) compared with failed nests. In some systems, this bias can lead to an erroneous conclusion that plant height is positively associated with nest survival. However, if the features that provide concealment are invariant during the incubation period, no bias should be expected, and the timing of measurement is less influential. We used data collected from 98 nests to evaluate whether there is evidence that such a bias exists in a study of wild turkey (Meleagris gallopavo) nesting in a montane forest ecosystem. We modeled nest survival as a function of visual obstruction and other covariates of interest. At unsuccessful nests, we collected visual obstruction readings at both the date of nest failure and the projected hatch date and compared survival estimates generated using both sets of vegetation data. In contrast to studies in grassland and shrubland systems, we found little evidence that the timing of vegetation sampling influenced conclusions regarding the association between visual obstruction and nest survival; model selection and estimates of nest survival were similar regardless of when vegetation data were collected. The dominant hiding cover at most of our nests was provided by evergreen shrubs; retention of leaves and slow growth of these plants likely prevent appreciable changes in visual obstruction during the incubation period. When considered in aggregate with a growing body of literature, our results suggest that the influence of timing of vegetation sampling depends on the study system. When designing future studies, investigators should carefully consider the type of structures that provide nest concealment and whether plant phenology is confounded with nest survival.

13.
Evol Appl ; 12(9): 1823-1836, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-31548860

RESUMO

The influence of human harvest on evolution of secondary sexual characteristics has implications for sustainable management of wildlife populations. The phenotypic consequences of selectively removing males with large horns or antlers from ungulate populations have been a topic of heightened concern in recent years. Harvest can affect size of horn-like structures in two ways: (a) shifting age structure toward younger age classes, which can reduce the mean size of horn-like structures, or (b) selecting against genes that produce large, fast-growing males. We evaluated effects of age, climatic and forage conditions, and metrics of harvest on horn size and growth of mountain sheep (Ovis canadensis ssp.) in 72 hunt areas across North America from 1981 to 2016. In 50% of hunt areas, changes in mean horn size during the study period were related to changes in age structure of harvested sheep. Environmental conditions explained directional changes in horn growth in 28% of hunt areas, 7% of which did not exhibit change before accounting for effects of the environment. After accounting for age and environment, horn size of mountain sheep was stable or increasing in the majority (~78%) of hunt areas. Age-specific horn size declined in 44% of hunt areas where harvest was regulated solely by morphological criteria, which supports the notion that harvest practices that are simultaneously selective and intensive might lead to changes in horn growth. Nevertheless, phenotypic consequences are not a foregone conclusion in the face of selective harvest; over half of the hunt areas with highly selective and intensive harvest did not exhibit age-specific declines in horn size. Our results demonstrate that while harvest regimes are an important consideration, horn growth of harvested male mountain sheep has remained largely stable, indicating that changes in horn growth patterns are an unlikely consequence of harvest across most of North America.

14.
PeerJ ; 7: e7185, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31293830

RESUMO

Percent of body fat and physiological stress are important correlates to wildlife demographics. We studied winter percent of body fat and physiological stress levels for a declining elk (Cervus canadensis nelsoni) population in South Dakota, 2011-2013. We obtained percent of winter body fat, pregnancy status, lactation status, and physiological stress data from 58 adult females (2+ years old). We compared physiological stress level data from 2011 with data collected from this same herd when elk densities were much higher (1995-1997). Our objectives were to determine percent of body fat during winter, examine if winter body fat was correlated with pregnancy and lactation status, and quantify and compare physiological stress hormone values from elk in the mid-1990s. Probability of being pregnant increased with higher winter nutritional condition, or percent of body fat; whereas females with a higher probability of previously lactating were lower in winter body fat. Mean fecal glucocorticoid metabolite (FGM) levels in 2011 (mean = 47.78 ng/g, SE = 2.37) were higher during summer compared to data collected in 1995-1997 (mean = 34.21 ng/g, SE = 3.71); however, mean FGM levels during winter did not differ between the two time periods. Although summer levels of FGM have significantly increased since the mid-1990s, we caution against any interpretation of increased FGM levels on elk fitness, as it may not infer biological significance. Mean winter percent of body fat of elk was lower when compared to other populations in the west but this difference does not appear to be limiting vital rates and population growth for this elk herd. We recommend future research focus on summer/autumn data collection to provide a more comprehensive understanding of percent of body fat for elk in our region.

15.
Diabet Med ; 36(8): 995-1002, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-31004370

RESUMO

AIM: To estimate the healthcare costs of diabetic foot disease in England. METHODS: Patient-level data sets at a national and local level, and evidence from clinical studies, were used to estimate the annual cost of health care for foot ulceration and amputation in people with diabetes in England in 2014-2015. RESULTS: The cost of health care for ulceration and amputation in diabetes in 2014-2015 is estimated at between £837 million and £962 million; 0.8% to 0.9% of the National Health Service (NHS) budget for England. More than 90% of expenditure was related to ulceration, and 60% was for care in community, outpatient and primary settings. For inpatients, multiple regression analysis suggested that ulceration was associated with a length of stay 8.04 days longer (95% confidence interval 7.65 to 8.42) than that for diabetes admissions without ulceration. CONCLUSIONS: Diabetic foot care accounts for a substantial proportion of healthcare expenditure in England, more than the combined cost of breast, prostate and lung cancers. Much of this expenditure arises through prolonged and severe ulceration. If the NHS were to reduce the prevalence of diabetic foot ulcers in England by one-third, the gross annual saving would be more than £250 million. Diabetic foot ulceration is a large and growing problem globally, and it is likely that there is potential to improve outcomes and reduce expenditure in many countries.


Assuntos
Amputação Cirúrgica/economia , Pé Diabético/economia , Medicina Estatal/economia , Assistência Ambulatorial/economia , Serviços de Saúde Comunitária/economia , Custos e Análise de Custo , Pé Diabético/cirurgia , Inglaterra , Feminino , Custos de Cuidados de Saúde , Hospitalização/economia , Humanos , Masculino , Cuidados Pós-Operatórios/economia , Estudos Prospectivos
16.
Clin Nutr ; 38(4): 1828-1832, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-30086999

RESUMO

INTRODUCTION: The management of intestinal failure (IF) requires safe and sustained delivery of parenteral nutrition (PN). The long-term maintenance of central venous catheter (CVC) access is therefore vital, with meticulous catheter care and salvage of infected CVCs being of prime importance. CVC-related infection and loss of intravenous access are important causes of morbidity and mortality in IF. Avoidance, prompt recognition and appropriate management of CVC-related infections are crucial components of IF care. However, there are few, if any, data on the occurrence of CVC-related infections in patients with acute, type 2, IF managed on a dedicated IF unit and no data on the salvage outcomes of infected CVCs in this group of patients. METHODS: This is a retrospective observational study conducted between January 2011 and July 2017. All patients with acute, type 2 IF newly admitted to a national U.K. IF unit (IFU) during these dates were included. All patients admitted to the unit with a CVC in place underwent immediate 'screening' paired central and peripheral blood cultures on arrival before the CVC was used for any infusate. A prospectively maintained database was used to record all confirmed catheter-related blood stream infections (BSI)/colonisations, demographic and clinical data. Diagnosis of catheter-related BSI/colonisation was based on quantitative and qualitative analysis of paired central and peripheral blood cultures. A standardized 10-14-day catheter salvage treatment protocol involving antibiotic and urokinase CVC locks and systemic antibiotic administration was used to salvage any infected or colonised CVCs, as appropriate. The CVC was not used for PN until successful salvage had been confirmed by negative blood cultures drawn 48 h after antibiotic completion. The development of a subsequent catheter-related BSI was recorded for all patients, both during the remaining in-patient stay on the IFU and after discharge home on PN. RESULTS: Of the 509 patients with type 2 IF admitted to the IFU during the study period, 341 (54% female; mean age 54.6 (range 16-86 years)) had an indwelling CVC that had been placed in the referring hospital. Surgical complications and mesenteric ischaemia were the most common underlying disease aetiologies. Sixty-five of 341 (19.1%) patients had an infected/colonised CVC on the initial screening set of blood cultures. A successful CVC salvage rate of 91% was achieved in this cohort after antibiotic therapy. The subsequent in-patient catheter-related BSI rate for those admitted with a CVC (n = 341) on the IFU was 0.042 per 1000 catheter days, over a total of 23,548 in-patient catheter days. Two hundred and seventy nine of 341 patients were discharged on home PN (HPN); with a subsequent catheter-related BSI rate on HPN of 0.22 per 1000 catheter days (mean duration of HPN = 778 catheter days (range:)) over a follow-up period of 216,944 out-patient catheter days. There was no increased risk of HPN-related catheter-related BSI (p = 0.09) or mortality (p = 0.4) in those admitted with an infected CVC. CONCLUSION: This is the first study to report catheter-related BSI/colonisation rates and salvage outcomes in patients with type 2 IF newly admitted to a dedicated IF Unit. We report that nearly one-fifth of all patients were referred with evidence of a catheter related BSI/colonisation; despite this, successful catheter salvage is possible and, with stringent CVC care, an extremely low subsequent catheter related BSI rates can be achieved and maintained during in-patient stay on a dedicated IF Unit and after discharge on HPN. These data provide novel evidence to support ESPEN recommendations that patients with type 2 IF are managed on a dedicated IF Unit.


Assuntos
Infecções Relacionadas a Cateter , Cateteres Venosos Centrais , Enteropatias/complicações , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Bacteriemia/complicações , Bacteriemia/epidemiologia , Bacteriemia/microbiologia , Bacteriemia/terapia , Infecções Relacionadas a Cateter/complicações , Infecções Relacionadas a Cateter/epidemiologia , Infecções Relacionadas a Cateter/microbiologia , Infecções Relacionadas a Cateter/terapia , Cateterismo Venoso Central/efeitos adversos , Cateteres Venosos Centrais/efeitos adversos , Cateteres Venosos Centrais/microbiologia , Unidades Hospitalares , Hospitalização , Humanos , Enteropatias/terapia , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
17.
J Trauma Acute Care Surg ; 84(1): 133-138, 2018 01.
Artigo em Inglês | MEDLINE | ID: mdl-28640779

RESUMO

BACKGROUND: The Society of Vascular Surgery (SVS) guidelines currently suggest thoracic endovascular aortic repair (TEVAR) for grade II-IV and nonoperative management (NOM) for grade I blunt traumatic aortic injury (BTAI). However, there is increasing evidence that grade II may also be observed safely. The purpose of this study was to compare the outcome of TEVAR and NOM for grade I-IV BTAI and determine if grade II can be safely observed with NOM. METHODS: The records of patients with BTAI from 2004 to 2015 at a Level I trauma center were retrospectively reviewed. Patients were separated into two groups: TEVAR versus NOM. All BTAIs were graded according to the SVS guidelines. Minimal aortic injury (MAI) was defined as BTAI grade I and II. Failure of NOM was defined as aortic rupture after admission or progression on subsequent computed tomography (CT) imaging requiring TEVAR or open thoracotomy repair (OTR). Statistical analysis was performed using Mann-Whitney U and χ tests. RESULTS: A total of 105 adult patients (≥16 years) with BTAI were identified over the 11-year period. Of these, 17 patients who died soon after arrival and 17 who underwent OTR were excluded. Of the remaining 71 patients, 30 had MAI (14 TEVAR vs. 16 NOM). There were no failures in either group. No patients with MAI in either group died from complications of aortic lesions. Follow-up CT imaging was performed on all MAI patients. Follow-up CT scans for all TEVAR patients showed stable stents with no leak. Follow-up CT in the NOM group showed progression in two patients neither required subsequent OTR or TEVAR. CONCLUSIONS: Although the SVS guidelines suggest TEVAR for grade II-IV and NOM for grade I BTAI, NOM may be safely used in grade II BTAI. LEVEL OF EVIDENCE: Therapeutic study, level IV.


Assuntos
Aorta Torácica/lesões , Procedimentos Endovasculares , Lesões do Sistema Vascular/terapia , Ferimentos não Penetrantes/terapia , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Seleção de Pacientes , Estudos Retrospectivos , Fatores de Tempo , Resultado do Tratamento
18.
Clin Nutr ; 37(6 Pt A): 2097-2101, 2018 12.
Artigo em Inglês | MEDLINE | ID: mdl-29046259

RESUMO

BACKGROUND & AIMS: Prevention of catheter related blood stream infections (CRBSI) and salvage of infected central venous catheters (CVC) are vital to maintaining long term venous access in patients needing home parenteral nutrition (HPN). It remains unclear as to whether patients are best trained for catheter care at home or in hospital or whether CRBSIs are lower if the patient self-cares for the CVC. Furthermore, there is minimal data on the longer term outcome following salvage of infected catheter and limited consensus on agreed protocols for catheter salvage. METHOD: We conducted a retrospective 5-year evaluation of CRBSI occurrence and CVC salvage outcomes in adult patients requiring HPN managed at a national UK Intestinal Failure Unit from 2012 to 2016. Prior to 2012, patients were primarily trained to administer PN in hospital; thereafter, patients underwent training at home. RESULTS: A total of 134 CRBSI were recorded in 92 patients (62 patients with a single CRBSI and 30 patients with more than 1 CRBSI) in a cohort of 559 HPN patients, with a total of 1163 HPN years. The overall CRBSI rate was 0.31 per 1000 catheter days. CNS were the most common isolates (41/134 (30.5%)), followed by polymicrobial infections (14/134 (10.4%)), Klebsiella spp. (16/134 (11.9%)) and methicillin - sensitive Staphylococcus aureus (MSSA) 5/134 ((3.7%)). Salvage was not attempted in 34 cases due to methicillin - resistant (MRSA) infection (1/34), fungal infection (13/34) or clinical instability due to sepsis (20/34). Of the 100 cases where salvage was attempted, 67% were successful. 82.8% of CNS salvage attempts were successful; there was no difference in salvage rates between CNS CRBSIs salvaged with a 10-day (22/26) or 14-day protocol (7/9) (p = 0.4). CRBSI rate, in those cared for by trained home care nurses was the lowest at 0.270 (self care: 0.342 and non-medical carer (e.g. family member): 0.320) (p = 0.03). CONCLUSION: We previously reported a sustained very low CRBSI rate in a large cohort of HPN patients in a national unit; we now further report that this is not influenced by training patients at home rather than in hospital but is influenced by the individual managing the catheter at home. CNS remains the primary cause of CRBSIs and can be successfully salvaged with a reduced duration of antibiotic therapy compared to our previous experience.


Assuntos
Bacteriemia , Infecções Relacionadas a Cateter , Catéteres , Melhoria de Qualidade , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Bacteriemia/epidemiologia , Bacteriemia/microbiologia , Bacteriemia/prevenção & controle , Bacteriemia/terapia , Infecções Relacionadas a Cateter/epidemiologia , Infecções Relacionadas a Cateter/microbiologia , Infecções Relacionadas a Cateter/prevenção & controle , Infecções Relacionadas a Cateter/terapia , Catéteres/microbiologia , Catéteres/normas , Reutilização de Equipamento , Humanos , Enteropatias/terapia , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
19.
Am Surg ; 83(6): 673-676, 2017 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-28637573

RESUMO

Bed availability remains a constant struggle for tertiary care centers resulting in the use of management protocols to streamline patient care and reduce length of stay (LOS). A standardized perioperative management protocol for uncomplicated acute appendicitis (UA) was implemented in April 2014 to decrease both CT scan usage and LOS. Patients who underwent laparoscopic appendectomy for UA from April 2012 to May 2013 (PRE group) and April 2014 to May 2015 (POST group) were compared retrospectively. There were no differences in patient demographics or clinical findings between the groups. All patients in the PRE group had a CT scan for the diagnosis of appendicitis, whereas there was a 14 per cent decrease in the POST group (P = 0.002). There was a significant decrease in median LOS between the groups [PRE 1.3 vs POST 0.9 days; (P < 0.001)]. There was no difference in subsequent emergency department visits for complications [3 (4%) vs 4 (4%); P = 1.0] or 30-day readmission rate [1 (1%) vs 5 (5%); P = 0.22] between the groups. A standardized perioperative management protocol for UA patients significantly decreased CT scan utilization and LOS without compromising patient care.


Assuntos
Apendicectomia , Apendicite/cirurgia , Laparoscopia , Adulto , Apendicectomia/métodos , Apendicite/diagnóstico por imagem , Índice de Massa Corporal , Feminino , Humanos , Laparoscopia/métodos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Risco , Resultado do Tratamento
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA