Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
1.
Transl Anim Sci ; 8: txae038, 2024.
Article in English | MEDLINE | ID: mdl-38572172

ABSTRACT

Alfalfa is a commonly grown forage in the Intermountain West region of the United States and is often included in the diet of dairy cattle. Alfalfa provides a variety of different nutrients, but the nutrient content of alfalfa varies depending on factors such as the soil, region, cutting, and climate. However, alfalfa leaves tend to have less variation in their nutrient content than alfalfa stems. Fractionating alfalfa may be one way to improve control of nutrients provided when developing a ration for developing dairy heifers. The purpose of this study was to determine whether including fractionated alfalfa in the diet impacts the growth or conception rates of developing dairy heifers. Heifers were allocated to one of three treatments: a control group fed a typical diet (CON; n = 8), a diet that replaced alfalfa with fractionated alfalfa leaf pellets and alfalfa stems (ProLEAF MAX + ProFiber Plus; PLM + PFP; n = 8), or a diet that replaced alfalfa with alfalfa stems (PFP; n = 8) for 85 d. Heifers were fed individually twice daily and weight, hip height (HH), and wither height (WH) were recorded every 14 d. Additionally, blood was collected every 28 d, and conception rates were recorded at the end of the trial. Heifers receiving the PFP diet consumed less dry matter (P = 0.001) than the CON treatment. Analyses were then conducted to determine nutrient intake and heifers receiving the PFP diet also consumed less neutral detergent fiber (P = 0.02), acid detergent fiber (P = 0.02), crude protein (P = 0.001), and net energy for maintenance (P = 0.001) than heifers consuming the CON diet; however, no differences (P > 0.10) were observed between heifers fed the CON and PLM + PFP diets. Analysis of body weight gain over the feeding period showed no difference (P = 0.52) among heifers consuming the different treatment diets. Additionally, treatment did not affect average daily gain (P = 0.49), gain:feed (P = 0.82), HH gain (P = 0.20), or WH gain (P = 0.44) among heifers receiving different diets. Treatment × time altered (P < 0.001) blood urea nitrogen when analyzed as a repeated measure. Total feed cost was lowest (P < 0.001) for the PFP diet and cost of gain tended (P = 0.09) to be increased for the PLM + PFP diet compared to the CON diet. Overall, these data indicate that including alfalfa stems in a developing heifer diet may decrease dry matter intake, lower input costs, and increase profitability, without negatively impacting growth.

2.
Am J Otolaryngol ; 45(3): 104231, 2024.
Article in English | MEDLINE | ID: mdl-38513514

ABSTRACT

PURPOSE: Hyperacusis is an audiological disorder in which patients become persistently sensitive and intolerant to everyday environmental sounds. For those patients that fail conservative options, a minimally invasive surgical procedure has been developed. MATERIALS & METHODS: Retrospective case series of 73 adult patients with hyperacusis who underwent oval and round window reinforcement surgery between 1/2017-6/2023. Small pieces of temporalis fascia were used to reinforce the round and oval windows. Patients were separated into two groups based on their preoperative speech Loudness Discomfort Level (LDL). Patients with a preoperative speech LDL ≤ 70 dB were placed in the "low LDL group" whereas patients with a preoperative speech LDL >70 dB were placed in the "high LDL group." Preoperative and one-week postoperative audiogram and speech LDLs were compared. Quality of life was assessed using the Glasgow Benefit Inventory (GBI) survey. RESULTS: 73 patients met inclusion criteria - 21 patients in the low LDL group and 52 in the high LDL group. Patients in the high LDL group significantly improved their LDLs by an average of 3.5 dB (P < 0.0001). 42 patients (80.8 %) in the high LDL group had improvement and would recommend the surgery for hyperacusis. Patients in the low LDL group significantly improved their LDL by an average of 12.9 dB (P = 0.032). Ten patients (47.6 %) from the low LDL group experienced improvement and would recommend hyperacusis surgery. CONCLUSION: Many patients with hyperacusis who undergo oval and round window reinforcement can receive significant improvement in sound tolerance and quality of life. Patients with a pre-op speech LDL > 70 dB have the greatest potential for improvement with surgery (80.8 %), probably because their hyperacusis was less severe. In the high LDL group(>70dB) the improvement in 1-10 scale went from 8.6 pre-op to 2.4 post op. In the low LDL group(<70dB) went from 9.2 pre-op to 6.8 post-op. These findings were consistent with the GBI results.


Subject(s)
Hyperacusis , Quality of Life , Round Window, Ear , Humans , Hyperacusis/surgery , Male , Female , Round Window, Ear/surgery , Retrospective Studies , Adult , Middle Aged , Treatment Outcome , Aged , Otologic Surgical Procedures/methods
3.
Am J Otolaryngol ; 45(1): 104071, 2024.
Article in English | MEDLINE | ID: mdl-37793300

ABSTRACT

OBJECTIVE: The purpose of this study is to investigate how cognition, as measured using the Self-Administered Gerocognitive Examination Test (SAGE), and age affect speech recognition scores in older adults (age > 65) at one year and two years after cochlear implantation. STUDY DESIGN: This is a prospective study. SETTING: This study was conducted at a single institution. METHODS: Unilateral cochlear implantation was performed by two surgeons on adult patients (>65 years) with postlingual bilateral sensorineural hearing loss. There were 230 patients who underwent cochlear implantation from January 2016 to June 2023. Fifty-five of these patients completed the SAGE questionnaire before implantation, one year after implantation, and 2 years after implantation. Paired t-test analysis was used to evaluate pre- and post-operative speech recognition scores (CNC, AzBio in Quiet). RESULTS: Patients who had normal preoperative cognition on SAGE showed greater improvement in postoperative speech recognition tests at 1 year and 2 years after implantation compared with patients who showed preoperative cognitive impairment. There were no significant differences in postoperative speech outcome between age group 1 (between 65 and 80 years old) and age group 2 (over 80 years old) cochlear implant recipients. There were no changes in cognitive SAGE scores after 2 years implantation. CONCLUSION: Cognitive function, as measured by SAGE, is a more reliable predictor than age in determining speech recognition improvement after cochlear implantation. Cochlear implantation did not improve postoperative cognition.


Subject(s)
Cochlear Implantation , Cochlear Implants , Speech Perception , Humans , Aged , Infant , Prospective Studies , Speech , Treatment Outcome , Cognition
4.
Article in English | MEDLINE | ID: mdl-37923370

ABSTRACT

BACKGROUND: Little is known about the persistence of antibodies after the first year following SARS-CoV-2 infection. We aimed to determine the proportion of individuals that maintain detectable levels of SARS-CoV-2 antibodies over an 18-month period following infection. METHODS: Population-based prospective study of 20 000 UK Biobank participants and their adult relatives recruited in May 2020. The proportion of SARS-CoV-2 cases testing positive for immunoglobulin G (IgG) antibodies against the spike protein (IgG-S), and the nucleocapsid protein (IgG-N), was calculated at varying intervals following infection. RESULTS: Overall, 20 195 participants were recruited. Their median age was 56 years (IQR 39-68), 56% were female and 88% were of white ethnicity. The proportion of SARS-CoV-2 cases with IgG-S antibodies following infection remained high (92%, 95% CI 90%-93%) at 6 months after infection. Levels of IgG-N antibodies following infection gradually decreased from 92% (95% CI 88%-95%) at 3 months to 72% (95% CI 70%-75%) at 18 months. There was no strong evidence of heterogeneity in antibody persistence by age, sex, ethnicity or socioeconomic deprivation. CONCLUSION: This study adds to the limited evidence on the long-term persistence of antibodies following SARS-CoV-2 infection, with likely implications for waning immunity following infection and the use of IgG-N in population surveys.

5.
J Epidemiol Community Health ; 78(1): 3-10, 2023 12 08.
Article in English | MEDLINE | ID: mdl-37699665

ABSTRACT

BACKGROUND: The social determinants of ethnic disparities in risk of SARS-CoV-2 infection during the first wave of the pandemic in the UK remain unclear. METHODS: In May 2020, a total of 20 195 adults were recruited from the general population into the UK Biobank SARS-CoV-2 Serology Study. Between mid-May and mid-November 2020, participants provided monthly blood samples. At the end of the study, participants completed a questionnaire on social factors during different periods of the pandemic. Logistic regression yielded ORs for the association between ethnicity and SARS-CoV-2 immunoglobulin G antibodies (indicating prior infection) using blood samples collected in July 2020, immediately after the first wave. RESULTS: After exclusions, 14 571 participants (mean age 56; 58% women) returned a blood sample in July, of whom 997 (7%) had SARS-CoV-2 antibodies. Seropositivity was strongly related to ethnicity: compared with those of White ethnicity, ORs (adjusted for age and sex) for Black, South Asian, Chinese, Mixed and Other ethnic groups were 2.66 (95% CI 1.94-3.60), 1.66 (1.15-2.34), 0.99 (0.42-1.99), 1.42 (1.03-1.91) and 1.79 (1.27-2.47), respectively. Additional adjustment for social factors reduced the overall likelihood ratio statistics for ethnicity by two-thirds (67%; mostly from occupational factors and UK region of residence); more precise measurement of social factors may have further reduced the association. CONCLUSIONS: This study identifies social factors that are likely to account for much of the ethnic disparities in SARS-CoV-2 infection during the first wave in the UK, and highlights the particular relevance of occupation and residential region in the pathway between ethnicity and SARS-CoV-2 infection.


Subject(s)
COVID-19 , Adult , Humans , Female , Middle Aged , Male , SARS-CoV-2 , Social Factors , Biological Specimen Banks , Social Determinants of Health , Surveys and Questionnaires
6.
Am J Otolaryngol ; 44(5): 103951, 2023.
Article in English | MEDLINE | ID: mdl-37329694

ABSTRACT

OBJECTIVE: The Cochlear Osseointegrated Steady-State Implant Bone Anchored Hearing Device (Osia) is a surgically implanted titanium apparatus that utilizes a piezoelectric actuator under the skin to address conductive and mixed hearing loss as well as single-sided deafness. The purpose of this study is to examine the clinical, audiologic, and quality-of-life outcomes in patients who underwent Osia implantation. METHODS: This is a retrospective study analyzing 30 adult patients (age 27-86) with conductive healing loss (CHL), mixed hearing loss (MHL), or single-sided deafness (SSD) who were implanted with the Osia device from January 2020 to April 2023 at a single institution by the senior author. Preoperative speech score testing (CNC, AzBio in quiet, AzBio in noise) were performed in all subjects while unaided, wearing conventional air conduction hearing aids, and wearing a softband BAHA. These preoperative speech scores were then compared to post-implantation speech scores using paired t-test analysis to assess for degree of speech improvement. In order to analyze quality of life after Osia implantation, each patient filled out the Glasgow Benefit Inventory (GBI) survey. The GBI is a series of 18 questions answered using a five-point Likert scale that addresses the changes in general health status, physical health status, psychosocial health status, and social support after a medical intervention. RESULTS: CHL, MHL, and SSD patients had significant improvement in hearing and speech recognition scores after Osia implantation compared to preoperative unaided hearing: CNC (14 % vs 80 %, p < 0.0001), AzBio in Quiet (26 % vs 94 %, p < 0.0001), and AzBio in Noise (36 % vs 87 %, p = 0.0001). Preoperative speech scores using the softband BAHA were accurate predictors of post-implantation speech scores and can serve to determine surgical candidacy for the Osia. Post-implantation Glasgow Benefit Inventory patient surveys demonstrated significant improvement in quality of life with patients scoring an average increase of +54.1 points in heath satisfaction. CONCLUSION: Adult patients with CHL, MHL, and SSD can receive significant improvement in speech recognition scores after implantation with the Osia device. This translates to improved quality of life, which was confirmed on the post-implantation Glasgow Benefit Inventory patient surveys.


Subject(s)
Deafness , Hearing Aids , Hearing Loss, Mixed Conductive-Sensorineural , Hearing Loss , Speech Perception , Adult , Humans , Middle Aged , Aged , Aged, 80 and over , Hearing Loss, Mixed Conductive-Sensorineural/surgery , Retrospective Studies , Quality of Life , Hearing , Deafness/surgery , Treatment Outcome
7.
Camb Prism Precis Med ; 1: e30, 2023.
Article in English | MEDLINE | ID: mdl-38550926

ABSTRACT

UK Biobank is an intensively characterised prospective cohort of 500,000 adults aged 40-69 years when recruited between 2006 and 2010. The study was established to enable researchers worldwide to undertake health-related research in the public interest. The existence of such a large, detailed prospective cohort with a high degree of participant engagement enabled its rapid repurposing for coronavirus disease-2019 (COVID-19) research. In response to the pandemic, the frequency of updates on hospitalisations and deaths among participants was immediately increased, and new data linkages were established to national severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing and primary care health records to facilitate research into the determinants of severe COVID-19. UK Biobank also instigated several sub-studies on COVID-19. In 2020, monthly blood samples were collected from approximately 20,000 individuals to investigate the distribution and determinants of SARS-CoV-2 infection, and to assess the persistence of antibodies following infection with another blood sample collected after 12 months. UK Biobank also performed repeat imaging of approximately 2,000 participants (half of whom had evidence of previous SARS-CoV-2 infection and half did not) to investigate the impact of the virus on changes in measures of internal organ structure and function. In addition, approximately 200,000 UK Biobank participants took part in a self-test SARS-CoV-2 antibody sub-study (between February and November 2021) to collect objective data on previous SARS-CoV-2 infection. These studies are enabling unique research into the genetic, lifestyle and environmental determinants of SARS-CoV-2 infection and severe COVID-19, as well as their long-term health effects. UK Biobank's contribution to the national and international response to the pandemic represents a case study for its broader value, now and in the future, to precision medicine research.

8.
J Dairy Sci ; 104(10): 10863-10878, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34389144

ABSTRACT

Dairy heifers developed in certified organic programs, especially those utilizing pasture-based management schemes, have lower rates of gain than heifers raised in nonorganic confinement production systems in temperate climates, such as in the Intermountain West region of the United States. This study investigates the effects that different forages in a rotational grazing system have on development of organically raised Jersey heifers. Over 3 years, 210 yearling Jersey heifers were randomly assigned to one of 9 treatments, including a conventional confinement control where animals were fed a total mixed ration or one of 8 pasture treatments: Cache Meadow bromegrass (Bromus riparius Rehmann), QuickDraw orchard grass (Dactylis glomerata L.), Amazon perennial ryegrass (Lolium perenne L.), or Fawn tall fescue (Schendonorus arundinaceus [Schreb.] Dumort) and each individual grass interseeded with birdsfoot trefoil (Lotus corniculatus L., BFT). Each treatment had 3 blocks/yr over the 3-yr period, with each block having a 0.4 ha pasture of each treatment. Every 35 d, over a 105-d period, heifers were weighed and measured for hip height, and blood samples were collected to determine serum insulin-like growth factor-1 and blood urea nitrogen concentrations. Fecal egg counts were also assessed. Heifer body weight (BW), blood urea nitrogen, and insulin-like growth factor-1 concentrations were affected by treatment when analyzed over time. Heifers on grass-BFT pastures had increased BW compared with heifers on monoculture grass pastures. Heifers receiving a total mixed ration or perennial ryegrass+BFT had increased BW gain over the 105-d period compared with heifers grazing tall fescue+BFT, orchard grass, perennial ryegrass, meadow bromegrass, or tall fescue. Individually for all grass species, heifers grazing +BFT pastures had greater ending BW and weight gain than heifers grazing the respective grass monocultures. Furthermore, weight gain for heifers on perennial ryegrass+BFT, meadow bromegrass+BFT, and orchard grass+BFT were not different from those on a total mixed ration. Heifers grazing grass-BFT pastures had increased blood urea nitrogen compared with heifers grazing monoculture grass pastures. Heifer hip height and fecal egg counts were not affected by treatment. These results show that the addition of BFT to organic pasture improves growth of grazing replacement heifers. Economic analyses also demonstrate that interseeding grass pastures with BFT results in an increased economic return compared with grazing monoculture grass pastures. Grass pastures interseeded with BFT may be a sustainable option to achieve adequate growth of Jersey heifers raised in an organic pasture scenario in a temperate climate.


Subject(s)
Festuca , Lotus , Animal Feed/analysis , Animals , Cattle , Diet/veterinary , Weight Gain
9.
Transl Anim Sci ; 5(3): txab098, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34222826

ABSTRACT

Alfalfa is often included in the diets of beef animals; however, the nutrient content of alfalfa is variable depending on the region in which it is grown, climate, soil, and many other factors. The leaf portion of alfalfa has a less variable nutrient composition than the stem portion of the plant. The variability that is present in the alfalfa plant can make the development of total mixed rations of consistent nutrient content difficult. As such, the purpose of this study was to determine how the inclusion of fractionated alfalfa leaves and alfalfa stems impacts performance and carcass quality of finishing beef steers. Twenty-four steers were allocated to one of three treatments: a control group fed a typical finishing diet with alfalfa as the forage (CON; n = 8), a typical diet that replaced alfalfa with fractionated alfalfa leaf pellets and alfalfa stems (ProLEAF MAX™ + ProFiber Plus™; PLM+PFP; n = 8), or a typical diet that replaced alfalfa with alfalfa stems (PFP; n = 8) for 63 days. Steers were fed individually once daily, weighed every 14 days and ultrasound images were collected every 28 days. At the end of the feeding trial, steers were harvested at a commercial facility and carcass data was obtained. Analysis of dry matter intake demonstrated that steers receiving the PFP and CON diets consumed more feed (P < 0.001) than steers consuming the PLM+PFP diet. Steers receiving the PLM+PFP diet gained less (P < 0.001) weight than the steers receiving the other two dietary treatments. No differences (P > 0.10) in feed efficiency or carcass characteristics were observed. Steers receiving the PFP diet had improved (P = 0.016) cost of gain ($0.93 per kg) when compared with steers receiving PLM+PFP ($1.08 per kg) diet. Overall, our findings demonstrate that the inclusion of PFP in place of alfalfa hay in a finishing diet has the potential to improve cost of gain, without negatively affecting growth, performance, or carcass characteristics of finishing feedlot steers.

10.
Head Neck ; 43(5): 1695-1698, 2021 05.
Article in English | MEDLINE | ID: mdl-33506547

ABSTRACT

Significant dysphagia, pain, and risk of bleeding occur after transoral robotic surgery (TORS) radical tonsillectomy. We present a novel surgical technique utilizing robotically assisted submandibular gland transposition (SMGT) to reconstruct the radical tonsillar defect. A 48-year-old male with p16+ tonsillar squamous cell carcinoma underwent deep TORS radical tonsillectomy, contralateral tonsillectomy, ipsilateral neck dissection, and TORS-assisted reconstruction of the radical defect with ipsilateral SMGT. Postoperatively, the patient experienced minimal pain and was discharged on postoperative day (POD) 3 tolerating a soft diet. There were no episodes of postoperative bleeding. This procedure was performed in five other cases as well. Transoral robotic SMGT can be used successfully to repair deep TORS radical tonsillectomy defects and may theoretically reduce dysphagia, pain, and the risk of hemorrhage.


Subject(s)
Carcinoma, Squamous Cell , Robotic Surgical Procedures , Tonsillar Neoplasms , Carcinoma, Squamous Cell/surgery , Humans , Male , Middle Aged , Neck Dissection , Submandibular Gland/surgery , Tonsillar Neoplasms/surgery
11.
Case Rep Otolaryngol ; 2020: 8435140, 2020.
Article in English | MEDLINE | ID: mdl-32908755

ABSTRACT

Salivary gland choristoma is an extremely rare middle ear mass and is hypothesized to be caused by second branchial arch developmental anomalies. We present a 14-year-old girl with Dandy-Walker syndrome and conductive hearing loss. Middle ear exploration revealed a large middle ear mass with absent incus and stapes and displaced facial nerve. The mass was completely excised with histological confirmation of salivary gland choristoma. Her hearing was improved with bone-anchored hearing aids (BAHA). As facial nerve involvement is common, physicians should consider partial excision to avoid facial nerve palsy. Hearing restoration can be achieved with OCR or BAHA.

12.
Laryngoscope Investig Otolaryngol ; 5(4): 766-772, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32864450

ABSTRACT

OBJECTIVE: To determine sound levels resulting from aural suctioning of the external auditory canal. METHODS: Unweighted decibels (dB) and A-weighted decibels (dBA) sound pressure level measurements were recorded using a retrotympanic microphone in cadaveric human temporal bones. Sound measurements were made with common otologic suctions, size 3, 5, and 7 French, within the external ear canal at the tympanic membrane, 5, and 10 mm from the tympanic membrane in the dry condition. In the wet condition, the ear canal was filled with fluid and completely suctioned clear to determine sound effects of suctioning liquid from the ear canal. RESULTS: Sound levels generated from ear canal suctioning ranged from 68.3 to 97 dB and 62.6 to 95.1 dBA. Otologic suctions positioned closer to the tympanic membrane resulted in louder sound levels, but was not statistically significant (P > .05). Using larger diameter suctions generated louder dB and dBA sound levels (P < .001) and the addition of liquid in the ear canal during the suction process generated louder dB and dBA sound levels (P < .001). CONCLUSIONS: Smaller caliber suction sizes and nonsuctioning techniques should be utilized for in-office aural toilet to reduce noise trauma and patient discomfort. LEVEL OF EVIDENCE: 5.

13.
OTO Open ; 4(3): 2473974X20948837, 2020.
Article in English | MEDLINE | ID: mdl-32923915
14.
Otol Neurotol ; 41(10): e1224-e1230, 2020 12.
Article in English | MEDLINE | ID: mdl-32810023

ABSTRACT

OBJECTIVE: Only a handful of case reports exist describing posttraumatic sutural diastasis in the calvarium and none report concurrent temporal bone involvement. We aim to describe diastasis along the temporal bone suture lines in the setting of temporal bone trauma and to identify clinical sequelae. STUDY DESIGN: Retrospective case review. SETTING: Tertiary Level 1 trauma center. PATIENTS: Forty-four patients aged 18 and younger who suffered a temporal bone fracture from 2013 to 2018 were identified. Diastasis and diastasis with displacement at the occipitomastoid, lambdoid, sphenosquamosal and petro-occipital sutures, and synchondroses were determined. MAIN OUTCOME MEASURES: The presence of temporal bone suture and synchondrosal diastasis following temporal bone trauma. Diastasis was defined as sutural separation of a distance greater than 1 mm in comparison to the contralateral side. RESULTS: Using our diastasis diagnostic criteria, diastasis occurred in 41.5% of temporal bone fractures. Transverse fracture types were significantly associated with diastasis (p ≤ 0.001). Lower Glasgow Coma Scale (GCS) and loss of consciousness (LOC) were associated with the presence of diastasis with displacement and diastasis (p = 0.034 and p = 0.042, respectively). Otic capsule violation was more common in fractures with diastasis but did not reach statistical significance. There were two cases of cerebrospinal fluid otorrhea and three deaths in cases that featured diastasis. CONCLUSION: Our findings indicate that diastasis is a positive predictor for higher disruptive force injuries and more severe outcomes and complications. Posttraumatic temporal bone suture diastasis may represent a separate clinico-pathologic entity in addition to the usual temporal bone fracture classification types.


Subject(s)
Fractures, Bone , Skull Fractures , Adolescent , Child , Cranial Sutures/diagnostic imaging , Cranial Sutures/surgery , Humans , Retrospective Studies , Skull Fractures/complications , Skull Fractures/diagnostic imaging , Skull Fractures/surgery , Sutures/adverse effects , Temporal Bone/diagnostic imaging , Temporal Bone/surgery
15.
J Dairy Sci ; 103(10): 8898-8909, 2020 Oct.
Article in English | MEDLINE | ID: mdl-32713701

ABSTRACT

This study evaluated the effect of feeding a palmitic acid-enriched supplement on production responses and nitrogen metabolism of mid-lactating Holstein and Jersey cows. Eighty mid-lactating dairy cows, 40 Holstein and 40 Jersey, were used in a randomized complete block design with a split-plot arrangement; the main plot was breed and the subplot was fatty acid treatment. Cows within each breed were assigned to 1 of 2 treatments: (1) control diet with no fat supplement or (2) control diet plus a palmitic acid-enriched supplement dosed at 1.5% of diet dry matter (PA treatment). The treatment period was 6 wk with the final 3 wk used for data and sample collection. There were no treatment × breed interactions for the variables analyzed. Compared with control, PA treatment increased milk fat yield (1.36 vs. 1.26 kg/d) and tended to increase 3.5% fat-corrected milk (35.6 vs. 34.0 kg/d) and energy-corrected milk (35.7 vs. 34.1 kg/d). There was no effect of PA treatment on dry matter intake, milk yield, milk protein yield, milk lactose yield, body condition score, body weight (BW) change, nitrogen intake, and variables related to nitrogen metabolism and excretion. Compared with Holstein cows, Jersey cows had greater dry matter intake as a percent of BW (4.90 vs. 3.37% of BW) and lower milk production (29.6 vs. 32.7 kg/d) and milk lactose yield (1.58 vs. 1.42 kg/d), but tended to have greater milk fat yield (1.36 vs. 1.26 kg/d). There was a breed effect on BW change; Holstein cows gained 0.385 kg/d during the experiment, and Jersey cows gained 0.145 kg/d. Jersey cows had lower nitrogen intake (636 vs. 694 g/d), blood urea nitrogen (12.6 vs. 13.8 mg/dL), urine total nitrogen (125 vs. 145 g/d), and urine total nitrogen as a percent of nitrogen intake (19.5 vs. 21.1%). Overall, feeding a palmitic acid-enriched supplement increased milk fat yield as well as dry matter and fiber digestibility in both Holstein and Jersey cows. The PA treatment did not have any major effects on nitrogen metabolism in both Holstein and Jersey cows. In addition, our results indicated that Jersey cows had lower urinary nitrogen excretion (g/d) than Holstein cows.


Subject(s)
Cattle/metabolism , Lactation/drug effects , Nitrogen/metabolism , Palmitic Acid/administration & dosage , Animal Feed/analysis , Animals , Diet/veterinary , Dietary Fiber/administration & dosage , Dietary Supplements , Digestion/drug effects , Eating/drug effects , Female , Lactation/physiology , Lactose/analysis , Milk/chemistry , Milk/drug effects , Nitrogen/urine , Species Specificity
16.
Semin Thorac Cardiovasc Surg ; 32(4): 674-680, 2020.
Article in English | MEDLINE | ID: mdl-32105786

ABSTRACT

Multiple risk factors for operative mortality in the setting of acute type A aortic dissection (ATAAD) have been described. Recently, the combination of severe acidosis and malperfusion was found to significantly impact operative mortality following surgery for ATAAD and a treatment algorithm was proposed. The purpose of this study is to validate these findings in our institution. A retrospective chart review was performed for patients who underwent ATAAD repair between Feb 1997 and Jan 2018. Preoperative nadir pH, bicarbonate, base deficit, organ malperfusion, and other relevant parameters were collected. Multivariable logistic regression was performed to evaluate operative mortality. A total of 298 patients underwent ATAAD repair. The highest operative mortality (18/49; 37%) was noted in patients with severe acidosis (base deficit ≤ -10). There were 96 patients (32%) with malperfusion. In patients with abdominal malperfusion, this trend is even more pronounced. Multivariable logistic regression showed that severe acidosis is associated with higher operative mortality, odds ratio of 13.9 (P = 0.001). The presence of diabetes and advanced age were also associated with higher operative mortality. These findings validate the previously reported findings that severe acidosis is a strong predictor of operative mortality, and risk increases with associated organ malperfusion. This supports the suggestion that base deficit, which is easily performed at the bedside, should be used clinically to predict operative mortality and should be collected in aortic dissection databases.


Subject(s)
Acidosis/mortality , Aortic Aneurysm/surgery , Aortic Dissection/surgery , Decision Support Techniques , Vascular Surgical Procedures/mortality , Acid-Base Equilibrium , Acidosis/diagnosis , Acidosis/physiopathology , Acute Disease , Adolescent , Adult , Aged , Aged, 80 and over , Algorithms , Aortic Dissection/diagnostic imaging , Aortic Dissection/mortality , Aortic Dissection/physiopathology , Aortic Aneurysm/diagnostic imaging , Aortic Aneurysm/mortality , Aortic Aneurysm/physiopathology , Female , Hemodynamics , Hospital Mortality , Humans , Male , Middle Aged , Predictive Value of Tests , Reproducibility of Results , Retrospective Studies , Risk Assessment , Risk Factors , Severity of Illness Index , Treatment Outcome , Vascular Surgical Procedures/adverse effects , Young Adult
17.
JPEN J Parenter Enteral Nutr ; 44(8): 1461-1467, 2020 11.
Article in English | MEDLINE | ID: mdl-32010992

ABSTRACT

BACKGROUND: Vasoactive and inotropic support (VIS) may predispose cardiac surgery patients to ischemic gut complications (IGCx). The purpose of this study was to describe the effect of VIS on the manner in which we deliver tube feeds (TFs) and determine its relationship with IGCx in cardiac surgery patients. METHODS: We reviewed cardiac surgery patients at a single institution and examined the effect of VIS (none, low, medium, high) on TF administration and evaluated IGCx. RESULTS: Of 3088 cardiac surgery patients, 249 (8%) required TFs, comprising 2151 total TF-days. Increasing VIS was associated with decreased amounts of TF administered per day (P = .001) and an increase in time that TF was held per day (P < .001). High VIS was associated with less intact, more semi-elemental/elemental formula use (P < .001) and increased use of gastric route (P < .001). Of all cardiac surgery patients, 11 of 3125 suffered IGCx (0.4%), with a mortality of 73%. Of the 3 receiving TF, 2 IGCx were focal and consistent with acute embolus, whereas one was diffuse, on high VIS and an intra-aortic balloon pump. Of the 8 IGCx in the patients not receiving TF, 5 were focal, whereas 3 were diffuse and not embolic (P = .21). CONCLUSIONS: Despite 32% of TF-days on moderate to high VIS, non-embolic IGCx were not increased compared with patients not receiving TF. As delivered at this institution, TF in even those requiring moderate to high inotropic and pressor support were not associated with an increase in attributable IGCx.


Subject(s)
Cardiac Surgical Procedures , Cardiovascular Agents , Gastrointestinal Microbiome , Cardiac Surgical Procedures/adverse effects , Enteral Nutrition , Humans
18.
J Thorac Cardiovasc Surg ; 160(5): 1166-1175, 2020 Nov.
Article in English | MEDLINE | ID: mdl-31627951

ABSTRACT

OBJECTIVES: We sought to develop strategies for management of the aortic arch in patients with Loeys-Dietz syndrome (LDS) through a review of our clinical experience with these patients and a comparison with our experience in patients with Marfan syndrome (MFS). METHODS: We reviewed hospital and follow-up records of 79 patients with LDS and compared them with 256 patients with MFS who served as reference controls. RESULTS: In the LDS group, 16% of patients presented initially with acute aortic dissection (AAD) (67% type A, 33% type B) or developed AAD during follow-up, compared with 10% of patients with MFS (95% type A, 5% type B). There was no difference between patients with LDS or MFS in need for subsequent arch interventions after aortic root surgery (46% vs 50%, P = 1.0). Among the patients who never had AAD, the need for arch repair at initial root surgery was greater in patients with LDS (5% vs 0.4%, P = .04), as was the need for any subsequent aortic surgery (12% vs 1.3%, P = .0004). Late mortality in patients with LDS after arch repair was greater than in those patients who had no arch intervention (33% vs 6%, P = .007). CONCLUSIONS: In the absence of dissection, patients with LDS have a greater rate of arch intervention after root surgery than patients with MFS. After a dissection, arch reintervention rates are similar in the 2 groups. Arch intervention portends greater late mortality in LDS.


Subject(s)
Aorta, Thoracic/surgery , Aortic Aneurysm/surgery , Loeys-Dietz Syndrome , Adolescent , Adult , Child , Child, Preschool , Humans , Loeys-Dietz Syndrome/epidemiology , Loeys-Dietz Syndrome/mortality , Loeys-Dietz Syndrome/surgery , Marfan Syndrome/epidemiology , Marfan Syndrome/mortality , Marfan Syndrome/surgery , Middle Aged , Retrospective Studies , Young Adult
19.
Neurobiol Dis ; 124: 152-162, 2019 04.
Article in English | MEDLINE | ID: mdl-30448285

ABSTRACT

Loss-of-function mutations in progranulin (GRN), most of which cause progranulin haploinsufficiency, are a major autosomal dominant cause of frontotemporal dementia (FTD). Individuals with loss-of-function mutations on both GRN alleles develop neuronal ceroid lipofuscinosis (NCL), a lysosomal storage disorder. Progranulin is a secreted glycoprotein expressed by a variety of cell types throughout the body, including neurons and microglia in the brain. Understanding the relative importance of neuronal and microglial progranulin insufficiency in FTD pathogenesis may guide development of therapies. In this study, we used mouse models to investigate the role of neuronal and microglial progranulin insufficiency in the development of FTD-like pathology and behavioral deficits. Grn-/- mice model aspects of FTD and NCL, developing lipofuscinosis and gliosis throughout the brain, as well as deficits in social behavior. We have previously shown that selective depletion of neuronal progranulin disrupts social behavior, but does not produce lipofuscinosis or gliosis. We hypothesized that reduction of microglial progranulin would induce lipofuscinosis and gliosis, and exacerbate behavioral deficits, in neuronal progranulin-deficient mice. To test this hypothesis, we crossed Grnfl/fl mice with mice expressing Cre transgenes targeting neurons (CaMKII-Cre) and myeloid cells/microglia (LysM-Cre). CaMKII-Cre, which is expressed in forebrain excitatory neurons, reduced cortical progranulin protein levels by around 50%. LysM-Cre strongly reduced progranulin immunolabeling in many microglia, but did not reduce total brain progranulin levels, suggesting that, at least under resting conditions, microglia contribute less than neurons to overall brain progranulin levels. Mice with depletion of both neuronal and microglial progranulin failed to develop lipofuscinosis or gliosis, suggesting that progranulin from extracellular sources prevented pathology in cells targeted by the Cre transgenes. Reduction of microglial progranulin also did not exacerbate the social deficits of neuronal progranulin-insufficient mice. These results do not support the hypothesis of synergistic effects between progranulin-deficient neurons and microglia. Nearly complete progranulin deficiency appears to be required to induce lipofuscinosis and gliosis in mice, while partial progranulin insufficiency is sufficient to produce behavioral deficits.


Subject(s)
Brain/metabolism , Brain/pathology , Microglia/metabolism , Neurons/metabolism , Progranulins/metabolism , Animals , Behavior, Animal , Female , Frontotemporal Dementia , Gliosis/metabolism , Lipofuscin/metabolism , Male , Mice, Inbred C57BL , Mice, Knockout , Progranulins/genetics , Social Behavior
20.
N Engl J Med ; 379(16): 1529-1539, 2018 10 18.
Article in English | MEDLINE | ID: mdl-30146931

ABSTRACT

BACKGROUND: Diabetes mellitus is associated with an increased risk of cardiovascular events. Aspirin use reduces the risk of occlusive vascular events but increases the risk of bleeding; the balance of benefits and hazards for the prevention of first cardiovascular events in patients with diabetes is unclear. METHODS: We randomly assigned adults who had diabetes but no evident cardiovascular disease to receive aspirin at a dose of 100 mg daily or matching placebo. The primary efficacy outcome was the first serious vascular event (i.e., myocardial infarction, stroke or transient ischemic attack, or death from any vascular cause, excluding any confirmed intracranial hemorrhage). The primary safety outcome was the first major bleeding event (i.e., intracranial hemorrhage, sight-threatening bleeding event in the eye, gastrointestinal bleeding, or other serious bleeding). Secondary outcomes included gastrointestinal tract cancer. RESULTS: A total of 15,480 participants underwent randomization. During a mean follow-up of 7.4 years, serious vascular events occurred in a significantly lower percentage of participants in the aspirin group than in the placebo group (658 participants [8.5%] vs. 743 [9.6%]; rate ratio, 0.88; 95% confidence interval [CI], 0.79 to 0.97; P=0.01). In contrast, major bleeding events occurred in 314 participants (4.1%) in the aspirin group, as compared with 245 (3.2%) in the placebo group (rate ratio, 1.29; 95% CI, 1.09 to 1.52; P=0.003), with most of the excess being gastrointestinal bleeding and other extracranial bleeding. There was no significant difference between the aspirin group and the placebo group in the incidence of gastrointestinal tract cancer (157 participants [2.0%] and 158 [2.0%], respectively) or all cancers (897 [11.6%] and 887 [11.5%]); long-term follow-up for these outcomes is planned. CONCLUSIONS: Aspirin use prevented serious vascular events in persons who had diabetes and no evident cardiovascular disease at trial entry, but it also caused major bleeding events. The absolute benefits were largely counterbalanced by the bleeding hazard. (Funded by the British Heart Foundation and others; ASCEND Current Controlled Trials number, ISRCTN60635500 ; ClinicalTrials.gov number, NCT00135226 .).


Subject(s)
Aspirin/therapeutic use , Cardiovascular Diseases/prevention & control , Diabetes Complications/prevention & control , Diabetes Mellitus/drug therapy , Hemorrhage/chemically induced , Platelet Aggregation Inhibitors/therapeutic use , Primary Prevention , Aged , Aged, 80 and over , Aspirin/adverse effects , Cardiovascular Diseases/epidemiology , Diabetes Complications/epidemiology , Female , Follow-Up Studies , Hemorrhage/epidemiology , Humans , Male , Middle Aged , Platelet Aggregation Inhibitors/adverse effects , Poisson Distribution , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...