RESUMEN
BACKGROUND: Natural service breeding is common in U.S. cow-calf operations. Diseases impacting bull reproductive performance have significant economic consequences for producers. Anaplasmosis may be an underappreciated cause of poor reproductive performance in bulls. The primary systemic effects of bovine anaplasmosis including anemia, fever, and weight loss, can all result in unsatisfactory reproductive performance. The objective of this pilot study was to evaluate breeding soundness examination (BSE) outcomes and clinical changes in bulls during and upon resolution of clinical anaplasmosis. Anaplasma marginale-challenged bulls were observed for clinical disease and infection progression and changes in breeding soundness compared to uninfected control bulls for 16 weeks. RESULTS: All Anaplasma marginale-challenged bulls were PCR-positive, seropositive, and showed clinical signs by 3-, 17-, and 24-days post-challenge, respectively. Clinical signs of anaplasmosis included pallor, icterus, fever (≥ 40.2 °C), and weight loss. Acute anemia was observed in all challenged bulls with PCV nadirs ≤ 18% and peak percent parasitized erythrocyte ≥ 50%. Decreased scrotal circumference and poor semen quality (e.g., increased percentage of abnormal spermatozoa, decreased progressively motile sperm), were initially observed within days after onset of clinical anaplasmosis signs and continued weeks beyond disease resolution. Control bulls remained negative for A. marginale. CONCLUSION: This pilot study demonstrates that clinical anaplasmosis reduces breeding soundness in beef bulls. Anaplasmosis should be considered as a differential for bulls with decreased semen quality, especially within endemic areas. A 90 day or greater retest window is recommended for bulls of unsatisfactory breeding potential recently recovered from clinical anaplasmosis.
Asunto(s)
Anaplasmosis , Enfermedades de los Bovinos , Femenino , Bovinos , Animales , Masculino , Análisis de Semen/veterinaria , Proyectos Piloto , Semen , Escroto , Pérdida de PesoRESUMEN
Formed at the confluence of marine and fresh waters, estuaries experience both the seaside pressures of rising sea levels and increasing storm severity, and watershed and precipitation changes that are shifting the quality and quantity of freshwater and sediments delivered from upstream sources. Boating, shoreline hardening, harvesting pressure, and other signatures of human activity are also increasing as populations swell in coastal regions. Given this shifting landscape of pressures, the factors most threatening to estuary health and stability are often uncertain. To identify the greatest contemporary threats to coastal wetlands and oyster reefs across the southeastern United States (Mississippi to North Carolina), we summarized recent population growth and land-cover change and surveyed estuarine management and science experts. From 1996 to 2019, human population growth in the region varied from a 17% decrease to a 171% increase (mean = +43%) with only 5 of the 72 SE US counties losing population, and nearly half growing by more than 40%. Individual counties experienced between 999 and 19,253 km2 of new development (mean: 5725 km2), with 1-5% (mean: 2.6%) of undeveloped lands undergoing development over this period across the region. Correspondingly, our survey of 169 coastal experts highlighted development, shoreline hardening, and upstream modifications to freshwater flow as the most important local threats facing coastal wetlands. Similarly, experts identified development, upstream modifications to freshwater flow, and overharvesting as the most important local threats to oyster reefs. With regards to global threats, experts categorized sea level rise as the most pressing to wetlands, and acidification and precipitation changes as the most pressing to oyster reefs. Survey respondents further identified that more research, driven by collaboration among scientists, engineers, industry professionals, and managers, is needed to assess how precipitation changes, shoreline hardening, and sea level rise are affecting coastal ecosystem stability and function. Due to the profound role of humans in shaping estuarine health, this work highlights that engaging property owners, recreators, and municipalities to implement strategies to improve estuarine health will be vital for sustaining coastal systems in the face of global change.
Asunto(s)
Ostreidae , Humedales , Animales , Ecosistema , Estuarios , Humanos , North CarolinaRESUMEN
Erosion, sediment production, and routing on a tectonically active continental margin reflect both tectonic and climatic processes; partitioning the relative importance of these processes remains controversial. Gulf of Alaska contains a preserved sedimentary record of the Yakutat Terrane collision with North America. Because tectonic convergence in the coastal St. Elias orogen has been roughly constant for 6 My, variations in its eroded sediments preserved in the offshore Surveyor Fan constrain a budget of tectonic material influx, erosion, and sediment output. Seismically imaged sediment volumes calibrated with chronologies derived from Integrated Ocean Drilling Program boreholes show that erosion accelerated in response to Northern Hemisphere glacial intensification (â¼ 2.7 Ma) and that the 900-km-long Surveyor Channel inception appears to correlate with this event. However, tectonic influx exceeded integrated sediment efflux over the interval 2.8-1.2 Ma. Volumetric erosion accelerated following the onset of quasi-periodic (â¼ 100-ky) glacial cycles in the mid-Pleistocene climate transition (1.2-0.7 Ma). Since then, erosion and transport of material out of the orogen has outpaced tectonic influx by 50-80%. Such a rapid net mass loss explains apparent increases in exhumation rates inferred onshore from exposure dates and mapped out-of-sequence fault patterns. The 1.2-My mass budget imbalance must relax back toward equilibrium in balance with tectonic influx over the timescale of orogenic wedge response (millions of years). The St. Elias Range provides a key example of how active orogenic systems respond to transient mass fluxes, and of the possible influence of climate-driven erosive processes that diverge from equilibrium on the million-year scale.
RESUMEN
Our objective was to determine if the addition of a concentrated human recombinant transforming growth factor beta-1 (TGF) to bovine semen at the time of AI would result in increased risk of pregnancy in beef and dairy cows. Suckled beef cows (nâ =â 1,132) in 11 herds across 2 states and lactating dairy cows (nâ =â 2,208) in one organic-certified herd were enrolled. Beef cows received fixed-time AI (FTAI) following a 7 d CO-Synchâ +â controlled internal drug release estrous synchronization protocol. Dairy cows were inseminated following observation of natural estrus expression. Cows received either no treatment as a control (CON) or 10 ng of TGF in 10 µL added through the cut-end of a thawed straw of semen immediately prior to AI. At the time of FTAI of beef cows, the meanâ ±â SD age was 5.0â ±â 2.4 yr, BCS was 5.3â ±â 0.7, and days postpartum was 78.2â ±â 15.5 d. The overall pregnancy risk (PR) in beef cows was 55.2% to AI and 90.5% season-long. PR in beef cows was not affected (Pâ =â 0.27) by the addition of TGF (53.1% vs. 58.1%). Furthermore, there was no difference (Pâ =â 0.88) for season-long PR in beef cows that received TGF (91.2% vs. 91.5%). At the time of insemination of dairy cows, the meanâ ±â SD lactation was 3.0â ±â 1.3 lactations, BCS was 2.9â ±â 0.3, days in milk was 115.6â ±â 56.6 d, and cows had received 2.4â ±â 1.5 inseminations/cow. The overall pregnancy risk to AI in dairy cows was 23.1%. PR to AI for dairy cows was not affected (Pâ =â 0.32) by addition of TGF (22.0% vs. 23.8%). In conclusion, PR to AI was not affected by addition of TGF to thawed semen immediately prior to AI in beef or dairy cows.
Seminal plasma is the fluid portion of the ejaculate that is routinely removed or significantly diluted when preparing semen for artificial insemination. Seminal plasma has been shown to elicit changes to the tissues of the uterus at the time of insemination that improves pregnancy outcomes in rodents and swine. Here, we supplemented the molecule of seminal plasma, transforming growth factor beta-1, to semen at the time of artificial insemination in an attempt to improve pregnancy rates in beef and dairy cattle. In total, 3,340 cows were inseminated; half received no treatment, and the other half received a supplementation of transforming growth factor beta-1. We found that supplementing transforming growth factor beta-1 did not improve the pregnancy rate in beef or dairy cattle. We conclude that the pregnancy rate was not affected by the supplementation of transforming growth factor beta-1 to semen at the time of insemination. Future studies should consider the effects of transforming growth factor beta-1 on other pregnancy outcomes, such as calving rate, birth weight, and postnatal growth.
Asunto(s)
Inseminación Artificial , Semen , Factor de Crecimiento Transformador beta1 , Animales , Bovinos/fisiología , Inseminación Artificial/veterinaria , Femenino , Embarazo , Factor de Crecimiento Transformador beta1/metabolismo , Masculino , Sincronización del Estro , LactanciaRESUMEN
OBJECTIVE: The objectives of this study were to evaluate the prevalence of chronic Anaplasma marginale infection in beef bulls from eastern Kansas and compare breeding soundness parameters between A marginale-infected and uninfected bulls. We hypothesized that bulls with chronic anaplasmosis would have inferior breeding soundness exam (BSE) outcomes as a result of persistent A marginale infection or the consequence of initial clinical disease compared to uninfected bulls. ANIMALS: 535 client-owned beef bulls from eastern Kansas undergoing routine BSE. METHODS: Complete BSEs were conducted by participating veterinarians according to the second edition of the Society for Theriogenology Manual for Breeding Soundness Examination of Bulls. Blood samples were collected for PCV determination and analysis of A marginale infection status via quantitative PCR and cELISA. Logistic and linear regression methods were used to evaluate factors associated with A marginale infection status and BSE parameters. RESULTS: Prevalence of chronic A marginale infection was 46% (245/535) among bulls. Unsatisfactory BSE outcome was not statistically associated with chronic anaplasmosis in this study population, although more bulls with chronic anaplasmosis had unsatisfactory BSE outcomes (15.0 ± 2.4% vs 12.0 ± 2.2%). CLINICAL RELEVANCE: Chronic anaplasmosis is prevalent among eastern Kansas breeding bulls; however, no negative association between chronic anaplasmosis and breeding soundness at time of BSE was observed.
Asunto(s)
Anaplasmosis , Enfermedades de los Bovinos , Humanos , Masculino , Bovinos , Animales , Escroto , Anaplasmosis/epidemiología , Kansas/epidemiología , Cruzamiento , Examen Físico , Enfermedades de los Bovinos/epidemiologíaRESUMEN
The objective of this study was to compare the influence of two low-stress weaning methods with conventional weaning on post-weaning performance and carcass characteristics of beef steers. Single-sourced steer calves (n = 89) were stratified by body weight (BW) and dam age into three groups in a completely randomized design (n = 29 or 30 steers/treatment): ABRUPT (calves isolated from dams on the day of weaning), FENCE (calves separated from dams via a fence for 7 d prior to completely weaning), and NOSE (nose-flap inserted and calves remained with dams for 7 d prior to completely weaning). At day +7 post-weaning, calves were transported to a commercial feedlot where they received standard step-up and finishing rations typical for a Northern Plains feedlot. BWs were recorded in study day -7 (PreTreat), 0 (Weaning), 7 (PostWean), 26 (Receiving), 175 (Ultrasound), and 238 or 268 (Final), and average daily gains (ADG) were calculated for each time period. Blood samples were collected via coccygeal venipuncture at d -7 (PreTreat), 0 (Weaning), and +7 (PostWean) from a subsample of calves (n = 10 per treatment) and analyzed for haptoglobin (acute-phase stress protein) concentrations using a bovine haptoglobin ELISA kit. On day 175, ultrasound fat thickness and intramuscular fat were determined and utilized to project marketing dates when steers reached 1.27 cm of backfat (day 238 or 268). Carcass measurements were recorded at the time of harvest. The weaning method interacted (P < 0.01) with a time period for ADG and BW. Calf ADG was greater (P < 0.01) in the NOSE treatment during PreTreat to Weaning than ABRUPT or FENCE. In the Weaning to PostWean period, the FENCE calves had greater (P < 0.01) ADG than ABRUPT and NOSE. During the Postwean to Receiving period ADG was greater (P < 0.04) for ABRUPT compared to FENCE and NOSE. Calf ADG was similar (P > 0.05) among treatments for the remainder of the feeding period. Calf BW did not differ among treatments (P > 0.05) at all times of weighing. Haptoglobin was undetectable in all samples except two samples collected on day -7. The weaning method did not influence (P > 0.05) carcass measurements. Collectively these data suggest low-stress weaning methods do not significantly improve post-weaning growth performance or carcass characteristics compared to using conventional methods despite minor, short-term alterations in ADG during the weaning period.
RESUMEN
The objective of this study was to compare the influence of beef production systems using additive combinations of growth-promotant technologies on meat quality. Steer calves (nâ =â 120) were assigned to 1 of 4 treatments: 1) no technology (NT; control), 2) antibiotic treated (ANT; NT plus therapeutic antibiotics, monensin, and tylosin), 3) implant treated (IMP; ANT plus a series of three implants), and 4) beta-agonist treated (BA; IMP plus ractopamine-HCl). Muscle biopsy samples from the longissimus lumborum were extracted from a subset (nâ =â 4 per treatment) of steers to evaluate expression of calpain-1, calpain-2, and calpastatin using real-time RT-PCR. Following carcass chilling, objective color (L*, a*, and b*) was evaluated. The right strip loin was removed from each carcass, portioned into 2.54-cm steaks, and designated to 7, 14, or 21 d postmortem aging periods for analysis of cook loss and Warner-Bratzler shear force (WBSF). The anterior face of each strip loin was used for analysis of crude fat and moisture. Treatment influenced (Pâ <â 0.001) L*, a*, and b*. The NT and IMP treatments had greater (Pâ <â 0.01) L* values, ANT was intermediate, and BA had the lowest (Pâ <â 0.01) L* values. The NT and IMP treatments had higher (Pâ <â 0.01) a* and b* values compared with ANT, which were higher (Pâ <â 0.01) than BA. Steaks from implanted steers (IMP and BA) tended (P ≤ 0.067) to exhibit higher a* and b* than steaks from nonimplanted steers. Cattle in the NT and ANT treatments produced steaks with increased (Pâ <â 0.01) crude fat percentage compared with the IMP and BA treatments, which were similar (Pâ >â 0.05). Percent moisture of NT steaks was lower (Pâ <â 0.01) than all other treatments, ANT was intermediate, and IMP and BA were similar (Pâ >â 0.05) and had the highest (Pâ <â 0.01) moisture content. Cook loss tended to be greater (Pâ =â 0.088) for implanted steers (IMP and BA) compared to nonimplanted steers (NT and ANT). Steaks from NT and ANT treatments were more tender (Pâ <â 0.05) than IMP and BA, which were similar (Pâ >â 0.05). Thus, WBSF was lower (Pâ <â 0.001) in nonimplanted than implanted steaks. Expression of calpastatin was increased (P ≤ 0.025) in ANT and BA treatments, and there was a tendency for expression of calpain-2 to be increased (Pâ =â 0.081) in ANT compared to NT. These results suggest that production systems with limited use of growth promoting technology produced strip loins with more crude fat, less moisture and cook loss, and improved tenderness.
RESUMEN
Silk-amyloid-mussel foot protein (SAM) hydrogels made from recombinant fusion proteins containing ß-amyloid peptide, spider silk domain, and mussel foot protein (Mfp) are attractive bioadhesives as they display a unique combination of tunability, biocompatibility, bioabsorbability, strong cohesion, and underwater adhesion to a wide range of biological surfaces. To design tunable SAM hydrogels for tailored surgical repair applications, an understanding of the relationships between protein sequence and hydrogel properties is imperative. Here, we fabricated SAM hydrogels using fusion proteins of varying lengths of silk-amyloid repeats and Mfps to characterize their structure and properties. We found that increasing silk-amyloid repeats enhanced the hydrogel's ß-sheet content (r = 0.74), leading to higher cohesive strength and toughness. Additionally, increasing the Mfp length beyond the half-length of the full Mfp sequence (1/2 Mfp) decreased the ß-sheet content (r = -0.47), but increased hydrogel surface adhesion. Among different variants, the hydrogel made of 16xKLV-2Mfp displayed a high ultimate strength of 3.0 ± 0.3 MPa, an ultimate strain of 664 ± 119%, and an attractive underwater adhesivity of 416 ± 20 kPa to porcine skin. Collectively, the sequence-structure-property relationships learned from this study will be useful to guide the design of future protein adhesives with tunable characteristics for tailored surgical applications.
RESUMEN
Performance of cows and calves during 63-d early or conventional weaning periods was evaluated. Spring-calving beef cows (n = 167) of similar age, body condition score (BCS), and body weight (BW = 599 ± 54.5 kg), and their calves (initial BW = 204 ± 26.7 kg; 153 ± 15 d of age) were assigned randomly to 1 of 4 weaning treatments: weaning at 153 d of age followed by 56 days of limit feeding in confinement (E-D), confinement of cow and calf for a 56-d period of limit feeding followed by weaning at 209 d of age (C-D), weaning at 153 d of age followed by a 56-d grazing period (E-P), and a 56-d grazing period for both cow and calf followed by weaning at 209 d of age (C-P). Cows and calves assigned to pasture treatments grazed native range pastures without supplement. Cows and calves assigned to drylot treatments were fed complete diets. Calves assigned to E-D were fed a concentrate-based diet at 2.5% of BW, whereas cows assigned to E-D were fed a forage-based diet at 1.6% of BW. Cows assigned to C-D were offered the diet fed to E-D cows at 2.0% of BW. Calf average daily gain (ADG) was influenced by diet and weaning treatments (diet × weaning, P ≤ 0.03). Cows and calves assigned to all treatments were limit fed common diets for 7 d at the end of our study to equalize gut fill. In general, calves managed in confinement and fed concentrate-based diets (i.e., E-D and C-D) had greater ADG than non-supplemented calves maintained on pasture (i.e., E-P and C-P). Cow BW and BCS change (days 0 to 63) were influenced by both diet and weaning status (P ≤ 0.05). Non-lactating cows maintained on pasture had lesser BW loss than other treatments, whereas non-lactating cows fed in confinement had lesser BCS on day 63 and greater BCS loss from days 0 to 63 than other treatments. Conversely, rump-fat depth on day 63 was greater (P < 0.01) for non-lactating cows maintained on pasture than for lactating cows in either pasture or drylot environments. Similarly, change in rump-fat depth was greatest (diet × weaning, P < 0.01) for non-lactating cows on pasture and least for lactating cows in either pasture or drylot environments. Results were interpreted to indicate that early-weaning spared cow BW and rump fat compared to weaning at conventional ages. Performance of cows appeared to be similar when limit-fed under drylot conditions or maintained in a pasture environment. Conversely, calf performance was generally greater in confinement than on pasture.
RESUMEN
Health and performance of early-weaned steers were evaluated during a 56-d weaning period, a 56-d feedlot receiving period, and a 165-d feedlot finishing period. Steers (n = 239; 128 ± 14 d of age) were assigned to a 56-d weaning treatment: drylot weaning (D) or pasture weaning (P). Pasture steers grazed mature, native tallgrass range (89.2% dry matter [DM], 9.08% crude protein [CP]), without supplementation. A concentrate-based diet (18.7% CP and 1.15 Mcal NEg/kg) was fed to D steers. Later, all steers were transitioned to a receiving, then a finishing diet and fed to a common endpoint. Body weight (BW) after and average daily gain (ADG) during weaning were greater (P < 0.01) for D than for P. Incidence of undifferentiated fever during weaning tended to be greater (P = 0.10) for D steers than for P steers. Conversely, incidence of keratoconjunctivitis was greater (P < 0.01) for P than for D during weaning (40.2% vs. 0%, respectively) and receiving (P < 0.01; 14.3% vs. 1.6%, respectively). At the start and end of receiving, D steers had greater (P < 0.01) BW compared with P steers. Drylot steers had greater (P = 0.03) ADG compared with P steers during receiving. Pasture steers tended to have greater dry matter intake (DMI) (P = 0.09) during receiving than D steers. In contrast, gain:feed (G:F) was improved (P < 0.01) for P steers than for D steers during receiving. Incidence of undifferentiated fever was not different (P = 0.99) between D and P steers during receiving. At start of finishing, D steers were heavier (P < 0.01) than P steers; however, finishing ADG was greater (P < 0.01) for P compared with D. Conversely, hot carcass weight of P steers was less (P < 0.01) compared with D steers. Drylot steers had greater DMI (P < 0.01) than P steers during finishing, whereas P steers had improved G:F (P < 0.01) compared with D steers. There were no differences (P ≥ 0.19) between treatments in DOF, carcass characteristics or United States Department of Agriculture yield grade. Growth and health during a 56-d weaning period and a 56-d receiving period were improved when steers were weaned in a drylot environment and fed a concentrate-based diet compared with non-supplemented steers weaned in a pasture environment. We interpret these data to suggest that, under the conditions of our experiment, steers preconditioned on mature, native, warm-season pasture for 56 d without supplementation were unable to compensate for previous nutrient restriction during finishing.
RESUMEN
The objective of this study was to determine the impact of beef production systems utilizing additive combinations of growth promotant technologies on animal and carcass performance and environmental outcomes. Crossbred steer calves (n =120) were stratified by birth date, birth weight, and dam age and assigned randomly to one of four treatments: 1) no technology (NT; control), 2) antibiotic treated (ANT; NT plus therapeutic antibiotics and monensin and tylosin), 3) implant treated (IMP; ANT plus a series of 3 implants, and 4) beta-agonist treated (BA; IMP plus ractopamine-HCl for the last 30 d prior to harvest). Weaned steers were fed in confinement (dry lot) and finished in an individual feeding system to collect performance data. At harvest, standard carcass measures were collected and the United States Department of Agriculture (USDA) Yield Grade and Quality Grade were determined. Information from the cow-calf, growing, and finishing phases were used to simulate production systems using the USDA Integrated Farm System Model, which included a partial life cycle assessment of cattle production for greenhouse gas (GHG) emissions, fossil energy use, water use, and reactive N loss. Body weight in suckling, growing, and finishing phases as well as hot carcass weight was greater (P < 0.05) for steers that received implants (IMP and BA) than non-implanted steers (NT and ANT). The average daily gain was greater (P < 0.05) for steers that received implants (IMP and BA) than non-implanted steers during the suckling and finishing phases, but no difference (P = 0.232) was detected during the growing phase. Dry matter intake and gain:feed were greater (P < 0.05) for steers that received implants than non-implanted steers during the finishing phase. Steers that received implants responded (P < 0.05) with a larger loin muscle area, less kidney pelvic and heart fat, advanced carcass maturity, reduced marbling scores, and a greater percentage of carcasses in the lower third of the USDA Choice grade. This was offset by a lower percentage of USDA Prime grading carcasses compared with steers receiving no implants. Treatments did not influence (P > 0.05) USDA Yield grade. The life cycle assessment revealed that IMP and BA treatments reduced GHG emissions, energy use, water use, and reactive nitrogen loss compared to NT and ANT. These data indicate that growth promoting technologies increase carcass yield while concomitantly reducing carcass quality and environmental impacts.
RESUMEN
The uncertain response of marine terminating outlet glaciers to climate change at time scales beyond short-term observation limits models of future sea level rise. At temperate tidewater margins, abundant subglacial meltwater forms morainal banks (marine shoals) or ice-contact deltas that reduce water depth, stabilizing grounding lines and slowing or reversing glacial retreat. Here we present a radiocarbon-dated record from Integrated Ocean Drilling Program (IODP) Site U1421 that tracks the terminus of the largest Alaskan Cordilleran Ice Sheet outlet glacier during Last Glacial Maximum climate transitions. Sedimentation rates, ice-rafted debris, and microfossil and biogeochemical proxies, show repeated abrupt collapses and slow advances typical of the tidewater glacier cycle observed in modern systems. When global sea level rise exceeded the local rate of bank building, the cycle of readvances stopped leading to irreversible retreat. These results support theory that suggests sediment dynamics can control tidewater terminus position on an open shelf under temperate conditions delaying climate-driven retreat.
RESUMEN
Forty-eight male calves (3/4 Brahman×1/4 Charolais) were used to determine carcass cutability and meat tenderness of Longissimus lumborum (LL), Gluteus medius (GM), Semitendinosus (ST) and Psoas major (PM) steaks from lighter weight carcasses of bulls and steers castrated at 3, 7, or 12 mo of age grown under tropical pasture conditions. Steaks from steers had lower (more tender) LL Warner-Bratzler shear force (WBSF) values than those from bulls. Steaks from steers castrated at 3 mo had lower GM WBSF than those from bulls. For PM steaks, those aged 28 d had lower WBSF than those aged 2d. Steaks aged 28 d had the lowest LL and GM WBSF and steaks aged 2d had the highest LL, GM, and ST WBSF. Castration at younger ages is recommended because it provides improvement in LL and GM tenderness over bulls with no differences in carcass traits or subprimal yields.