Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
J Dairy Sci ; 99(12): 10150-10160, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27743666

ABSTRACT

The need for vitamin D supplementation of dairy cattle has been known for the better part of the last century and is well appreciated by dairy producers and nutritionists. Whether current recommendations and practices for supplemental vitamin D are meeting the needs of dairy cattle, however, is not well known. The vitamin D status of animals is reliably indicated by the concentration of the 25-hydroxyvitamin D [25(OH)D] metabolite in serum or plasma, with a concentration of 30ng/mL proposed as a lower threshold for sufficiency. The objective of this study was to determine the typical serum 25(OH)D concentrations of dairy cattle across various dairy operations. The serum 25(OH)D concentration of 702 samples collected from cows across various stages of lactation, housing systems, and locations in the United States was 68±22ng/mL (mean ± standard deviation), with the majority of samples between 40 and 100ng/mL. Most of the 12 herds surveyed supplemented cows with 30,000 to 50,000 IU of vitamin D3/d, and average serum 25(OH)D of cows at 100 to 300 DIM in each of those herds was near or above 70ng/mL regardless of season or housing. In contrast, average serum 25(OH)D of a herd supplementing with 20,000 IU/d was 42±15ng/mL, with 22% below 30ng/mL. Cows in early lactation (0 to 30d in milk) also had lower serum 25(OH)D than did mid- to late-lactation cows (57±17 vs. 71±20ng/mL, respectively). Serum 25(OH)D of yearling heifers receiving 11,000 to 12,000 IU of vitamin D3/d was near that of cows at 76±15ng/mL. Serum 25(OH)D concentrations of calves, on the other hand, was 15±11ng/mL at birth and remained near or below 15ng/mL through 1mo of age if they were fed pasteurized waste milk with little to no summer sun exposure. In contrast, serum 25(OH)D of calves fed milk replacer containing 6,600 and 11,000 IU of vitamin D2/kg of dry matter were 59±8 and 98±33ng/mL, respectively, at 1mo of age. Experimental data from calves similarly indicated that serum 25(OH)D achieved at approximately 1mo of age would increase 6 to 7ng/mL for every 1,000 IU of vitamin D3/kg of dry matter of milk replacer. In conclusion, vitamin D status of dairy cattle supplemented with vitamin D3 according to typical practices, about 1.5 to 2.5 times the National Research Council recommendation, is sufficient as defined by serum 25(OH)D concentrations. Newborn calves and calves fed milk without supplemental vitamin D3, however, are prone to deficiency.


Subject(s)
Dairying , Vitamin D/blood , Animals , Calcifediol , Cattle , Female , Lactation , Milk , Vitamins
2.
Br J Nutr ; 111(2): 261-9, 2014 Jan 28.
Article in English | MEDLINE | ID: mdl-23880397

ABSTRACT

Dissimilatory reduction of sulphate by sulphate-reducing bacteria in the rumen produces sulphide, which can lead to a build-up of the toxic gas hydrogen sulphide (H2S) in the rumen when increased concentrations of sulphate are consumed by ruminants. We hypothesised that adding ferric Fe would competitively inhibit ruminal sulphate reduction. The effects of five concentrations and two sources (ferric citrate or ferric ammonium citrate) of ferric Fe were examined in vitro (n 6 per treatment). Rumen fluid was collected from a steer that was adapted to a high-concentrate, high-sulphate diet (0·51 % S). The addition of either source of ferric Fe decreased (P< 0·01) H2S concentrations without affecting gas production (P= 0·38), fluid pH (P= 0·80) or in vitro DM digestibility (P= 0·38) after a 24 h incubation. An in vivo experiment was conducted using eight ruminally fistulated steers (543 (sem 12) kg) in a replicated Latin square with four periods and four treatments. The treatments included a high-concentrate, high-sulphate control diet (0·46 % S) or the control diet plus ferric ammonium citrate at concentrations of 200, 300 or 400 mg Fe/kg diet DM. The inclusion of ferric Fe did not affect DM intake (P= 0·21). There was a linear (P< 0·01) decrease in the concentration of ruminal H2S as the addition of ferric Fe concentrations increased. Ferric citrate appears to be an effective way to decrease ruminal H2S concentrations, which could allow producers to safely increase the inclusion of ethanol co-products.


Subject(s)
Animal Feed/analysis , Cattle/physiology , Diet/veterinary , Ferric Compounds/pharmacology , Hydrogen Sulfide/metabolism , Rumen/drug effects , Animal Nutritional Physiological Phenomena , Animals , Housing, Animal , Hydrogen Sulfide/chemistry , Male , Rumen/metabolism
3.
BMC Genomics ; 14: 730, 2013 Oct 25.
Article in English | MEDLINE | ID: mdl-24156620

ABSTRACT

BACKGROUND: As consumers continue to request food products that have health advantages, it will be important for the livestock industry to supply a product that meet these demands. One such nutrient is fatty acids, which have been implicated as playing a role in cardiovascular disease. Therefore, the objective of this study was to determine the extent to which molecular markers could account for variation in fatty acid composition of skeletal muscle and identify genomic regions that harbor genetic variation. RESULTS: Subsets of markers on the Illumina 54K bovine SNPchip were able to account for up to 57% of the variance observed in fatty acid composition. In addition, these markers could be used to calculate a direct genomic breeding values (DGV) for a given fatty acids with an accuracy (measured as simple correlations between DGV and phenotype) ranging from -0.06 to 0.57. Furthermore, 57 1-Mb regions were identified that were associated with at least one fatty acid with a posterior probability of inclusion greater than 0.90. 1-Mb regions on BTA19, BTA26 and BTA29, which harbored fatty acid synthase, Sterol-CoA desaturase and thyroid hormone responsive candidate genes, respectively, explained a high percentage of genetic variance in more than one fatty acid. It was also observed that the correlation between DGV for different fatty acids at a given 1-Mb window ranged from almost 1 to -1. CONCLUSIONS: Further investigations are needed to identify the causal variants harbored within the identified 1-Mb windows. For the first time, Angus breeders have a tool whereby they could select for altered fatty acid composition. Furthermore, these reported results could improve our understanding of the biology of fatty acid metabolism and deposition.


Subject(s)
Fatty Acids/metabolism , Genome-Wide Association Study , Genome , Animals , Breeding , Cattle , Fatty Acid Synthases/genetics , Fatty Acid Synthases/metabolism , Genotype , Meat/analysis , Models, Statistical , Phenotype , Polymorphism, Single Nucleotide , Stearoyl-CoA Desaturase/genetics , Stearoyl-CoA Desaturase/metabolism
4.
J Anim Sci ; 1012023 Jan 03.
Article in English | MEDLINE | ID: mdl-36592745

ABSTRACT

An experiment was conducted over 2 yr to measure performance and greenhouse gas (GHG) emissions of weaned calves from two cow-calf production systems. Crossbred steers and heifers (n = 270, initial body weight (BW) = 207 kg, SD = 35) were used in a randomized complete block design, with treatments applied to the cow-calf system. Treatments were: 1) a traditional system consisting of April to June calving with smooth bromegrass pasture and grazed corn residue as forage resources (TRAD); 2) an alternative system consisting of July to September calving utilizing partial-drylot feeding, summer-planted oats, and corn residue grazing (ALT). Calves from both production systems were weaned at the same age and grown (diet NEg = 1.05 Mcal kg-1) for approximately 117 d. The calves then transitioned to a high-grain finishing diet (year 1: NEg = 1.32 Mcal kg-1; year 2: NEg = 1.39 Mcal kg-1) and fed to a targeted 1.52 cm backfat. Growth performance in the grower phase resulted in greater (P < 0.01) average daily gain (1.39 vs. 1.22 ± 0.02 kg), greater gain:feed (P < 0.01; 0.157 vs. 0.137 ± 0.003) for ALT calves compared to TRAD calves, However, a lower initial BW (P < 0.01; 185 vs. 229 ± 4.9 kg) resulted in a lower ending BW (P < 0.01; 347 vs. 371 ± 2.9 kg) for ALT calves compared to TRAD calves in spite of improved growth performance. In the finisher phase, ALT calves gained less (1.52 vs. 1.81 ± 0.218 kg; P = 0.02), were less efficient (0.139 vs. 173 ± 0.0151; P = 0.01) but exhibited similar hot carcass weights (HCW) (388 vs. 381 ± 3.8 kg; P = 0.14) compared to TRAD calves. Each pen of calves was put into a large pen-scale chamber that continuously measured carbon dioxide (CO2) and methane (CH4) for 5 d during the grower and finisher phases. The average CH4 and CO2 production per unit of feed intake was used to calculate total GHG emissions over the entire grower and finisher phase. Overall, there were no differences (P ≥ 0.17) between treatments for CH4 per day and per kilogram dry matter intake (DMI). However, ALT calves tended to produce less (P ≤ 0.10) CO2 per day and per kilogram DMI than TRAD calves. Overall, methane emissions were greater in ALT calves (110.7 vs. 92.2 ± 8.3 g CH4 kg-1 HCW; P = 0.04) than TRAD calves. The ALT calves required 27 additional days on feed to market, which resulted in more total CH4 per animal across the entire feeding period (P = 0.02) than TRAD calves. Production systems that reduce days to market to achieve similar HCW may reduce GHG emissions.


There are many reasons (i.e. drought, limited perennial forage, calving) for using intensive or partially intensive production practices (e.g. drylotting or confinement) in a cow-calf enterprise. These practices may impact subsequent calf growth and feedlot performance. In addition, limited data are available comparing the environmental impacts (i.e., greenhouse gas (GHG) emissions) from different cow-calf production systems. This experiment evaluated the effects of a partial-intensive cow-calf production system on post-weaning calf growth performance, carcass characteristics, and GHG emissions. Calves from the partial-intensive cow-calf system had improved growth compared to calves from the extensive cow-calf system during the grower phase. During finishing, calves from the partial-intensive cow-calf system had poorer growth performance resulting in calves from the partial-intensive cow-calf system requiring an additional 27 d on feed to reach finish as calves from the traditional cow-calf system. These differences are likely due to compensation from lower gain periods resulting in better gain in the subsequent growth period. Cow-calf production system did not alter methane and carbon dioxide emissions per kilogram of intake. However, because calves in the partial-intensive cow-calf system required additional days on feed, absolute methane and carbon dioxide emissions were greater per animal for the partial-intensive cow-calf system compared to the extensive cow-calf system suggesting that reducing days to market may reduce emissions from beef systems.


Subject(s)
Greenhouse Gases , Methane , Cattle , Animals , Female , Carbon Dioxide , Poaceae , Eating , Diet/veterinary , Animal Feed/analysis
5.
Transl Anim Sci ; 6(1): txac023, 2022 Jan.
Article in English | MEDLINE | ID: mdl-35356231

ABSTRACT

Annual forages provide a valuable grazing resource for cattle producers; however, annuals are prone to accumulating nitrate and have the potential to cause nitrate toxicity. Although these forages pose a risk of containing high nitrate concentrations, they can be a high-quality feed source. Understanding the factors that affect the potential for toxicity when using these forages is important to help nutritionists and producers make management decisions. This review describes the previous research, current guidelines for nitrate toxicity, and the potential for improvement in our current recommendations. Current extension toxicity guidelines appear to be founded primarily on drenching based studies and overestimate the nitrate toxicity potential of forages. Recommendations need to account for multiple factors that affect the threshold for toxicity. There is evidence that fresh forages have a lower risk of toxicity because of slower release of nitrate into the rumen and a slower rate of dry matter intake. Increased dietary energy and sulfur content reduce the potential for toxicity. Microbial adaptation can reduce the risk and allow use of potentially toxic forages. These factors should influence feeding recommendations. However, there is currently not enough data available to establish new guidelines that account for these main factors. Thus, there is a need for renewed research in this area. The limited number of studies grazing elevated nitrate forages seems to suggest that there is less risk in grazing situations, especially if animals graze selectively. There is a need to develop guidelines for nitrate toxicity and management recommendations when grazing. To accomplish this, there is a need for more studies to evaluate risk of toxicity in grazing situations. These grazing studies need to evaluate the effects of nitrate concentration, forage quality, and grazing management on the potential for nitrate toxicity. While the conservative guidelines that are currently in use reduce risk of nitrate toxicity, they may also cause a significant increase in feed costs for producers.

6.
Transl Anim Sci ; 6(3): txac090, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35854967

ABSTRACT

An experiment was conducted to measure production responses of an alternative cow-calf production system integrated into a cropping system without access to perennial forage compared to a traditional cow-calf system utilizing perennial forage. Multiparous, cross-bred beef cows (n = 160; average age = 6.2 ± 2.8 yr) were utilized in a randomized complete block experimental design and unstructured treatment design. Upon initiation, cows were blocked by age and stratified by source, assigned randomly to one of two production systems, each with four replicates (n = 20 cows/replicate). Once allotted to their treatment groups, cows remained in their experimental units for the duration of the experiment. Treatments were: 1) a traditional system consisting of April to May calving with smooth bromegrass pasture and grazed corn residue as forage resources (TRAD); 2) an alternative system consisting of July to August calving utilizing partial-drylot feeding, summer-planted oats, and corn residue grazing (ALT). There were no differences (P ≥ 0.27) in calving rates (91.8 vs. 86.7 ± 2.92%), pregnancy rates (89.3 vs. 89.9 ± 2.66%), and weaning rates (87.2 vs. 82.3 ± 3.29%) for TRAD vs. ALT, respectively. However, there was an increase (P = 0.04) in the rate of twin offspring in ALT (2.9 vs. 9.4 ± 2.36% for TRAD vs. ALT, respectively). One calf from the set of twins was selected randomly at birth to be removed from the experiment, so the production data are only from single calves. There was no difference (P = 0.47) in calf body weight at birth (40 vs. 39 ± 0.7 kg for TRAD vs. ALT, respectively). At weaning, calves in the ALT system were lighter (P < 0.01) at the same day of age (184 vs. 229 ± 5.5 kg) compared to TRAD calves. Cows from the ALT system had fewer (P < 0.01) kg weaned per cow exposed to bull (150 vs. 199 ± 7.2 kg) compared to TRAD cows. Apart from the twinning rate, no differences in reproductive performance were observed among systems. However, reduced weaning weights and kilogram of weaned calf per cow exposed may negatively impact revenue to the cow-calf enterprise of the ALT system.

7.
Transl Anim Sci ; 6(2): txac076, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35769452

ABSTRACT

Two studies were conducted to determine interactions of urea inclusion to a dried distillers grains plus solubles (DDGS; 29.4% crude protein, 5.48% ether extract) supplement fed at two amounts and two frequencies to steers on a high forage diet. In Exp. 1, 120 (247 kg; SD = 20) steers were fed individually for 84 d. Steers received ad libitum grass hay (6.8% crude protein) and one of eight treatments. Treatment design was a 2 × 2 × 2 factorial. Supplement was fed daily or three times per week, amount of supplement fed was 6.36 kg dry matter (DM)/week [0.37% body weight (BW); LO] or 12.73 kg DM/week (0.74% BW; HI) and contained either no urea or 1.3% urea on a DM basis. Steer BW was measured at the start and end of the trial and hay DM intake (DMI) was measured weekly. In Exp. 2, ruminally cannulated steers (310 kg; SD = 25) were used in a row-column design with eight steers and six 14-d periods. Treatments assigned were the same as Exp. 1, except that supplement was fed at 0.4% of BW (LO) or 0.8% of BW (HI) and supplement was fed either daily (DY) or every other day (ALT). Hay DMI, rumen ammonia-N, rumen pH, in situ neutral detergent fiber (NDF) disappearance, and rumination were measured. In Exp. 1, average daily gain (ADG) was affected by amount of supplement with steers on HI gaining 0.30 kg/d more (P < 0.01) than LO. Hay DMI was reduced by increased amount of supplement (0.39 kg/d; P < 0.01) and by decreased frequency of supplementation (0.54 kg/d; P < 0.01). In Exp. 2, hay DMI was also reduced due to increased amount of supplement and decreased frequency of supplementation (P < 0.01). Rumen pH was decreased on the day of supplement feeding for steers on ALT (P < 0.01) and reduced for steers fed HI vs. LO. There was no difference in NDF digestibility between DY and ALT (P > 0.05). For ALT steers, there was reduction (P < 0.01) in in situ NDF disappearance for the HI compared to LO amount of supplementation on the day of supplementation. Infrequent supplementation of DDGS results in no difference in ADG but decreased hay DMI compared to daily supplementation. Urea had no effect on digestion or ADG, suggesting rumen degradable protein was not deficient when supplementing DDGS. There is little change in rumen fermentation parameters between frequency of supplement feeding, indicating that forage digestion is not impacted by supplementation frequency. Dried distillers grains can be supplemented infrequently without a reduction in animal performance.

8.
Animals (Basel) ; 11(8)2021 Aug 07.
Article in English | MEDLINE | ID: mdl-34438788

ABSTRACT

The objective was to determine the effects of an injectable trace mineral (TMI; Multimin 90) containing copper (Cu), manganese (Mn), selenium (Se), and zinc (Zn) on trace mineral status and the resulting impacts on reproduction of beef cows and the growth of their calves. Beef cows (n = 200) were assigned to receive TMI or no injection (CON) prior to calving and breeding over two consecutive years. Calves born to cows receiving TMI also received TMI at birth in both years and at 49 ± 1.3 days of age in year 1. The TMI increased (p = 0.01) liver Zn and tended (p = 0.06) to increase liver Cu concentrations. Short-lived effects of TMI on Se were observed. Liver Cu and Zn would have been considered adequate and Se marginal in the CON. Pregnancy due to artificial insemination and overall pregnancy rate did not differ (p ≥ 0.36) between treatments. Use of TMI did not increase calf pre-weaning gain. These data indicate that TMI does not improve the reproductive performance of beef cows with adequate trace mineral status or the pre-weaning performance of their calves.

9.
Access Microbiol ; 3(1): acmi000180, 2021.
Article in English | MEDLINE | ID: mdl-33997611

ABSTRACT

Methane produced by cattle is one of the contributors of anthropogenic greenhouse gas. Methods to lessen methane emissions from cattle have been met with varying success; thus establishing consistent methods for decreasing methane production are imperative. Ferric iron may possibly act to decrease methane by acting as an alternative electron acceptor. The objective of this study was to assess the effect of ferric citrate on the rumen bacterial and archaeal communities and its impact on methane production. In this study, eight steers were used in a repeated Latin square design with 0, 250, 500 or 750 mg Fe/kg DM of ferric iron (as ferric citrate) in four different periods. Each period consisted of a 16 day adaptation period and 5 day sampling period. During each sampling period, methane production was measured, and rumen content was collected for bacterial and archaeal community analyses. Normally distributed data were analysed using a mixed model ANOVA using the GLIMMIX procedure of SAS, and non-normally distributed data were analysed in the same manner following ranking. Ferric citrate did not have any effect on bacterial community composition, methanogenic archaea nor methane production (P>0.05). Ferric citrate may not be a viable option to observe a ruminal response for decreases in enteric methane production.

10.
Transl Anim Sci ; 4(2): txaa047, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32705044

ABSTRACT

To determine the effects of harvest method and ammoniation (3.7% of dry matter) on consumption and waste of baled corn residue, a 6 × 6 Latin square with a 3 × 2 factorial treatment structure was conducted. Six treatments consisted of either nonammoniated or ammoniated residue, harvested one of three ways: conventional rake and bale (CONV), New Holland Cornrower with two rows of stem chopped into the windrow with tailings (2ROW), or EZBale system (EZB) with a disengaged combine spreader and tailings dropped in a windrow. Open cows were grouped by body weight to produce a light block of two pens (448 kg ± 49.6) and a heavy block of four pens (649 kg ± 65.9). One bale was fed to each pen during each of six 7-d periods using round bale ring feeders with closed bottom panels. Residue falling around (waste) and remaining in (refusals) the feeder was collected. The daily nutrient intake was estimated as the difference between what was offered and what remained (waste plus refusals). Crude protein (CP) of residue offered did not differ (P = 0.58) among harvest methods. The digestible organic matter (DOM) content of residue offered in 2ROW and EZB bales did not differ (P = 0.86) and was greater (P < 0.01) than CONV. Ammoniation increased (P < 0.01) CP and DOM content of the residue offered. Total wasted and refused residue did not differ (P = 0.12) between 2ROW (29%) and EZB (37%), while CONV (42%) was greater (P = 0.02) than 2ROW but did not differ (P = 0.34) from EZB. Ammoniation reduced (P = 0.03) total waste and refusals from 41% to 32%. The nutrient content of both waste and refusals did not differ (P ≥ 0.34) among harvest methods and, with the exception of CP, was not affected (P ≥ 0.15) by ammoniation. The CP content of the waste was greater (P = 0.02) and refusals tended to be greater (P = 0.08) from ammoniated bales. The CP intake of 2ROW was greater (P ≤ 0.02) than both EZB and CONV, while EZB tended (P = 0.06) to be greater than CONV. The CP intake of all ammoniated residues was greater (P < 0.01) than the nonammoniated residue. The DOM intake of nonammoniated 2ROW and EZB did not differ (P = 0.61) but was greater than nonammoniated CONV (P < 0.01). Ammoniation increased (P < 0.01) DOM intake. Overall, ammoniation had much larger effects than harvest method, resulting in reduced waste and refusals and greater intake of DOM and CP. However, the combination of both ammoniation and selective harvest (2ROW or EZB) was needed to result in energy and protein intakes that would meet the needs of a mature cow in mid-gestation.

11.
Transl Anim Sci ; 3(1): 42-50, 2019 Jan.
Article in English | MEDLINE | ID: mdl-32704776

ABSTRACT

To determine the effect of harvest method and ammoniation on both in vivo and in vitro digestibility of corn residue, six corn residue treatments consisting of three different harvest methods either with or without anhydrous ammonia chemical treatment (5.5% of dry matter [DM]) were evaluated. The harvest methods included conventional rake-and-bale (CONV) and New Holland Cornrower with eight rows (8ROW) or two rows (2ROW) of corn stalks chopped into the windrow containing the tailings (leaf, husk, and upper stem) from eight rows of harvested corn (ammoniated bales of each harvest method resulted in treatments COVAM, 8RAM, and 2RAM). Nine crossbred wether lambs (49.2 ± 0.5 kg BW) were fed 64.2% corn residue, 29.8% wet corn gluten feed, 3.3% smooth-bromegrass hay, and 2.8% mineral mix (DM basis) in a 9 × 6 Latin rectangle metabolism study with a 3 × 2 factorial treatment to measure total tract disappearance. Six 21-d periods consisted of 14-d adaptation and 7-d total fecal collection, and lambs were fed ad libitum (110% of the previous day's DM intake [DMI]) during days 1 to 12 and reduced to 95% of ad libitum intake for days 13 to 21. There was a harvest method by ammoniation interaction (P < 0.01) for ad libitum DMI (days 7 to 11). Ammoniation increased (P < 0.01) intake across all harvest methods, where 2RAM DMI was 4.1%, COVAM was 3.6%, and 8RAM was 3.1%, which were all different (P < 0.01) from each other, but all untreated residues were consumed at 2.6% of BW (P ≥ 0.92) regardless of harvest method. There were no interactions (P > 0.34) between harvest method and ammoniation for any total tract or in vitro digestibility estimate. Harvest method affected (P < 0.04) DM, neutral detergent fiber (NDF), and acid detergent fiber (ADF) digestibility, where 2ROW was greater than both CONV and 8ROW, which did not differ. The organic matter (OM) digestibility (P = 0.12) and digestible energy (DE; P = 0.30) followed the same numerical trend. Both in vitro DM digestibility (IVDMD) and in vitro OM digestibility (IVOMD) of the residue were affected (P < 0.01) by harvest method, with 2ROW being greater (P < 0.01) than both CONV and 8ROW. For IVDMD, 8ROW was not (P = 0.77) different from CONV, but 8ROW IVOMD was lower (P = 0.03) than CONV. Ammoniation improved (P < 0.01) DM, OM, NDF, and ADF digestibility of all harvest methods, resulting in a 26% increase (P < 0.01) in DE due to ammoniation. Similar digestibility improvements were observed in vitro with ammoniation improving IVDMD and IVOMD by 23% and 20%, respectively. Both selective harvest methods and ammoniation can improve the feeding value of baled corn residue.

12.
J Anim Sci ; 96(8): 3503-3512, 2018 Jul 28.
Article in English | MEDLINE | ID: mdl-30060232

ABSTRACT

Data from a recent survey suggest that the major reasons Nebraska farmers plant cover crops are to improve soil organic matter, reduce erosion, improve soil water holding capacity, produce forage, and increase soil microbial biomass. Many of these benefits appear to be positively correlated with production of above-ground biomass. Thus, selecting species that will produce the greatest biomass should be beneficial for both soil conservation and forage production. Furthermore, the limited data available suggest that grazing of cover crops does not have large negative crop production, soil, or environmental impact. In the Midwestern United States, the production window following wheat harvest, male row destruction in seed corn, and to a lesser extent following corn silage harvest is long enough to produce 2,500 to 4,500 kg DM per hectare of high-nutritive value, fall forage. In the past 4 yr, we have conducted eight trials using predominantly oats and brassicas planted in mid- to late-August. Forage nutritive value of oats and brassicas is extremely high in early November (70% to 80% IVDMD; 14% to 23% CP) and remains high through December with only a 4% to 7% unit decrease in IVDMD and no change in CP concentration. Thus, it appears that delayed grazing could be an option to maximize potential forage yield. Fall-weaned calves (200 to 290 kg BW) grazing oats with or without brassicas in November and December (48 to 64 d) at stocking rates of 2.5 to 4.0 calves per hectare have ADG between 0.60 and 1.10 kg. The cost of gain has ranged from $0.53 to $2.08/kg when accounting for seed costs plus establishment ($60 to 117/ha), N plus application ($0 to 58/ha), fencing ($11/ha) and yardage ($0.10 calf-1 d-1). Although soybeans and corn harvested for grain do not provide a large enough growing window to accomplish fall grazing, similar dual purpose cover crop practices are often accomplished by planting winter-hardy small grain cereal grasses, such as cereal rye or winter triticale in the fall and grazing in the spring. However, traditional planting dates for corn and soybean result in a 30 to 45 d grazing period prior to corn and a 45 to 60 d period prior to soybean planting. Planting cover crops to provide late fall or early spring grazing has potential. However, incorporating forage production from cover crops into current cropping systems greatly increases the need for timeliness of management since the window of opportunity for forage production is quite narrow.


Subject(s)
Cattle/physiology , Conservation of Natural Resources , Crops, Agricultural , Environment , Livestock , Animals , Biomass , Edible Grain , Male , Midwestern United States , Nutritive Value , Poaceae , Seasons , Silage , Soil
14.
J Vet Diagn Invest ; 24(4): 702-9, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22643342

ABSTRACT

To determine if ruminal hydrogen sulfide, urine thiosulfate, or blood sulfhemoglobin could be used as diagnostic indicators for sulfur-induced polioencephalomalacia, 16 steers (8 cannulated, 368 ± 12 kg; 8 unmodified, 388 ± 10 kg; mean ± standard error) were fed 1 of 2 dietary treatments. Diets consisted of a low sulfate (0.24% S; control) wheat midd-based pellet or the control pellet with sodium sulfate added to achieve a high-sulfate (0.68% S) pellet. As designed, intake did not differ (P = 0.80) between treatments. At 8 hr postfeeding, ruminal hydrogen sulfide was not affected by cannulation (P = 0.35) but was greater (P < 0.01) in high S (6,005 ± 475 mg/l) than control (1,639 ± 472 mg/l) steers. Time of day of sampling affected (P = 0.01) ruminal hydrogen sulfide, with peak concentrations occurring 4-12 hr after feeding. Urine was collected prefeeding (AM) and 7-9 hr postfeeding (PM). Urine thiosulfate concentrations of high S steers sampled in the PM were greater (P > 0.01) than in the AM. However, there was no difference due to time of sampling for control. In both the AM and PM, urine thiosulfate concentrations of high S were greater (P > 0.01) than control. Although hydrogen sulfide and thiosulfate were elevated by increased dietary S intake, a concentration at which polioencephalomalacia is likely to occur could not be determined. Sampling urine for thiosulfate or rumen gas for hydrogen sulfide of nonsymptomatic pen mates 4-8 hr after feeding may be useful to assess sulfur exposure and differentiate between causes of polioencephalomalacia.


Subject(s)
Cattle Diseases/metabolism , Encephalomalacia/veterinary , Hydrogen Sulfide/metabolism , Rumen/metabolism , Sulfates/metabolism , Sulfates/toxicity , Sulfhemoglobin/analysis , Thiosulfates/urine , Animals , Cattle , Cattle Diseases/chemically induced , Cattle Diseases/diagnosis , Cattle Diseases/urine , Encephalomalacia/diagnosis , Encephalomalacia/metabolism , Encephalomalacia/urine , Hydrogen-Ion Concentration , Male , Random Allocation , Sulfates/administration & dosage
SELECTION OF CITATIONS
SEARCH DETAIL