Feed Intake

From BIF Guidelines Wiki
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

This information is intended to cover feed intake in growing cattle, rather than for cows.

The following is an updated addendum to the feed intake measurement recommendations in the 9th edition of the Beef Improvement Federation Guidelines. The key updates below focus on warm-up and testing periods, as well as newly suggested approaches to appropriate contemporary grouping for application to genetic evaluation. Testing equipment and test diet guidelines were not revised from the original recommendations.

Pre-Test Information

For feed intake records to be suitable for inclusion in genetic evaluation programs, pre-test information on individual animals should be recorded. Individual animal identification (e.g., registration number) should be easily compatible with other databases and unique. Depending on the traits included in genetic evaluation(s), birth and weaning dates and weights, age of dam, and information to define contemporary groups will also be required. It has been shown that feed intake is related to the age of animals when feeding tests are conducted. Animals entering a feed intake test should have actual birth date recorded so that age at the beginning of the test can be calculated. Weaning data are generally required to be collected before animals reach 260 d of age. The age at which an animal begins a feed intake test should be after weaning but not be younger than 240 d. Within a feeding contemporary group, animals should have start of test ages within a 60-d range. Feed intake measurement on test should be completed before an animal reaches 390 d of age.

Warm-up period recommendations

BIF recommends a warm-up or acclimation period of at least 21 days should be included in the feed intake test. The goal of the acclimation period is to reduce within contemporary group variation in feed intake due to non-genetic factors (e.g. pre-test environment) and to acclimate animals to both the test diet and the testing equipment. Animals should be transitioned from receiving to test diets gradually to minimize digestive system upset. Frequently these transitions will include moving animals from a primarily roughage-based receiving/backgrounding diet (either fed ad libitum or with restricted intake) to a higher energy, concentrate-based growing diet fed ad libitum. If calves entering a test have been previously transitioned to a diet that will be used in the formal feed intake test, then the acclimation period may be substantially reduced (by a week or more) to accommodate acclimation to the feed intake equipment only. However, users should be cautious that animals acclimated to a high concentrate diet are not restricted from that diet while training to prevent acidosis issues when they return to normal consumption levels. The overall test period (warm-up and testing) may be reduced by adjusting the start date of the trial through observation of daily intake records. When intakes have stabilized across the entire group following a week or more of feed intake observations, the trial may begin. While adjustment of start date may practically save days on feed in the feed intake facilities, it typically won’t reduce the days on feed for growing or developing animals and incurs additional labor and data analysis costs to reliably determine the ‘start’ date of the test. In practice, it may be simpler and more reliable for producers and central tests to set a minimum warm-up period (21 days) as the standard operating test protocol.

Test Diets

Diets used in feeding tests will vary according to animal type, animal gender, environmental constraints, feed ingredient availability, cost, and management. Therefore, data collection should be implemented such that diets can be adjusted insofar as possible to a common nutritional base. All animals within one test should be fed the same test diet, and the diet should be formulated to provide essential nutrients and sufficient energy to ensure the expression of animal differences for intake. The ingredient composition of the diet should be recorded, and the ingredient composition of the diet maintained throughout the test period. It is desirable for samples of diet ingredients or of the complete diet to be sent to a commercial laboratory for complete chemical analysis.

Diets used in tests with growing bulls should contain at least 2.4 Mcal ME/(kg DM). Diets used in tests with finishing steers should contain at least 2.9 Mcal ME/(kg DM). There is a growing number of reports in the scientific literature in which data from intake tests are adjusted to a common energy content, mainly to increase across-test comparability. That is, statistical adjustment to a constant energy density requires recording of enough chemical composition data on the diet(s) to derive metabolizable energy (ME) in megacalories (Mcal) on a dry matter basis. Average daily intake and functions of intake data should be reported on a dry matter basis. Expression of daily feed intake values on a dry matter basis removes variability in the moisture content across a diversity of diets, and increases the comparability across multiple tests and studies. As-fed measurement of daily feed intake can be recorded as well, but for further data analyses, sufficient information must be supplied to convert feed intake to a dry matter (DM) basis.

Test period recommendations

From an industry-wide genetic improvement perspective where there are limited feed intake measuring facilities, it is desirable to minimize the length of the test period while still ensuring a sufficiently precise measure of feed intake. A shorter test period not only reduces the cost of testing an animal, but it also increases the number of animals that can be tested.

A 42-day test length is sufficient for obtaining a high-quality observation of daily feed intake (Table 1). While shorter test periods result in less precise measures of average daily gain (Table 2), this issue can be overcome by producing feed intake EPDs from a sensible model that includes post-weaning gain and weaning weight as correlated traits[1], and includes complete pedigree information for all animals in the genetic evaluation. Because shorter test periods permit the collection of feed intake data on more animals, more selection candidates with potentially higher accuracy of EPD will be available, thus, resulting in improved selection response[1]. If additional precision in measuring test average daily gain were desired, then continuing to feed test animals outside of the feed intake collection facility after 42 days of measuring intake would accomplish this and still allow for measuring feed intake on additional animals. A 42-day test length will typically ensure 35 days of reliable and precise feed intake measures which is considered the minimum.

Days where animals are removed from the pen for any reason are excluded from calculation of average dry matter intake (ADMI) by some and included by others. There is currently not sufficient information to recommend one approach over the other. Until such information becomes available, either approach is acceptable. At a minimum, animals should be weighed twice at the beginning and end of the testing period. A preferable approach would be to weigh animals every two weeks during the testing period and a regression approach used for the calculation of average daily gain. However, the additional costs and loss of feed intake data for those days must be considered.

Table 1. Average correlation and regression coefficients between standard (70-day) and varying shortened test period lengths. (adapted from [2])

Test Length ADMI Pearson Correlation RFI Pearson Correlation ADMI Regression RFI Regression
28 days 0.94 0.835 0.83 0.765
42 days 0.975 0.905 0.93 0.87
56 days 0.99 0.95 0.985 0.955


Table 2. Average correlation and regression coefficients between standard (70-day) and varying shortened test period lengths for average daily gain (ADG) and metabolic mid-weight (MMWT). (adapted from [2] )

Test Length ADG Pearson Correlation MMWT Pearson Correlation ADG Regression MMWT Regression
28 days 0.66 0.975 0.275 0.92
42 days 0.835 0.99 0.525 0.98
56 days 0.945 1 0.825 1.005

Contemporary grouping for genetic evaluation

The most efficient use of feed intake measures to achieve genetic progress is in genetic evaluation with EPD delivered to breeders for use in selection decisions. Feed intake test contemporary groups should consist of the weaning contemporary group in addition to any post weaning treatment differences including ‘on test’ date, test or acclimation duration, test location and/or test equipment, and minimum age on test. The combination of these factors into a single feed intake contemporary group as a fixed effect should result in a relatively equal pre-test environment resulting in equal ability to express genetic differences during the test. Alternatively, the contemporary group definition could be partitioned into two components, weaning and post-weaning, whereby weaning contemporary group is fitted as a random effect and nested within the fixed effect of post-weaning contemporary group. Including the grouping related to the trait being measured, in this case post-weaning performance, as a fixed effect avoids bias in the genetic predictions related to differences in management practices. Given testing stations may combine animals from multiple sources (breeders) and weaning contemporary groups in a single test pen, there may be a desire to make use of these data. In this case, including test group (pen, period/test, diet) as a separate effect in the model in addition to fitting post-weaning gain contemporary group as a separate effect could be done. This maximizes contemporary group size while still attempting to account for environmental differences. However, as a general rule mixed breeder contemporary groups are not advised. Appropriate contemporary grouping ensures that the within-group performance differences are minimally affected by differences in non-genetic factors thereby reducing bias in genetic predictors.

Dry matter intake vs residual feed intake genetic evaluation

Organizations producing genetic predictions for feed consumption and partial efficiency differ in the expression of the EPD. While some EPD are expressed as measured daily dry matter intake (DMI), others are published in index form to quantify partial efficiency such as residual feed intake (RFI) and residual average daily gain (RADG).

RFI and RADG

Phenotypic-based RFI attempts to adjust observed intake for phenotypically correlated sources of variation, so RFI is not correlated with indicator traits. Most commonly these include gain and metabolic mid-weight, although measures of body composition have also been used. This process creates a restricted selection index based on phenotypes, whereby selection for RFI will reduce intake without changing gain. Alternately, RADG is a restricted index that allows change in gain whilst holding feed intake constant. To generate EPD for RFI, two alternative methods have been proposed. A phenotypic-based RFI can be calculated using estimated relationships between maintenance requirements, and anticipated requirements for growth and fat deposition, and this phenotype becomes the dependent variable in the genetic evaluation. More commonly, the DMI phenotype is a dependent variable in a model that includes correlated factors as covariables - e.g., weight, gain, fat thickness, etc. The resulting genetic prediction of RFI is intended to be genetically independent of the covariates included in the model. Alternately, RFI can be obtained using an index that includes DMI EPD and the EPD of the RFI covariables. The same issues and approaches exist for producing genetic predictions of RADG.

DMI

EPDs produced for DMI are produced simply by fitting an analytical model that does not adjust for genetically correlated covariables. Instead, these other traits may serve as indicators of DMI in a multiple-trait model. This is analogous to how all other traits in the genetic evaluations are considered. Dry matter intake data need to be evaluated for heterogeneity of variation due to alternative test periods, diets, and feeding conditions. Methods have been proposed for normalizing the variation.[3]

Kennedy et al. (1993)[4] showed the equivalence of selection indexes that incorporated intake or RFI when the economic weights were calculated correctly. Of course, this assumes the production of the RFI phenotype is performed sensibly when this method is used. By definition, RFI is not an economically relevant trait given it only accounts for a portion of feed consumed and thus cannot be a sensible trait in an economically rational breeding objective. Moreover, RFI could have a dramatically different definition depending on the class of cattle used to develop it and thus apply it to whereas DMI could be applied to a wide range of animals. It has been argued that RFI should be published because not all producers use selection index methods. However, this logic promotes sub-optimal selection practices, including single-trait or two-trait selection methods. Given the straightforward definition of DMI, the fact that it is an ERT, and the relative ease at which an economic value can be assigned to it, BIF recommends that if an EPD for growing animal intake and/or partial efficiency be published that DMI EPD be made available and not RFI and RADG EPD. Moreover, BIF recommends that economic selection indexes be made available to select for feed efficiency in an economic context with other appropriate economically relevant traits related to more comprehensive breeding objectives.

Impacts of changing technologies

New remote-sensing technologies continue to be developed and older technologies improved such as automated animal weighing systems and ear tags monitoring feeding behavior. Given the rapid advancements, the guidelines for measuring individual feed intake and gain, will likely need review on an ongoing basis. These technologies will likely result in changes to the current recommendations.

References

  1. 1.0 1.1 Thallman, R. M., L A Kuehn, W M Snelling, K J Retallick, J M Bormann, H C Freetly, K E Hales, Gary L Bennett, R L Weaber, D W Moser, and M D MacNeil 2018. Reducing the period of data collection for intake and gain to improve response to selection for feed efficiency in beef cattle, Journal of Animal Science, Volume 96, Issue 3, March 2018, Pages 854–866, https://doi.org/10.1093/jas/skx077
  2. 2.0 2.1 Culbertson, M. M., S. E. Speidel, R. K. Peel, R. R. Cockrum, M. G. Thomas, and R. M. Enns. 2015. Optimum measurement period for evaluating feed traits in beef cattle. J. Anim. Sci. 93:2482-2487. doi:10.2527/jas.2014-8364
  3. MacNeil, M. D., N. Lopez-Villalobos, and S. L. Northcutt. 2011. A prototype national cattle evaluation for feed intake and efficiency of Angus cattle. J. Anim. Sci. 89:3917-3923. doi:10.2527/jas.2011-4124.
  4. Kennedy, B. W., J. H. J. van der Werf, and T. H. E. Meuwissen. 1993. Genetic and statistical properties of residual feed intake. J.Anim. Sci. 71:3239–3250. https://doi.org/10.2527/1993.71123239x