Soils - Part 10: The Scientific Basis for Making Fertilizer Recommendations
The Scientific Basis for Making Fertilizer Recommendations: In this lesson, you will gain an understanding of the history of fertilizer use and the ideas behind fertilizer recommendations. Three major crop nutrition concepts will be discussed in terms of their benefits and disadvantages.
[This lesson, as well as the other nine lessons in the Soils series, is taken from the "Soils Home Study Course," published in 1999 by the University of Nebraska Cooperative Extension.]
History of Fertilization Recommendations
Crop rotations and animal manures have been used to fertilize crops for a long time. However, the scientific study of matching crop requirements with controlled applications of nutrients is relatively recent. The process of discovering the role of mineral elements in crop nutrition and efficient application of these nutrients has taken 400 years and is not yet complete. Along the way, some people made wrong conclusions from the available data. Scientific interpretation is usually made with partial data and is subject to constant revision. This is a normal part of the scientific process.
Fertilizer nitrogen became available to farmers after 1945. Excess ammonia production capacity remaining from World War II provided the source of manufactured nitrogen fertilizers. Soil testing began and was used primarily as a sales tool. Up to this time, fertilizers were not used routinely. Soil testing helped convince farmers of that need and speeded adoption.
Once it was known that fertilizers are effective yield enhancers, interest began in using them efficiently. During the mid to late 1940s, university agronomists began thinking about how to recognize soils that need fertilization and how to determine application rates. As described in Soils - Part 9, this led to research in soil test correlation and calibration. During the 1950s, specific procedures were developed according to nutrient and geographic region. The process of developing a useful soil test takes time and money, since an extensive database needs to be compiled. Most of this work was done by federally-funded USDA research or at land grant universities. Initially, there were no commercial laboratories conducting soil analyses, so university laboratories began providing both the analyses and recommendations to farmers.
The 1960s were a significant transition time for soil testing. Fertilizer prices were low: $0.04/lb. nitrogen and $0.08/lb. phosphate. There was a great push for increasing crop yields: hybrids were improved, insect control became a reality, and weed control with herbicides became widespread. Higher fertilizer rates were needed to produce higher yields; and with low-cost fertilizer, annual rates of 300 lb. nitrogen/A ($12/A) and 100 lb. phosphate/A ($8 A) were not uncommon for corn production.
Fertilizer efficiency was not of great economic importance, and environmental concerns were not a high priority. Much of the correlation-calibration research at universities was scaled down, and more effort was devoted to high yield research. This was a transition period when more soil samples were analyzed by commercial laboratories than university laboratories, and commercial laboratories began making fertilizer recommendations. Commercial soil testing laboratories didn’t conduct further research into field verification of recommendations and generally did not emphasize soil testing.
How a field is fertilized depends on many factors. One of the most important is the farm manager’s objectives for that field. Many ideas have been proposed to put fertilization in the context of a general management principle. The rest of this chapter describes various ideas commonly discussed on the issue of crop fertilization.
Be the first to write a comment...