PANHANDLE PERSPECTIVES: Stripe Rust in Nebraska |


Robert M. Harveson
Extension Plant Pathologist Panhandle R&E Center, Scottsbluff
Ears of wheat on a grunge background
Getty Images/iStockphoto | iStockphoto

Stripe rust is a serious disease that commonly occurs wherever wheat is grown under cool, moist environmental conditions during the season. It is considered to be the primary rust disease in wheat production, and has been studied extensively for more than 100 years.


Stripe rust was first found in the United States in 1915 by a Danish visiting scientist, F. Kolpin Ravn, when he traveled with a U.S. Department of Agriculture crop survey team in the western U.S. Ravn recognized the stripe rust on wheat near Sacaton, Ariz., as it had long been a destructive problem in Great Britain, northern and central Europe and was easily distinguished from other rusts diseases in cereals and grasses. It was later determined to have been present in the western United States for at least 23 years, based on herbarium samples collected from Washington State, Utah, Wyoming and Montana. Other research suggested it had also been present in California in the 18th century.

“Stripe rust, is a serious disease that occurs, commonly, wherever wheat is grown under cool, moist environmental conditions during the season.”


After it was discovered in North America, many feared it would spread east into other major wheat growing areas, like the Great Plains, Mississippi valley and the Atlantic Coast. That concern, coupled with an increased need to expand wheat production during World War I, provided the stimulus for immediate research efforts. Some of these early research topics focused on economic importance, geographic distribution, host range and cultivar resistance.

By the 1930s, it had not spread into wheat growing areas east of Rocky Mountains, as expected. Its absence from the Great Plains reduced the emphasis for investigations and for almost 30 years — from the 1930s to the late 1950s — there was minimal concern for the disease. During this time, no major research projects were conducted and only occasional reports of its appearance were published.


In the late 1950s and early 1960s, the disease was discovered for the first time in the Great Plains, and severe epidemics in California and the Pacific Northwest occurred as well. These reports renewed previous fears of economic losses, which then stimulated new research aimed at disease control by the USDA and state experiment stations. This was also the peak of the Cold War.

The U.S. Army began investigating the potential for using the pathogen as a biological control weapon against the Russians and their vast number of acres planted to wheat and studying the impact of the disease on national security.

Interest in the disease came again in the late 1960s and 1970s, and research focused elsewhere. Stripe rust was noticed at this time, but never became problematic east of the Rocky Mountains. In the 1980s and 1990s, it started to occur periodically in south central United States at levels which caused noticeable yield losses. However, leaf rust remained a larger problem and breeding efforts concentrated on stripe rust resistance were still not a priority. That began to change after 2000.


Rust diseases on cereals are not a new phenomenon. Awareness of their effects goes back thousands of years. Rusts were one of the major blights of ancient history mentioned in the Bible and various writings of the Greeks and Romans that periodically led to famine and increased fears and superstitions when the nature of the cause could not be determined.


Early records do not distinguish between the three types of rust disease, but each is distinct, caused by one of three different pathogens; however, they are all similar in their biological life histories. The life cycles of wheat rust fungi are very complex, involving five different spore stages — each produce a distinct spore type — that appear in a regular sequence, requiring two unrelated hosts, also known as alternate hosts, to complete their life cycles.

The spore type that causes yield reductions on wheat is called the urediniospore, and it is also the one easiest to notice. These spores are reddish-brown to yellow or orange in color, depending on the disease, which also is the source of the name “rust.” Stripe rust spores tend to be more yellow while those of leaf rust are orange.

The spores are produced in blister-like lesions called pustules on wheat leaves. The urediniospore stage is also called the repeating stage because it is the only one with the ability to cause multiple infections on the wheat crop within the same year.


Wheat rust pathogens are particularly prone to rapid, large-scale spread because of their capacity to produce spores. Rust pustules can produce 10,000 urediniospores per day, each of which could theoretically then produce another new pustule within 10 days. Fortunately, not all spores cause infection — generally about 10 percent — but this example showes the high reproductive potential of these pathogens under optimal environmental conditions.

The repeating urediniospore is furthermore well-suited for wind transportation and can efficiently and rapidly be transported over hundreds or thousands of miles. In fact, by the use of airplanes in the 1930s it was found that the rust spores from grasses and grains were detected on petroleum jelly-coated microscope slides from elevations as high as 16,000 feet. Furthermore, in North America rust infections have been documented as migrating from northern Mexico and south Texas through the Great Plains to North Dakota within six months.


Stripe rust of wheat, has been long been regarded as a low-temperature disease and primarily problematic in cool, wet conditions. Optimal spore germination has been documented to be 50-54 degrees, with an optimal temperature for disease development of 55-60 degrees. The latent period is the time between infection and new spore formation.

This is 10-15 degrees lower than the average for the leaf and stem rust diseases, which have historically been more problematic for Nebraska production. Yet in the last decade, the disease has emerged to consistently induce severe economic losses in Nebraska under very warm conditions previously thought to be unlikely.


Stripe rust of wheat was traditionally considered a problem under cool, wet conditions. Therefore it was more important in northern and western Europe and the Pacific northwest of the U.S. than other rust diseases of wheat.


The first symptoms of disease are the appearance of chlorotic patches on leaves. Blister-like lesions called pustules then develop within these patches consisting of yellowish-orange spores. The pustules are aligned linearly in narrow stripes between leaf veins. Under favorable environmental conditions, entire leaf surfaces can be covered with stripes.

Pustules and spores can additionally form on leaf sheaths, awns or glumes on the wheat head. Occasionally immature grain can be infected, but distinct stripes are not formed on seedlings or young plants.


Before 2000, stripe rust in the U.S. was primarily problematic as a cool weather-oriented disease in the Pacific Northwest and California. However, in 2000, the diseaes spread further than ever documented in the U.S. It was reported from California to Virginia, through the entire Great Plains from Texas to North Dakota, on all forms of wheat. The epidemic in 2000 attacked wheat in 20-plus states and caused significant yield losses in new locations where it had rarely occurred before.


The pathogen causing stripe rust in wheat is quite variable, it consists of genetically different biological forms known as races. Before 2000, a well-known group of races occurred throughout the country and wheat varieties resistant to these races were readily available. In 2000, new races were identified with the ability to overcome the resistance.

New races helped explain the large yield loss in 2000, but didn’t explain the widespread nature of the epidemic. As epidemics continued, atypical behavior of the stripe rust pathogen began to be observed. For example, stripe rust no longer spread slowly when it was warm and completely shut down when it became hot.

Something changed.

At first, it was concluded the new races have shorter latent periods, which is the time between infection and new spore production. It was estimated that isolates with shorter latent periods caused 2.5 times more disease during a typical season.

Second, results showed the new races were more aggressive at warmer temperatures than the old races.


These discoveries demonstrated the new races could cause disease quicker than before and were more aggressive at higher temperatures than before. Therefore it started to overcome the most effective genetic resistance. Furthermore, they displaced the old isolates across the expanded geographic ranges, suggesting that the new isolates were more fit than older ones.

These changes indicated the old pathogen races occurring in cooler climates, are distinct from the invasive new isolates occurring in warmer climates after 2000. This suggested the new races were likely recently introduced into the United States from a still unknown location.


The first step to successfully combat stripe rust is to modify previously held mind sets.

Before 2000, stripe rust wasn’t a serious threat. Therefore, problems such as wheat streak mosaic were the focus. After 2000, a paradigm shift occurred and genetic resistance for the disease became a priority for plant breeders. This practice is given serious consideration and incorporated into breeding programs for new tolerant varieties.

In the past, based on wheat prices and economics, a maximum of one fungicide application was generally employed, if that. But with the new races, the use of single applications should be reconsidered and revised to better manage this disease.

There is no one silver bullet that will control this disease. Many factors must be considered, and the integration of multiple techniques will likely be the most successful and economically sound, such as combining the planting of disease-tolerant varieties with fungicide applications.

The use of varieties with better tolerance can delay disease development or reduce severity of disease to the point that fungicides may not be necessary. Fungicides, if utilized, can be effective, but scouting and early detection are important points that also need to be considered. Timing of fungicides require a number of factors.

Factors to consider when it comes to applying fungicides

» It may not be economical to apply fungicides in the absence of high levels of disease. If the pathogen is observed primarily in the lower leaves, and the potential of disease progress is reduced fungicides may not be needed.

» Ten-14 day weather forecast: Weather plays a significant role in disease development and can be used to estimate need of applications

» It is important to protect the flag leaf, as it contributes greatly to grain fill. If disease is noticed early, additional applications may be necessary later. However, certain fungicides cannot be applied after flowering. Check labels to avoid having loads rejected at the processor due to non-compliance of fungicide label application instructions.

» If few inputs have been made or prices of wheat are low, using generic fungicides may be the most cost-effective method of control. It may also be most advantageous to not spray at all if yields are expected to be low in certain fields, for example in dryland fields with little precipitation during the season. ❖