In arid and semiarid regions throughout the world, shortage of water necessitates utilization of marginal water for agricultural irrigation. Because of its availability and relatively low cost, treated wastewater is commonly considered as an alternative water source for agricultural needs. Application of treated wastewater for agricultural irrigation may result in exposure of soil to pathogens, creating potential public health problems. Raw sewage water is known to contain a variety of human pathogens. Although their concentrations decrease during the wastewater reclamation process, the secondary treated effluents most commonly used for irrigation today still contain bacterial human pathogens. Therefore, irrigation with treated effluents introduces bacterial human pathogens to the soil. Although not in their natural host, human pathogenic bacteria are capable of surviving long periods of time in soil and water and thereby have the potential to contaminate crops in the field. Therefore, there is a risk of direct contamination of crops by human pathogens from the treated effluents used for irrigation, as well as a risk of indirect contamination of the crops from contaminated soil at the agricultural site. Bacterial human pathogens were recently demonstrated to have the ability to enter plants through their roots and translocate and survive in aerial plant tissues. The practical implications of these findings for food safety no doubt depend on the ability of bacterial pathogenic microorganisms to survive and multiply in the irrigated soil, in the water, and in the crop.
Shortage of water in arid and semiarid areas throughout the world makes utilization of marginal water for agricultural irrigation a necessity. The marginal water most used for irrigation in Israel is secondary-treated urban effluents. In spite of the water treatment process, these waters often contain higher levels of bacterial human pathogens than the potable water from which they were derived. Utilization of the treated effluents for irrigation in Israel is strictly regulated according to the water quality and the irrigated crop. Due to health concerns, and a lack of experimental data, the treated effluents are not yet used for irrigation of vegetables. In the present study we have evaluated safety and agronomic issues involved in irrigation of summer melon with secondary-treated urban effluents, administered to the production field by surface and sub-surface drip irrigation according to the national regulations. Two water qualities were compared, secondary-treated wastewater and potable water. The effluents contained higher levels of EC, pH, Na and Cl, N, P, K, microelements, and heavy metals than the potable water. Potable water was applied by surface drip irrigation, and three irrigation regimes were compared for the treated effluents. These included surface irrigation, and subsurface irrigation at 20 or 40 cm below the soil surface. No differences in yield quantity and quality were found between treatments. Na concentrations and SAR levels of the soil were higher under irrigation with the effluent. Contamination by E. coli, fecal coliforms, and total coliform bacteria were found on the melon peel of all treatments, and the quantity and quality of the contamination did not vary significantly between treatments. E. coli and fecal coliforms were found in the surface 0-2 cm soil samples of treatments irrigated with both water qualities by surface drippers, but no contamination was found in the treatments irrigated by subsurface irrigation. The fact that the microbial contamination of the fruit was not prevented by subsurface drip irrigation or by irrigation with fresh water suggests that environmental factors, rather than an irrigation treatment affect, were the cause for the microbial spread. Further analysis is required concerning effects of environmental factors, such as the interaction between weather conditions and distance from the effluent oxidation ponds on temporal geospatial distribution of the bacterial human pathogens and the potential for subsequent contamination of fresh produce in the field.
Visual leaf damage symptoms affect plant and flower development. A variety of physiological leaf symptoms are induced by environmental and growing conditions, including light intensity during cultivation and the nutrition status of the leaves. In the present study, we studied effects of leaf age, leaf ionome, and shade factor during cultivation (20% and 47% shade – under shade nets), on the development of leaf disorders in two cultivars of Phlox paniculata. The leaf ionome of both cultivars changed with leaf age, and varied between cultivars. The percentage of shade applied during cultivation by shade nets, had a minor effect on the leaf ionome, and it did not affect the type and severity of the leaf disorders that developed on the plants, nor the stage of development of their appearance. The ionome of young leaves and mature leaves that were affected by a purple spotting disorder was similar to that of ‘healthy’-looking leaves, demonstrating that this disorder is not related to the nutritional status of the tissue. Our results further excluded leaf age, plant age, plant trimming and shade factor during cultivation (by shade nets) as inducers of the purple spots disorder. This study is first to explore the ionome of Phlox paniculata and in relation to leaf age, physiological leaf disorders and shade factor during cultivation.
Eight Frankia strains were isolated from root nodules of Casuarina cunning-hamiana trees growing in a sandy loam soil at an irrigated landscaping site on Kib-butz Naan, in Israel. Analysis of the 16S ribosomal DNA gene sequence determined the diversity of the different isolated strains, and the sequences were compared to two known Frankia strains: DSM-44251, isolated from Alnus rubra (Betulaceae family) and DSM-43829, isolated from Colletia crucita (Rhamnaceae family). The phylogenetic tree constructed, based on the 16S rDNA sequencing results, revealed that the strains isolated from Casuarina cunninghamiana had a high phylogenetic similarity to the known strain isolated from Alnus rubra, whereas the sequence ho-mology of the strain isolated from Colletia crucita was located at a distant branch of the phylogenetic tree. These results demonstrate for the first time the existence of “actinorhizal” symbiosis in Israeli soil and the relation of Israeli Frankia strains to known strains from different regions of the world.
Ca deficiencies induce a range of physiological disorders in plants. The disorders typically appear in young growing tissues that are characterized by high demand for Ca and restricted Ca supply due to low transpiration. In this study, we examined the effect of supplementing Ca by foliar spray and through the irrigation solution to Anemone coronaria plants, in order to evaluate if flower abortions and leaf damages that appear in the production fields are related to Ca deficiencies. With the goal to develop a preventive nutritional regime, four Ca treatments were evaluated. The supplemented Ca was applied with the fertigation solution in the concentrations of 60 or 110 ppm Ca; with the 60 ppm application an additional application of Ca by foliar application was tested in concentrations of 3 g/l Ca or 6 g/l Ca, as Ca(NO3)2. The plants were cultivated in a net-house, in soilless culture (Tuff) beds. Application of 110 ppm Ca compared to 60 ppm with the fertilizing solution increased the concentration of Ca in the leaf tissue, resulting in an increase in the quantity and quality of the flowers. Calcium supply by foliar spray, at both 3 g/l or 6 g/l Ca(NO3)2 caused leaf necrosis and did not improve yield production. Application of 110 ppm Ca reduced the concentrations of Mn, Cl and Na in the leaves. Application of Ca in the irrigation solution, or by foliar spray, did not reduce the percentage of non-marketable flowers. The identified lower concentrations of Ca in damaged compared to non-damaged leaves on the flower stem suggests that the damages to the flowers and the leaves is related to local deficiencies of Ca.
Shortage of water in Israel necessitates utilization of increasing volumes of marginal water for irrigation. Marginal water is characterized by higher concentrations of heavy metals then the potable water from which they were derived. Over time, irrigation with treated water carrying appreciable amounts of heavy metals can contribute to their build-up in soils. The actinomycete Frankia is a nitrogen-fixing root-nodule endosymbiont that is present free in the soil or in association with dicotyledonous plants roots. Frankia has the ability to bind and sequester several toxic heavy metals and is a potentially bioremediation agent. The sensitivity to heavy metals of seven Frankia strains (1F, 5F, 6F, 7F, Fb, Fc, and d) recently isolated in our lab and one reference strain (DSM-44251) was determined. A differential response of these strains to Cd, Al, and B was observed. Toxic levels of the different strains for Cd and B were determined as well as the deficiency levels of B. For all Frankia strains except 7F, increasing Al concentrations enhanced the growth at low pH. Strain d had the highest tolerance to Cd and toxic levels of B with no inhibitory effect of Al, albeit with low growth enhancement by Al compared to the other strains. Irrigation with treated wastewater may reduce growth of some Frankia strains and reduce their nodulation efficiency. Given the potential of Frankia in bioremediation and phytoremediation applications it is important to elucidate Frankia isolates' sensitivity and tolerance to water pollutants such as heavy metals.
The requirements of Ranunculus asiaticus L. for N and K fertigation were recently studied in an experimental cultivation system in soilless culture, and the results identified lower requirement of R. asiaticus L. for N and K fertilization than the regime routinely practiced in Israel in the production fields. Fertigation with 50 mg N L-1 and 60 mg K L-1 excelled in terms of marketable flower production, flower quality, vase life duration, and reducing damage to the cut-flower yield by "stem-topple". In the present study, to assess the applicability of the newly developed optimized fertigation regime for soil-cultivation in the commercial production fields, we have compared the conventional high-input fertigation practice (100 mg N L-1 and 120 mg K L-1) with the newly developed reduced-fertigation regime (50 mg N L-1 and 60 mg K L-1) under agronomic conditions routinely practiced for commercial cultivation in the Bessor region, at the south of Israel. The newly optimized regime, which requires half the inputs of K and N fertilizers, was sufficient for optimal yield quality and quantity in agronomic soil production setups. It supported production of the same number of marketable cut flowers, of better length quality and vase life than the high input commercial regime. Utilization of the newly optimized regime therefore has the potential to increase income to farmers by reducing fertilizers input costs, and increasing profit from the improved flower quality.
Eucalyptus silver dollar (Eucalyptus cinerea) is cultivated under intensive agronomic practices for production of cut foliage branches for the floriculture industry. A range of damage symptoms, suspected to be related to unoptimized mineral nutrition, routinely occur in the leaves at the production plantations and reduce yield quality. No information is available about the nutritional requirements of Eucalyptus silver dollar, or of any other Eucalyptus species under intense cultivation for cut foliage branches production. In this study we evaluated the hypotheses that: (1) leaf damage symptoms in the Eucalyptus silver dollar plantations might be related to the nutritional status of the leaves; and (2) they are affected by environmental and growing conditions, and will therefore differ between seasons and location of the plantations. To test these hypotheses we studied the seasonal and location variations in the ionomics of damaged and healthy leaves, physiological parameters, and postharvest attributes of cut foliage branches during vase life in four plantations of Eucalyptus silver dollar in Israel. The observed leaf symptoms were also characterized anatomically. The range of concentrations for individual macronutrients in the leaves was (in g kg–1): N (18–40); P (1.2–3.0); K (5.5–17.0); Ca (3.5–14.0); Mg (1.1–2.8); S (1.3–2.6). The concentrations range for micronutrients was (in mg kg–1): B (10–100); Fe (30–170); Zn (14–27); Mn (38–190); Cu (3.5–5.9). None of the identified leaf symptoms correlated with a consistent increase or decrease of the content of a specific mineral nutrient or heavy metal compared to the healthy leaves, suggesting that they were not caused by mineral deficiency or toxicity. The leaf ionomics was affected by season and varied between locations. The main damage symptoms observed in the four examined plantations during the four harvests were red and purple spots, and oil stains. Postharvest experiments showed that the quality of branches was reduced during 7–15 days of vase life following transport simulation to the local market. The degree of reduced quality during vase life was also dependent on the location of the plantation and the season of harvest. The oil stains appeared in the two most southern locations during summer, suggesting that this symptom might be derived from the summer conditions such as the high temperatures and high light intensities occurring in the southern part of Israel.