Brief History of Food Safety and Agriculture

Agriculture has evolved since humans first domesticated plants such as corn more than 6000 years ago. Although current agricultural practices vary worldwide, in the United States and developed countries agriculture has become increasingly industrialized since the 1940s and 1950s, resulting in more efficiency and production on the farm. Mechanical inventions such as the self-propelled combine reduced the need for manual labor and encouraged the production of grain commodities, which led to the practice of monocrop-ping or monoculture, as farmers began to focus on growing the most profitable crops such as corn, soy, and wheat. Though profitable, monocropping reduced the previous soil-enriching practices of crop rotation and livestock grazing, making agriculture more dependent on synthetic or petroleum-based fertilizers in place of natural manure for amending the soil. Furthermore, although arsenic and lead-based pesticides had been used widely since the late 1800s, new pesticide formulations came on the market during the agricultural boom of the mid-20th century. These included methylbromide, a fumigant once widely applied to soil and crops to kill insects and weeds that was approved for use in 1947, atrazine, a herbicide approved in 1959, and chlorpyrifos, an organophosphate pesticide approved for use in 1965. Since the 1960s, pesticide use in the United States has more than tripled. Despite the ban on several toxic pesticides, like the organochlo-rines in the United States over the past several years, currently more than 1 billion pounds of agricultural pesticides are still purchased each year in the United States. Globally, pesticide use also has increased, and the type used, amount, and regulations vary regionally (4-6).

Since 1962, the Codex Alimentarius Commission (CAC) of the Food and Agricultural Organization of WHO has been responsible for developing standards, guidelines, and other recommendations on the quality and safety of food to protect the health of consumers and to ensure fair practices in food trade. In the United States, various regulations exist to enhance food safety. Early actions of the U.S. Department of Agriculture (USDA) culminated in the passage of the 1906 Food and Drug Act that helped increase food safety for the public. In 1910, the Insecticide Act established product-labeling provisions. The Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) of 1947 required registration of pesticide products with the USDA prior to domestic or foreign sales. The Federal Food, Drug, and Cosmetics Act that evolved from the 1906 statute was expanded in 1954 by the Miller Amendment that established pesticide tolerances in or on agricultural commodities based primarily on good agricultural practices. The Delaney Clause of 1958 prohibited use of any carcinogenic food additive in processed foods. Subsequently, regulatory authority was enhanced by creation of the U.S. Environmental Protection Agency (EPA) in 1970 and an additional 1972 FIFRA amendment that required manufacturers to demonstrate that use of a product "would not cause adverse effects on human health or the environment" (7-9).

Recurrent outbreaks of food and water diseases have highlighted the importance of sustaining safe food and water supplies. In response to threats to food safety, the United States government and other entities have made several changes in the United States food safety regulatory structure. These include implementation by USDA of the Pathogen Reduction: Hazard Analysis Critical Control Point (HAACP) in 1995, Final Rule for Meat and Poultry (from USDA's Food Safety and Inspection Service (FSIS), creation of FoodNet (a sentinel surveillance system for active collection of foodborne disease surveillance data), creation of PulseNet (a national molecular subtyping network for foodborne bacterial disease surveillance), and revisions to the Food Code and the National Primary Drinking Water Regulations (10-12).

0 0

Post a comment