Our website uses cookies so that we can provide you a better online experience and service;
by continuing, you agree to our use of cookies in line with our Privacy Statement
Close

The history of food quality control

Ever since microorganisms were discovered in our environment and linked to typhoid fever and other diseases that have plagued humanity, public health authorities have been concerned with the accumulation of filth and foul odours in urban areas. The first early inspection systems based on sensory evaluations were legally enforced at the beginning of the 20th century. Initial bacteriological techniques to detect pathogenic bacteria in foods, such as shellfish, appeared soon after.

From that point on, the food and beverage industry has applied stricter product inspection procedures and more and more effective production methods to conserve the freshness of natural raw materials. Today, the establishment of good manufacturing practices (GMP) and good hygienic practices (GHP) in many countries has significantly reduced the risk of spoilage and pathogenic microorganisms in modern food products. In addition to complying with national and international food regulations, food manufacturers typically follow international quality standards, such as ISO as well as the Hazard Analysis Critical Control Point (HACCP) system.

In recent years, there has been an increasing focus on traceability in food production. This has followed public concerns arising from various food contamination scares and the development of foods containing ingredients derived from genetically modified (GM) crops. The Codex Committee on General Principles is to examine the role of traceability as a potential risk management tool for public health purposes. The EU Commission is also leading the way in establishing basic standards and guidelines (e.g. Regulation EC No 178/2002). To some extent, the principles involved are similar to those already employed in quality management systems.

For more information: