Advances in food testing

Laurence Castle reflects on the recent advances in food science and food testing analytical capabilities.

The Food and Environment Research Agency (Fera) in the UK is a centre of excellence for diagnostics and analytical science and a leader in the identification of new and previously uncharacterised threats.

New opportunities and challenges are now presented by recent advances in our analytical capabilities. I am thinking mainly about our ability to detect, identify and quantify small molecules (up to perhaps 1000 Daltons) as well as larger molecules such as proteins. These capabilities have come on in leaps and bounds, driven largely by instrument manufacturers working with academics and users. 

So taking liquid chromatography coupled to high-resolution mass spectrometry as an example, labs now have the tools to detect and identify all sorts of chemicals in food and animal feed with a coverage and sensitivity that was almost unimaginable (or perhaps at best a pipe dream) just a few years ago. And that exactly is the opportunity but also the challenge.

As analytical chemists we are finding more and more of less and less – more and more chemicals in the samples that we test and at lower and lower detection limits.

After scandals such as the economically motivated adulteration of food and feed with melamine, and with horse meat masqueraded as beef, there is an increased emphasis on detecting food contamination; even more clearly a new emphasis in detecting food fraud using non-targeted profiling techniques. 

As their name indicates, these analytical techniques aim to gain as much information about a sample as possible; taking a holistic look rather than testing for a limited list of targeted substances.

The profiling techniques existing now and under further development, such as chromatography coupled to mass spectroscopy and nuclear magnetic resonance spectroscopy, help distinguish between ‘normal’ and ‘suspect’ samples – and they are indeed very powerful. They can help to identify if a sample has been adulterated, or if it is authentic with respect to geographic origin or production method, and check for food quality in general. But these profiling tools come with the ability to drill down into the data to identify the chemical substance(s) that cause any deviation from normality. And a pre-requisite is also to know what constitutes a ‘normal’ sample. This can reveal information on the detailed chemical composition of food and feed that was hitherto unknown – and we need to be prepared for the unexpected.

The issue of detection limits has been faced already. As the analytical sensitivity increases, the default position of zero tolerance – ‘should not be detected at the detection limit of the analytical method’ – becomes untenable for prohibited additives, contaminants and residues of food production chemicals. If we can (now, or in the future) detect ‘just a few molecules’ then we will detect everything in everything – rather simplistically, all chemicals in all samples. In the majority of cases, the presence of a few parts-per-billion or parts-per-trillion of a chemical would present no real concerns with regards to adulteration or consumer safety, and so a non-zero threshold limit must be accepted for pragmatic reasons.

Similarly, in terms of analytical coverage, we will surely find that even normal food samples contain chemicals that we do not yet know about. The best recent example is acrylamide. Acrylamide may be a ‘normal’ product of some cooking processes, but nonetheless being normal – and therefore present in all of the affected food types – is not (and was not) a cause or even an excuse for complacency and inaction. Another example could be as-yet unknown bioactive substances present in plant products such as fruits, vegetables, herbs and spices. Some may be beneficial, some may be detrimental. Some may be both, depending on the intake, since ‘the dose makes the poison’. As we apply profiling techniques more widely, we will indeed find more and more of less and less. So many analytical chemistry instruments, such as LC-MS, GC-MS and NMR, will give ‘straight out of the box’ data and information on a sample. But we can agree with Stoll: ‘Data is not information, Information is not knowledge, Knowledge is not understanding’.

The findings from the analytical lab must eventually lead to risk management or quality management decisions, and vice-versa. Such decisions should initiate and guide the analytical work to be conducted. For the consideration of food safety and risk, analytical thresholds can be entirely consistent with modern risk assessment principles, such as thresholds of toxicological concern.

For quality management decisions, it is a contract between the supplier and the customer. An agreement from both sides is required, as to what exactly constitutes the principle that food should be of the nature and quality expected by the consumer. It should be safe, wholesome, nutritious and authentic – not subject to adulteration and fraud.

When does ‘a few molecules’, a practically zero but not actually zero concentration, become unacceptable? We do need to lift the stone and look under, or open Pandora’s box (to mix metaphors) but we also need to be prepared to interpret, understand and act upon what is found. These are the opportunities – and particularly the challenges – offered by our advances in analytical chemistry.

Laurence Castle is a principal scientist at The Food and Environment Research Agency (Fera) in York, UK. 

Recent Issues