Analysis of the North American Olive Oil Association’s (NAOOA) Report “Random Testing of Store-bought California Extra Virgin Olive Oils: 67% Fail New Olive Oil Commission of California (OOCC) Standards”
By Alexandra Kicenik Devarenne
The olive oil importer’s association in N. America (NAOOA) did off-the-shelf testing of eighteen California extra virgin olive oils to check their compliance with the new Olive Oil Commission of California’s (OOCC) Grade and Labeling Standards, reporting that 67% failed to meet the new standards. Although a lack of information makes full interpretation of the test data impossible, an unofficial analysis by an industry consultant, writer and researcher reveals some significant errors and omissions.
- Rather than proving that the OOCC standards do not work, as stated by the NAOOA report, the test results show that the standards do work, providing good information about the product as it reaches the consumer and highlighting areas for improvement
- The report repeatedly states that the OOCC standards have eliminated certain purity criteria that detect adulteration. This is a distortion of the facts: all of the purity parameters that are present in the USDA olive oil standards apply to all California-produced olive oil, and have since 2009. Blacked out columns in the NAOOA tables for these parameters are sensational and misleading since values do exist for those tests
- The reason for the OOCC seeking to alter the limits for some of these defining parameters is not “loosening purity criteria,” it is to accommodate the natural variability of olive oil made outside the traditional olive oil-producing regions
- The samples were tested for chemical compliance with IOC standards but two IOC quality tests were inexplicably omitted: sensory analysis and fatty acid ethyl esters
- The samples were tested for PPP and DAGs (tests included in the OOCC standards) but the NAOOA only reports that the Italian lab was IOC-accredited; IOC accreditation does not include proficiency in those tests therefore the reliability of the results is unknown
- Testing for PPP and DAGs is not “rejected by world experts.” These tests have been in use for a decade by retailers in Northern Europe and are used extensively by olive oil traders to determine the quality and age of oil they are buying
This article is not an official response from the OOCC or any other organization. It is an analysis done by an individual, a long-time California olive oil industry observer, with financial support from members of the industry.
The most recent communication regarding California’s new olive oil standards from the North American Olive Oil Association, the trade association that represents primarily big-brand olive oil importers, is a report entitled “Random Testing of Store-bought California Extra Virgin Olive Oils: 67% Fail New Olive Oil Commission of California (OOCC) Standards.” Dated May 2015, the report draws heavily on an NAOOA-funded report called “The Olive Oil Commission of California’s 2014 Grade and Labeling Standards: Analysis and Implications” by Islam A. Siddiqui, dated April 7, 2015.
The NAOOA collected sixteen California olive oils in mid-January 2015 from unspecified stores in California and two samples in March in New Jersey. These samples represented sixteen different brands and eighteen unique SKUs. They report that the samples were properly stored under cool, dark conditions before being transferred to 500 ml unbranded dark glass jars for shipping to an IOC-certified lab in Italy for (blind) testing. The samples were analyzed for the chemical parameters in the IOC standards as well as the PPP (pyropheophytin) and DAGs (diacylglycerol) measures included in the new OOCC standards.
An analysis of this report falls into two parts. One of these, a complete analysis of the results of the NAOOA’s testing, is not possible without additional information. Although the report states that the lab used is IOC-accredited, which indicates competence in olive oil testing, it does not state which methods were used in the testing. Since there are numerous testing methods, it is important to know whether official methods were used. Also, the IOC proficiency certification does not include the ISO methods for DAGs and PPP testing.
The test results are coded, and state any harvest or Best Before dates included on the labels. They do not, however, provide the brand names or lot codes. This makes any further investigation to better interpret the results impossible. The NAOOA states that it has sent the results to the producers, but the OOCC has not reported receiving them.
Puzzling is the omission of an important quality parameter that is a crucial part of the IOC standards for virgin olive oil: sensory analysis. The OOCC and the IOC standards contain the same sensory parameters for extra virgin olive oil: a median of defects equal to zero and a median of fruitiness greater than zero. In other words, the oil may have no defects of flavor and must taste of ripe and/or green olives. Also present in the IOC standards, but inexplicably absent in the NAOOA testing, is fatty acid ethyl esters.
Off-the-shelf testing provides valuable information for everyone in the supply chain from producer to distributor to retailer. Properly interpreting test results is a complex and revealing process and there is surely good product knowledge to be gleaned from a complete analysis of these test data and follow-up investigation of the possible causes. In order to do this, more information is needed.
There are however some errors in the report that do not hinge on the results of the testing.
Errata in the NAOOA Testing Report: Elimination of the IOC Adulteration Standards
“Elimination of the IOC adulteration standards in the OOCC standards appears to have been unwarranted—all eighteen of the California samples met the IOC standards the OOCC claims are difficult to meet—unnecessarily exposing consumers to the risk of adulterated olive oils.”
This statement is totally misleading, as are the tables included in the report that show blacked out columns labeled “OOCC – N/A” for some of the purity criteria. As stated in the NAOOA’s own analysis of the OOCC standards[i]:
In 2008, the California Legislature passed SB 634 led by Senator Pat Wiggins, which was signed into law by Governor Schwarzenegger on September 30 and became effective January 1, 2009. It established grade and labeling standards for olive oils, as defined under the California Health and Safety Code (CHSC): Olive Oil Grades, Manufacturing and Marketing: Division 104, Part 6, Chapter 9, Sections 112875-112880. (Siddiqui, p.2)
It [the CDFA] also made several changes in the… purity parameters requirements of the proposed grading and labeling standards recommended by the OOCC: …. (b) Several purity parameters were deleted from Tables 2, 3 and 4 of the proposed standard, as they were less stringent than the current requirements under the CHSC. (Siddiqui, p.6)
As correctly stated in the above NAOOA-commissioned analysis, all olive oil produced in California is covered by the CHSC, including the olive oil that falls under the new OOCC standard. It is inaccurate to state, as the NAOOA does in its press release of May 27, 2015, that “the OOCC standards eliminate certain chemical analyses necessary to detect adulteration. This testing does not ensure the authenticity and quality of olive oil consumers deserve.” The OOCC standards eliminate nothing that is included in the CHSC standards for olive oil, the OOCC standards add to them.
It is also misleading to say that California proposes “loosening” purity criteria. This is a distortion of the facts.
It is true that the OOCC standard proposed different limits for particular fatty acids and sterols. This is not an attempt to allow adulteration with seed oils, it is aimed at ensuring that all the genuine olive oils produced in California will meet the legal definition of olive oil. This is why:
Olive oil is composed of a mixture of fatty acids—monounsaturated, polyunsaturated and saturated. The proportions of these fatty acids vary based on the olive variety, the climate and the maturity of the fruit at harvest. The permitted range of variation in these fatty acids was set by the IOC, based on the typical levels in olive oil made from traditional varieties grown in the traditional Mediterranean olive oil-producing areas. As olive oil production has spread around the globe, resulting in new combinations of olive variety and climate, global producers have seen their genuine olive oil fall outside the IOC parameters for fatty acids such as oleic, linolenic or palmitic. In other words, it’s not officially “olive oil” even though it obviously is.
According to Islam A. Siddiqui’s report, “With few exceptions, the revised U.S. standards for grades of olive oil [adopted in 2010] are in line with the IOC standards for quality and purity of olive oil.” Why those “few exceptions” in the USDA standards, deviations from the IOC standards? Exactly this natural variability in olive oil composition cited above. The USDA has different values than the IOC for some of the fatty acids to better accommodate the range of our domestic olive oils. And since that time, as olive oil production has spread into more extreme climates such as the far southern reaches of California, genuine olive oils made in these regions have fallen outside the IOC definition of “olive oil” because of their natural composition. Exactly the same thing happens with some of the minor components such as sterols.
The California olive oil industry is not alone in this fight to adopt purity criteria that encompass the entire range of olive oil production. Producers in Argentina, the state of Georgia, Israel, Australia, Tunisia, even parts of Spain and Southern Italy sometimes struggle to meet the IOC’s limits for certain sterols and fatty acids. The CDFA rejected the new levels proposed by the OOCC—resulting in the gaps in the OOCC tables where the existing CHSC parameters apply—because they wanted to see more data from California olive oils to support the changes in the limits. The OOCC is working with producers and researchers to gather more of these supporting data.
The argument that these changes in the fatty acid and sterol levels will permit adulteration is specious.
Errata in the NAOOA Testing Report: New Tests Rejected by World Experts
The NAOOA report states that the new OOCC standards “include two new test measures known as PPP and DAGs which have been rejected as unreliable by world experts for industry use.” On page 4 of the report, the author(s) states:
The OOCC claims that PPP and DAGs indicate the freshness of the oil and the presence of refined olive oil. The claim is based on data and recommendations from the UC Davis Olive Center, the Australian Oils Research Laboratory, the Modern Olives Laboratory of Australia (and soon California), and the American Oil Chemists Society (AOCS). On the other hand, world experts and the IOC have examined PPP and DAGs and rejected them for use in trade standards for two reasons – first, research showed a large number of false positives when testing the oils for freshness and second, there is no way of knowing if the results are due to poor quality olive oil or simply to storage conditions. There is no scientific evidence that PPP or DAGs directly detect refined olive oil mixtures. [ii]
This implication that second-rate or insider data were presented to support the value of PPP and DAGs testing is spurious. It is also simply wrong to say that world experts have rejected the tests for industry use, although it is true that the IOC has still not included them in the IOC standards. The reality of PPP and DAGs testing is quite a different story.
Far from being fringe measures and untested technology, the use of PPP and DAGs to assess oil quality and age goes back to the mid-2000s in Germany. The German Society for Fat Science issued a statement after a workshop in 2005 describing the testing of the methods and supporting the use of PPP and DAGs to measure olive oil quality, age and thermal treatment (as in refining). The tests were adopted as part of the standard quality testing required by the large retailers in Germany. The value of these tests in showing the age of olive oil has made them a standard part of the testing done by many olive oil traders in Europe and around the world.
PPP and DAGs are also part of the Australian Standard® for Olive oils and olive-pomace oils, adopted in 2011, and have been extensively studied by both the Australian Oils Research Laboratory and the Modern Olives Laboratory near Melbourne. Both of these labs hold multiple accreditations, including ISO 17025, IOC and AOCS. It is also worth noting that although it is called “American” Oil Chemists’ Society, AOCS is a century-old international body that, among other things, developed numerous testing methods that are cited as official methods in global olive oil standards such as Codex Alimentarius.
In June of 2013, the European Commission convened a workshop on olive oil authentication at the IOC headquarters in Madrid, specifically to address: “In particular: i) the blend of extra virgin olive oil (EVOO) or virgin olive oil (VOO) with soft deodorized OO or ii) with other adulterant oils; iii) the evaluation of quality parameters related to ‘freshness’.”[iii]
Deodorized or soft-refined oil is oil that has been refined by a “gentler” method than traditional refining, and so shows less impact on the traditional chemical parameters that are altered during conventional refining.
Among the findings presented at this EC workshop was a report from the person in charge of coordination of inspections of fraud for the Spanish Ministry of Agriculture, Food and Environment regarding samples that were found to be non-compliant with the provisions of the EU standards. “He highlighted that the more common fraud cases can be detected by the currently available analytical methods, however frauds using deodorized oils are more challenging to detect.”[iii]
This group of scientists, meeting at the behest of the EC, at the IOC headquarters, delivered a series of presentations outlining the limitations of current methods and promising solutions. PPP and DAGs were part of this dialogue, which included debate about the uses of the tests. Although resistance can be found—particularly in IOC satellite organizations—it is interesting that the IOC has a provisional official method for determining DAGs.
The value of these parameters for flagging the presence of deodorized oil and indicating the age of an olive oil should not be dismissed because they can also be affected by storage conditions. This is true of many of the olive oil quality parameters, including ones that are part of the official IOC standards. Test results must always be looked at in combination, including the results of sensory analysis, in order to get a full picture of the product.
A peer-reviewed paper “Pyropheophytin a and 1,2-Diacyl-glycerols Over Time Under Different Storage Conditions in Natural Olive Oils” published in the Journal of the American Oil Chemists’ Society, showed that PPP and DAGs were good indicators of olive oil quality and freshness and that they also highlighted problems during storage. Research from Australian Oils Research Laboratory (AORL) has confirmed this. These parameters were not affected by the olive cultivar or the growing region, and showed good correlation with some of the other quality tests such as UV and sensory.
These findings point to the obvious question: why should olive oil standards shy away from tests that can provide good information about the quality of an olive oil when it reaches the consumer? If an olive oil has suffered because of improper storage, transport or display, that is important information—information that is provided by the use of the best testing we have available. And like all science, it is continually being updated.
The NAOOA’s off-the-shelf testing of California olive oil revealed some issues, but not the ones that the NAOOA has focused on. For example, there is the confusion about harvest dating. The OOCC standard originally contained a formula for the (optional) harvest date that stated that olives harvested during the fall-winter season—olive oil in California can be made from Sept to March—will be dated by the latter of the two years, the year it would be released. Therefore, oil made in October of the 2013-14 harvest season would be considered “Harvest Date 2014.”
This has since been changed by the commission board: moving forward, the harvest date will reflect the year the olives were grown, or the earlier of the two years; i.e. the 2015-16 season will be called Harvest Date 2015. The new rules, however, do not go into effect until the 2015-16 harvest. This makes the harvest dates listed on the NAOOA’s test results confusing; in reality it is unlikely that any of the 2014-15 harvest oil was on the shelf in January.
There is also a lot of confusion about when the standards take effect. In the order establishing the CDFA standards, it states that “each handler in California who processes olives grown in California into a minimum of 5,000 gallons of olive oil during the marketing season, beginning July 1, 2014 and continuing through June 30, 2015, shall comply with the provisions of the California Grade and Labeling Standards for Olive Oil, Refined-Olive Oil and Olive-Pomace Oil. Said handlers shall meet the grade and labeling requirements set forth…”[iv]
In the Preface of Appendix “A” of the CDFA standards, however, it outlines some limitations including the lack of accredited local labs and says “… a period of transition is required. This testing appendix is thus a beginning only and will develop as resources allow. The Commission is committed to full implementation for the 2016 fiscal year.”[v]
This uncertainty over the implementation timeline was correctly highlighted in the Siddiqui report. What is not in question, however, is whether the quality and purity provisions of the CHSC still apply to all olive oil produced in California: the CHSC applies. Are California producers of over 5,000 gallons expected to additionally meet the more stringent CDFA requirements with olive oil produced after the September 26, 2014 implementation date? It seems so, but this is not entirely clear because of the language contained in Appendix A and clarification will need to come from the OOCC and CDFA.
The Test Results: A First Impression
As previously mentioned, it is impossible to make any sort of full analysis of the NAOOA testing results without additional information, but an initial look at the results actually reveals a lot. Two particularly interesting results are samples 1269 and 1277. The NAOOA testing report says
Only one of the samples (1269) failed a purity/adulteration measure (for stigmastadienes, which suggests contamination with refined oil) in addition to failing IOC and OOCC quality limits. However, that same sample passed the DAGs test which OOCC claims can detect adulteration.[vi]
First, the statement “the DAGs test which OOCC claims can detect adulteration” is not completely accurate. The DAGs and PPP are valuable in combination with each other and other tests. To say that an oil failing PPP and passing DAGs is proof of a failure of the standards is wrong. Even more revealing is sample 1277 that would have passed IOC standards, but by the OOCC standard would have been flagged as having issues—definitely temperature trauma at some point and potentially the presence of soft-refined olive oil, if this lab’s PPP testing was accurate. The standard IOC tests would have missed this and passed this oil as “extra virgin” without any problem.
PPP and DAGs both change in a predictable way during the natural aging process. PPP also rises when oil is subjected to high heat or light. Additional information about sample 1277 would be needed to do the detective work needed to figure out what happened: was it in clear glass and displayed in a window or was it apparently a new dark bottle, just pulled out of the case? What were the results of the sensory analysis? If it was a fresh bottle in a high turnover location there is a high probability that there is some soft refined olive oil in that bottle. The UV and FFA are suspiciously low for an oil showing such a high level of PPP. For reference, a properly stored EVOO will start with low PPP—close to zero—and increase at approximately 7% per year. To reach 49 this oil would have had to age at normal temperatures for about 7 years. A look at the UV K232—a measure of secondary oxidation—shows an oil that easily passes both IOC and OOCC K232 limits with 1.85. The free fatty acid is 0.21—very low and consistent with a fresh, high quality oil—or one that has been cut with deodorized oil (which has practically no free fatty acids and low UV absorbency as long as it’s fresh).
Rather than proving that the OOCC standards don’t work, the NAOOA report has done quite the opposite.
A full analysis of these test data—supplemented with additional information such as lot codes and more about the lab’s experience with PPP and DAGs testing—done by experts with experience in such “olive oil forensics” would be very illuminating. Ideally, retesting at a different lab to confirm anomalous results would be done. Science and secrecy are poor bedfellows. Only by gathering all the information and examining it from all angles is it possible to properly interpret the results and hopefully understand the story of those oils. If such an investigation reveals a problem after the product left the producer on its way to the consumer, that is important; only by learning the shortcomings can we improve the systems. Hopefully the NAOOA will share the additional information with the OOCC for the benefit of California’s consumers.
The OOCC standard aims to do something ambitious: to provide assurance to the consumer that they are getting a genuine quality product from California. The inclusion of the three-year transitional period for full implementation will allow the OOCC to work out the bugs in their processes. Among other things, the importance of an off-the-shelf verification program will no doubt be a topic of conversation for the commission in the future. By continually learning and improving, the OOCC can achieve its goal of building consumer confidence in healthy, flavorsome California olive oil.
Alexandra Kicenik Devarenne is an international olive oil consultant, researcher and writer based in California. A global advocate for quality extra virgin olive oil, she is the author of numerous academic and popular pieces about various aspects of olive oil including the pocket reference Olive Oil: A Field Guide.
[i] I. Siddiqui, The Olive Oil Commission of California’s 2014 Grade and Labeling Standards: Analysis and Implications, April 7, 2015
[ii] Random Testing of Store-bought California Extra Virgin Olive Oils: 67% Fail New Olive Oil Commission of California (OOCC) Standards NAOOA, May 2015
[iii] European Commission Newsletter Workshop on olive oil authentication, Madrid, Spain 10 & 11 June 2013
[v] CDFA Grade and Labeling Standards for Olive Oil, Refined-Olive Oil and Olive-Pomace Oil, Appendix “A” – Sampling, Testing and Grading Methodology for Olive Oil, Refined-Olive Oil and Olive-Pomace Oil, September 2014.
[vi] Random Testing of Store-bought California Extra Virgin Olive Oils: 67% Fail New Olive Oil Commission of California (OOCC) Standards NAOOA, May 2015