Call us toll free: (317)727-9173
Top notch Multipurpose WordPress Theme!

2-Minute Tip: 6 Ways Ingredients Communicate Value

by NaturPro in Uncategorized Comments: 0

Product development is an increasingly painful process, taking weeks and months to sort through and evaluate ingredients.

That’s because the evaluation process involves cutting through the marketing fluff and understanding (and communicating) the core value of your product.  This makes it a difficult and time-consuming task for your customers.

Marshmallow fluff GinnyWhy should your customer pick your product or ingredient over all the others?  Because they are able to communicate it’s value.

Effective customer education  is one great way to help customers navigate the pitfalls of the product development process, and keep your product top of mind.  The results often include higher customer conversion and less wasted activity.

 Here’s a 2-Minute Tip listing a few things to be sure to include in your customer education materials:

2-Minute Tip: Six Ways Ingredients Communicate Value

 

 

What do supplement testing and Star Wars have in common?

by NaturPro in Uncategorized Comments: 0

Star Wars and science fiction fans know that technology is a double-edged sword. On one hand, advances in science offer us fantastic powers to solve difficult problems (space travel, light sabers). On the other hand, the potential for catastrophe is also greater. With better technology comes a greater responsibility to prevent its misuse.

Early botanical scientists understood both the power and limitations of science to describe a complex natural world. Carl Linnaeus, who developed the original system to classify plants and animals, recognized that all organisms are not discrete species necessarily, but exist on a continuous spectrum of life.

Five ways NaturPro helps to ensure scientific validity

star-wars-episode-7-production-release-date

Today, scientists in academia work to identify and quantify the diverse array of chemical constituents in botanical products, while industry works to ensure a safe, effective and consistent product. At our disposal are alphabet soups of various analytical technologies that offer increasingly better detection of constituents, even down to the picogram, which relative to a gram can be visualized as a drop of water in a thousand swimming pools.

But with picoscale resolution comes a lot of noise (one trillion per gram, to be exact) and even more responsibility to reliably separate a signal from it. Even at the parts-per-million (ppm) level—equivalent to a cup of water in a swimming pool—we often observe unexplainable results that defy logic.

How our “UnLab” approach controls for shoddy methods and unexplainable results… 

For example, only today’s best and most expensive instruments, such as multiple mass spectrometers linked to a chromatograph, such as LC/MS/MS (also known as tandem-MS, which means two mass spectrometers are hooked to each other; the first MS removes a lot of the “junk” that can interfere with the result from the second MS), are able to account for matrix effects that occur when testing complex mixtures. The reason complex mixtures are so difficult to examine is they contain so many different compounds, and therefore the chances are relatively high that one of these is observed at the same retention time (or peak) on the chromatogram as the compound a scientist is trying to quantify. Also, because the sample is being injected into super-heated, high-pressure instruments, there are often chemical reactions create new interfering compounds. Matrix effects can falsely change results in a significant way that cannot be resolved without further work. Results should always be questioned and replicated, and ultimately, investments in the development of methods are required to generate confidence.

FDA Supplement FactsValidation of matrix-specific methods across multiple laboratories address these challenges, however few methods have been validated to the extent required to be confident in the results. An example from the nutrition field: the inherent challenges in quantification of vitamin D (a pure compound and age-old vitamin, no less!)

Both the best and worst thing about good science is that with each answer comes another question. There is always more work to be done to achieve the greater goal: reproducible results. Needless to say, rigorous analysis of complex mixtures such as botanical products is often not straightforward. Unfortunately, the aims of science often oppose the aims of high-throughput lab testing.
How do you know whether a lab is focused on getting the right results? Here are some criteria to help decide whether or not to work with an independent laboratory:

  • Is it transparent? Does it share methods, chromatograms, observations, historical data and control charts?
  • Does it perform validation? Does it verify methods using appropriate controls such as calibration curves and spike recovery? What steps are taken when it initially sets up a method?
  • Does it have a process for dealing with out-of-specification results, and will it share that process? Does it have an internal recordkeeping system that tracks method precision and alerts them when a method or system is out of calibration?
  • Does it run internal control samples? Does it run samples in triplicate or duplicate at least, and does it report statistical analysis on the certificate of analysis (CoA), such as standard deviation from multiple runs?
  • How does it validate the purity of reference standards? When it gets a new batch of reference standard, does it run it against an internal control sample? How often does it make fresh reference standard solution?
  • Is it a proactive communicator, for example how often does it advise on the best methods to use, and alert their customers on new developments in methods?

 

Not all testing needs to be expensive or high-tech, but every method needs to be rigorous enough to provide results that are reproducible in another lab. For example, thin layer chromatography (TLC) is not high-tech, but it can be valid to determine botanical identity with the right mix of expertise, a rigorous and validated set of reference standards, and enough trial and error to develop the method and be confident in reproducibility of results. High-performance liquid chromatography (HPLC) is great when actual validation of the method and reference standards have been certified for their purity.

MicroscopeThe true test of scientific validity is when multiple labs running different methods achieve the same result, especially when they are blinded as to the expected result.

Despite all of the challenges in quality control (QC) testing of botanicals, the world is changing, and our industry is rapidly improving. With scientific validity mandated by supplement GMPs (good manufacturing practices), and increasing demands for transparency and validity from all stakeholders, everyone is upping their game. Good science, not science fiction, provides reproducible results we can all be confident in.

Learn about reproducible results through our UnLab…


By: Blake Ebersole

This article appears with revisions, and was originally published in the March 2014 issue of Natural Products Insider.

Eight Steps to Developing Research Relationships

by NaturPro in Uncategorized Comments: 0

Developing relationships with scientists is much like any other; the first step is in understanding scientists’ challenges and needs. Sensitivity to the ways of the scientific research world, especially academia, is one of the best ways to get the most out of your research investment.

As for what else a supplement manufacturer needs to do:

Show an interest in the science. Like anyone, scientists can sense if you’re more interested in doing great science or just the marketing benefits from it. Offer solutions that boost both scientific and business objectives. Add to the debate and question assumptions.

Try to discover something new. There are thousands of questions to be answered and thousands of different study designs. To be industry-relevant, adopt “standard” methods used widely—but allow some space for new discoveries. Also, test some new hypothesized bioactivity or clinical effect.  One-hundred percent “me-too” science just isn’t very interesting to scientists or consumers. Plus, new findings are more likely to go viral.

Decide on a budget and be realistic. Most research costs money, unless you can get into a study funded by someone like the NIH. But government funding is decreasing every year, while grant applications have multiplied exponentially. Performing strong research often requires expensive labor and materials, and the coordination of many different shared resources.

Offer unrestricted grants for basic research. Research seeking to understand mechanisms of action often best developed step-by-step, making long-term planning difficult. Unrestricted grants that don’t guarantee a specific study plan allow you to support critical shared resources, and they prevent you from painting yourself into a corner at the beginning of your scientific journey.

Agree to milestones for projects, but anticipate delays. University-based, public-funded research requires the alignment of many parts, so some projects hit snags. Plan in advance to prevent potential troubles with approval, recruitment, testing, or finances. Add a “delay buffer” to your timeline for a more realistic expectation.

Decide whether to publish research results and, if so, where. Agree early on who owns the data and who has final decision on whether to publish results. Deciding this early on is a good idea because it sets the standard for the rigor of study design. It’s not necessary to always publish in a patent application or journal. Consider the fact that by publishing, you are likely helping both humankind and your competition. Decide which one outweighs the other.

Presentations at research conferences are sometimes a good idea because you can “publish” data that is somewhat peer-reviewed, and isn’t widely available to the public.

Scrutinize everything. Analyze all methods, data, and reports closely; question them to the best of your ability. Form an internal peer review panel of experts from related disciplines. Be sure to give yourself and other sufficient time to review and discuss revisions.

License technology. Many universities have inventions or start-ups that quietly clamor for attention and funding. Look for available technologies that are scalable and offer a new benefit for humankind.

By: Blake Ebersole

First published in Natural Products Insider, December 8, 2015

Keys for Meeting Supplement GMP Testing Requirements

by NaturPro in Uncategorized Comments: 0

A core concept across GMPs for many industries is scientific validity, and this is also one of the necessary requirements of the dietary supplement GMPs. For example, the purpose of an ingredient specification is to disclose scientifically valid methods and results for the tests, and these methods and results are used to verify the quality and identity of the material being sold.

Scientific validity means that tests must be suitable for what they are intended to measure. In a rapidly evolving industry, scientific validity is a core principle guiding our efforts to ascertain the identity, safety, and label claims of the material that millions of people take to support their health.


Here’s some ways NaturPro helps to ensure scientific validity


To apply scientific principles to the measurement means that we develop a foundation of confidence in test results that accumulates only through repeated testing of viable hypotheses. During the process, we understand that like with many scientific measurements, sources of error exist which tend to increase with complexity. For example, complex samples containing thousands of chemical constituents (e.g., botanical extracts), and instrumentation methods that have a lot of variables all contribute to our bank of “known unknown” and “unknown unknowns.”

Testing using any single method can be an educated guess as an answer to a different question, especially for labs that may only sporadically test a given matrix with a single type of test.

gel electrophoresisToday’s analytical technology to measure analytes in complex mixtures is way ahead of the not-too-distant past, but now we understand a mitigating factor: that with greater power and resolution comes an increasing number of factors that may cause test results to be inaccurate or imprecise.

For example, it can be difficult to account for systematic error associated with dirty chromatography columns or non-optimal instrument conditions. Inaccurate purity data on reference standards (due to either inaccurate standard purity values, or unaccounted-for degradation during storage) are also a common sources of error — when we are simply trying to figure out the “actual” composition of a material. Another source of error arises from the calculation of the results; for example, moisture can account for a certain amount of the measured weight of both samples and standards, which is often simply estimated, even if it is accounted for.


What more does supplement testing and Star Wars have in common?


Other sources of error in testing can be chalked up to incomplete extraction and isolation during the sample preparation.  The subject of dissolution is an interesting one. For example, it is a common assumption that when a sample “dissolves” during HPLC sample prep, then it is fully “ionized” and thus is not strongly bonded to any solid particles (which then often get caught on the filter and not pass into the detector).

If both standard and sample dissolve to the same degree, no problem!  But (unknown unknown) error due to lower than expected ‘percent recovery’  creeps in when your sample is prepared with heat and time, becoming different compounds and binding differently to the protein-fat-and-sugar matrix of a biological product.  So the analyte that you are trying to extract into another phase is often a lot easier using the pure, unbound. chemical reference standard — leading to a difference in percent recovery.  So chemical reference standards are best complemented in testing with an additional control being the original, authentic botanical reference — yes a whole plant part, taken from the same source as the raw material in question.  Sounds easy, but its actually not for a lot of people. 14963749580_49e4e7ed8a_k

Then compound the sample preparation challenges with the high heat and pressure applied by an analytical instrument like HPLC, where more chemical reactions can happen in the complex sample to degrade what you are measuring, all while your pure reference standard survives nicely to the detector. (Theoretically, this scenario can also happen the other way around, where the matrix stabilizes the analyte better than the standard solution under the HPLC conditions.)

Good-Manufacturing-PracticesExciting stuff, all this mystery, which we eventually find answers to through validation and repetitious testing.  While it’s difficult to predict analytical uncertainty, the point is to control it to the extent possible, hopefully to within 5-10% of your expected result — not bad compared to the 20% tolerance limit required by pharmaceuticals.

The practical question facing suppliers and manufacturers is how to ensure your specification accounts for testing variance?  One solution commonly opted for in the short term is surprisingly simple: add the testing variance to the label or spec requirement, to ensure a high statistical probability that the material won’t fail due to inherent imprecision of the test.

The implications of an imprecise test often means that manufacturers are forced to add an ‘overage’ of material, which essentially makes the cost of the material 10% more expensive for every 10% difference in test results. 

Scientific validity in QC testing for supplement all too often is discussed not on a daily basis, but when the cost of “mistakes” has finally sunk in.  Many a product formulator saw hours and months of work go down the drain due to quality testing failures, and everyone involved in product development can testify to the measurable waste of time and resources that result from testing failures, which can include both the approval of bad material, as well as the rejection of good material.


Five ways NaturPro helps to ensure scientific validity


Here is a short list of some practices that QC units can perform to achieve scientific validity as per GMPs:

–Review your lab’s methods for their suitability for the intended purpose. There are good independent labs out there that will share method information, and answer your questions. Always ask whether the sample is being tested in triplicate and request to receive the individual values.

–Review the documentation on the reference standard, specifically the methods and results of the testing used to determine its purity. When was the standard made, when was its purity last tested, and how was it stored in between?

–Blind your sample so your lab does not know what value to expect.

–Test control samples (samples that do not contain the suspected analyte, OR samples that you previously sent to the same lab).

–Work with labs that can demonstrate having worked to some basic degree to optimize/validate the method.

Sounds like costly work, but not so much when put in perspective of the potential costs. With transparency among customer, supplier, and lab together, a little teamwork goes a long way to reduce the costs and maximize the benefits of quality systems.

By: Blake Ebersole

This article was first published in Natural Products Insider in June 2013

 

Dose Delivery: Oil Into Water


The gut is by nature one of the best machines imaginable for chemical conversion of food into energy. As a result, the majority of what we consume is changed into something different with incredible efficiency. The stomach begins this process with the pH of battery acid, plus enzymes. Then, the intestines—25 feet long and filled with hungry bacteria and more enzymes— do the rest. Considering the environment, it is surprising that anything we consume actually absorbs into our bloodstream intact.

Bioavailability, often defined as the amount of a compound put into the body compared to the amount reaching blood circulation, is an unexpectedly complex subject. Luckily, the pharmaceutical scientific literature has given us some good tools to understand it adequately.

In order to be bioavailable, a compound must first remain stable and retain its identity in the gut, which is no small order. Assuming it stays intact, its bioavailability can be divided into four main classes, categorized by the Biopharmaceutics Classification System (BCS). According to the BCS system, there are only two chemical factors driving bioavailability: its solubility and its permeability. Water-soluble compounds like ascorbic acid are considered class I (high solubility and permeability), the ideal scenario for bioavailability. However, many fat-soluble compounds such as curcumin are considered BCS Class IV, possessing low solubility and permeability. With the potential of curcumin and its more than 6,700 published studies, this means a lot of opportunity in the face of a lot of challenge.

For Class I to III compounds, dose delivery is often fairly easy to solve. Optimizing these based on pH, particle size, solid dispersions, crystallization or salt forms can be enough to ensure adequate absorption. But what to do about the difficult Class IV compounds?

The water solubility of a compound is based on its chemical polarity, which depends on the electric charge of a compound (given by the presence of charged atoms like oxygen) and its asymmetry. Water (H2O) is polar, because it’s an asymmetric molecule containing mostly oxygen by weight. On the other hand, fats and oils, generally symmetric compounds with a lot of uncharged carbons, are nonpolar and not very water soluble.

The spectrum of polarity includes some compounds called amphiphilic (meaning both-loving) which are able to interact with both polar and nonpolar compounds. As a result, they are ideal to include in many dose-delivery systems. Because ‘like dissolves like’, amphiphiles can dissolve both fat- and water-soluble compounds; detergents like soap, which dissolves grease into water, are one everyday example of this type of compound.

The use of amphiphiles to improve bioavailability has already been perfected in our gut. High-purity phospholipids are the key components used by the small intestine to absorb dietary fat. Phospholipids are also a main part of cell membranes, whose critical function is to separate the cell from its environment while at the same time allowing both polar and nonpolar nutrients to pass through. In the past few years, science has harnessed these unique and interesting properties of phospholipids to better deliver active compounds to target tissues.

To date, thousands of studies have been published on improved bioavailability technologies such as solid-lipid particles, nanoparticles, micelles, liposomes, emulsions, microparticles and others. Phospholipids are one common factor among these technologies, which ultimately stabilize and solubilize compounds of a class IV nature. Curcumin, resveratrol and other fat-soluble compounds all clearly benefit from some of these advanced dose delivery systems.

Yet challenges remain. In the view of the literature and medical use, reliable human efficacy is achieved infrequently; the number of successful human studies using these advanced technologies pales in comparison to the number of successful test tube or animal studies. Why? From a dose delivery view, it is also well understood that there is an ideal concentration of active in the body: too little is ineffective, while too much may be counterproductive. Striking that balance is a difficult task.

The human body and how exactly it works remains much of a mystery, still with vast areas of uncharted territory. And some big differences exist in how we absorb, metabolize and excrete what we consume, due to genetics, diet, what we consume it with and other variables. Plus, each active compound is an individual chemical entity with unique physicochemical characteristics and bioavailability. Hence, the need for good and rigorous science.

Also, our current capabilities and standards for measuring bioavailability are sometimes not relevant to efficacy. For example, bioavailability of fat-soluble compounds is generally measured in blood plasma, because plasma is mostly water-based and the best medium to measure water-soluble compounds. However, plasma data often does not reflect actual bioavailability of fat-soluble compounds, because plasma does not generally attract these compounds like blood cell membranes and organ tissues do. Many dose-delivery technologies work magnificently in the test tube, but may not survive stomach acid or the small intestine intact. And some delivery systems may appear to improve bioavailability but only for an inactive metabolite like a glucuronide.

In these cases, the chemical identity of what we measure in the blood, in addition to what part of the blood we are measuring, proves to be more important than the amount we are measuring. When the science of a bioavailability study is off-kilter, it can ultimately lead to the selection of poor clinical study material and a failure to show efficacy. Good science in the early stages of development is critical, and while oil and water can mix, dose delivery is ultimately just a means to promoting health.

By: Blake Ebersole

This article was originally published in Natural Products Insider in August 2014

 

Supplier Verification Key to New FDA Rules

by NaturPro in Quality Comments: 0

Long before the New York Attorney General made a move to DNA test mainstream supplement products (and well before the Dietary Supplement Health and Education Act of 1994 [DSHEA], 21 CFR 111 and now CFR 117), foods and supplements were prone to adulteration—both deliberate and unintentional.

In today’s modern supply chain, ingredients generally come in powder or liquid form, masking their true identity to the naked eye. They are shipped from afar—and grown and processed from even farther. Their travel across dusty roads and through busy ports is accompanied in many cases simply by a piece of paper that “certifies” their analyses.


Contact us about independent supplier auditing and verification with Supplier Verified


The adulterants of today are similar to those of yesterday, but more advanced. Undeclared fillers and analysis-interfering ingredients are today’s sawdust and snake oil. And even raw materials innocently misidentified or mistested can ruin all the good intention in the world.

“Caveat emptor,” say some suppliers, “I don’t confirm my supplier’s certificate of analysis (CoA) because that’s my supplier’s job.” Passing the buck doesn’t—or shouldn’t—work in today’s regulated industry. A money-back guarantee cannot erase the taint of inferior ingredients consumed by people daily, and in relatively large amounts for their health.

When the analysis certified by that piece of paper is a third- or fourth-hand document whose validity has not been independently verified or tested, a shred of trust quickly turns into a shroud of mystery: Who tested the material? How did they test it? Can we trust the results?

On another part of the supply-chain spectrum exists a lot of upstanding, quality-invested companies, which simply come from the old days and have not accumulated sufficient expertise or information about how their ingredients are made or tested. Now we are made to actually test against the spec instead of trusting it—isn’t that enough?

Not anymore. Many are damning the attorneys general for their lack of expertise about the uses and limitations of DNA testing of botanicals. But others believe that their actions, and the new FSMA (Food Safety Modernization Act)/CFR 117 rules serve not as a red herring, but a beneficial spotlight on some real issues: What tests are suitable for identity? How do we ensure identity and ingredient integrity is maintained?

Many of those who are committed to quality see the rainbow in the storm, and want to invest in doing the right thing as part of continuous improvement. They know transparency and traceability are not just buzzwords: the concepts actually mean something to CPG (consumer packaged good) marketers—and not just because they are starting to mean something to the end consumer.

The new requirements have a catch: maintaining and improving on the principles of transparency and traceability requires more than just creating product specifications and testing against them. Adopting these values to meet the new requirements will require investment in strong relationships and control of supply chains, and proper audits and qualification of suppliers. Trust but verify.


More about verification of specifications using Spec Verified


Now that many U.S. manufacturers are getting up to par on GMPs (good manufacturing practices), FDA appears to be (rightfully) focusing on suppliers of ingredients imported from foreign countries such as China and India. So, under the new CFR 117, supplier verification programs are required. Some say the new requirements, which also require verification of food safety principles like HACCP, are the missing link between the requirements of DSHEA and how to responsibly ensure food and supplement product safety and integrity.

Here’s one example to illustrate: many factories overseas performing extractions also process pharmaceutical actives, including cytotoxic drugs and other materials that can possess bioactivity at low concentrations. These materials, which could cross-contaminate a natural product brand’s material, are rarely listed on the specification. The reason stated for the absence of cross-contaminants on the spec is that they are unnecessary—because cleaning is performed between manufacturing runs. But how is the validation of the cleaning performed? Are all nooks and crannies of the production line cleaned? A weak cleaning validation or pre-production line clearance, lacking any actual limits or in-process testing for contaminants, serves as a key difference between a safe, legal product and an adulterated, potentially unsafe one.

Ingredient manufacturing facilities should perform cleaning validation, and should let their customers know of potentially toxic products that are made on the same production lines. This information isn’t found on the ingredient specification, but is part of a good supplier qualification program. If a brand doesn’t ask about cleaning validation or audit the facility’s production records, it would never know whether the cleaning was effective to prevent adulteration or a potential product liability issue.

“Trust but verify” is the central mantra of GMP and of good quality practices. For these reasons, and many others, many experts believe the new CFR 117 is a critical and important move toward improving product quality and integrity in dietary supplements. So, if the industry in 2015 is remembered by the word “identity,” 2016 may be known for the words “supplier qualification.”


Contact us about supplier verification with Supplier Verified


If a brand isn’t well-versed in these areas, help is available. New compliance programs offered by groups such as NSF International, IDDI and others work to close gaps by offering independent review and verification of supplier GMPs. Others focus on verifying ID and product specifications, based on oft-cited FDA inspection violations. New independent supplier verification services are available to assist clients with meeting the new rules.

Once, during a foreign-supplier qualification audit, I politely noted a potential risk to product safety that was due to the absence of a critical control point. After the facility supervisor’s response along the lines of “it’s not a big deal—we have never had a problem noted,” I acknowledged that may be the case. But now knowing the potential risk, would he let his children consume the product without fixing the issue? There was a pause in his response.

Likewise, we should all be willing to swallow the product whose safety and quality we are responsible for. But before taking the plunge, we should get to know our suppliers, and verify their quality practices.

By: Blake Ebersole

This article was first published by Natural Products Insider, December 2015.

Traceability: What’s the Point?

by NaturPro in Quality Comments: 0

Traceability is an industry buzzword, quality issue and regulatory requirement for products intended for human consumption, with the latest focus fueled by adulteration of baby formula with melamine in 2008 and the subsequent signing of the Food Safety Modernization Act (FSMA). Quality issues that can be solved by traceability systems date back at least 2,000 years ago, when Dioscorides developed methods to differentiate Balsamodendron (balsam), Commiphora (myrrh) and Boswellia (frankincense) gum resins. Likely due to early traceability and quality systems, the Magi were able to differentiate among these valuable materials with similar visual and sensory attributes:

Balthazar: Melchior, you say this material is frankincense, but how do you know for sure?

Melchior: This material has met the basic quality requirements of our time: it is easily flammable, with clear smoke and pleasant, characteristic fragrance. Also, my supplier harvested it from his Boswellia tree just this morning. This information is written on parchment with his signature.

Balthazar: Your argument is convincing, and you have established adequate traceability. Let us approve this material fit for its intended purpose.

Today, there is no definition or standard for traceability. Traceability systems are intended to track the flow of materials through the supply chain, and are mainly made of documentation and the supporting legwork done to create and verify the documents. The ISO 9000:2000 guidelines define traceability as the “ability to trace the history, application or location of that which is under consideration.” For some food systems, traceability is maintained back to the farm or even the seed, while in others it is maintained back to a point in a manufacturing process intended to control a key quality attribute—microbial load, for example.

Traceability requirements depend on the type of product and the regulations of various countries. In Europe, documentation is required to identify suppliers of ingredients of foods. In the United States, FSMA requires the country of origin to be labeled. However, for the U.S. dietary supplement industry, ingredient traceability systems at reputable firms go beyond country of origin, requiring the name and address of ingredient manufacturing facilities, and product quality and traceability information that in many cases requires a chain of custody back to the farm.

In general, the requirements for traceability in the industry differ widely, depending on the type of product and market demands. For example, a small volume of conventional chamomile tea made only from dried chamomile flowers sourced from a single farm and sold at a single local market can have a very simple (yet effective) traceability system: “I know the farmer who grew and dried these flowers.”

On the other hand, coffee that is mass-marketed in high volumes for attributes like shade-grown and fair-trade will require a relatively complex and resource-intensive system, particularly as higher volumes are demanded. This is because of the multiple steps in the supply chain and the fact that many coffee farms are small—so a large number of farmers need to be supervised and documented, which can be a costly endeavor. However, coffee that is labeled simply as Arabica need not have any traceability other than that required to maintain product quality and safety.

Even within a traceability system, there are different strengths of the supporting documentation. Statements from the raw material buyer that a fair price was paid to the harvesters serves as one layer of support, but solid verification of this claim may require an in-person audit of the system and periodic visits with the individual farmers and harvesters. How far does a brand want to go?

Ultimately, “full traceability” is difficult to achieve for agricultural products (until we can figure out a way to bar code each individual plant). So, the objectives and costs of an adequate traceability system depends on the nature of the product and market demands. Three main objectives for traceability systems include:

1. To effectively manage the supply chain: Supply chain management aims to determine the most efficient way to produce and procure products. Documentation of the products throughout the cycle from start to finish is key to understanding how they are made and how much they should cost.

2. To support marketing claims: Today’s discerning consumer demands many attributes from their products, which they are not able to taste or otherwise perceive in the product. For example, dolphin-safe tuna can only be verified through the supporting documentation; no analytical test is available to test that the tuna is dolphin safe.

3. To provide information for quality assurance systems or food safety investigations: When product quality issues occur, traceability documents are integral to track back to the root cause of the issue and correct it.

In the world of botanical ingredients, traceability may extend all the way to the farm, while for agro- or petrochemicals, it may extend back to the manufacturing level. Regardless of the level to which a material is traced, one of the key requirements for any system is based on the concept of segregation, which for the purpose of determining GMO (genetically modified organism) status is known as identity preservation. In proper systems, materials have discrete lot number and sizes, and are kept physically separate from other materials or inputs.

In determining lot size, there is a balance that takes into account the level of “precision” required for a product. Too few lots for a given amount of material, and the amount of material may be too large to control and keep consistent; too many lots require legwork and testing that may be too costly. Ultimately, the customer sets the expectation for traceability and will value (and pay for) the benefits and peace of mind that it can offer.

By: Blake Ebersole

This article was originally published in Natural Products Insider in November 2014

How To Design a Clinical Trial


Designing and executing a clinical trial that meets scientific and marketing requirements can be a tall order. Lots of variable exist, and often a meaningful clinical study result is a moving target. So study design requires significant expertise in the therapeutic area, and an understanding of market dynamics.

Here are four main questions to ask.

1.) What are the rationale and central questions to the study? There are a number of questions that need answers, but in general, clinical studies should start with one or two central questions, and a reason for why the material should be studied. This results in the development of primary and secondary endpoints. Are you trying to see whether a nutritional product can improve joint pain in baby boomers, or muscle pain in athletes? Because the study design may be completely different for what may appear to be very similar studies.

How the product will be perceived by the market post-study is also important. What study endpoints will allow for solid marketing claims? If your product has a significant effect in the study, will it help to differentiate your product against the leaders in the category? In the drug industry, studies on new products are compared against the “standard of care,” and the approach for supplement clinicals can take the same approach, particularly if the product is not very well differentiated in other ways.  Are there new mechanisms of action or emerging markers that can be added as secondary endpoints, which would help to differentiate your product?

Accumulation of data to support safety and global regulatory acceptance such as GRAS determinations should always be an objective, so any efficacy study is also a great opportunity to inexpensively accumulate safety data.

2.) What is the dose? Often, this is the most challenging and critical question across all drug and nutrition clinical studies. For many products that are complex mixtures of active compounds, pharmacokinetics or bioavailability is unknown or untenable, making dosing a wild guess. In cases where there are only a couple active compounds, bioavailability should be assessed before moving on to clinical efficacy trials.

In cases where bioavailability cannot be easily determined, a dose-response study (using multiple doses) should be performed. Ideally, a dose-response study observes a small effect at a lower dose, and a greater effect at a higher dose. In other cases, a linear dose-response relationship should not be assumed; a higher dose may not work as well (or reveal safety issues) compared to a lower dose.

Market considerations, such as cost per day and number of capsules should also be included in this evaluation. While a randomized, placebo-controlled clinical trial is wonderful to have, if the product never reaches the shelf (or the dose is too high for the consumer to stomach) then the best-designed study is like a tree falling in the woods.

3.) How many subjects are needed for the study to be adequately powered? A minimum requirement today for nutritional products is that the changes in the group taking the active dose must be significantly different than the changes in the placebo or control group. It makes no sense to design and invest in a study that will show no difference between your product and a sugar pill. For some subjective measures such as pain, the placebo effect and inter-individual variation can be very high, due to the subjective and ever-changing nature of pain perception. In this case, the number of subjects required to get reliable statistical separation between the active versus control groups is relatively high. For other endpoints, such as blood concentrations of actives in pharmacokinetic studies, placebo effects are almost nil, and therefore a lower ‘n’ is likely to result in significant changes versus controls.

4.) What is the budget and timeline? Research is an investment, one that can be expensive and time-consuming. For example, if the therapeutic area and endpoints include testing of blood markers, then the drawing, processing and testing of blood samples is a major cost center in the research budget.  Common blood markers such as blood lipids are relatively easy using standard kits, while other less standard markers can require method development and increase costs, and may provide unreliable data that needs to be repeated.

A university-based study offers the independence and clout of world-class clinical studies, but the prestige can be balanced with increased costs and more uncertainty in the timeline, particularly when your study is relatively small and relies on shared resources. While a contract research organization is often faster than a university, this option can also come with greater costs. A research services contract with a detailed protocol and time-based milestones is critical to have in place.

Ethical approval (typically through an Institutional Review Board, or IRB) is also required for all human studies. Some research centers can get IRB approval within a month, while others are mired in bureaucracy and generally take six months or more.

Recruiting also contributes to the study timeline. If you are excluding a lot of lifestyle factors, then your available population is low, and getting the required number of subjects can be costly if not impossible.  Many clinical studies never get off the ground when recruiting is not taken into account.

Lastly, it is critical to do the homework up front and ask a lot of questions. Make sure you have someone in your corner, who speaks the language and is looking out for your best interests. Only then can you ensure the returns on your research investment are maximized.

By: Blake Ebersole

This article was previously published in Natural Products Insider, June 2015.

The Way to My Heart? Through My Stomach…


Heart health, gut microbiota and diet are closely linked in ways we are just beginning to understand. It is well-known that diet can alter microflora balance and tip the scales toward a pro-inflammatory status affecting heart health, but new research has uncovered other interesting links between gut and heart health. A 2015 study published in Metabolism found women with and without metabolic syndrome who produced equol, a gut bacteria metabolite resulting from soy consumption, enjoyed cardiovascular benefits from consuming soy nuts.1 However, non-equol producers experienced no improvement. This suggests the possibility that in order to enjoy the cardiovascular benefits from soy, a certain balance or type of gut bacteria is required.

Many nutritional interventions appear to work regardless of gut microbiota. A 2015 randomized, controlled clinical trial published in the journal Hypertension by a university group in London, found the primary active constituents of beet root are the nitrates like betain.2 In this study, 250 ml of beet root juice (compared to a placebo of nitrate-free beet juice) reliably lowered blood pressure in hypertensive patients, as well as improved endothelial function by 20 percent (p<0.001). Remarked the authors, “This is the first evidence of durable BP reduction with dietary nitrate supplementation in a relevant patient group.”

But juicers might want to keep the fiber. A study in the American Journal of Clinical Nutrition following 7,216 men and women for eight years found baseline consumption of fruits and fiber was associated with a significantly lower death rate, and those consuming the highest level of fruits (>210 g/d) had a 41-percent lower risk of mortality, which was mainly associated with cardiovascular disease.3

The questions around cardioprotective effects of whole grains continues. The Dietary Guidelines for Americans recommends at least half of our grain consumption come from whole grains, but study findings tend to be inconsistent. In a well-designed controlled crossover study in the Journal of Nutrition, which was co-authored by researchers from Nestlé and General Mills, an increase of 140 g/d in whole grain consumption did not result in significant effects in blood pressure, fecal measurements or gut microbiology.4

Studies like this one lead to more questions than answers, such as whether the “gold standard” randomized controlled trial is adequate to measure effects of interventions such as whole grains, especially when it is difficult to control every possible mitigating factor (such as the elimination of whole grains from subjects’ diet during the washout period). Perhaps the type of whole grain was a factor as well, but some also suggest that a lack of effect also illustrates why simply eating a balanced diet according to prevailing nutrition recommendations may not be sufficient to impact health, especially as we age.

Lest we forget that diet does not exist in a vacuum, there are a number of psychological and social factors that impact nutrition and cardiovascular outcomes. In the Cardiovascular Risk in Young Finns Study published in Circulation, 1,089 children were followed for 27 years, which resulted in a fantastic dataset.5 Higher ratings of emotional, parental health and self-control behavior patterns in children resulted in a significantly better cardiovascular risk rating as adults. Although the study did not focus on specific nutritional aspects, it may be worth our time as an industry to consider ways to integrate dietary interventions with lifelong behaviors that optimize health outcomes.

Reams of evidence suggest polyphenols support cardiovascular health. A recent six-week controlled clinical trial in Portugal was published in the American Journal of Clinical Nutrition, which compared the effects of two olive oils containing different levels of polyphenols on proteomic biomarker scores related to coronary artery disease.6 The findings were surprising: the olive oil lower in polyphenols was slightly more effective than the enriched olive oil. Could there be other compounds in olive oil other than polyphenols responsible for its well-known health benefits?

Regardless, the research on polyphenols continues, with berries as the main focus. Ongoing trials on polyphenols from colored berries and flowers, based on a search of ClinicalTrials.gov, include the following: a study on a hibiscus extract beverage on cardiovascular and endothelial health, which completed in February 2015; another study on a chokeberry extract in former smokers, to complete in May; and another study on cranberry extract in obese, insulin-resistant humans at Pennington Biomedical Research Center, anticipated to complete in July.

On berries, a study published in Italy in April 2015 found that a formulation of white mulberry leaf extract, berberine and red yeast rice both lowered low-density lipoprotein (LDL) and raised high-density lipoprotein (HDL) cholesterol in humans with high cholesterol not already on statins.7 This formulation was compared to a similar one without mulberry, but with astaxanthin, folic acid, policosanol and CoQ10. Based on the complexity of the formulations, it is difficult to conclude much about the contributions of each ingredient; however, the authors suggested that the mulberry extract might have made the difference for the high-performing formulation.

Future research is expected to add to our increasing knowledge of how to reach the heart through the gut.

References:

1.       Acharjee S et al. “Effect of soy nuts and equol status on blood pressure, lipids and inflammation in postmenopausal women stratified by metabolic syndrome status.” Metabolism. 2015 Feb;64(2):236-43. DOI: 10.1016/j.metabol.2014.09.005.

2.       Kapil V et al. “Dietary nitrate provides sustained blood pressure lowering in hypertensive patients: a randomized, phase 2, double-blind, placebo-controlled study.” Hypertension. 2015 Feb;65(2):320-7. DOI: 10.1161/HYPERTENSIONAHA.114.04675.

3.       Buil-Cosiales P et al. “Fiber intake and all-cause mortality in the Prevención con Dieta Mediterránea (PREDIMED) study.” Am J Clin Nutr. 2014 Dec;100(6):1498-507. DOI: 10.3945/ajcn.114.093757.

4.       Ampatzoglou A et al. “Increased whole grain consumption does not affect blood biochemistry, body composition, or gut microbiology in healthy, low-habitual whole grain consumers.” J Nutr. 2015 Feb;145(2):215-21. DOI: 10.3945/jn.114.202176.

5.       Pulkki-Råback L et al. “Cumulative effect of psychosocial factors in youth on ideal cardiovascular health in adulthood: the Cardiovascular Risk in Young Finns Study.” Circulation. 2015 Jan 20;131(3):245-53. DOI: 10.1161/CIRCULATIONAHA.113.007104.

6.       Silva S et al. “Impact of a 6-wk olive oil supplementation in healthy adults on urinary proteomic biomarkers of coronary artery disease, chronic kidney disease, and diabetes (types 1 and 2): a randomized, parallel, controlled, double-blind study.” Am J Clin Nutr. 2015 Jan;101(1):44-54. DOI: 10.3945/ajcn.114.094219.

7.       Trimarco V et al. “Effects of a New Combination of Nutraceuticals with Morus alba on Lipid Profile, Insulin Sensitivity and Endotelial Function in Dyslipidemic Subjects. A Cross-Over, Randomized, Double-Blind Trial.” High Blood Press Cardiovasc Prev. 2015 Apr 14.

By: Blake Ebersole

This article was first published in Natural Products Insider in June 2015

Extracts: More Than a Cup of Tea


Why extract when you can just consume the whole plant? Extracts concentrate the bioactive part of plants into a manageable dose, while removing the inert parts such as cellulose. And since a lot of botanicals that support health don’t taste very good, we would prefer to be able to consume them as one or two capsules—not 10 or 20.

On a basic level, making a botanical extract is like making a cup of tea: Just soak some plant material in some hot water and enjoy.

Yet, as many tea connoisseurs know, making tea is both an art and a science. The quality of the cup of tea is predicated on a number of variables that include raw material composition, the solvent (such as water or alcohol), the amount of tea to water, the water’s temperature and steeping time. Changes in these variables necessarily results in differences in the end product that are detectable by the human palate.

Let’s say you want to make a powdered extract from this cup of tea. The temperature, time and method of removing the water all impact the quality of the end product. To standardize the extract to a certain specification, including potency, color, powder size and impurities, requires another additional set of controls and experience.

Lastly, maintaining consistency from batch to batch is an additional challenge with natural products prone to variations in climate, geography and harvest methods.

The choice of solvent is a key variable that, along with raw material selection, has the most impact on the final extract. Different solvents will extract different classes of bioactive compounds, so it is important to know what you are trying to extract.

Historically, extraction facilities often selected solvents that provided the best yield, with little regard for safety or regulatory acceptance. As regulators and consumers have become more discerning, so have the processing methods. Today, “green” extraction methods offer a lot of the positives consumers demand—but not without some key tradeoffs.

Like dissolves like, so water will dissolve similarly polar compounds such as flavonoids. Water as a solvent is often preferred by consumers because of its “clean” image; however, it is also a challenge to work with as a master oxidizing agent and a great medium for microbial growth.

Due to its low vapor pressure, water is also among the most difficult solvents to remove during drying, resulting in extra heat and time that can further degrade the native composition of the original plant. Powdered extracts made with water are often hygroscopic, meaning they attract moisture from the air readily, which can lead to clumping and microbial growth in what was once a perfectly clean and flowable extract.

Ethanol is often preferred as a solvent, because it does not present many of the challenges of water. Many generations of physicians have produced liquid extracts known as tinctures—herbs steeped typically in ethanol at established concentrations.

Ethanol is good to dissolve diverse types of compounds, but for many fat-soluble molecules, saturation is reached at a low concentration, resulting in poor extraction efficiency. Thus, extracts using ethanol only often demand a premium price, and may not reach the level of potency offered by other non-polar solvents.

Supercritical extracts using solvents such as carbon dioxide (CO2) have become popular, and for good reason. This method of extraction can be performed at moderate temperatures, and CO2 is one of the cleanest and lowest cost solvents around. Supercritical COis often used to remove caffeine from tea, and extract essential oils from spices and herbs.

The main disadvantages of supercritical extraction include high capital and operating costs, poor selectivity of compounds without optimization, and the time and expertise required to perfect or optimize a process. Often, to achieve a standardized product, a supercritical extraction may have to be paired with other processing methods, which can add to cost.

Standard methods of extraction can be complemented with emerging technologies to achieve a superior product.

Ion-exchange chromatography is one of the best ways to purify natural products, although the higher concentrations of actives achieved are offset by lower yields and higher processing costs. Ultrasound and microwave-assisted extraction are newer ways to achieve better yields during standard solvent extraction, as they act to break the plant cells and release active components better than simple heat or static mixing.

Today’s botanical extraction toolbox offers endless possibilities to achieve desired purity while retaining the natural composition of the botanical.

 

By: Blake Ebersole

This article was first published in Natural Products Insider in February 2015