Green. Natural. Organic. Plant-based. These are words that have moved beyond just food and diet, as more Americans are concerned about not only what goes in their bodies, but also what goes on them and is absorbed by the skin. Two years ago, nearly half of American women who used skincare products reported looking for those labeled as natural or organic, according to market research—the chief concern being avoiding “toxins.” However, toxins aren’t well defined, said Dr. Catherine Harrison, a local scientist turned natural skincare brand founder.