In 1856, 18-year-old chemist William Henry Perkin was experimenting with coal tar-derived compounds in a crude laboratory in his attic.
His teacher, August Wilhelm von Hofmann, had published a hypothesis on how it might be possible to make a prized malaria drug using chemicals from coal tar, and as his assistant, Perkin was hoping that he would be the one to discover it.
The experiment was a failure. Rather than the prized drug, Perkin created a thick brown sludge. However, when he went to wash out the beakers with alcohol, it left behind a bright purple residue.
The residue became the world’s first-ever mauve synthetic dye.
Before the invention of synthetic dyes, people obtained dyes through organic materials such as plants, clay, minerals, or certain animals such as insects and squid.
Natural dyes such as those from clay tended to fade quickly, and those that were long-lasting, such as natural indigo dyes, required an arduous extraction process that made them expensive.
However, Perkin’s mauve dye was stable and easy to make.
Perkin’s discovery and commercial success prompted chemists in Europe to find more dyes in coal tar; magenta was discovered in 1858, methyl violet in 1861, and Bismarck brown in 1863.
Synthetic dyes would soon be added to everything—clothing, plastics, wood, and food.
Dyes in Food
For centuries, people have colored food to make it appear more appealing. Butter, for example, is not always yellow. Depending on the cattle feed, breed, and period of lactation, the color of butter can fluctuate seasonally, from bright yellow in the summer to pale white in the winter.

Natural colors, unlike artificial ones, are susceptible to changing pH, temperature, and moisture. They can change in hue and intensity, and yellows can become pale.
The practice of mass coloring and striving for uniformity likely emerged as a result of industrialization in the late 19th century, when packaged and processed foods became widely available, according to Hisano.
“Mass production and industrialization required easier, more convenient ways of making food, and using coal-tar dyes was one of the solutions for creating more standardized food products,” Hisano told The Epoch Times.
Since packaged foods lose freshness, they may lose color or look less natural. So previously, some companies would add compounds such as potassium nitrate and sodium sulfites to products such as meats to preserve their color. These compounds were relatively harmless.

Dairy products such as butter and cheese were the first foods authorized by the federal government for artificial coloring.
Despite Wiley’s criticisms, by the time his book “Foods and Their Adulteration” was written, practically all the butter on the market was artificially colored.
“The object of coloring butter is, undoubtedly, to make it appear in the eyes of the consumer better than it really is, and to this extent can only be regarded as an attempt to deceive,” Wiley wrote, arguing that if the cows were properly fed during winter, they would naturally produce butter of the appealing yellow shade.
The FDA
The previous year, in 1906, Congress passed the Food and Drugs Act, prohibiting the use of poisonous or dangerous colors in food. The FDA was formed on the same day the bill was made into law.After the prohibition, the FDA approved seven synthetic food dyes—most of which would be banned in the 1950s after new animal studies indicated their toxic effects.
However, the FDA has always given greater scrutiny to synthetic dyes than to natural ones. Synthetic food dyes must be given an FDA certification before they can be used, but there is no requirement for natural dyes. While the FDA regulates synthetic dyes as a food additive, natural dyes can be regulated as generally recognized as safe, which is a less stringent authorization procedure.
In 1938, new laws were passed requiring all food dyes, whether synthetic or natural, to be listed on product labels.
These new petroleum-based food dyes are considered very similar in composition and chemistry to their earlier coal tar counterparts, food scientist Bryan Quoc Le told The Epoch Times.
“Petroleum is cheaper, safer, and available in greater quantities,” he said.
The use of synthetic food dyes has been steadily increasing every decade. Data based on FDA dye certification suggest that food dye consumption has increased fivefold since 1955.

Cancer Concern
By the time Wiley became the first head commissioner of the FDA, experts were in contention over which food dye was riskier than the other. Over the following decades, dyes that were initially approved were gradually whittled down to the six remaining dyes of today.The hearing also led to the passing of the Delaney Clause, which prohibits the FDA from approving any food additive that can cause cancer in either humans or animals.
Orange No. 1 and several other approved dyes were removed after evidence of animal carcinogenicity.
The Delaney Clause was what prompted the removal of Red No. 3 in January under the Trump administration.
Professor Lorne Hofseth, director of the Center for Colon Cancer Research and associate dean for research in the College of Pharmacy at the University of South Carolina, is one of the few researchers in the United States studying the health effects of synthetic food dyes.
These dyes are xenobiotics, which are substances that are foreign to the human body, and “anything foreign to your body will cause an immune reaction—it just will,” he told The Epoch Times.
“So if you’re consuming these synthetic food diets from childhood to your adulthood, over years and years and years and years, that’s going to cause a low-grade, chronic inflammation.”
Hofseth has tested the effects of food dyes by sprinkling red, yellow, and blue food dyes on cells in his laboratory and observed DNA damage. “DNA damage is intimately linked to carcinogenesis,” he said.
The mechanism of how food dyes cause cancer remains to be elucidated.
Behavioral Problems
While the link between food dyes and cancer may remain elusive, the link between food dyes and behavioral problems in some children is much more accepted.Rebecca Bevans, a professor of psychology at Western Nevada College, started looking into food dyes after her son became suicidal at the age of 7.
His suicidal ideations went away once food dyes were removed from his diet.
More surprisingly, Bevans noticed that her own anxiety dissipated after she removed synthetic red and yellow food dyes from her diet.
“I had a little mini existential crisis at 52,” Bevans told The Epoch Times.

He proposed the Feingold diet—a diet free of additives—for children. His theories garnered widespread attention in the media, but the medical community and the American Academy of Pediatrics were unmoved at the time.
When Feingold died in 1982, interest in his hypothesis died away.
However, the University of Southampton study was only explored the effects of mixtures of food dyes and included dyes not used in the United States. Therefore, the effects of individual dyes present in the U.S. food supply remain unclear.
“We don’t know exactly the mechanism of how these metabolites from these dyes or how these dyes themselves directly affect the brain,” Bevans told The Epoch Times.
One explanation is that the dyes cause behavioral problems by harming the gut, since the gut bacteria help produce and regulate the brain. Gut problems can also lead to nutritional deficiencies, which may also impair brain health.
“There’s just a lot of unknowns as to the mechanism of function in the body, but there is enough evidence demonstrating through studies that some individuals have much more negative reactions to these food dyes than others,” Bevans said.
Current Food Dyes Used in the US
Three synthetic food dyes have been banned so far in 2025, leaving six currently in use. Among them, red and yellow dyes account for 90 percent of all the dyes used in the United States.Although evidence of carcinogenicity in these dyes is still inconclusive, the behavioral effects of food dyes are more supported by research.
Current research in children suggests that “there may be some small subset of children who appear to have altered behavior if they consume these synthetic food dyes,” Susan Mayne, former head of the FDA’s Center for Food Safety and Applied Nutrition, told The Epoch Times.
Mayne said that current studies are still murky as they study not individual dyes, but mixtures.
Phasing Out Colors
On April 22, the FDA announced a voluntary phasing out of petroleum-based synthetic food dyes in the United States, with plans to have them extirpated from the food supply by the end of 2026.Whether this can be done is still unclear.
Mayne said that since it is not a legal requirement, it’s hard to ensure that the dyes are removed by all companies.
Natural dyes must be sourced from agricultural products, which places additional agricultural and supply pressures on food companies.
Companies would need to experiment with new formulations, potentially resulting in products that are less visually appealing and have shorter shelf lives—all of which could risk customer loss and increase production costs, according to Renee Leber, food science and technical services manager at the Institute of Food Technologists.
“Sometimes we talk about this like it’s a one-to-one substitution. It is not. ... There are so many things to keep in mind,” Leber told The Epoch Times.
Nevertheless, some large companies such as Pepsi and Tyson Foods have announced that they will be removing synthetic dyes from their products.
Leber said it is the consumers who decide and shape the market, noting that they need to understand the process and effects as the food industry goes through this change. Consumers may be surprised by the changes to shelf life and food prices resulting from the switch.
“I think we’re going to need to set new expectations,” Leber said.
“Most companies will try to make sure that they’re bringing their consumers along with them when they start to do this and to give them the narrative that they are doing this in order to meet this public interest.”