Unhealthy Old-School Ingredients Everyone Used To Eat

Despite advancements in manufacturing and availability of ingredients from around the world, many of our food preferences have remained the same for thousands of years. Just like the Ancient Romans, we are still hooked on fine wine, bread, and cheese, and just like the Ancient Egyptians, people continue to eat chickpeas, watermelon, and garlic on a regular basis. While many of the raw materials in our diets have remained largely unchanged for centuries, some ingredients have had a much shorter shelf-life.

Technological advances in food manufacturing have allowed scientists to create new preservatives, flavorings, dyes, and supposedly healthy facsimiles of unhealthy foods. From white flour to high-fructose corn syrup, many of these ingredients exploded in popularity during the 20th century, only to decline shortly thereafter. Some seemingly miraculous ingredients, such as the non-fat oil alternative olean, were all but eliminated due to health concerns, while others, such as lard, simply fell out of favor. As ingredient labels continue to lengthen, let's look back on some of the ingredients that were once wildly popular that you probably won't see on many food packages nowadays.

Trans fat butter alternatives

In 1911, Procter & Gamble introduced a culinary marvel: a thick, white, buttery substance that looked and acted like lard but was made entirely of cottonseed. Crisco was the first product made through hydrogenation, a process that involves saturating vegetable oil in hydrogen at a high temperature. The result is a solid mass of fat rather than an oil. Americans went wild for the new product, which had none of the animal odor of lard or perishability of butter. Five years after it hit the market, they were buying more than 60 million cans of it each year.

Around the same time, margarine began to appear on store shelves. Originally made with beef tallow, the butter substitute was eventually made with hydrogenated vegetable oil as well. Its profile soared alongside Crisco's. By 1958 it was more popular than butter, and in 1976, Americans were eating about 12 pounds of it per capita each year.

In the 1990s, everything changed. Unfortunately for fans of these alternative fat products, hydrogenated oil is the manufactured version of a trans fat, which doctors eventually identified as the worst type of fat. Heart disease, diabetes, and colon cancer are just a few of the risks it poses. Even though Crisco eliminated trans fat from its formula in 2007, and many similar brands have opted to use the word "spread" instead of "margarine," neither product is as popular as it once was.

Olestra

If the word "olestra," or its brand name "Olean," means anything to you, chances are, you came of age before the 1990s. If you haven't heard of it, it's because its reign on American food labels was extremely short. Created by Procter & Gamble, olestra was a synthetic fat molecule that was too large to be absorbed by the intestine, boasting a miraculous zero grams of fat and not a single calorie. 

After 25 years and over $200 million of development, it was approved by the FDA in 1995 and became the hero ingredient of the low-calorie, low-fat craze sweeping the nation in the '90s. Olestra Pringles debuted in 1996, followed by Lay's WOW Chips in 1998. Consumers went wild, buying more than 100 million bags of the salty snacks within six months. But the love affair fizzled quickly, with sales of Procter & Gambles' products falling from $41 million per month to $28 million within only seven months of their release.

The problem lay in a pesky little disclaimer found on every bag of Olean-fried chips: "Olestra may cause abdominal cramping and loose stools." For many customers, this was an understatement. For the lucky ones, cramping and gas were the only symptoms, but others were afflicted with days of explosive diarrhea and fecal incontinence. Not surprisingly, olestra became a pariah and, although the FDA has never banned it, it is much harder to find on food labels than it was in the '90s.

Red Dye No. 2

People have been enhancing the color of foods with natural dyes such as turmeric and beet juice for millennia, but it wasn't until the late 1800s that synthetic dyes entered the scene. The invention changed the food industry, ushering in an era of products that tempted consumers as much for their color as their taste. Until Congress passed the United States Food and Drugs Act in 1906, it was a largely unregulated industry, with some dyes featuring toxic ingredients like arsenic, lead, and mercury. 

Despite the law banning poisonous dyes, one product slid under the radar for nearly a century despite extensive testing. Red Dye No. 2 was created in the 1870s using a by-product of coal and was one of the most common food colorings throughout much of the 20th century. Found in items like soft drinks, candy, sausages, dairy products, and baked goods, it was present in about $10 billion worth of foods by the mid '70s.

Despite its prevalence, Red No. 2 was the subject of hot debate for decades, with some scientists arguing that it posed health risks. Although it was shown to give rats cancer when consumed in large doses, other scientists believed it was one of the safest food colorings on the market. In 1976, after facing increasing pressure, the FDA banned the dye. It became such a hot-button issue that Mars, Inc. briefly stopped making red M&M's even though it had never used Red Dye No. 2 to color them.

High-fructose corn syrup

High-fructose corn syrup was once viewed as the perfect replacement for table sugar (sucrose). Invented in the 1950s and used widely by the 1980s, it is made by turning corn starch into syrup and introducing an enzyme that converts some of the glucose into fructose. The result yields a syrup with a similar level of sweetness to sugar, which came in handy as sugar costs escalated. It quickly took over the food industry and spiked in the late '90s. In 2003, Americans were consuming roughly the same amount of high-fructose corn syrup (60.6 pounds per capita) as sugar (61 pounds) (per the USDA via Mother Jones).

By the late 2000s, however, sentiments about the corn-based sweetener had begun to sour as scientists suggested that consumption of fructose might lead to obesity. The decline was part of a larger trend of consumers turning away from sugary products such as soft drinks, but even as rates of table sugar consumption remained relatively stable, high-fructose corn syrup consumption began to slide in 2006. By the late 2000s, "real sugar" had become a marketing buzzword, with brands like Pepsi, Snapple, and Pizza Hut publicizing their divestment from corn syrup for certain products and their return to good old fashioned sucrose.

Lard

If you happen to peruse an American cookbook from the late 19th or early 20th century, you will likely see lard in many of the ingredient lists. Derived from the belly fat of pigs, lard is a thick white substance that makes pie crusts flaky and cakes deliciously tender. Before vegetable oils were readily available, it was the gold standard and used in both sweet and savory recipes. Then came Crisco, a fat product derived from cottonseed that had a nearly identical texture and effect in recipes as lard, but was cheaper without any of the porky flavor. Procter & Gamble even tried to convince the public that Crisco was healthier by advertising its product as more digestible.

Lard was hit with an even bigger blow in the 1950s, when doctors began to point the finger at saturated fats as the cause of heart disease. A few decades later, Crisco met a similar fate when it transpired that trans fats were just as bad if not worse. By that time, vegetable oils were affordable and plentiful, so lard was not ushered back into the mainstream culinary fold. In the 2010s, some cooks started to popularize pig fat again, even claiming that it has under-appreciated health benefits. While it's undeniable that it makes a melt-in-your-mouth pie crust, it's still difficult to argue that lard is better for you than unsaturated fats like olive oil. Currently, it is nowhere near as popular as it was during its heyday.

Saccharin

From high-fructose corn syrup to monk fruit extract, it seems like there is a new sweetener bursting onto the market every few years. Some of them have staying power. Medjool dates, for example, are the world's oldest cultivated fruit and take the top spot on our list of 13 types of sweeteners ranked worst to best. Others have a bumpier track record. Saccharin is one such ingredient, also known under one of its trade names like Sweet'N Low and Necta Sweet.

Discovered way back in the 1870s, it is 300 times sweeter than table sugar and has zero calories. It became a popular ingredient during the first and second world wars when sugar was in short supply, and exploded in popularity in the 1960s as artificial sweeteners became popular among dieters. By the late '70s, 44 million Americans used saccharin every day (per Science History Institute). 

Around the same time, government regulations on food processing became stricter, and saccharin was caught in a lengthy public battle. It was briefly banned in 1977 due to findings that high doses caused cancer in rats, but these results were deemed irrelevant to humans in another study in 2000. These days, the FDA asserts that saccharin is safe to use, but many brands have replaced it in their formulas with more natural-tasting aspartame, and it seems unlikely that it will ever regain the status it held in the '70s.

Brominated vegetable oil

When you think of vegetable oil, you might think of its uses in baked goods, fried foods, and salad dressing, but one of the most widely used oils in recent years is known for its presence in soft drinks. Brominated vegetable oil, which is usually derived from corn or soybeans, is used by beverage companies to help emulsify flavoring oils and keep drinks from separating. The FDA officially approved its use in small quantities in 1970 as a way to keep citrus flavorings from floating to the top of certain soft drinks, stipulating that it must appear in ingredients lists.

Health concerns about brominated vegetable oil have been widespread for decades, with health experts warning that it can cause irritation to the skin, nose, mouth, and stomach, and can even cause neurological symptoms in people who consume more than two liters of drinks that contain the ingredient each day. Many brands removed brominated vegetable oil from their products voluntarily, including Coca Cola and PepsiCo, but in 2024, the FDA chose to revoke its 1970 regulation and ban the substance outright.

Azodicarbonamide

Anyone who reads food labels will be familiar with the jumble of letters that constitute the names of many modern additives. These words are tricky to read, let alone pronounce or spell, which makes it difficult to look them up to find out what, exactly, they are. Azodicarbonamide is a prime example. Unless you're a food scientist, there are few clues in its name as to what it might be. Is it a texture enhancer? A dye? The answer is a little bit of both.  

Azodicarbonamide is used as a food coloring, a flour improver, and a bleaching agent, especially in bread products. It is also, as many critics have pointed out, a compound used to make yoga mats. Food manufacturers love azodicarbonamide because it helps proteins bond and form gluten, thereby making it easier to use cheaper, low quality flour that would otherwise struggle to form the elasticity needed for bread dough. In 2014, it was being used in hundreds of products, including Pillsbury Dinner Rolls, Little Debbie cakes and pastries, and Wonder Bread.

There is disagreement among scientists over whether the ingredient poses a health risk when ingested in bread. In its raw form, azodicarbonamide is not believed to be toxic. However, some health experts, including the independent consumer advocacy organization the Center for Science in the Public Interest, argue that it poses a cancer risk when baked. The FDA maintains that it is safe, but many brands, including Subway and Nature's Own, have phased it out.

Cyclamate

Some controversial ingredients receive a disproportionate amount of criticism. Saccharin, for example, has been the target of debate for over a century, even as the FDA continues to insist that it is safe to use. Cyclamate is another story. The sugar alternative has been around since 1937, when a graduate student at the University of Illinois was taking a smoke break while working on a fever drug and tasted something sweet on his hands. It was about 50 times sweeter than sugar, calorie-free, and cheap to produce. 

In the 1950s, when diet sodas were becoming all the rage, products including Tab and Diet Pepsi used the sweetener, and it was the main ingredient in the early formulas of Sweet'N Low. Many products paired it with saccharin, an artificial sweetener that is more than 300 times sweeter than sugar and has a slightly metallic flavor on its own.

The love affair ended in 1970 when cyclamate was banned in the U.S. due to studies showing that it caused bladder cancer in animals. Similar findings about saccharin led to a brief ban, but they were eventually dismissed. Cyclamate was not as lucky. Although the study was later deemed to be unrelated to human health, there are still concerns that it may amplify carcinogens in the body. It remains a banned food additive in the U.S., though it is still legal in Canada, Europe, and elsewhere.

Safrole

If you've ever wondered what sets root beer apart from any other dark colored soda, the answer lies underground. Sassafras is a tree native to North America with roots (hence the name "root beer") that Native Americans used medicinally. When colonists arrived on the continent, they made beer with whatever ingredients they could forage, including sassafras root, ginger, molasses, herbs, tree bark, and berries. Some of these ingredients stood the test of time. 

By the mid-20th century, root beer was made with sassafras, the vine sarsaparilla, licorice, mint, and nutmeg. It was a spicy, refreshing concoction that couldn't be replicated. Until it had to be. In the 1960s, studies showed that safrole, an oil found in sassafras and sarsaparilla, caused liver cancer in rats, leading the FDA to ban both ingredients from being used as food additives.

Subsequent studies cast doubt on these findings, especially in relation to their implications for human health, but the ban stuck. Meanwhile, other ingredients containing safrole, such as nutmeg, continue to be used widely without negative effects. Nowadays, root beer is made with a variety of artificial ingredients instead of those found in nature.

Artificial flavors

Around the time scientists were discovering synthetic food dyes, another branch of the profession was uncovering artificial flavors. As early as the 1850s, scientists had managed to chemically recreate various fruit flavors, and by the end of the century, artificial flavors were found in candies, beverages, and foods sold around the world. With the rise in processed foods in the mid-20th century, artificial flavors abounded. Even as other food additives fell afoul of increasing scientific study and public health concerns, artificial flavors escaped largely unscathed. The FDA even exempted them from the testing standards imposed on most other ingredients since such small quantities of the flavorings were required. 

But the tide began to turn as low-fat and low-calorie crazes evolved into increased consumer demand for "natural" ingredients. The difference between natural and artificial flavors is not a simple binary, nor are the health implications, but brands like General Mills have moved to phase out the latter category in favor of the former.

A further blow arrived in 2018 when the FDA, facing a lawsuit from various environmental and consumer groups, banned seven artificial flavorings after they were found to cause cancer in lab animals. The agency stressed that it was delisting the ingredients in accordance with a 1938 law requiring additives to be banned if they are found to cause cancer in animals or humans. Still, it argued that extensive testing had shown they were not harmful to human health.

White bread

Bakers have been making white flour since about 3000 B.C. By discarding the outer husks of wheat kernels and retaining the smooth, white interior, it results in lighter, softer baked goods that rise to loftier heights. Toward the end of the 19th century, manufacturers mechanized the separation process, creating flour that was smoother and whiter than ever. In the 1910s and '20s, uniform loaves of squishy white bread were a sign of wholesomeness and progress, even though they lacked the nutritional value of wholewheat bread. When Wonder Bread hit grocery store shelves in 1921, its colorful packaging and blindingly bleached color signaled a bold new world of processed food that was wildly popular for those who had been eating humble homemade bread their whole lives.

Things changed in the 1960s. As perhaps the most common food item in the home, bread was a symbolic target for the counterculture movement, which rejected capitalism, social conformity, and industrialization. As concerns about nutrition grew throughout the latter half of the century and into the 21st century, the nutritional value (or lack thereof) of white bread took center stage, further denting the reputation of the once highly sought after kitchen staple. Even more significant, sales of whole wheat bread have surpassed those of white bread since 2009. Nowadays, bread brands emblazon the whole grain content of their loaves on their packaging, even when a careful read of the ingredients reveals there isn't much of it to speak of.