There was an absolutely fascinating article in The Atlantic last week that I think we need to spend some time looking at together.
I think it marks a sea change in how we as a country are taking control of our health and making better choices about what we eat, free from decades of “meat is bad” propaganda.
Is America Done with Plant Based Meat?
After doing a pretty good job of breaking down the big media pushes over the last few decades to move us away from animal fats into more seed oils and plants, she says “Now the goal of eating less meat has lost its appeal”. No kidding.
The most fascinating thing about the article is the author puts the blame for the move on the shoulders of politics, especially all those evil right-wing MAHA people with their Joe Rogan podcasts and RFK press conferences.
Never once does she deal with the fact that we all believed the propaganda for decades, ate less meat and just got fatter and sicker because of it.
Fortunately, I think we are all waking up from some kind of bizarre fever dream from the last few decades where we were asked to take as an article of faith that plants always equal healthy and meat would clog our arteries and even a single steak would give us a stroke or take years off our lives.
I have extremely high hopes that this is the beginning of a food renaissance where we can all make better food choices, free from Vegan propaganda and begin to reverse some of the horrifying trends we are seeing in our health.
