The other day I was studying for Praxis - a lecture entitled “Visions of History”. It’s point was to encourage others to view history in light of the progress that’s come through the help of entrepreneurs, trade, and individuality. Instead of thinking of history as a list of dates to remember (of wars, discoveries, etc.), view it as a story of man pursuing a better life by also helping others.
I won’t go into the deep economics, but it gave a clear picture of the progress that’s come from the accumulation of capital, and through trading or inventions.
But there was something interesting once the lecture ended and someone during Q&A asked this question: “Why is it that this story isn’t more widely accepted? Things are getting better and better all the time, but most people think things are getting worse, so why is this?”
The answer’s simple, short, but has been in the back of my mind since: “Bad news sells.” For some reason (maybe the wording is what hit me) but I started to see that phrase be truthful in different areas. Think of advertisements, think of politics, etc. It’s pretty much become a habit for most people to be sold by negativity. It just makes me wonder if it’s because of a lack of adventure that most want to have? Maybe a lack of danger? I think it’s a part of human nature to want to be wild, and I think society has tried to keep that curiosity tamed. Maybe bad news is becoming “attractive” because we can say things like “I told you so” or “I knew it all along”.
People want to feel important. We just don’t want to take the risks, so we buy into bad news because it’s easier, and because bad news is powerful.
At least, that’s my theory.