In my last post, I went on a little (OK, so it was a lengthy) diatribe about the stratification of society as seen through an albeit unscientific surveying-the-landscape point of view. (That last post was written in two separate writing sessions, interluded by an evening of sleep and one proper dose of Tramadol, so I apologize for any rambling or self-repetition.) 😉
I hinted in that post that maybe I had already nailed down some of the “why”s, but it turns out that I had left out some important aspects. A big Thank You goes out to a great friend and commenter for reminding me of some of those important aspects, leading ultimately to my realization that my work here was not quite yet done (grin).
And so, here I am, gladly tying up (most of) the loose ends, although I’m sure that I’ll probably forget something (or a few somethings) along the way (grinning again).
I think that my lovely commenter was spot-on when she brought up the failure(s) of Reaganomics, the trickle-down economic theory that he supported and enacted. This probably laid the early underpinnings of the foundation, paving the way for the situation we have today. Or at least, it was probably a large early contributor.
In theory, Reagan’s economic policy theory was sound: cut the tax rates of the wealthiest people and biggest companies, encourage investment and all that, and if you don’t penalize it and instead, you create incentives to do it, you’ll get more of it. He figured that the wealthiest entities create jobs for others, which in turn leads to more people employed. If businesses aren’t having to shell out a huge tax bill, then they’ve got more money left over with which to accomplish this.
So far, so good. Except, like communism and socialism, it might look utopian and altruistic on paper, but it doesn’t pan out in practice. Cutting the top tier of the tax brackets didn’t actually work.
The article doesn’t explore the reasons why it didn’t work, and I have few hard facts to stand on. But as usual, I have theories! If there’s one thing I’m not short on, it’s theories (grin again).
I’m guessing that since large businesses are ultimately driven by human beings, a big ol’ tax cut probably only went toward job creation when it acted in their best interest to do so. Ultimately, however, additional jobs add up to additional headache (I should know, and I’m the creator of one measly job position!), and they present a large ongoing expense. So it only makes fiscal sense to create a job position when that worker can make you more money than they cost you. That in itself is not necessarily a bad thing; it’s part of Business Wisdom 101–that is, if you want your business to remain solvent and continue to exist.
Most of the time, however, I have the sneaking suspicion that the extra money in the hands of corporate head honchos went into their own investments and retirement funds, not toward job creation or stimulation of the general economy. And with a few more people having a lot more money, this advanced the development and proliferation of luxury items.
I don’t think it stopped there, nor do I think it actually started there.
To see how and where (and maybe why) it started, one must flip back a few more chapters in the history books, to a section that took place long before our time: The Great Depression of the 1930s, which was kicked off by the stock market crash of 1929.
I feel that it’s important to point out that not everybody suffered so much during this time. Unemployment reached a Mount Everest-esque peak of 25%, but did reach that peak for several years, nor did it remain there for very long. As for everyone else? They were still employed, although times were rough all over and things were tight. (The actual numbers cited paint a picture that sounds eerily similar to our recent Great “Recession”, which will more than likely get its own post.)
Regardless, it was not so much the raw numbers of people affected, but rather the type(s) of people affected. In other words, it was not so much “how many?”, but “who?”.
Those affected were considered important people, the movers and shakers of the country: the investors–and they lost their shirts in this crash.
Because of their pull and clout in the grand scheme of things, laws were enacted to protect them. Among them were two points of interest (for our purposes, of this post), which, to paraphrase, are:
- Fiduciary responsibility: the legislatively-mandatory fact that if and when push ever comes to shove, the stockholders/shareholders always get considered first, and
- You must meet certain financial benchmarks–I.e., you must already be wealthy–in order to take part in the investment in “risky” start-up ventures (where the real money can be made, if the venture is successful).
One can already begin to see how this further separates the top-tier rich from practically everyone else.
The fiduciary responsibility aspect will get its own post, I promise. I have one brewing in my notes as we speak. For now, suffice it to say that in any publicly traded company, the shareholders come first, and this is mandated by strictly-enforced federal law.
What this means for the purpose of this post is that since a corporation must act in the best interest of its shareholders, there is mandatory pressure put on the leadership of that company to continue to change, evolve, innovate, cut corners, increase profits, be leaner and meaner, and so on–not only every year, but every freaking quarter, a period of merely three months. And this pressure is ongoing and constant, for the life of the company. The company must take new directions and improvise new strategies on a constant basis. And these strategies must not merely be constantly forthcoming and ongoing, but they must also satisfy the herd-like, skeptical mentality of the ignorant but all-powerful shareholder. It becomes very much a “what have you done for me lately?” vibe. And the ideas presented must wow the shareholders.
The strategy that many larger companies have turned to in recent years is one of a gentrification of sorts. “Middle of the road” is simply not good enough anymore. “Normal”, “regular” people don’t spend enough money. With healthy competition between businesses, many of the products sold have become commodities.
Intersect this situation with the advent and proliferation of online entities; with their lower overhead and more efficient business models, they’ve been able to give–and have given–brick-and-mortar stores a run for their money–literally.
What’s a higher-overhead brick-and-mortar store to do?
For many, the answer to that question is: go yuppie. Offer high-quality products, with marketing that appeals to the self-righteous, delusionally-superior “intelligence” (attitude) of the higher-income shopper with a more expansive line of credit, and do it with a hands on, full-service involvement that makes them feel important and catered to.
Another strategy has been to embrace the latest, hottest trends, in an effort to prove to the world how in-touch, “with it”, and cutting edge the company still is (although I’m not sure that Sears or JC Penney got that memo).
The other set of strategies has been to offer products of slowly-and-quietly-eroding quality, and continue to sell them for the same middle-class prices, and hope that no one notices. (Heh. That didn’t work on me; I noticed, and jumped ship accordingly. It’s been over 11 years since I shopped at Walmart for that very reason.)
Hence, the growing–and no longer invisible–divide between the offerings for the well-to-do and those for the not-so-much.
I don’t think that’s going to go away any time soon. After all, the stockholder as the primary concern is legally-mandated. And businesses are simply trying to do what they can to survive in a trickier economy, with a populace that can now compare prices and features of the same or similar items right there in a store, on a handheld smartphone.
It’s also not impossible to find middle class stores and merchandise; it’s just not necessarily the default anymore, so that, too, has become more challenging.
One last factor I can think of that might have unwittingly led us into our current situation involves the customer base itself. In the world at large, there’s this ridiculous pressure to keep up with one’s neighbors and community. There’s a semi-neurotic compulsion to have the latest gadgets, the trendiest clothing, the nicest house, the status symbol vehicle, and so on. Everyone’s trying to meet impossible benchmarks in multiple categories including physical appearance, parenting, economic status (or at least the appearance thereof), and so on. I’ve written before about what I believe is driving this compulsion (emptiness, deep and unrealized or un-admitted).
The inner emptiness driving this compulsion spills out into the external world, and influences our outward culture and landscape. Shopping has become semi-compulsory, too, and for understandable reason: it boosts dopamine, the fulfilling pleasure-based Neuro-chemical in the brain.
Basically, some (many) people feel empty inside, an inner deficit, which becomes an itch they can’t scratch, but the closest they can come to satisfying themselves and filling that hopeless void is by shopping, because it gives a temporary boost to their pleasure chemicals. And shopping in nice places, surrounded by an air of attainable luxury, boosts that dopamine supply even more. Spending time in an upscale environment gives the shopper the feeling that they’re important, that they’re worthy of something nice or worthy of doting service. The whole activity of shopping for purposes of leisure, or electing to spend more time shopping than is necessary, is essentially an act of (at least an attempt at) self-medication.
In short, people often shop to feel better about themselves, and they feel even better about themselves when they’re surrounded by a yuppie-style environment. They can forget, even if just for a while, that they’re actually average, middle class people.
People who depend on shopping, especially in pleasant surroundings, for their happiness, will ultimately spend more time there. They will linger longer than they practically need to. They’re after their “fix”, after all.
And the increase in credit card use during the same time period is simply the last nail in the coffin. People will whip out the plastic to spend money that might not actually be sitting in their bank accounts in order to satisfy this empty hole. It’s no longer a $10 or $20 dollar bill that’s burning a hole in their pocket; the hole is already there–it’s an emotional one, and they’re trying to fill it.
Combining all of these factors gives us the current situation. On one hand, you have shoppers who are empty, feeling a sense of peer pressure, who crave a yuppie feel-good environment in which to exist for a while and don’t hesitate to use their plastic to do so, coupled with businesses that are legally required to satisfy money-hungry but legally-preferred shareholders who demand ever greater profits and the changes required to make that happen.
It’s a perfect storm having come to a thunderhead, a three-way encounter between business, government, and consumer. Nobody and everybody is to “blame”. Everyone who plays a part, plays a part. It’s been created, and it’s still being perpetuated. It’s not as simple as greed, because there’s also long-time legislation, contemporary psychology, and ancient neurology involved.
It’s a complex web, and I had fun detangling that web as I see it. Hopefully, you had fun, too!
I’m always open to hearing your thoughts! 🙂