I've written a few columns over the past year or so, and increasingly recently, concerning structural unemployment; specifically described in this context as technology displacing human labor.
One of the things I find most interesting, from a sociological standpoint, is the denial that this shift is occurring and that it has been accelerating for the past three decades.
The question is not whether this shift is happening but whether it is permanent; and if it is permanent, what the trajectory indicates about human employment prospects over the next generation.
I will get to that in just a moment, but I want to address the denial issue first. Last week, the Bureau of Labor Statistics released the employment situation for July 2013.
Leading into the report, the payroll tax receipts, which are collected and reported on daily by the U.S. Treasury and made available publicly for anyone who bothers to track them, indicated that the growth rate in tax receipts is decelerating rapidly and approaching no growth on an annualized basis.
The most logical conclusion is that no new jobs had been created during the month. And yet, the publicly available prognostications by the largest institutions were for an increase of about 185,000. When the Bureau of Labor Statistics reported 162,000, the general consensus in the financial markets and media was that the number was close and that at least it was still positive.
The reality, though, is that the majority of the created jobs were in low-end service-sector employment. This largely reflected employers taking two 40-hour-per-week jobs and making three 27-hour-per-week jobs out of them, probably so they could avoid paying for employees' healthcare insurance.
There's also nothing new about this, and it surely should have been noted by any reputable analyst in making their prediction for job creation. I did not see this noted in any of the reports I read, and I only heard it discussed in passing as an indirect or ancillary issue.
We measure employment because it is a proxy for economic growth. If you cut a pie into quarters and then into eighths, the size of the pie hasn't changed.
The U.S., on a per capita basis, is experiencing a decline in the number of jobs available and in the payment received for performing the work associated with them.
Currently, about 30% of the total population of the U.S. is employed full time in the private sector. On the current trajectory, that number will decline to 25% within 10 years, and to 20% within 20 years -- one generation. Concurrent with the decline in jobs is the decline in payment for them.
On the pay side, the average U.S. worker today receives an hourly wage that in purchasing power terms is about equal to what the minimum wage was 40 years ago.
The average salary offered for entry-level jobs requiring college degrees is now below the minimum wage of 40 years ago in purchasing-power terms.
Had the minimum wage remained constant against the M2 money supply over the past 40 years or been raised in connection with the real increases in the cost of living during that period of time, it would be about $18 per hour today. Instead, it's $7.25 and has been for four years.
These conditions are not compatible with a consumer-driven capitalist economy. If something is not done by both public and private-sector leaders to alter this trajectory, the U.S. will experience a crisis on a scale most can't even fathom today, a real existential crisis.
As investors rationally and understandably invest in the companies that create the technologies that are now destroying jobs and driving down incomes, they need to be aware that they are simultaneously destroying the concept that the past is prologue.
Like Steven King's Langoliers eating the past, technological advancement is now challenging the applicability of our social, political, governmental, judicial and economic foundations.