Saturday, August 09, 2025

The Great Flattening

The combination of Artificial Intelligence tools, continued layoffs in IT fields and reduced opportunities for recently laid off workers and new college graduates has resulted in a flood of news stories and online commentary that seem to be converging on a common term for the phenomena -- The Great Flattening. In this simplified narrative, the job market is being squeezed because corporations are concluding there is a significant swath of middle management labor previously providing "analysis" and "tracking" and "forecasting" required by upper management that is no longer needed or can be done via AI based automation.

That's a concise summary. It fits into one paragraph to match the attention span of modern readers. It even gels with other popular narratives helping to hype AI. Unfortunately, this explanation is missing crucial details that provide a more complete picture on the causes of this trend which means this explanation does little to help society understand what can be done (if anything) and doesn't help individuals who might be affected or already have been affected to prepare for what comes next.

What's wrong with this simplistic "great flattening" theory? It fails to reflect a unique combination of regulatory failure in the economy (the American and worldwide economies alike) and a perpetual disconnect between the theory of business operations and the actual power politics of large corporations. Those two forces are creating a discontinuity in staffing requirements that is unique in its SIZE but not its NATURE that happens to be coinciding with the advent of new AI technologies that are benefiting from a lack of regulation while promising to enable additional labor savings in many of the most expensive labor categories within large corporations.


Business School Theory

In efficient economies, efficient companies led by professional managers continuously monitor the entire business environment and examine market demand, willingness to pay, technology that allows trade-offs between labor and automation and the status of competitors. These observations are then used to make alterations to a company's products and the processes used to make them, including choices between investments in labor and training versus technology and automation. An efficiently operated firm that knows demand for its product(s) equates to X when it only has labor to produce 75% of X can choose to a) spend more money on overtime to meet demand with existing labor supply, b) hire more workers, c) adopt new technology that increases output without requiring more labor or d) do nothing and cede market share to competitors.

In theory, a professional manager will choose (a) (overtime) if demand is only thought to be temporarily high or seasonal. In theory, a professional manager seeing a permanent increase in demand would analyze the productivity trade-off between labor and capital equipment then pick a mix that would increase capacity to the full value of X.

p>Conversely, the same firm with capacity for X units facing demand for only 75% of X also faces choices – a) keep producing X units with the current labor and build unsold inventory, b) only produce 0.75X units with the current labor force to avoid costs for extra materials and unsold inventory, c) reduce the labor force to the level required to only produce 0.75X units to save money on both labor and materials and avoid excess inventory.

In theory, a professional manager would only choose (a) (no changes) if contractually bound to continue buying supplies, etc. Option (b) would be chosen if the shortfall in demand was viewed as temporary or seasonal and the cost of idling workers or retraining replacements exceeded labor savings. Option (c) would be chosen if the business recognized the shortfall in demand as persistent or long term, in which case any other choice is just delaying an inevitable reckoning of unprofitability.

In short, in both scenarios for market growth and market shrinkage, a professional manager should be attempting to continuously monitor the need for labor and making consistent (monthly? Quarterly? Yearly?) adjustments to staffing levels to meet output demand. Visually, a company's headcount should theoretically exhibit a stairstep pattern that remains tightly correlated over time to the output of the company's processes. Something like this:

That's why MBA students take classes in managerial accounting and operations management, right? Perhaps those MBA students need a better class in Human Behavior and Organizational Design. What actually happens in many (most?) corporations is vastly different.


Corporate Reality

This process of continual optimization theorized by business school curriculums (and suggested by common sense) is distorted or short-circuited entirely in large corporations by a consistent set of human behaviors:

  • Middle management is reluctant to share early indications of future bad news with senior executives. Senior execs don't want to hear bad news or excuses, only results, no matter how absurd the goals may be or bad the external environment might be.
  • Senior executives are often reluctant to act upon legitimate bad news after hearing it, either for fear of consequences from a board or a belief they can achieve impossible results by sheer force of their will (see The Steve Jobs Reality Distortion Field).
  • People in organizations are prone to empire building. Headcount is equated with influence and increasing headcount is frequently a requirement for promotion to higher titles and pay so virtually no manager is going to VOLUNTEER to surrender headcount (even after a voluntary exit) as "unneeded".
  • Senior executives often protect their turf at the expense of other organizations. When a company reaches a point where wholesale cuts are required, many executives will argue their department is different and is tied to revenue or "customer experience" and that cuts need to come from elsewhere. Anywhere but my department. I run a tight ship, my department is perfectly sized, everyone else is bloated and inefficient.
  • Seemingly continuous "re-organizations" every time an executive role changes hands which result in responsibilities shifting to new leaders who don't understand the roles operating under them. This often produces one of two opposite but equally harmful problems. It often yields a situation where the new leader doesn't recognize a newly inherited function is over-staffed for current needs and thus allows its bloat to go uncorrected. It can also yield a situation where the new leader accepts a new responsibility without some or all of its current headcount. This ensures their new team will be perpetually overworked, further heightening middle managers' reluctance to let go of headcount without a gun to their head.

All of these human behavior traits lead to a consistent, inevitable result. Instead of headcount levels closely synchronizing with production demand as taught in school, headcount levels consistently trail increasing demand and are adjusted even less frequently on the downside of any demand curve. Instead of relatively small incremental adjustments that might actually line up with normal job churn attrition over the course of a year, needed reductions queue up and become mass layoffs, dumping a much larger set of similar workers into the job market at the same time, causing more difficulty in obtaining new employment. Visually, the result looks like this:

Regulatory Failure

All of the behaviors described previously take place in any significantly large company with hundreds or thousands of employees. Obviously, looming financial problems are harder to ignore for smaller companies lacking the financial inertia of firms with millions or billions in revenue but the principals at work are identical. A unique factor in the current environment is many of the most notable corporations tied to large layoffs appear to be among the most profitable firms in the economy. One immediate response to that observation is SEE? That's why these companies are so profitable… They are immediately leveraging new technologies and laying off unneeded workers the second they are no longer needed. Isn't that what business school theory says they SHOULD be doing?

Excellent counterpoint.

Theoretically, that counterpoint might be valid in some cases. However, the biggest firms tied to this "flattening" share at least some non-coincidental traits:

  • They develop software for core AI algorithms
  • They develop hardware optimized to execute AI algorithms at vast scales
  • They sell "compute" (processing power, storage and network connectivity) required to develop and operate AI systems at vast scales
  • Their AI development work has violated copyrights and intellectual property rights of literally MILLIONS of individuals around the world.
  • Their EXISTING online platforms SHOULD require vast amounts of human labor to accurately / fairly enforce copyright, intellectual property rights and CSAM (Child Sexual Abuse Material) protections yet NONE of these firms meaningfully handle these responsibilities, saving themselves billions in costs while creating a wild west online environment.
  • Their NEW online AI platforms should ALSO be requiring vast amounts of human labor to properly test these systems for proper guardrails yet NONE of these firms have devoted meaningful resources for properly testing this unproven technology – the entire world is beta testing these technologies in the real world.

Underpinning all of these factors is that many of these existing firms (Google, Microsoft, Meta/Facebook, Amazon, Apple) are gargantuan in every measure of power. Meta is currently the smallest of these firms and its market capitalization is $1.941 trillion dollars. Microsoft is currently the largest among this group by market capitalization at $3.9 trillion dollars. Nvidia is currently THE largest corporation by market capitalization at a staggering $4.5 trillion dollars but they have NOT engaged in mass layoffs… at least yet.

It is absolutely the case that there are characteristics of the computer software and hardware industries that provide economies of scale that make operating these types of businesses at these scales extremely profitable. However, there is nothing unique about the computer software and hardware industries that negate lessons learned over centuries about the harm done to society by monopolies. In every generation, in every economy, in every sector, monopolies reduce supply, raise prices, limit choice and stifle innovation. EVERY. TIME.

The unique aspects of these firms and their line of business do not mitigate these damages, they make them far worse. By limiting choice and innovation in the functionality of systems used as mass media for news, marketing and personal communication, firms operating at these scales are creating distinctly high levels of damage to the societies in which they operate. Damage which should have been corrected ten to fifteen years into their existence through proper enforcement of anti-trust laws already on the books.

So why are these tech giants making such large staffing cuts amid record profits? Quite simply, because ongoing operation as monopolies with little meaningful correcting influence by the government has trained them to believe they can continue developing more software and hardware used for ever more critical purposes with less quality control and continue to enjoy all of the upside. All of the downside stemming from poor, un-innovative design and non-existent quality control becomes an externality applied to customers who pay extra on patching buggy systems or buying even more software automation to monitor and correct security problems produced by these products. And all of the traffic that used to arrive on web sites looking for content or product information simply disappears gradually or all at once as AI summaries make click-through to original content pointless. How can a small company prove a giant Goliath stole a click that never arrived? In such an insular environment, there's no incentive to retain extra staff beyond what's required to deploy the next buggy release. As Martin Weir of Get Shorty might have asked… What's my motivation (to do anything different)?


Back to the Larger Great Flattening

So if these tech giants already making oodles of money off AI technology are just obeying their monopolistic instincts, what is motivating companies in the larger economy? Again, the motivations affecting these big tech firms are only unique in their MAGNITUDE, not their KIND. All of the human behavior patterns discussed previously take place in every company. All of those behaviors create lag effects that build up over decades and, like water, become invisible to the fish swimming in the corporate environment. So why do these job reductions seem to be concentrated on the vaunted "knowledge workers" of only a decade ago?

Multiple reasons...

First, by definition "knowledge worker" tasks such as software coding, software testing, software requirements writing, budget analysis, etc. all involve a great deal of reading, writing and summarization and 100% of it is done electronically in Word documents, emails and spreadsheets. This is the perfect work product for slurping into an AI training cycle and entering into an AI prompt for new content since the source "knowledge" and "ask" are all in text format suitable for processing with Large Language Model based tools. In contrast, it could be forever before AI systems come after the jobs of carpenters, roofers, plumbers, barbers, dentists and surgeons. It will prove impractical to devise a robotic system that can perform a task which is intensive for both physicality and skill. Fast food jobs are different because they are highly specialized and simplified to require virtually zero training or physical strength / skill.

Second, as the innovations of the first Internet technology wave of 1996 to 2004 took root in corporations, most recognized the need to redesign corporate systems for commerce, internal production and order fulfillment which triggered a stream of internal software development projects to design customer portal websites, internal customer service agent tools, etc. Applications which may have only started with the need to serve up simple pages to hundreds or thousands of users per day rapidly grew to where the system needed to support thousands of users per minute. These development efforts required more formal design and planning to keep them remotely near their budgets and delivery dates. This not only resulted in hiring more developers and testers but people with new-fangled responsibilities manufactured as part of new software development processes that promised to solve all of the productivity and quality problems that exploded with this crush of work. Few of those new methodologies worked as claimed yet the new functions became ingrained in headcount charts across nearly every corporation.

When the second "big data" technology wave arrived between 2004 and 2016, all of these existing poorly implemented systems developed in the first wave often underwent "enhancements" to tie them together with real time feeds or leverage massive data repositories to automate customer service inquiries and trouble diagnostics. This work required even more rounds of development with all of these new software development lifecycle (SDLC) processes requiring even MORE of these workers who weren't actually tied to core development or testing of code. An application whose original iteration in 2000 might have taken 15-20 people to code, test and deploy over 9 months might now be assigned 75-100 people over 15-18 months to refactor for supporting iPhone, Android and web views. Curiously, the count of actual CODERS and TESTERS might still only be 20-25 people.

In short, software development processes in most corporations are incredibly bloated with headcount that never touches a line of code during development or testing and has no involvement with deploying, running and monitoring the system in production. Within development circles, the term "10x developer" is commonly thrown around when talking about "productivity." Productivity for developers is inherently difficult to quantify consistently / fairly – is "more source code" for a given problem "more" productive than less source code? NO. Is faster calendar delivery of code better than slower calendar delivery of code? NOT NECESSARILY. In general, this "10x developer" concept describes a pattern seen in nearly every large development organization. A pattern where it seems like in a team of twenty developers, a small subset, maybe only four – seem to do 80% of all of the development of code that reaches production. This is seldom quantified or proven but based on decades of experience, the SENSE of this definitely does ring true for actual coding work. When applied to the larger collection of headcount AROUND core development but not DOING core development, it is ABSOLUTELY the case. This means that there is likely a large number of people who describe their role as related to software development who actually never write a single line of SQL, C#, Java or Python yet count themselves as developers when laid off. They are not.

Lastly, this "overhang" of dead weight in these indirect job functions related to software projects have gone uncorrected over the last twenty years precisely because of the empire building and turf-protecting patterns described previously. Many of these roles were not added to the managers operating actual development teams but separate "project management" teams operating in parallel within the development organization. That organizational chart design choice immediately destroyed any incentive to eliminate job functions that had no demonstrable effect on productivity or quality and instead created a new turf to be defended and even expanded to justify the existence of new chains of management. Ask any person employed in an IT organization that has adopted "Agile" development in the last decade and they will confirm this phenomena.

As a final factor acting to cement this inefficiency into organizations, the rapid evolution of software architectures has virtually guaranteed any mid-level technical managers have zero intuition for appropriate technical designs and required development time for current applications. Architectures have evolved from mainframe apps in the 1960s and 1970s to isolated desktop apps in the 1980s to client/server applications in the early 1990s to server-based web portals from roughly 1995 to 2010 to browser / smartphone centric apps from 2015 to the present. Any mid-level or senior-level IT leader has experience that is likely two generations out of date, making it very difficult for them to argue against the continued use of these inefficient practices that contribute to so much headcount bloat. They may recognize it intuitively but they are unable to articulate the problem effectively to convince everyone else to abandon a failing methodology.

The net-net of all of the above is that IT roles in particular seem uniquely primed for cutbacks even without Artificial Intelligence solutions promising productivity improvements. Artificial Intelligence merely provides an easy EXTERNAL excuse to make these giant corrections without existing management having to state these inefficiencies were present all along and should have been identified and eliminated over the last ten to fifteen years.

For functions outside IT like financial analysis, the rationale might be similar but not as exaggerated. Large corporations typically create special budget analysis teams that operate in parallel with large departments like IT, Customer Service, Manufacturing, etc and use data collected from payroll and ERP systems to track departmental spending against budgets on a weekly basis. At my former employer, this continuous true-up process was a joke and a nightmare because data quality in the source systems was sketchy and the analysis and true-ups were performed in Excel (in a department spending $100 million per year in expense and $50+ million in capital). At some level, the entire exercise was even more pointless because at any point in the year, approved capital dollars might be reduced or eliminated for a project with zero notice, requiring panicked rebalancing and reprioritization of work. If reports confirmed a project was going to materially overspend but the project was tied to a pet executive goal, funds WOULD be found, making the time spent truing up the data by mid-level managers worthless. In these types of environments, it's possible this brain damage could be done equally well by AI based tools that could skim the source data and produce the same summary Excel spreadsheets and PowerPoint bullet summaries. The analysis was already consistently flawed, why not let AI do it and let five budgeteers per department go?


Implications for the Future Job Market

Any analysis of future impacts from this wave of layoffs must address three different pools of workers which will be termed here 1) core IT workers, 2) near-IT workers, 3) analysis / reporting workers, 4) entry level workers.

For core IT workers directly writing code or testing code or peforming work that directly affects functionality being built, there will likely be continued pressure on eliminating roles in this area until the longer life-cycle impacts of involving AI systems in development become known. Experts in development have already commented that AI excels at developing SMALL bits of STANDALONE code and can outperform most developers at that function. This same analysis has confirmed that AI is exceedingly poor at developing small code changes to EXISTING complicated systems or designing and creating code for LARGE systems integrated to multiple other systems. That's no guarantee that an executive hearing an external vendor or consultant whisper in their ear about how they can rewrite all of your systems with an AI for a mere $4 million won't take that bet and try it. However, it seems highly unlikely AI will be able to replace compentent core developers doing real integration work in a typical corporate "enterprise" setting in the next five to ten years.

For "near-IT" workers, those working in proximity to actual development work and trying to translate internal development mumbo jump like stories, sprints, epics, roadmaps, backlog, burndown, retrospectives, etc. into upper management speak, the attempt to adopt AI for that reporting could likely be the straw that breaks the back of the camel known as agile and leads management to abandon most of this noise. If this is what you do for a living, it is likely the number of openings for this work will decline rapidly, meriting a pivot to other work.

For analysis / reporting workers, the impact of AI on these types of reporting roles is likely to be similar but smaller in magnitude than the impact forecasted for "near-IT" workers. Simply put, most of the reporting and detail tracked by near-IT workers made sense to few inside IT and made ZERO sense to anyone outside IT including clients who relied on IT to build their systems. Financial and operational reporting is different. It can have material impacts on processes that affect publicly reported numbers on quarterly statements and – let's face it – MONEY means much more to any executive than technical gobbledygook about "development sprints" on an "agile project" building a portal for Customer Service. As a result, the internal consumers of this "non-IT" reporting are FAR more conservative about changing ANYTHING in the process and will be very slow to trust any change in those processes. That being said, remaining in this line of work merits broadening your skills to learn how AI can provide a front-end to existing ETL (Extract / Transform / Load) tools that might be in use today to compile data being summarized and audited and mastering those tools.

For entry-level workers, frankly the outlook is easier to predict. The types of work outlined above often provided opportunities for entry level workers or as promotional slots which freed up slots for new entry-level workers. Any management trend that eliminates five or ten percent of jobs in any company, even if the elimination targets mid-career functions, inevitably harms entry level opportunities by reducing the total number of slots for people to move up to to vacate lower level jobs.

It might be possible to suggest (hope?) that one possible outcome of this introduction of AI to push out a share of mid-career workers is that people in many other mid-career roles with aversions to technology may choose to leave those roles as well as they are forced to interact with AI-adjusted processes. If this were to happen, it MIGHT open up remaining positions to people more comfortable with AI, providing an opportunity. Maybe… But those few openings would not only require familiarity with and confidence using AI for assembling that reporting but would still require significant understanding of the underlying business processes being tracked at both a financial and needs-of-the-business level. Domain expertise as it is termed within industries.

For existing employees at all levels, one final general piece of advice would be to begin formulating an understanding in your mind about the degree to which the management above you tends to exhibit a steady hand on the proverbial wheel or follows the latest management fads or advice from expensive consultants that never seem to leave the executive floor in the building. If you feel your management tends to follow fads, especially those in disciplines they already do not seem interested in understanding, I would pay very close attention to the introduction of any tools or systems tagged as "AI", especially when they directly impact your job role. Trust you instincts if you suspect you have become a target for elimination and plan accordingly.

The most useful advice for new graduates and those in college looking at an employment landscape being re-sculpted by AI like a combination tsunami, earthquake and hurricane might be this. Ensure you learn the basic fundamentals of AI from a terminology and functionality perspective. This does not involve reading and memorizing the whitepaper Attention is All You Need and understanding the matrix algebra therein. Ensure you understand how AI systems encode data, train on it, then use it for processing and the shortcomings of those processes. (HINT: Read up on the variability of training data in different languages and the selection of sources that provided the petabytes of data needed to improve system metrics.) If you are still in school, to the extent possible, fit courses providing basic insight into business operations, accounting or marketing into your remaining coursework to gain familiarity with how "business" people think and the terminology they use to describe their work. Over the course of a career, being able to converse in business terms with different teams across a company will be more valuable than knowing how to generate a bitchin' graph of revenue across products over the last two years. If you're a recent graduate still looking for a first full-time job, work to develop a conversational familiarity with the basics of AI and as best you can, ask implicitly or explicitly during any interviews how the prospective employer is approaching AI as it relates to the role you're considering.


WTH