RIP To The Career Ladder
For young workers in AI-exposed jobs, the ladder is already collapsing. What replaces it is a lattice, with a different set of skills at the top. Which version of that lattice wins depends on what we do right now.
The Writing On The Wall
Applied AI practitioners have been watching this come for more than a year. In March 2026, Gary Sheng and Ron Roberts named it directly in The Writing on the Wall: the Rise of Applied AI and the Life-or-Death Choice Every CEO Must Make Now: every leader, every worker, every young person deciding what to study this year is standing at a fork. The runway to decide is measured in quarters.
The data has caught up. Three papers across November 2025 to March 2026 put numbers on what practitioners have been living.
The Canary (Stanford, Nov 2025)
Stanford's Digital Economy Lab published Canaries in the Coal Mine. Economist Bharat Chandar and his collaborators Erik Brynjolfsson and Ruyu Chen tracked millions of U.S. workers through ADP payroll data.
In AI-exposed occupations (software development, customer service, administrative roles), employment for workers 22 to 25 was growing 16% slower than in less-exposed occupations. Workers 30 and over in the same fields were still on trend.
Follow-on figures sharpened it. Employment for software developers aged 22 to 25 has now fallen nearly 20% since late 2022, the exact moment generative AI tools entered mainstream use. Devs 30 and over in the same companies saw employment grow 6% to 12% in the same window.
Same jobs. Same firms. Different ages. Opposite direction. That is the shape of a rung collapsing.
The Corporate Signal (Duke + Atlanta & Richmond Fed, March 2026)
The decision-maker data came next. The Duke CFO Survey run with the Federal Reserve Banks of Atlanta and Richmond polled 750 CFOs and projected roughly 502,000 AI-driven job cuts in 2026, about nine times the 2025 figure.
The authors are careful: that is still 0.4% of the U.S. workforce ("not the doomsday scenario"). But the shape matters.
- Half the losses are projected in high-skill services.
- Most cuts are preemptive: companies acting on what they expect AI to do, not what it has already demonstrably done in their own firm.
- Over the next three years, the same executives predict AI will boost productivity 1.4%, raise output 0.8%, and cut employment 0.7%.
This is what the Survivor Economy looks like from the finance team's spreadsheet. Inside the companies that employ most of the workforce, a quiet sorting has begun. The people who can harness AI become load-bearing. The ones who cannot are being written out of the org chart, one quarterly plan at a time.
The Compounding Gap (Anthropic, March 2026)
And the capability side is widening the gap every day. Anthropic's Economic Index: Learning Curves report found that experienced users achieve roughly 10% higher success rates in conversations than newer users, controlling for task type and country. High-tenure users tackle measurably more sophisticated work over time.
The report names the dynamic directly: skill-biased technological change. Early, more-technical adopters pull ahead, and the gap compounds. The people already building with AI every day in 2026 are becoming a different economic class than the people who started last month.
The Picture
Put the three signals together:
- Entry ramps narrowing. (Stanford)
- Organizations reallocating preemptively. (Duke / Atlanta Fed)
- Early adopters compounding. (Anthropic)
That is the writing on the wall. The runway to respond is measured in months.
The Ladder That's Breaking
Classical career progression assumed a ladder. Junior roles did the implementation work (book knowledge, rote execution, entry-level tasks) and earned the tacit knowledge that unlocked senior roles (judgment, strategy, relationships, taste). The ladder worked because the implementation work had to be done by someone, and doing it was how the next generation learned.
AI is increasingly capable of the implementation work. The rungs are collapsing upward.
For young people, this creates a specific problem: the on-ramp is narrower. For incumbents, it creates a subtler one: the next generation of senior operators is not being trained the old way, even in firms that think they are fine today.
And individual companies have a weak incentive to solve it. Chandar's point: firms know they need future seniors, but they also know their juniors can leave. So each firm hires and trains fewer juniors than would be good for society as a whole. That gap between private incentive and social benefit is where a movement like this one has to work.
The Lattice That Replaces It
Chandar and his collaborators name the replacement: a career lattice. Instead of one track from junior to senior, you get movement across roles driven by where demand is showing up. Careers become more fluid, more responsive, more learnable on the fly. That is the hopeful version. It only shows up if we upgrade the way people learn.
The pessimistic version is real too. If we do not change anything, young workers get stuck on the wrong side of the automation line and incumbents rest on tacit knowledge that AI eventually learns too. Nobody is permanently safe.
The Applied AI Society exists to build the hopeful version.
What Becomes More Valuable
Chandar identifies three categories AI is less capable of in the short to medium term: physical tasks, strategic thinking, and social interaction. Everything on that list becomes leverage. We would extend the list in a few directions our field notes keep surfacing.
Strategic thinking
Strategy is the new execution. As implementation commoditizes, the bottleneck moves to: what should be built, what should be executed, what is true about our situation, what do we actually want?
This is what a manager does. It is also what a founder does. And increasingly, it is what every individual contributor does as they guide AI agents to produce work. Knowing how to define reality, set objectives, evaluate outputs, and steer a system is the skill that compounds fastest right now.
Agentic strategy is one of the clearest expressions of this shift. When you have an AI with deep context on your life and work, you can pressure-test plans, surface blind spots, and interrogate your own certainty on a daily basis. That practice trains strategic thinking the way reps train a muscle.
Social skills and relationship capital
The usual AI-and-work discourse underweights this category. It should be at the center. A few beats:
- Trust is built in person. Founders close deals because someone believed in them. Partnerships compound because two people actually like each other. Communities form around people who show up. AI does not make any of that less true. It makes it more true, because the ambient level of slop rises and the premium on a real human relationship rises with it.
- Cultural nuance matters more as work globalizes. Globalized AI means globalized opportunity. The practitioners pulling work toward them are the ones who can read a room, adapt across contexts, and communicate with warmth. Our own chapter leaders opening AAS presence in new cities and countries are living this every day: language, empathy, and cultural pattern-matching are the difference between a chapter that takes root and one that doesn't. (See Tim Dort-Golts' transformation for one close-up.)
- Being someone's go-to person is a moat. When capability levels off across everyone with a Jarvis, the differentiator is the human your client calls first. Relationship capital is the hardest form of leverage in the AI economy, full stop.
Tacit and hyper-local knowledge
The book-learnable work is the most exposed to AI. What is less exposed: the knowledge that lives in bodies and neighborhoods. How does this city actually work? Who are the real decision-makers? What is the unwritten etiquette of this partnership? What do customers say off the record that they would never put in a survey?
AI learns from text. Tacit knowledge is earned on the ground. The people who can go, see, listen, adapt, and bring that knowledge back (and then document it so others can build on it) are the ones AI cannot replace.
Taste, judgment, and knowing what to build
Chandar frames this well: humans have to "reflect on what it is that we want." That is taste and judgment. It is one of the last things AI will reliably do for you, because it requires a body, a life, lived values, and the kind of reflection that only happens when you are the person who has to live with the consequences.
The practitioners pulling away from the pack are the ones who can look at a tool-driven output and say: this is not quite right, and here is what would make it true. That is discernment. It compounds with every loop.
Community leadership
This is the meta-skill that ties the others together.
We keep watching the same pattern emerge: cities become soulful AI cities when one to three people decide to host the recurring thing. Capital does not do it. People do. They show up, they make it legible, they invite, they introduce, they sponsor, they keep going. In Austin, in LA, in Dallas, in Bordeaux, every healthy chapter started this way.
Community leadership is disposition-intensive more than training-intensive. Warmth, consistency, abundance mindset, willingness to make things happen. These are human capacities that AI amplifies rather than replaces, and they are now among the most important skills anyone can cultivate. (See Build Your Jarvis In Community.)
The AAS Response
Our mandate sits directly on this shift.
We help people get Jarvised. A Personal Agentic OS puts you on the augmentation side of the line. You stop competing with AI on tasks it is better at. You start using it to sharpen judgment, amplify thinking, and multiply your reach on tasks only you can do. The Supersuit Up workshop is the on-ramp.
We teach the meta-skills, not the tool. The specific harness will change every six months. The practice of managing your own context, defining what you want, iterating on your own thinking, leading your own community: that does not. Hyperagency is vendor-neutral and life-portable.
We refuse to foster dependence. Our playbooks are open source. Our docs are public. Here is the best we know. Go do it. Improve it. Share back. That is how the floor rises across the whole labor market at once.
We build local community because social skill and relational capital are now load-bearing. People learn applied AI by doing it next to each other. Chapter leaders are a critical pillar of the transition, and the Community Leader role is one of the emerging careers we are documenting as it forms.
What to Do If You're Young
Use the tools as much as possible. Build with them. Get so comfortable with them that you can tell where they are strong and where your judgment has to step in.
Double down on what AI is bad at. Read widely. Have hard conversations in person. Show up to recurring things. Be the one who follows up. Build a relationship graph by being useful and by being a real friend to people.
Treat strategic thinking as a muscle you train every week. Brain dumps, artifact creation, decision logs. That is how strategy becomes the new execution in your own life.
And find your local community. If there isn't one, start it. The skill of building soulful community in your city is one of the highest-ROI investments you can make right now, and it is not on any standard curriculum.
What to Do If You're Leading
Hire young people who are already applied-AI fluent. Not any young people. The ones who have built a Jarvis, who think in skill files, who have been running their own operating system for months. That capability compounds on day one inside your business. Train them deliberately. The private incentive is to hire fewer juniors because they might leave. The social consequence of everyone acting on that incentive is a generation that never develops. Be the kind of operator who takes the social side of the tradeoff seriously. The short-term cost is real. The long-term payoff is a bench of applied-AI-native operators who know your business from the inside out.
Invest in the dimensions AI does not commoditize. Judgment, taste, relationships, community, and the practice of defining reality. These are the things your organization will compete on in five years.
The Opening
The ladder is breaking. That is real. The lattice is possible. That is also real.
What the Applied AI Society is building (Supersuit Up workshops, open playbooks, local chapters, the public docs) is the infrastructure that turns the possibility into the default. If we do this well, the next generation does not end up stuck on the wrong side of the automation line. They end up operating the system from above it, with a Jarvis at their side and a community around them.
That is the work.
Further Reading
External research and framing
- The Writing on the Wall (Sheng & Roberts, March 2026): The framing piece. Names the applied-AI transition as the life-or-death choice every leader now faces.
- Canaries in the Coal Mine (Brynjolfsson, Chandar, Chen, Stanford Digital Economy Lab, Nov 2025): The paper that named the employment signal.
- Anthropic Economic Index: Learning Curves (March 2026): Skill-biased technological change already visible in Claude usage; experienced users pull ahead.
- Anthropic Economic Index: Labor market impacts (March 2026): Observed-exposure framework. Slower hiring into entry-level AI-exposed roles rather than mass layoffs.
- Artificial Intelligence, Productivity, and the Workforce (Duke CFO Survey + Atlanta & Richmond Fed, March 2026): 750-CFO survey. Projected ~502K AI-driven cuts in 2026 (~9x 2025), largely preemptive, concentrated in high-skill services.
AAS concepts and playbooks
- Jevons Paradox: The economic mechanism underneath. As AI commoditizes execution, total demand for smart humans expands into work that was previously uneconomic.
- The Survivor Economy: What the preemptive sorting looks like from inside the companies doing it.
- Agentic Strategy: Strategy is the new execution, practiced daily.
- Hyperagency: What it looks like when a human wraps AI around themselves.
- Being Someone's Go-To Person: Why relationship capital is now the hardest form of leverage.
- Build Your Jarvis In Community: Why this isn't a solo activity.
- The Amplification Effect: How skill gaps compound at AI speed.
- Your Two Futures: The daily decision underneath all of this.
- Community Leader: The emerging career built around local applied AI community.
- Tim Dort-Golts' Personal Transformation: One young practitioner living the shift in real time.