In partnership with

You Deserve a Better Intranet

A modern intranet like Haystack streamlines workplace operations by centralizing knowledge, communication, and resources.

Employees will no longer waste time hunting through email chains or scattered folders—they can find what they need in seconds.

With customizable templates, clear layouts, and multimedia capabilities, teams can create and share content that is easy to read, navigate, and reference. Haystack turns your intranet into an interactive, engaging resource hub that supports collaboration and knowledge retention.

Upgrading your intranet boosts efficiency across departments, reduces duplicated work, and ensures consistent, accurate information is accessible to everyone. Employees stay informed, aligned, and empowered, while leadership gains visibility into engagement and usage.

Haystack transforms your intranet from a static repository into a dynamic platform that drives productivity, connection, and culture.

TL;DR


• Spotify’s top engineers haven’t written code since December. They supervise AI via Slack. 50+ features shipped in 2025 using this model.

• CS enrollment posted its first decline in 20 years. 62% of computing programs nationally reported drops. Students are migrating to AI-specific degrees.

• Google’s startup lead says LLM wrappers and aggregators are dead. The only moat is domain expertise, proprietary data, or vertical depth.

• 93% of developers use AI assistants. But organizational productivity gains have plateaued at ~10%. The bottleneck has shifted from writing code to reviewing, judging, and directing it.

• Anthropic’s own Agentic Coding Trends Report confirms it: the engineering role is moving from “code writer” to “orchestrator and verifier.”

• The actionable framework: Audit your role. Automate the commodity tasks. Double down on what AI can’t replicate: judgment, relationships, and domain knowledge.

Five Signals, One Story

In a single two-week span this February, five seemingly unrelated events landed in the same news cycle: Spotify revealed its top engineers haven’t typed code since December. UC system CS enrollment dropped for the first time since the dot-com bust. A Google VP declared two hot AI startup models dead on arrival. Anthropic shipped Opus-class AI at one-fifth the price. And Anthropic’s own research team published a report describing the end of “code-first” engineering.

Read separately, each story gets a day of headlines and a week of LinkedIn debates. Read together, they tell one story: the era where “I can code” was a sufficient career strategy is ending. The era where “I know what to build, why, and how to direct AI to build it” is beginning.

This deep dive unpacks each signal, connects the dots, and gives you a concrete action plan to future-proof your career or your team before the shift becomes obvious to everyone else.

By the Numbers

Metric

Data Point

Spotify features shipped via AI (2025)

50+, with zero hand-written code from top devs since Dec

UC system CS enrollment decline

6% YoY (first drop in 20 years)

National computing programs reporting declines

62% (Computing Research Association, Oct 2025)

Princeton CS majors (Class of 2028 vs 2026)

74 vs 150 (–51%)

Developers using AI coding assistants

93% monthly; 75% weekly (DX Research, Feb 2026)

AI-authored production code

26.9% of all code merged (up from 22% last quarter)

Org-level productivity gain from AI coding

~10% (plateaued since mid-2025)

Sonnet 4.6 vs Opus pricing

$3/$15 vs $15/$75 per million tokens (5x cheaper)

AI startups raising $100M+ (Jan 2026 alone)

17 companies in 49 days

Signal 1: Spotify’s Engineers Became Supervisors

During Spotify’s Q4 earnings call on February 10, co-CEO Gustav Söderström made a statement that ricocheted through every engineering Slack channel in Silicon Valley: the company’s most senior developers have not written a single line of code since December 2025. They describe ideas in natural language through Slack, and an internal system called “Honk” — built on Anthropic’s Claude Code — generates, deploys, and delivers the finished feature back to the engineer’s phone for review.

This isn’t a prototype workflow for a hackathon team. Spotify shipped over 50 features and changes to its app throughout 2025 using this system, including AI-powered Prompted Playlists, audiobook Page Match, and About This Song. Söderström called it “just the beginning.”

What makes this different from the usual CEO AI hype: Spotify isn’t claiming AI will someday transform engineering. They’re describing a system that is already running in production, already shipping features to 675 million users, and already eliminating the manual act of writing code for their most experienced engineers. The critical detail most coverage misses is that Honk isn’t generic Claude Code. It’s a custom layer optimized for Spotify’s codebase, coding standards, and deployment infrastructure. That specificity is what makes it reliable enough for production — and it’s the same kind of “deep vertical expertise” that Google’s Mowry says is the only real moat.

The counterargument surfaced immediately. Software engineer Siddhant Khare published a viral essay the same week arguing that AI-generated code is making his job harder, not easier. His complaint: reviewing an endless assembly line of AI pull requests feels like being a quality inspector who never gets to build anything. This tension — between the efficiency Spotify is describing and the drudgery Khare is experiencing — is the exact fault line the orchestration shift will create. The engineers who thrive will be the ones who set the direction, not the ones who merely approve the output.

Signal 2: Students Are Voting with Their Feet

The enrollment data is striking. Computer science majors across the University of California system dropped 6% this year after a 3% decline in 2024 — the first sustained retreat since the dot-com bust. The UC system still has nearly 12,700 CS undergrads, almost double a decade ago, so this isn’t a collapse. But it is an inflection point.

Nationally, 62% of computing programs reported enrollment declines this fall, per the Computing Research Association. At Princeton, the CS B.S.E. major dropped from 150 students in the Class of 2026 to 74 in the Class of 2028. Students are not leaving technology. They’re migrating. UC San Diego’s new AI-specific major attracted one in five applicants to its CS department. MIT’s AI and Decision-Making major is now the school’s second-largest. The University of South Florida enrolled 3,000 students in its new AI and cybersecurity college in its first semester.

The signal beneath the signal: Parents are steering children away from computer science and toward mechanical or electrical engineering — fields perceived as harder for AI to automate. Students are making rational economic calculations: if 41% of production code is already AI-generated and Spotify’s best developers don’t write code, what exactly does a traditional CS degree prepare you for? The answer, increasingly, is “the foundational knowledge needed to orchestrate AI systems” — but that’s not how most CS programs are marketed or structured.

Signal 3: Google Says the Wrapper Era Is Over

On February 21, Darren Mowry — who leads Google’s global startup organization across Cloud, DeepMind, and Alphabet — went on TechCrunch’s Equity podcast and said the quiet part out loud: LLM wrapper startups and AI aggregators have their “check engine light” on.

The timing was brutal. In the 49 days between January 1 and February 17, seventeen U.S. AI startups raised over $100 million each — many on exactly the business model Mowry was declaring dead. His message was precise: wrapping “very thin intellectual property around Gemini or GPT-5” is no longer a path to growth. The models themselves keep improving and absorbing what wrappers used to provide. What’s left for startups? “Deep, wide moats that are either horizontally differentiated or something really specific to a vertical market.”

Why this matters beyond startups: Mowry’s analysis applies to individual careers just as well as it applies to companies. If your professional value is “I can operate the AI tool” — essentially wrapping yourself around a model — you face the same commoditization risk as a thin wrapper startup. The durable value, for both companies and people, comes from domain expertise that AI can’t easily replicate: knowing which problems to solve, which tradeoffs to make, and which outputs are actually correct in your specific context.

Signal 4: The Productivity Paradox

Here’s the data that ties everything together. Laura Tacho, CTO at DX and Austrian Innovator of the Year, presented research at the Pragmatic Summit in February 2026 covering 121,000 developers across 450+ companies. The headline findings: 92.6% of developers use an AI coding assistant at least monthly. 75% use one weekly. AI-authored code now accounts for 26.9% of all production code merged, up from 22% last quarter.

But here’s the paradox: organizational productivity has plateaued at roughly 10%, the same level it hit when AI coding tools first gained traction in mid-2025. Developers report saving 3.6 to 4 hours per week. That number hasn’t budged in two quarters.

The Faros AI research report, which analyzed telemetry from 10,000+ developers across 1,255 teams, explains why. AI is making individual developers faster at writing code — but it’s shifting the bottleneck to review, testing, and integration. Code is getting bigger and buggier. Pull requests need more scrutiny, not less. The teams seeing real gains are the ones that paired AI adoption with structural changes: better review processes, governance, and — critically — engineers who can make high-stakes judgment calls about what the AI produced.

This is the orchestration argument in data form. The companies where AI is a “force multiplier” (Tacho’s term) are the ones with engineers who direct, verify, and override AI output. The companies seeing twice as many customer-facing incidents are the ones that let AI generate without sufficient human judgment at the helm.

Signal 5: The AI Builders Agree

Anthropic’s 2026 Agentic Coding Trends Report — published by the company that builds Claude Code, the tool Spotify relies on — puts it plainly: the engineering role is shifting from writing code to orchestrating agents. Their specific predictions include that most “tactical work of writing, debugging, and maintaining code shifts to AI while engineers focus on higher-level work like architecture, system design, and strategic decisions about what to build.”

The report describes a future of multi-agent architectures where a human orchestrator coordinates specialized agents working in parallel — each handling defined tasks while the human makes judgment calls, resolves ambiguity, and sets direction. This closely matches what Spotify is already doing with Honk, what Google’s Mowry says is the only durable business model, and what the productivity data shows separates effective AI-augmented teams from struggling ones.

Notably, the report also acknowledges that AI is expanding who can write code. Security teams, operations teams, data analysts, and design teams are all increasingly using AI coding tools for tasks that previously required a developer. The barrier between “people who code” and “people who don’t” is dissolving — which means the old definition of what makes a developer valuable is dissolving with it.

What the Orchestrator Role Actually Looks Like

Nicholas C. Zakas, a veteran software engineer, published a widely-circulated analysis in January 2026 titled “From Coder to Orchestrator.” He describes the transition in three phases, borrowing Addy Osmani’s framing: coder → conductor → orchestrator. We’re in the conductor phase now. The full orchestrator phase — where engineers manage fleets of specialized agents working in parallel — is likely 2–3 years away for most organizations, but Spotify is demonstrating it today.

The skills that matter in the orchestration era:

System Design & Architecture. Understanding how components fit together, where failure points exist, and how to structure systems for reliability. This was always valuable; now it’s the primary job.

Domain Expertise. Knowing your industry’s constraints, regulations, customer expectations, and edge cases. AI can generate a HIPAA-compliant data handler, but only a human who understands healthcare workflows knows whether it actually solves the right problem.

Verification & Judgment. 96% of developers say they don’t fully trust AI-generated code. The skill isn’t writing code — it’s knowing when the AI’s output is subtly wrong in ways that won’t show up until production.

AI Fluency. Understanding context windows, model strengths, prompt architecture, agent coordination, and cost optimization. This is the new technical literacy, replacing syntax mastery.

Communication & Translation. The ability to translate business requirements into AI-actionable instructions, and to explain AI outputs to non-technical stakeholders. This was a “soft skill.” It’s now a core competency.

Your Career Hardening Checklist

Whether you’re a developer, a manager, or a non-technical professional who uses AI daily, here’s what to do this month:

This Week

Run the Role Disruption Audit from this week’s newsletter. Map your 10 most time-consuming tasks against AI Automation Risk and Human Edge Score. Identify your moat.

Audit your AI memory. Open ChatGPT, Copilot, or Claude settings and review stored memories. Delete anything you didn’t put there.

Try building one thing with an AI orchestration tool. Use Replit Agent, Claude Code, or Cursor to build a prototype for something you’d normally file as a feature request. Experience the orchestrator role firsthand.

This Month

Shift 20% of your learning time from syntax to systems. Replace one coding tutorial with content on system design, architecture patterns, or your industry’s regulatory landscape.

Propose one automation to your manager. Pick one commodity task from your audit and present a plan to hand it to AI. Be the person who volunteers to automate part of your own job — that’s how you become the person who directs, not the one who gets directed.

Start a “Verification Habit.” Every time you use AI-generated output this month, spend 30 seconds documenting what it got wrong. You’re building the judgment muscle that separates orchestrators from rubber-stampers.

This Quarter

Double down on your domain. Take one course, join one community, or attend one conference in your industry vertical (not in “AI” generally). The orchestration premium goes to people who know their domain deeply enough to direct AI accurately.

Build a “supervisor portfolio.” Document projects where you directed AI to produce meaningful outcomes. This replaces the traditional code portfolio for your next career move.

Reassess your competitive position. Rerun the Role Disruption Audit. If your high-risk tasks haven’t changed, you haven’t moved.

Go Deeper

Spotify earnings call (TechCrunch): techcrunch.com

UC CS enrollment decline (TechSpot): techspot.com

Google VP’s wrapper warning (TechCrunch): techcrunch.com

DX Productivity Research (ShiftMag): shiftmag.dev

Anthropic Agentic Coding Trends Report: anthropic.com (PDF)

“From Coder to Orchestrator” (Nicholas C. Zakas): humanwhocodes.com

AI Productivity Paradox Report (Faros AI): faros.ai

This deep dive accompanies the iPrompt Newsletter for the week of February 23, 2026. Read the newsletter for the quick-hit version.

Stay curious — and stay paranoid.

— R. Lauritsen

Recommended for you