Silicon Valley has always prized “high-agency” individuals—people who impress their ideas upon the world by thinking for themselves and taking action without being told what to do. But as the performance of AI coding tools has surged, so has the industry’s emphasis on humans being “agentic” themselves.
“Today’s agents might already be more capable than all three of us here in the room,” says Akshay Kothari, cofounder and chief operating officer of the $11 billion productivity startup Notion. “Taste is something we think is pretty unique to Notion, but you can imagine agents getting pretty good at that too. Eventually, the only thing left for humans is agency.”
That idea might sound outrageous to most people, but it will come as no surprise to many in Silicon Valley. A viral Harper’s essay brought the subject to a head recently. It followed a few young people in San Francisco and concluded that being agentic has less to do with productivity and “more to do with constantly chasing attention online.” But in my conversations with founders, researchers, and investors, I came to a different conclusion.
The tech industry is grappling with a very real shift in how software engineers do their jobs. Millions of developers are using AI coding agents like Claude Code and Codex to automate much of their work (some tech companies are even requiring it). For many, a massive piece of their value is now tied to deciding what AI coding agents should work on. The shift is unsettling to those who enjoyed the act of sitting down and actually writing code, but for some of the industry’s most “agentic,” it’s an opportunity.
Simon Last, another Notion cofounder, uses AI coding agents more than anyone I’ve ever met. He coded for nearly two decades—then abruptly stopped. Now, he’s using up to four AI coding agents at any given time (he prefers Codex to Claude Code). If he’s at a party or sleeping, Last gets what he calls “token anxiety” if he doesn’t have agents working in the background. He doesn’t like to use more than four, though, because he says it causes “context overload” on his human brain.
“Knowing how to harness these agents is now the most important skill in the world, and it’s not really something you can train for,” says Last. “You have to be very open-minded, curious, and willing to try whatever the newest thing is. The value of that sort of person is going up exponentially, because the value they can create, by extension, is going up exponentially.”
To be fair, Last is kind of describing himself. He acts as a “super IC” for Notion, and he doesn’t manage humans, only agents. But the way he manages them is not unlike how a manager oversees employees. Last is constantly delegating work to AI agents, then closely reviewing and fixing their code.
Kothari says Notion has always hired people with high agency, but the value of doing so has gone up dramatically in recent months. Even though the company is using AI agents to automate work, Notion is busier than ever. Employees are shipping products at a higher velocity and doing more work overall, the cofounders claim. Notion isn’t downsizing its team because of AI, but it is hiring differently. “There’s more value in the Valley today to have a few Simons than thousands of engineers,” says Kothari. He says the people Notion hires need to understand “the new way of working.”
If you’re not a software engineer, this can be a little hard to process. AI agents aren’t very useful for people in many industries. A recent survey from Gallup found that most Americans still don’t use AI much in their jobs, though the number of people who do is rising. Kothari is increasingly convinced the “new way of work” will eventually hit finance, legal, creative, and other industries. If and when that happens, you can imagine more Americans beginning to ask themselves, am I agentic?
Jennifer Li, a general partner at Andreessen Horowitz who works on the AI infrastructure team, says it’s hard to find a company she’s invested in where employees don’t use AI coding tools. “If we do come across people who are oblivious to it, it’s a big red flag. It impacts how we think about picking founders.”
Of course, Li notes that just because you use a lot of AI agents doesn’t make you a “high-agency” person—ideally you’re using them in a smart way. Some engineers who work on highly sophisticated infrastructure systems can’t use AI coding agents much, she adds. For teams that are using them, there’s a “no slop rule”: the person who submitted the code is still responsible if it’s wrong.
Yoni Rechtman, a partner at the early-stage investment firm Slow Ventures, tells me that the types of people AI startups are looking for has changed in the AI era. He sent me a job description from one of his startups, an AI health care startup named Phoebe, that encapsulated this. “I’m not looking for raw IC execution … I expect agents to take over more and more of this role over the next few months,” it reads. Instead, Phoebe is looking for people who are “excited about building the machine that lets us move fast and build features end-to-end with agents.” In other words, they want people who are cool with automating their own work with AI agents from day one, so they can think about “higher-order” tasks.
While Silicon Valley has determined high agency to be a critical value for founders and engineers in the AI era, the term has developed a sort of stink around it.
“I think it’s cringy to refer to yourself or someone else as agentic. But that doesn’t mean those are not, in fact, good qualities to look for and cultivate,” said Rechtman. “It sort of reveals a worldview that you genuinely, unironically believe there are two kinds of people in the world: the NPCs and the main characters, and you’re one of the main characters.”