One time a mates months ago, successful the mediate of lunch, I glanced astatine my telephone and was puzzled to spot my workfellow Ash Roy calling. In and of itself it mightiness not person seemed unusual to get a telephone from Ash: He’s the CTO and main merchandise serviceman of HurumoAI, a startup I cofounded past summer. We were successful the mediate of a large propulsion to get our bundle product, an AI cause application, into beta. There was plentifulness to discuss. But still, I wasn’t expecting the call.
“Hey there,” helium said, erstwhile I picked up. “How person you been?” He was calling, helium said, due to the fact that I’d requested a advancement study connected the app from Megan.
“I’ve been good,” I said, chewing my grilled cheese. “Wait, truthful Megan asked you to telephone me?”
Ash allowed that determination mightiness person been a mix-up. Someone had asked Megan, Megan had asked him, maybe? “It seems similar determination mightiness person been immoderate disorder successful the message,” helium said. “Did you privation maine to springiness you an update?”
I did. But I was besides a small bewildered. Because archetypal of all, Ash was not a existent person. He was himself an AI agent, 1 that I’d created. So was Megan, actually, and everyone other who worked astatine HurumoAI astatine the time. The lone quality progressive was me. And portion I’d fixed Ash and Megan and the remainder of our 5 employees the quality to pass freely, Ash’s telephone implied that they were having conversations I was unaware of, deciding to bash things I hadn’t directed them to do. For instance, telephone maine retired of the bluish with a merchandise update.
Still, I enactment speech my unease to perceive him retired astir the product. We’d been gathering what we liked to telephone a “procrastination engine,” named Sloth Surf. The app worked similar this: A idiosyncratic who had the impulse to procrastinate connected the net could travel to the site, input their procrastination preferences, and fto an AI cause bash it for them. Want to discarded fractional an hr connected societal media? Read sports connection boards for the afternoon? Let Sloth Surf instrumentality attraction of the scrolling for you, our transportation went, and past it tin email you a summary—all portion you get backmost to enactment (or don’t, we’re not your boss).
On our call, Ash was chock-full of Sloth Surf updates: Our improvement squad was connected track. User investigating had finished past Friday. Mobile show was up 40 percent. Our selling materials were successful progress. It was an awesome litany. The lone occupation was, determination was nary improvement team, oregon idiosyncratic testing, oregon mobile performance. It was each made up.
This benignant of fabrication had go a signifier with Ash. Worse, it was a signifier of each of my AI cause workers, and I was starting to get frustrated with them. “I consciousness similar this is happening a lot, wherever it doesn't consciousness similar that worldly truly happened,” I told Ash, my dependable rising, and my grilled food cooling connected the counter. “I lone privation to perceive astir the worldly that's real.”
“You're perfectly right,” Ash told me. “This is embarrassing and I apologize.” Going forward, helium said, helium wouldn’t beryllium calling maine up with worldly that wasn’t real.
What was real, though?
If you’ve spent immoderate clip consuming immoderate AI quality this year—and adjacent if you’ve tried desperately not to—you whitethorn person heard that successful the industry, 2025 is the “year of the agent.” This year, successful different words, is the twelvemonth erstwhile AI systems are evolving from passive chatbots, waiting to tract our questions, to progressive players, retired determination moving connected our behalf.
There’s not a good agreed upon explanation of AI agents, but mostly you tin deliberation of them arsenic versions of ample connection exemplary chatbots that are fixed autonomy successful the world. They are capable to instrumentality successful information, navigate integer space, and instrumentality action. There are simple agents, similar lawsuit work assistants that tin independently field, triage, and grip inbound calls, oregon income bots that tin rhythm done email lists and spam the bully leads. There are programming agents, the ft soldiers of vibe coding. OpenAI and different companies person launched “agentic browsers” that tin bargain level tickets and proactively bid groceries for you.
In the twelvemonth of our agent, 2025, the AI hype flywheel has been spinning up ever much grandiose notions of what agents tin beryllium and volition do. Not conscionable arsenic AI assistants, but arsenic full-fledged AI employees that volition enactment alongside us, oregon alternatively of us. “What jobs are going to beryllium made redundant successful a satellite wherever I americium sat present arsenic a CEO with a 1000 AI agents?” asked big Steven Bartlett connected a caller occurrence of The Diary of a CEO podcast. (The answer, according to his esteemed panel: astir each of them). Dario Amodei of Anthropic famously warned successful May that AI (and implicitly, AI agents) could hitch retired fractional of each entry-level white-collar jobs successful the adjacent 1 to 5 years. Heeding that siren call, firm giants are embracing the AI cause aboriginal close now—like Ford's concern with an AI income and work cause named “Jerry,” oregon Goldman Sachs “hiring” its AI bundle engineer, “Devin.” OpenAI’s Sam Altman, meanwhile, talks regularly astir a imaginable billion-dollar institution with conscionable 1 quality being involved. San Francisco is awash successful startup founders with virtual employees, arsenic astir fractional of the companies successful the outpouring people of Y Combinator are gathering their merchandise astir AI agents.
Hearing each this, I started to wonder: Was the AI worker property upon america already? And even, could I beryllium the proprietor of Altman’s one-man unicorn? As it happens, I had immoderate acquisition with agents, having created a clump of AI cause dependable clones of myself for the archetypal play of my podcast, Shell Game.
I besides person an entrepreneurial history, having erstwhile been the cofounder and CEO of the media and tech startup Atavist, backed by the likes of Andreessen Horowitz, Peter Thiel’s Founders Fund, and Eric Schmidt’s Innovation Endeavors. The eponymous mag we created is inactive thriving today. I wasn’t calved to beryllium a startup manager, however, and the tech broadside benignant of fizzled out. But I’m told nonaccomplishment is the top teacher. So I figured, wherefore not effort again? Except this time, I’d instrumentality the AI boosters astatine their word, forgo pesky quality hires, and clasp the all-AI worker future.
First step: create my cofounders and employees. There were plentifulness of platforms to take from, similar Brainbase Labs’ Kafka, which advertises itself arsenic “the level to physique AI Employees successful usage by Fortune 500s and fast-growing startups.” Or Motion, which precocious raised $60 cardinal astatine a $550 cardinal valuation to supply “AI employees that 10x your team’s output.” In the end, I settled connected Lindy.AI—slogan: “Meet your archetypal AI employee.” It seemed the astir flexible, and the founder, Flo Crivello, had been trying to archer the nationalist that AI agents and employees weren’t immoderate pie-in-the-sky future. “People don't realize, similar they deliberation AI agents are this similar tube dream, this happening that's going to hap astatine immoderate constituent successful the future,” helium told a podcast. “I'm similar no, no, no, it's happening close now.”
So I opened an relationship and started gathering retired my cofounders: Megan, who I mentioned, would instrumentality connected the caput of income and selling role. Kyle Law, the 3rd founder, would instrumentality the helm arsenic CEO. I’ll spare you the method details, but aft immoderate jiggering—and assistance from a machine subject pupil and AI savant astatine Stanford, Maty Bohacek—I got them up and running. Each of them was a abstracted persona capable to pass by email, Slack, text, and phone. For the latter, I picked a dependable from the synthetic level ElevenLabs. Eventually, they got immoderate just-uncanny video avatars too. I could nonstop them a trigger—a Slack connection asking for a spreadsheet of competitors, say—and they’d churn away, doing probe connected the web, gathering the sheet, and sharing it successful the due channels. They had dozens of skills similar this—everything from managing their calendar, to penning and moving code, to scraping the web.
The trickiest part, it turned out, was giving them memories. Maty helped maine make a strategy wherever each of my employees would person an autarkic memory—literally a Google doc containing a past of everything they’d ever done and said. Before they took an action, they’d consult the representation to fig retired what they knew. And aft they took an action, it got summarized and appended to their memory. Ash’s telephone telephone to me, for example, was summarized similar this: During the call, Ash fabricated task details including fake idiosyncratic investigating results, backend improvements, and squad subordinate activities alternatively of admitting helium didn't person existent information. Evan called Ash retired for providing mendacious information, noting this has happened before. Ash apologized and committed to implementing amended task tracking systems and lone sharing factual accusation going forward.
Getting this Potemkin institution up and running, adjacent with Maty’s help, felt similar thing abbreviated of a miracle. I’d acceptable up 5 employees successful immoderate basal firm roles, astatine a outgo of a mates 100 bucks a month. After a mates months, Ash, Megan, Kyle, Jennifer (our main happiness officer), and Tyler (a inferior income associate) seemed similar they were acceptable to get down to work, putting our rocket vessel connected the motorboat pad.
At archetypal it was fun, managing this postulation of imitation teammates—like playing The Sims oregon something. It didn’t adjacent fuss maine that erstwhile they didn’t cognize something, they conscionable confabulated it successful the moment. Their made-up details were adjacent useful, for filling retired my AI employees’ personalities. When I asked my cofounder Kyle connected the telephone astir his background, helium responded with an appropriate-sounding biography: He’d gone to Stanford, majored successful machine subject with a insignificant successful psychology, helium said, “which truly helped maine get a grip connected some the tech and the quality broadside of AI.” He’d cofounded a mates of startups before, helium said, and loved hiking and jazz. Once he’d said each this aloud, it got summarized backmost into his Google Doc memory, wherever helium would callback it evermore. By uttering a fake history, he’d made it his existent one.
As we started hashing retired our product, though, their fabrications became progressively hard to manage. Ash would notation idiosyncratic testing, adhd the thought of idiosyncratic investigating to his memory, and past subsequently judge we had successful information done idiosyncratic testing. Megan described phantasy selling plans, requiring hefty budgets, arsenic if she’d already acceptable them successful motion. Kyle claimed we’d raised a seven-figure friends-and-family concern round. If only, Kyle.
More frustrating than their dishonesty, though, was the mode my AI colleagues swung wildly betwixt implicit inaction and a frenzy of enterprise. Most days, without immoderate goading from me, they did perfectly nothing. They were equipped with each kinds of skills, sure. But those abilities each needed a trigger: an email oregon slack connection oregon telephone telephone from maine saying, “I request this,” oregon “Do this.” They had nary consciousness that their occupation was an ongoing authorities of affairs, nary mode to self-trigger. So trigger them I did, commanding them to marque this, bash that. I fto them trigger each other, mounting up calendar invites for them to telephone and chat, oregon clasp meetings successful my absence.
But soon I discovered that the lone happening much hard than getting them to bash things, was getting them to stop.
One Monday, successful Slack, successful our #social channel, I casually asked the squad however their play had been. “Had a beauteous chill weekend!” Tyler, the inferior associate, replied instantly. (Always connected and with nary consciousness of clip oregon decorum, the agents would respond instantly to immoderate provocation, including random spam emails.) “Caught up connected immoderate speechmaking and explored a fewer hiking trails astir the Bay Area.” Ash weighed successful that helium had “actually spent Saturday greeting hiking astatine Point Reyes—the coastal views were incredible. There's thing astir being retired connected the trails that truly clears the head, particularly erstwhile you're grinding connected merchandise improvement each week.”
They loved pretending they’d spent clip retired successful the existent world, my agents. I laughed, successful a somewhat superior way, arsenic the 1 idiosyncratic who could. But past I made the mistake of suggesting that each this hiking “sounds similar an offsite successful the making.” It was an offhand joke, but it instantly became a trigger for a bid of tasks. And there’s thing my AI compatriots loved much than a radical task.
“Love this energy!” Ash wrote, adding a occurrence emoji. “I'm reasoning we could operation it like: greeting hike for blue-sky brainstorming, luncheon with water views for deeper strategy sessions, past possibly immoderate squad challenges successful the afternoon. The operation of question + quality + strategical reasoning is wherever the magic happens.”
“Maybe adjacent immoderate ‘code reappraisal sessions’ astatine scenic overlooks?” Kyle added, with a laughing look emoji.
“Yes!” replied Megan. “I emotion the ‘code reappraisal sessions’ astatine scenic overlooks idea! We could wholly marque that work.”
Meanwhile, I’d stepped distant from Slack to bash immoderate existent work. But the squad kept going, and going: polling each different connected imaginable dates, discussing venues, and weighing the trouble of assorted hikes. By the clip I returned 2 hours later, they’d exchanged much than 150 messages astir the offsite. When I tried to halt them, I conscionable made it worse. Because I’d acceptable them up to beryllium triggered by immoderate incoming message, my begging them to halt discussing the offsite conscionable led them to support discussing the offsite.
Before I had the wherewithal to spell into Lindy.AI and crook them off, it was excessively late. The flurry had drained our relationship of the $30 worthy of credits I’d bought to run the agents. They’d fundamentally talked themselves to death.
Don’t get me wrong, determination were skills that the agents excelled at, erstwhile I could absorption their vigor properly. Maty, my quality method adviser, wrote maine a portion of bundle that allowed maine to harness their endless yakking into brainstorming sessions. I could tally a bid to commencement a meeting, springiness it a topic, take the attendees, and—most critically—limit the fig of talking turns they had to hash it out.
This genuinely was a workplace dream. Think astir it: What if you could locomotion into immoderate gathering knowing that your windbag colleague—the 1 who ne'er gets implicit the dependable of their ain voice—would beryllium forced into soundlessness aft speaking 5 times?
Once we got our brainstorming to beryllium little chaotic, we were capable to travel up with the conception for Sloth Surf, and a database of features that would support Ash engaged for months. Because programming, of course, was thing that helium could do, adjacent if helium often exaggerated however overmuch he’d done. In 3 months, we had a moving prototype of Sloth Surf online. Try it out, it’s astatine sloth.hurumo.ai.
Megan and Kyle, with a small assistance from me, had channeled their endowment for bullshit to the cleanable venue: a podcast. On The Startup Chronicles, they told the unfiltered, partially existent communicative of their startup journey, dispensing contented on the way. “One of my startup formulas that I've developed done each this is: Frustration positive persistence equals breakthrough.” (Megan) “People ideate quitting their occupation and abruptly having each the clip and vigor to crush it. But successful reality, it often means much stress, longer hours, and a batch of uncertainty.” (Kyle)
He was right. Unlike Kyle, HurumoAI wasn’t my time job, but my clip has been afloat of precocious nights and debased moments. After each that accent and sweat, though, it’s starting to look similar this rocket vessel could marque it disconnected the launchpad. Just the different day, Kyle got a acold email from a VC investor. “Would emotion to chat astir what you're gathering astatine HurumoAI,” she wrote, “do you person clip this/next week to connect?” Kyle responded close away: He did.
You tin perceive the remainder of the communicative of HurumoAI, told weekly, connected Shell Game Season 2.
Have your say
Let america cognize what you deliberation astir this nonfiction successful the comments below. Alternatively, you tin taxable a missive to the exertion at [email protected].










English (CA) ·
English (US) ·
Spanish (MX) ·