From the Desk of
ZCS floral monogram

Even Garry wants a faster horse

My inbox has been blowing up with friends sending me Garry Tan’s request for startups about a “Secure AI App Store”— I’ve spent the last few years building future of macOS and talking all my friends’ ears off about it —and now they’re delighting in Garry joining the chat.

If you’re not familiar with Garry or his work, he’s the president of Y Combinator. If you’re not familiar with Y Combinator, it’s the gold standard for startup incubators. YC is trusted by investors, by founders, by early employees, it’s more or less the bedrock of the entire startup ecosystem. It’s a vetted brand with a stamp of credibility that, if nothing else, is strong enough to get your parents off your back about not going to grad school after college so you can take a risk on a startup.

Its primary offerings are twofold. First, a trusted library of playbooks, strategies and tactics for building a fast growing startup, starting with “make something people want” and getting into the blocking and tackling of acquiring users, fundraising, recruiting, and managing early-stage chaos. YC startups tend to follow this blueprint pretty closely, because it sometimes works spectacularly well and because startups are risk machines so why take on additional strategy risk when there’s a tried and true playbook to run. 

But second, and more importantly, beyond trusted playbooks YC is a merchant of trust itself. Founders pay dearly for Y Combinator’s money because its brand does a lot of work in the early life of a company, especially when that company’s software needs to touch user data or live somewhere deeply intimate, like inside your inbox, calendar, or financial accounts. Almost every YC company makes sure to flash that badge early and often. It’s a shortcut to building enough trust with early users to begin the process of brand and product adoption. In a world of App Stores and search results and data-permission prompts, that trust is precious —and it’s no surprise Garry would want to preserve the ecosystem where YC’s reputation gives founders that kind of head start.

So with all due respect to Garry, asking for an AI App Store is the 2025 version of asking Henry Ford for a faster horse.

If you didn’t read the linked Request for Startup, in a nutshell he’s asking for a new kind of AI-native App Store—one that gives users a lot more control over their computers. He outlines five main components: user-consented shared memory between apps, a secure permissions model that users actually understand, a marketplace for AI agents that act on your behalf, some way to prevent data leakage and bad behavior, and developer infrastructure to make it all viable. It’s a smart diagnosis of the problem: trust, privacy, and clarity are totally broken in the current model. His answer is to build better tooling to let users manage that complexity themselves—better knobs, clearer prompts, more control.

The shift that LLMs are kicking off is way bigger than anything that can be contained in an app store. These old metaphors—stores, catalogs, search results—start to look hilariously quaint when you compare them to the scale and dynamism of something like TikTok. App stores have hundreds of thousands of entries. TikTok has more uploads than that each hour. It's not just bigger, it's a different shape entirely—something built for asynchronous, continuously generated media that’s far, far more creative than the queries users can imagine punching into a search field.

Seeing the present can be hard for me! Seeing the future is hard for just about everyone all the time. Let’s talk about what Garry’s RFS misses.


Before we explore the future let’s reimagine the past in an alternate history— like For All Mankind or The Man in the High Castle, but instead of the bad guys winning the space race or world war, what if television evolved differently.

So, okay, starting point: photographs, movies, TV were never invented.

In 1823 a mysterious, magical ore was discovered: Visionium. You can split it into two halves and whatever is seen through one side is visible from the other. No delay, no intermediaries, it’s like a mirror meets glass, powered by quantum entanglement (or something).

At first, it feels like sorcery. You can instantly see your loved ones across the world and talk to them like they're right there with you (FaceTime but via literal magic glass). Before anything like television or films existed, entertainment started as bringing a chunk of Visionium to a play and pointing it at the stage. The paired piece of ore stays at home and your family can watch a performance without going to the theater. Eventually, that evolves into Visionium theaters and then studios, with full-on arrays of ore slices all pointed at the stage, broadcasting out to paired pieces around the world.

But remember, each pane is live paired to the viewer, so the studios can peer into millions of viewers’ living rooms. That’s when businesses started paying attention, starting with grocery stores trying to analyze what kinds of popcorn viewers were eating during the show. Then furniture stores wanted to know whose sofa looked forlorn so they could send coupons for an upgrade. It gets creepy for a while then government steps in with a rule that stores couldn’t be in the studios anymore. So then they hire intermediaries to stand in for them and report back what they espy. Then come the legal disclosures, at the beginning of every show a cue card that says something inane like “by watching this performance, you consent to being seen” and it's like... yeah that's how the technology works, man... so what?

People are creative, clever and rebellious, they come up with workarounds, rules, etiquette— maybe you’d wear a mask while viewing or hide your sofa under a blanket. Whole new layers of regulation sprang up around who could look through what and when...


OK, ha— slightly cute, slightly trite story, right? But it’s not just my weird little sci-fi alt-history detour, it’s no accident that I hope it reads to you a lot like how the software world works today.

Since the beginning of the internet there's been an increasingly bizarre forced intimacy between software developers and users. The operating systems— including the web and native iOS, macOS, Windows, etc. —force developers to possess user data to make their programs, then adding insult to injury they force users to traverse a gauntlet of nagging prompts granting access to the systems their tools obviously need to deliver their utility. It’s like you as a developer have to be in the room with someone just to make some cool, helpful code.

But Apple doesn't have to force developers to possess user data, it doesn't have to be this way!

Imagine building a maps of smiles app— like, it just tracks places that made you smile. On iOS, Apple forces developers to contemplate the user's location as globally identifiable latitude-longitude coordinate pairs. If you want to make a plus button that stores locations and a map that shows stored locations, you have to know the lat/longs along the way, shuttling them from the Location Services API, to a datastore or whatever, just to pass them back into the system Maps View API. Why? What if the system gave me an opaque token— $LocationABC123 —that meant nothing to me, but still let me show stuff on a map or generate location-aware insights? Then I wouldn’t need to nag the user for location permission because I wouldn’t be handling their identifiable data —I get to live the dream of having no idea where you are!

And it’s not just the internal minutiae of each app, it’s the whole computing architecture of having to pre-decide which apps are installed. Like, every user is supposed to be some clairvoyant savant who knows what future tasks they'll want to do, and also has photographic memory of which apps they already have and what they’re good for. It’s absurd.

And because basically everything cool that gets made in modern times— whether it’s a new Charli XCX album or a James Patterson novel or a James Cameron movie —is getting made somewhere by someone using a computer, that absurdity compounds into a mass of friction that slows down all of human progress.

Software is a cursed medium because Apple et al force developers to be intimately aware of user data. We don't need newer better rules that perpetuate the cat-and-mouse that played out in the hypothetical about Visionium.

What we really need is a new kind of runtime that resolves the original sin of developers being forced to touch user data.

Apple, for example, could stop wasting billions of peoples' precious time with zillions of nagging prompts if its leadership pushed forward a future of computing vision pro-user privacy, underpinning an operating system that's simply and absolutely secure, instead of endlessly abdicating responsibility for pushing the platform forward in favor of another os "upgrade" with yet more unanswerable permission prompts.


It's not crazy, it's not even that imaginative or visionary. Books, photographs, videos, all already work like this— inherently asynchronous and intermediated, having matured before constant connectivity via the internet was a thing. As a result, our expectations around consuming those media forms are deeply shaped by that asynchronicity —and so are the tools for creating and distributing them.

And that shows, and it matters a lot!, in the many, many orders of magnitude more people able to produce and consume those kinds of media compared to the few who actively participate in software.

I kinda think we’ve mostly assumed software is smaller because it’s harder to make, like I think almost everyone thinks writing a program is harder than writing a Tweet? But now that LLMs make it instantaneous to generate software and we’re all about to learn the deeper truth: the problem wasn’t making the software, it was distributing it.

Every business school graduate working in tech can tell you new apps have to be "10X better" —not because it's a law of physics but because this absurd, baked-in intimacy means users have to exercise constant hypervigilance about privacy, trust, and the intentions of whoever made the tools they might use.

We should fix it, and in service of unlocking the world changing potential of LLMs we must fix it. We need a new runtime that’s totally opaque to developers so that users can step out of the loop of app stores and google search results.

And when we do that, we can bring the algorithmic distribution layer that makes other media hyper-liquid— like TikTok or YouTube —to software itself. That’s when everything changes.

We're basically at the end of the epoch when software ate the world— we've unlocked vast advances in living standards for everyone by transforming analog processes into software accelerated systems. Now imagine what happens when software gets to eat itself. Fixing the intimacy problem at the core of how apps are made and distributed finally unlocks accelerating software with software.

And once that original sin is gone, we’ll unlock something wild. Not just more software, but new types of things that people can make. We’ll stop asking “do I trust this developer with my data?” and start delighting in “what weird and beautiful thing arrives next?”


I don't know Garry but I would guess that, like myself, he would describe himself as being involved in a lifelong love affair with computers. 

When software becomes as freely and safely distributed as other forms of media, we’re not just scaling up participation— we’re expanding the entire space of possibility. We’ll get a full-blown kaleidoscopic renaissance of new forms, new creators, new utilities we can't imagine yet. And as a lover of computers, I cannot wait to live in that future.