Inside RapidNative's Export Pipeline: From AI-Generated Code to App Store

RI

By Rishav

27th Apr 2026

Last updated: 27th Apr 2026

Inside RapidNative's Export Pipeline: From AI-Generated Code to App Store

Most demos of AI app builders end with a glossy preview. The cursor blinks, the simulator rotates, the founder smiles, the video cuts. What you almost never see is the part that actually matters: the moment that AI-generated code stops being a sandboxed toy and becomes a real app sitting in TestFlight, awaiting Apple's review queue.

That last mile is where most "vibe coding" tools quietly fall apart. The output is real React Native, but the bundle is gone the moment you close the tab. Or the export is a ZIP of half-resolved imports that no human can run. Or the project ships, but it depends on a private bundler nobody else can reproduce.

RapidNative's export pipeline was built specifically to close that gap. This post walks through the full architecture — the Postgres file store, the in-browser bundler running in a Web Worker, the on-demand ZIP route, the GitHub template handoff, and the bridge to Expo Application Services — so you can see exactly how an AI-generated screen becomes a real react native app deployment artifact.

Mobile app developer reviewing code on a laptop Photo by Igor Miske on Unsplash

The Real Problem: AI App Builders Have a Last-Mile Problem

There are roughly three things any AI app builder has to do well: generate code that compiles, render that code somewhere a user can poke at it, and hand the user something they can ship. The first two are well-trodden territory. The third is where the architecture choices get interesting.

The naive approach is to treat "export" as an afterthought — wait until a user clicks Download, then frantically materialize files from a runtime store. That works for toy projects. It falls apart the moment you have a 200-file Expo Router app with images stored across multiple buckets, environment variables that need to be production-aware, and dependencies that have to resolve in a way the user can npm install against on a real machine.

RapidNative treats export as a first-class architectural concern. The system is built so that at any moment, for any project, a fully-runnable Expo project can be reconstructed on demand — not because we cached it, but because the source of truth is structured to make reconstruction cheap.

A Three-Stage Architecture

Before diving in, here's the mental model. The product has three decoupled stages, each owned by a different subsystem:

StageOwnerWhere it runs
GenerationTwo-step AI pipeline (/api/user/ai/generate-v2)Server-side, streamed via SSE
PreviewContainerized Metro bundlerExternal workspace server, Docker
ExportBrowser bundler + on-demand ZIP routeClient-side bundling, server-side packaging

Decoupling these matters. Generation can fail without taking down preview. Preview can be down without breaking export. The shared contract between them is a single Supabase table — files — which stores every artifact the AI ever produces.

This post focuses on the third stage, but you can't talk about export without first understanding where the bytes live.

Stage 1: Files Live in Postgres, Not Object Storage

Here's the first non-obvious decision: project source files don't live in a blob store. They live in Postgres.

The files table looks roughly like this:

ColumnTypePurpose
idUUIDPrimary key
project_idUUIDForeign key into projects
file_pathVARCHAR(500)e.g., app/(tabs)/index.tsx
contentTEXTThe actual source code
mime_typeVARCHAR(100)application/typescript, etc.
file_typeENUMtsx, jsx, json, markdown, etc.
encodingENUMutf-8, base64, binary
is_externalBOOLEANMarks binary assets stored in Supabase Storage

Indexes on project_id, file_type, and file_path keep lookups O(log n). A unique constraint on (kv_project_id, file_path) guarantees no two files in the same project can collide.

Why Postgres instead of S3-style blob storage? Three reasons that turned out to matter.

Transactional consistency. When the AI generates 40 files in a single response, they need to land atomically. Either every file makes it in or none does. Postgres transactions give us that for free; reconciling object-store writes with metadata writes is a known footgun.

Cheap diffs. Every modification produces an updated_at. Computing what changed since the user last hit "preview" is a single indexed query, not a paginated bucket listing.

On-demand reconstruction. Because every file is a row, we never have to maintain a "current export." Need a ZIP? Scan the rows. Need to recompute a bundle? Scan the rows. There is no cache to invalidate, because there is no cache.

For binary files (images, fonts, anything heavy), we use a hybrid: the row in files carries metadata and a is_external: true flag, while the actual bytes sit in a Supabase Storage bucket at projects/{projectId}/fs/{filePath}. The export pipeline knows to reach into Storage for those.

Database server racks in a data center Photo by Taylor Vick on Unsplash

Stage 2: The Browser-Based Bundler

Here's where things get unusual. RapidNative does not run a Metro bundler on the server to produce preview output. We run a bundler in the browser, inside a Web Worker, on the user's own machine.

The bundler lives at src/modules/file/bundler.worker.ts, with the orchestration layer in src/modules/file/almostmetro-bridge.ts. Under the hood it's browser-metro (an open-source-friendly fork called almostmetro) configured with a transformer chain that mirrors what a real Metro setup would do:

  1. TypeScript stripping via Sucrase — fast, no type-checking, just remove the types.
  2. JSX transform to plain React.createElement calls.
  3. React Refresh instrumentation so hot module replacement works.
  4. Path resolution with a custom plugin that handles RapidNative's project layout.
  5. Nativewind shim that rewrites Tailwind className strings into react-native-web style objects at runtime.

The communication protocol between the main thread and the worker is a small message bus:

Main → Worker:  { type: 'watch-start', files, packageServerUrl }
Main → Worker:  { type: 'watch-update', changes }
Worker → Main:  { type: 'watch-ready', code, stubbedFiles? }
Worker → Main:  { type: 'hmr-update', update, bundle }
Worker → Main:  { type: 'error', message }

Why bundle on the client? Three answers, in order of importance.

Cost. A serverless bundler that rebuilds on every keystroke gets expensive fast. Pushing the work to the user's CPU is essentially free.

Latency. Round-tripping every file change to a server adds 100–300ms even on a fast network. In-process bundling in a Worker is single-digit milliseconds for incremental updates.

Resilience. When the AI emits broken JSX — and it does, sometimes — the bundler's acorn-based code-frame generator stubs the offending file with a visible BrokenComponentStub instead of nuking the whole bundle. Other screens keep working. The user can keep editing while you fix one component.

This is the same bundler that powers exports. When a user requests a downloadable build, the system already has a battle-tested transform pipeline; it just runs it once more, against the latest snapshot in files, and produces a clean output.

Stage 3: Packaging for Production

The actual export endpoint lives at POST /api/user/projects/[projectId]/download. This is where AI-generated code becomes a thing you can hand to npx create-expo-app.

The route does roughly the following:

  1. Authorize. Verify the requester owns the project (or is an admin).
  2. Gate by plan. Free tier users get a soft block; Pro and above proceed. This is the only place the export pipeline cares about subscription state.
  3. Open an adm-zip archive in memory.
  4. Iterate the file list from the request body (the client already has the working set in memory; the server can also rehydrate from the files table when needed).
  5. For each text file: apply small migrations — for example, normalizing legacy LinearGradient imports — and write to the ZIP as UTF-8.
  6. For each external asset: stream-fetch from projects/{projectId}/fs/{filePath} in Supabase Storage and write the binary buffer to the ZIP.
  7. Generate a .env file populated with the project's production environment variables, filtering out anything marked secret.
  8. Stream the ZIP back with Content-Disposition: attachment; filename=<projectName>-expo.zip.

A few details are worth lingering on.

There is no cached export. Every download re-bundles from current state. This guarantees the ZIP is always in sync with what the user just edited. The cost is a few seconds of compute per download; the win is zero stale-export bug reports.

Environment separation matters. The runtime preview operates in a designer environment with its own variable set. Export switches to production. Variables flagged as secret never leave the server boundary. Without this split, half the AI-generated apps would ship with development-only API keys baked in.

The dependency tree is fixed and tested. The exported package.json pins versions that we test daily: expo@54.0.13, expo-router@6.0.12, react@19.1.0, react-native@0.81.4, react-native-web@0.21.1, nativewind@4.2.1, @gluestack-ui/core@3.0.10. This is not a "latest-and-greatest" exercise — it's a known-good combination that compiles on a stock Mac with a stock Xcode install.

Mobile phones on a desk showing app interfaces Photo by Christopher Gower on Unsplash

The GitHub Path: A Real Repo, Not a Tarball Push

For users who prefer Git over a ZIP, RapidNative exposes a second route: POST /api/user/projects/[projectId]/init-git.

The implementation is deliberately simple. Instead of running git init, building a tree, generating commits, and pushing — which would require either a server-side Git binary or a heavy library like isomorphic-git — we use GitHub's template repository generation API. RapidNative maintains a public template called rapidnative/expo-template that contains the canonical scaffold of an exportable Expo project: app.json, tsconfig.json, eas.json, the right .gitignore, and so on.

The flow:

  1. POST to https://api.github.com/repos/rapidnative/expo-template/generate with a Bearer token from GITHUB_TOKEN_FOR_PROJECTS.
  2. Receive html_url, clone_url, ssh_url for the new repo.
  3. Return them to the client, where the user can git clone and then either push the export ZIP's contents on top, or use the GitHub web editor.

This is intentional minimalism. We outsource Git mechanics to GitHub, get a real repository with real history, and avoid maintaining server-side Git infrastructure. The trade-off is that the user does one extra step — but they end up with a normal repo that any CI system already understands.

From ZIP to App Store: The EAS Build Bridge

Once the user has the export — either as a ZIP or as a freshly-minted GitHub repo — they enter the standard Expo Application Services flow. RapidNative does not currently run EAS Build on the user's behalf. Here's why that's a deliberate choice today, and what's likely to change.

The Expo deployment path is well-documented and works:

npx expo install expo-dev-client
eas build --platform ios
eas build --platform android
eas submit -p ios
eas submit -p android

eas build provisions a clean macOS or Linux VM, installs dependencies, runs the React Native build, signs the binary with your Apple Developer or Google Play credentials, and uploads the resulting .ipa or .aab to a download URL. eas submit then uploads that artifact to App Store Connect or Google Play Console — TestFlight from there is automatic for iOS.

The reason RapidNative doesn't wrap this today is twofold. First, code-signing requires the user's Apple Developer credentials and provisioning profiles, which we don't want to custody on someone else's behalf. Second, EAS already does this job better than we could replicate; building a thin proxy adds latency without adding value.

What we do guarantee is that the exported project is EAS-ready out of the box. The app.json has a valid bundle identifier slot, the eas.json defaults to a working production build profile, and the dependency versions are known to compile under the current Expo SDK. Users who paste their export into a fresh directory and run eas build get a real binary on the first try, not on the fifth.

This is the bridge most "AI app builder" demos skip. Production deployment to the App Store or Google Play has its own learning curve — provisioning profiles, certificate management, app review responses — and an exported codebase is the prerequisite, not the finish line. RapidNative's job is to hand you a codebase that compiles cleanly under EAS Build; the rest is the standard React Native publishing flow that's been battle-tested by thousands of teams.

What "Production-Ready" Actually Means

It's worth being precise about the word "production-ready," because it's the word that gets misused most often in this space.

When RapidNative's export pipeline calls a project production-ready, it specifically means:

  • The dependency tree resolves cleanly. No phantom imports, no peer-dependency mismatches, no expo-doctor warnings on the canonical template.
  • The Expo SDK and React Native versions are paired correctly. Expo SDK 54 expects React Native 0.81.x — get that wrong by one minor version and pod install fails on iOS.
  • The Expo Router file structure is valid. Every file in app/ resolves to a route, every layout file matches its directory, no orphaned _layout.tsx files.
  • All assets are physically present. No require('./missing.png') calls left after an AI rewrite.
  • The TypeScript config is real. tsconfig.json extends expo/tsconfig.base with proper path aliases, so @/components/Button actually resolves.
  • The build runs on EAS. The eas.json production profile is configured for both iOS and Android with sane defaults.

The pipeline doesn't claim that every screen is bug-free — that's a product-quality question, not an architecture one. What it claims is that the export is the same shape as what an experienced React Native developer would commit on day one of a new project. From there, normal mobile development workflows take over.

What's Next: Async Builds and Deeper Store Integration

Looking at the current pipeline, the obvious next steps are visible.

The Inngest job system is already wired into the codebase for background work — currently used for email sequences and Slack notifications. The same machinery is a natural fit for async export jobs: queue an EAS Build trigger, poll for completion, push the resulting binary URL back to the user. For very large projects (think 500+ files), shifting ZIP generation off the request thread also fits cleanly into Inngest.

A second direction is direct integration with App Store Connect's REST API and Google Play's Publishing API. With user-supplied credentials stored in a vault (not in our database), we could surface real-time submission status inside the editor — the user sees "Apple is reviewing your build" without leaving the project. This is the kind of feature that's only worth building once the export pipeline itself is rock-solid, which is why it sits behind the architecture work described above.

For now, the contract is clean: AI-generated code lives in Postgres, the in-browser bundler keeps it always-runnable, the export route packages it on demand, GitHub gives you a real repo, and EAS Build takes it from a codebase to a binary in the App Store. That's the full path from a chat prompt to a downloadable app — and every stage is independently observable, replaceable, and debuggable.

If you want to see this end-to-end yourself, the fastest way is to start a project on RapidNative, generate a few screens, and then click Download. You'll get the ZIP this post describes — open it, run npm install, then eas build, and you've gone from prompt to App Store binary in an afternoon.

For more on what powers the upstream stages, see how the two-step AI pipeline and browser bundler power instant React Native and the architecture behind real-time React Native preview. Pricing details and free-tier limits are on the pricing page.

Ready to Build Your App?

Turn your idea into a production-ready React Native app in minutes.

Try It Now

Free tools to get you started

Frequently Asked Questions

RapidNative is an AI-powered mobile app builder. Describe the app you want in plain English and RapidNative generates real, production-ready React Native screens you can preview, edit, and publish to the App Store or Google Play.