Skip to content
All Posts
BlogMarch 25, 20269 min read
SL

Screenshots.live

Team

From Xcode Simulator to App Store: A Developer's Screenshot Workflow

Walk through the full manual Xcode screenshot workflow, identify the bottlenecks, and replace each step with an automated pipeline using Screenshots.live templates, API rendering, and Fastlane uploads.

The Screenshot Problem Every iOS Developer Knows

You have spent weeks perfecting your app. The UI is polished, the animations are smooth, and the beta testers are happy. Now comes the part nobody warns you about in tutorials: creating App Store screenshots for every device, every localization, and every screen size Apple demands.

If you have shipped even one app, you know the drill. Open the Xcode Simulator, navigate to the right screen, press Cmd+S, repeat for six different device sizes, then open Figma or Photoshop to add text overlays and backgrounds. Multiply that by the number of localized languages you support, and you are staring at hours of repetitive manual work for something that is not even code.

This article walks through the traditional manual workflow step by step, identifies where the bottlenecks live, and then systematically replaces each painful step with automation. By the end, you will have a fully automated pipeline that goes from template design to App Store Connect upload without manual intervention.

The Manual Workflow: Step by Step

Step 1: Capturing Screenshots in the Simulator

The journey starts in Xcode. You boot up the simulator for each required device: iPhone 6.7", iPhone 6.5", iPhone 5.5", iPad Pro 12.9", and iPad Pro 11". For each simulator, you navigate your app to the exact screen you want to capture, make sure the data looks right, and press Cmd+S or use the screenshot button.

Sounds simple enough, but the problems start immediately. Your app might show different content depending on screen size, so you need to verify each capture. Simulators are slow to boot. If your app requires authentication, you are logging in on every device. One wrong tap and you have to start over.

For a typical app with 6 screenshots across 5 device sizes, that is 30 individual captures. Add 3 localizations and you are at 90 screenshots before you have even opened a design tool.

Step 2: Resizing and Formatting

Apple has strict size requirements for App Store screenshots. Each device class expects exact pixel dimensions. iPhone 6.7" needs 1290x2796, iPhone 6.5" needs 1242x2688, and so on. If your simulator output does not match exactly, App Store Connect will reject the upload.

So you open each screenshot and verify the dimensions. Maybe your simulator was running at a different scale factor. Maybe you accidentally captured the status bar in a way that adds extra pixels. Each mismatch means manual cropping or resizing, and any scaling introduces quality loss.

Step 3: Adding Text Overlays and Backgrounds

Raw simulator screenshots are not enough for a competitive listing. The top apps in every category use carefully designed screenshots with text captions, branded backgrounds, device frames, and visual hierarchy that tells a story as users scroll through the listing.

This is where most developers either open Figma, Sketch, or Photoshop, or they pay a designer. You create a template with your brand colors, add a headline like "Track your habits effortlessly," position the device mockup, and export. Then you do it again for the next screenshot. And again for each device size, adjusting text size and layout to fit each aspect ratio.

This step alone can take an entire day. And every time you update your app's UI, you have to redo it.

Step 4: Exporting for Every Device

With your designs ready, you export each one at the correct resolution. Figma makes this somewhat easier with export presets, but you still need to manage naming conventions so you know which file goes where in App Store Connect. A naming scheme like en-US_iPhone67_01_habits.png keeps things organized, but maintaining that scheme manually is error-prone.

Step 5: Uploading to App Store Connect

Finally, you open App Store Connect, navigate to your app version, and start dragging screenshots into the right slots. Each device size has its own section. Each localization has its own tab. You drag, wait for the upload, verify the preview, and move on.

If you spot an error, a typo in your text overlay for example, you go back to Figma, fix it, re-export, and re-upload. The feedback loop is painfully slow.

Identifying the Bottlenecks

Looking at the manual workflow, the pain points cluster around three themes:

  • Repetition across devices. Every action is multiplied by the number of device sizes. Apple currently supports up to 6 different screenshot size classes.
  • Repetition across languages. If you localize your screenshots, every text overlay change means re-exporting for every language.
  • No single source of truth. Your Figma file, your simulator captures, and your App Store Connect uploads are disconnected. Changing one does not automatically update the others.

The ideal workflow would let you define a screenshot design once, render it for every device and language automatically, and push the results straight to App Store Connect. That is exactly what we are going to build.

The Automated Pipeline

Step 1: Design Once in Screenshots.live

Instead of designing in Figma and manually combining device mockups with text overlays, you create a template in the Screenshots.live visual editor. The editor works directly in your browser and is purpose-built for app store screenshots.

Each template supports dynamic text layers and image layers. You define placeholders for your headline text, your app screenshot, and your background. The editor shows you a real-time preview at the exact dimensions Apple requires.

The key advantage: one template works for multiple device sizes. Screenshots.live handles the device frame rendering, so you design the composition once and the system adapts it for iPhone 6.7", 6.5", iPad, and any other size you need.

Step 2: Define Your Content as Data

Instead of manually typing text overlays into a design file, you define your screenshot content as structured data. Here is a simple example:

const screenshots = [
  {
    templateId: "habit-tracker-main",
    layers: {
      headline: "Track your habits effortlessly",
      appScreenshot: "https://your-cdn.com/screens/home.png"
    }
  },
  {
    templateId: "habit-tracker-stats",
    layers: {
      headline: "See your progress at a glance",
      appScreenshot: "https://your-cdn.com/screens/stats.png"
    }
  }
];

For localization, you simply swap the text values:

const localizedContent = {
  "en": { headline: "Track your habits effortlessly" },
  "de": { headline: "Verfolge deine Gewohnheiten muhelos" },
  "es": { headline: "Rastrea tus habitos sin esfuerzo" }
};

Your screenshots are now data-driven. Change a headline in one place, and every device size and language picks it up automatically.

Step 3: Render via the API

Screenshots.live provides a REST API that renders your templates to pixel-perfect images. A single API call produces a finished screenshot at the exact resolution App Store Connect requires:

curl -X POST https://api.screenshots.live/v1/render \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "templateId": "habit-tracker-main",
    "format": "png",
    "width": 1290,
    "height": 2796,
    "layers": {
      "headline": "Track your habits effortlessly",
      "appScreenshot": "https://your-cdn.com/screens/home.png"
    }
  }'

The response gives you a URL to the rendered image, ready for download or direct upload. No Figma, no Photoshop, no manual resizing.

To render all your screenshots for all devices and languages, you write a simple script that loops through your content data and calls the API for each combination. What used to take hours now runs in minutes.

Step 4: Automate the Upload with Fastlane

Fastlane's deliver tool handles App Store Connect uploads from the command line. Combined with Screenshots.live's API, you can build a fully automated pipeline:

# render_and_upload.sh

# 1. Render all screenshots via Screenshots.live API
node render-screenshots.js --output ./fastlane/screenshots

# 2. Upload to App Store Connect via Fastlane
fastlane deliver \
  --skip_binary_upload true \
  --skip_metadata true \
  --overwrite_screenshots true \
  --screenshots_path ./fastlane/screenshots

Your render script downloads the API output into Fastlane's expected directory structure:

fastlane/screenshots/
  en-US/
    iPhone 67 inch-01_habits.png
    iPhone 67 inch-02_stats.png
    iPad Pro 129 inch-01_habits.png
  de-DE/
    iPhone 67 inch-01_habits.png
    ...

Fastlane picks up the files, matches them to the correct device slots in App Store Connect, and uploads everything in one run.

Step 5: Integrate into CI/CD

The final step is running this pipeline automatically. Add it to your GitHub Actions workflow or your CI system of choice:

name: Update App Store Screenshots
on:
  workflow_dispatch:
  push:
    paths:
      - 'screenshot-content/**'

jobs:
  screenshots:
    runs-on: macos-latest
    steps:
      - uses: actions/checkout@v4
      - name: Render screenshots
        run: node render-screenshots.js --output ./fastlane/screenshots
        env:
          SCREENSHOTS_API_KEY: ${{ secrets.SCREENSHOTS_API_KEY }}
      - name: Upload to App Store Connect
        run: fastlane deliver --skip_binary_upload true --skip_metadata true --overwrite_screenshots true
        env:
          FASTLANE_USER: ${{ secrets.FASTLANE_USER }}
          FASTLANE_PASSWORD: ${{ secrets.FASTLANE_PASSWORD }}

Now, whenever you update your screenshot content, whether it is a new headline, a new localization, or a fresh app UI capture, the pipeline renders and uploads everything automatically.

What About Android?

If you also ship on Google Play, the manual workflow doubles. Different screenshot sizes, different aspect ratios, different upload interfaces. Screenshots.live supports Android device frames natively, and the template porting feature lets you auto-convert your iOS templates to Android dimensions. Your same content data and the same API calls produce Google Play-ready screenshots alongside your App Store assets.

Fastlane's supply tool handles Google Play uploads the same way deliver handles App Store Connect, so the pipeline extends naturally.

Time Comparison

Here is a realistic comparison for an app with 6 screenshots, 5 device sizes, and 3 languages (90 total images):

StepManualAutomated
Capture in simulator2 hours0 (use existing UI screenshots)
Design in Figma4 hours1 hour (one-time template setup)
Export all variations1.5 hours3 minutes (API rendering)
Upload to App Store Connect1 hour2 minutes (Fastlane)
Total8.5 hours1 hour + 5 min per update

The one-time setup takes about an hour. Every subsequent update, whether it is a text change, a new localization, or a fresh app UI, takes about 5 minutes. Over the lifetime of an actively maintained app, that adds up to weeks of saved time.

Getting Started

If you are an iOS developer tired of the screenshot grind, here is the path forward:

  1. Sign up at Screenshots.live and create your first template in the visual editor.
  2. Set up your content data as a simple JSON or JavaScript file with your headlines and app screen URLs for each language.
  3. Write a small render script that calls the Screenshots.live API for each content variation and saves the output in Fastlane's directory structure.
  4. Run Fastlane deliver to upload everything to App Store Connect in one command.
  5. Add it to CI so updates happen automatically whenever your content changes.

The entire setup takes about an hour, and it pays for itself the very first time you need to update a screenshot across multiple devices and languages. Your future self, shipping the next update at midnight before a deadline, will thank you.

Related Posts