greenstevester

snapshot

3
1
# Install this skill:
npx skills add greenstevester/fastlane-skill --skill "snapshot"

Install specific skill from multi-skill repository

# Description

Automate App Store screenshot capture across devices and languages

# SKILL.md


name: snapshot
description: Automate App Store screenshot capture across devices and languages
argument-hint: [--devices "iPhone 15 Pro"] [--languages "en-US,ja"]
allowed-tools: Bash, Read, Write, Edit


Automated App Store Screenshots

Set up Fastlane Snapshot to automatically capture App Store screenshots across multiple devices and languages.

Pre-flight Checks

  • Fastlane installed: !fastlane --version 2>/dev/null | grep "fastlane " | head -1 || echo "βœ— Not installed - run: brew install fastlane"
  • Fastfile exists: !ls fastlane/Fastfile 2>/dev/null && echo "βœ“ Found" || echo "βœ— Not found - run /setup-fastlane first"
  • Existing Snapfile: !ls fastlane/Snapfile 2>/dev/null && echo "βœ“ Already configured" || echo "β—‹ Not configured yet"
  • UI Test target: !find . -maxdepth 3 -name "*UITests*" -type d 2>/dev/null | head -1 || echo "β—‹ No UI test target found"
  • Simulators available: !xcrun simctl list devices available | grep -E "iPhone|iPad" | head -3

Arguments: ${ARGUMENTS:-setup}


Why Automate Screenshots?

App Store requires screenshots for multiple device sizes. Manual capture means:
- 5+ device sizes Γ— 5+ screenshots Γ— N languages = hours of work
- Risk of inconsistency between screenshots
- Repeat everything for each app update

Snapshot automates this: run once, get all screenshots.


Step 1: Initialize Snapshot

fastlane snapshot init

This creates:
- fastlane/Snapfile - Configuration file
- fastlane/SnapshotHelper.swift - Helper for UI tests


Step 2: Configure Snapfile

Edit fastlane/Snapfile:

# Devices to capture (App Store requirements)
devices([
  "iPhone 15 Pro Max",      # 6.7" display (required)
  "iPhone 15 Pro",          # 6.1" display
  "iPhone SE (3rd generation)", # 4.7" display (if supporting older phones)
  "iPad Pro 13-inch (M4)",  # iPad screenshots (if universal app)
])

# Languages to capture
languages([
  "en-US",
  # "ja",      # Japanese
  # "de-DE",   # German
  # "fr-FR",   # French
  # "es-ES",   # Spanish
])

# UI Test scheme
scheme("YourAppUITests")

# Output directory
output_directory("./fastlane/screenshots")

# Clear old screenshots before capture
clear_previous_screenshots(true)

# Stop on first error (set false to continue despite failures)
stop_after_first_error(true)

# Dark mode variants (iOS 13+)
# dark_mode(true)

# Workspace or project (uncomment one)
# workspace("YourApp.xcworkspace")
# project("YourApp.xcodeproj")

Step 3: Add SnapshotHelper to UI Tests

  1. Add SnapshotHelper.swift to your UI test target:
  2. Drag fastlane/SnapshotHelper.swift into Xcode
  3. Ensure it's added to your UITests target (not main app)

  4. Import and configure in your UI test file:

import XCTest

class ScreenshotTests: XCTestCase {

    override func setUpWithError() throws {
        continueAfterFailure = false
        let app = XCUIApplication()
        setupSnapshot(app)  // Initialize snapshot
        app.launch()
    }

    func testTakeScreenshots() throws {
        let app = XCUIApplication()

        // Screenshot 1: Home screen
        snapshot("01_HomeScreen")

        // Navigate to feature and capture
        app.buttons["Feature"].tap()
        snapshot("02_FeatureScreen")

        // Screenshot with content
        app.textFields["Search"].tap()
        app.textFields["Search"].typeText("Example")
        snapshot("03_SearchResults")

        // Settings screen
        app.buttons["Settings"].tap()
        snapshot("04_Settings")

        // Any additional screens...
        snapshot("05_DetailView")
    }
}

Step 4: Run Snapshot

# Capture all screenshots
fastlane snapshot

# Specific device only
fastlane snapshot --devices "iPhone 15 Pro Max"

# Specific language only
fastlane snapshot --languages "en-US"

# Skip launch (use existing simulator state)
fastlane snapshot --skip_open_summary

Screenshots are saved to fastlane/screenshots/{language}/{device}/.


Step 5: Upload to App Store Connect

After capturing, upload with deliver:

# Upload screenshots only (no binary)
fastlane deliver --skip_binary_upload --skip_metadata

# Or use the screenshots lane from setup-fastlane
fastlane ios screenshots

App Store Screenshot Requirements (2024)

Required Device Sizes

Display Size Example Devices Dimensions
6.7" iPhone 15 Pro Max, 14 Pro Max 1290 Γ— 2796
6.5" iPhone 15 Plus, 14 Plus, 11 Pro Max 1284 Γ— 2778
5.5" iPhone 8 Plus (legacy) 1242 Γ— 2208
12.9" iPad iPad Pro 12.9" 2048 Γ— 2732

Minimum: You need at least 6.7" or 6.5" iPhone screenshots. Other sizes can be auto-generated by App Store Connect.

Screenshot Count

  • Minimum: 1 per device size
  • Maximum: 10 per device size
  • Recommended: 5-6 highlighting key features

Optional: Frame Screenshots with Device Bezels

Add device frames around screenshots using frameit:

# Install frameit
brew install imagemagick

# Frame screenshots
fastlane frameit

# Silver device frames
fastlane frameit silver

Create fastlane/screenshots/Framefile.json for custom titles:

{
  "default": {
    "title": {
      "font": "./fonts/MyFont.ttf",
      "color": "#000000"
    },
    "background": "#FFFFFF",
    "padding": 50,
    "show_complete_frame": true
  }
}

Troubleshooting

"SnapshotHelper.swift not found"

Re-run fastlane snapshot init and add the helper to your UI test target.

"Unable to boot simulator"

Reset the simulator:

xcrun simctl shutdown all
xcrun simctl erase all

Screenshots are black/blank

  • Ensure setupSnapshot(app) is called before app.launch()
  • Add small delays if content loads asynchronously:
sleep(1)  // Wait for content
snapshot("01_HomeScreen")

"No matching device found"

Check available simulators:

xcrun simctl list devices available

Update Snapfile device names to match exactly.

UI test fails to find element

Use accessibility identifiers:

// In your app code
button.accessibilityIdentifier = "settingsButton"

// In UI test
app.buttons["settingsButton"].tap()

Integrate with Fastfile

Add a dedicated lane for screenshots:

lane :screenshots do
  snapshot(
    scheme: "YourAppUITests",
    devices: ["iPhone 15 Pro Max", "iPad Pro 13-inch (M4)"],
    languages: ["en-US"]
  )
  # Optional: frame screenshots
  # frameit(white: true)
end

lane :upload_screenshots do
  deliver(
    skip_binary_upload: true,
    skip_metadata: true,
    overwrite_screenshots: true
  )
end

Best Practices

  1. Use sample data: Pre-populate app with attractive demo content
  2. Consistent state: Reset app state before each test run
  3. Accessibility IDs: More reliable than text matching
  4. Handle async: Add waits for network content to load
  5. Dark mode: Capture both light and dark variants
  6. Localization: Test with actual translations, not placeholders
  7. Landscape: Include landscape screenshots for iPad if relevant

Files Created

fastlane/
β”œβ”€β”€ Snapfile                    # Snapshot configuration
β”œβ”€β”€ SnapshotHelper.swift        # Helper for UI tests (copy to test target)
└── screenshots/
    β”œβ”€β”€ en-US/
    β”‚   β”œβ”€β”€ iPhone 15 Pro Max/
    β”‚   β”‚   β”œβ”€β”€ 01_HomeScreen.png
    β”‚   β”‚   β”œβ”€β”€ 02_FeatureScreen.png
    β”‚   β”‚   └── ...
    β”‚   └── iPad Pro 13-inch (M4)/
    β”‚       └── ...
    └── ja/
        └── ...

Complete Workflow

# 1. Set up snapshot
fastlane snapshot init

# 2. Write UI tests with snapshot() calls

# 3. Capture screenshots
fastlane snapshot

# 4. Review screenshots in fastlane/screenshots/

# 5. Optional: add device frames
fastlane frameit

# 6. Upload to App Store Connect
fastlane deliver --skip_binary_upload --skip_metadata

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.