MegaOptim
Back to Blog Tutorials

Automating Image Optimization in Your CI/CD Pipeline

· 10 min read

Automating Image Optimization in Your CI/CD Pipeline

Manual image optimization does not scale. A developer remembers to run an image through a compression tool before committing it, but another developer forgets. A designer uploads a 4 MB hero image directly to the CMS. A batch of product photos lands in the repository untouched. Over time, the problem compounds, and your site slows down one unoptimized image at a time.

The solution is to take human judgment out of the equation entirely. By integrating image optimization into your CI/CD pipeline, every image that reaches production is guaranteed to be compressed, regardless of who committed it or how it got there.

This tutorial covers several approaches to pipeline integration using the MegaOptim API, from GitHub Actions and GitLab CI to pre-commit hooks and build tool plugins.

Why Manual Optimization Fails at Scale

On a small team shipping a few pages a month, manual optimization might work. Someone runs images through a tool, checks the output, and commits the result. But this approach breaks down quickly:

  • Inconsistency. Different team members use different tools with different settings. One developer compresses to 60% quality, another to 85%. There is no single standard.
  • Forgotten steps. Under deadline pressure, optimization is the first step to be skipped. It is invisible until a performance audit surfaces the bloated images weeks later.
  • No enforcement. Code reviews catch bad logic, but reviewers rarely check whether a 2.4 MB PNG could have been a 180 KB WebP.
  • Volume. E-commerce sites with thousands of product images, content-heavy blogs, and user-generated content platforms simply cannot rely on someone manually processing every file.

Automation eliminates all of these problems. The pipeline applies the same compression settings to every image, every time, with zero developer effort after initial setup.

Where to Add Optimization in the Pipeline

There are three natural integration points, each with different tradeoffs:

Pre-commit hooks catch unoptimized images before they enter the repository. This is the earliest intervention point and keeps the repository clean, but it requires every developer to have the hook installed locally.

Build step optimization runs during CI, compressing images as part of the build process. This is the most reliable approach because it runs in a controlled environment regardless of individual developer setups.

Deployment step optimization happens just before or during deployment. This works well when images are sourced from a CMS or external storage rather than the repository itself.

For most teams, the build step is the right default. It is centralized, consistent, and does not depend on local developer tooling.

GitHub Actions Example

The following workflow finds all images in the repository, sends each one to the MegaOptim API for optimization, and replaces the originals with the compressed versions. It runs on every push that includes image file changes.

name: Optimize Images

on:
  push:
    paths:
      - '**.jpg'
      - '**.jpeg'
      - '**.png'
      - '**.webp'

jobs:
  optimize:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Find and optimize images
        env:
          MEGAOPTIM_API_KEY: ${{ secrets.MEGAOPTIM_API_KEY }}
        run: |
          # Find changed image files
          CHANGED_IMAGES=$(git diff --name-only HEAD~1 HEAD -- '*.jpg' '*.jpeg' '*.png' '*.webp' || true)

          for IMAGE in $CHANGED_IMAGES; do
            [ -f "$IMAGE" ] || continue
            echo "Optimizing: $IMAGE"

            # Submit image for optimization
            RESPONSE=$(curl -s -X POST https://api.megaoptim.com/v1/optimize 
              -H "X-API-Key: $MEGAOPTIM_API_KEY" 
              -F "file=@$IMAGE" 
              -F "compression=intelligent")

            REQUEST_ID=$(echo "$RESPONSE" | jq -r '.request_id')

            # Poll for result
            for i in $(seq 1 30); do
              RESULT=$(curl -s https://api.megaoptim.com/v1/optimize/$REQUEST_ID/result 
                -H "X-API-Key: $MEGAOPTIM_API_KEY")
              STATUS=$(echo "$RESULT" | jq -r '.status')

              if [ "$STATUS" = "completed" ]; then
                DOWNLOAD_URL=$(echo "$RESULT" | jq -r '.result.url')
                curl -s -o "$IMAGE" "$DOWNLOAD_URL"
                SAVINGS=$(echo "$RESULT" | jq -r '.result.saved_percent')
                echo "  Saved ${SAVINGS}%"
                break
              fi
              sleep 2
            done
          done

      - name: Commit optimized images
        run: |
          git config user.name "github-actions[bot]"
          git config user.email "github-actions[bot]@users.noreply.github.com"
          git add -A
          git diff --staged --quiet || git commit -m "Optimize images via MegaOptim"
          git push

Store your API key as a repository secret named MEGAOPTIM_API_KEY. The workflow only processes images that changed in the most recent commit, so it does not re-optimize files that have already been compressed.

GitLab CI Example

The same approach works in GitLab CI. Add this to your .gitlab-ci.yml:

optimize-images:
  stage: build
  image: alpine:latest
  before_script:
    - apk add --no-cache curl jq git bash
  script:
    - |
      CHANGED_IMAGES=$(git diff --name-only HEAD~1 HEAD -- '*.jpg' '*.jpeg' '*.png' '*.webp' || true)
      for IMAGE in $CHANGED_IMAGES; do
        [ -f "$IMAGE" ] || continue
        echo "Optimizing: $IMAGE"

        RESPONSE=$(curl -s -X POST https://api.megaoptim.com/v1/optimize 
          -H "X-API-Key: $MEGAOPTIM_API_KEY" 
          -F "file=@$IMAGE" 
          -F "compression=intelligent")

        REQUEST_ID=$(echo "$RESPONSE" | jq -r '.request_id')

        for i in $(seq 1 30); do
          RESULT=$(curl -s https://api.megaoptim.com/v1/optimize/$REQUEST_ID/result 
            -H "X-API-Key: $MEGAOPTIM_API_KEY")
          STATUS=$(echo "$RESULT" | jq -r '.status')
          if [ "$STATUS" = "completed" ]; then
            DOWNLOAD_URL=$(echo "$RESULT" | jq -r '.result.url')
            curl -s -o "$IMAGE" "$DOWNLOAD_URL"
            echo "  Saved $(echo "$RESULT" | jq -r '.result.saved_percent')%"
            break
          fi
          sleep 2
        done
      done
  only:
    changes:
      - '**/*.{jpg,jpeg,png,webp}'
  variables:
    MEGAOPTIM_API_KEY: $MEGAOPTIM_API_KEY

Add MEGAOPTIM_API_KEY as a CI/CD variable in your GitLab project settings.

Pre-Commit Hooks for Images

For teams that want to catch unoptimized images before they even reach CI, a Git pre-commit hook provides an immediate feedback loop. Create .git/hooks/pre-commit (or use a framework like pre-commit):

#!/bin/bash
# Pre-commit hook: optimize staged images via MegaOptim API

STAGED_IMAGES=$(git diff --cached --name-only --diff-filter=ACM -- '*.jpg' '*.jpeg' '*.png' '*.webp')

if [ -z "$STAGED_IMAGES" ]; then
  exit 0
fi

if [ -z "$MEGAOPTIM_API_KEY" ]; then
  echo "Error: MEGAOPTIM_API_KEY not set. Skipping image optimization."
  exit 1
fi

for IMAGE in $STAGED_IMAGES; do
  echo "Optimizing: $IMAGE"

  RESPONSE=$(curl -s -X POST https://api.megaoptim.com/v1/optimize 
    -H "X-API-Key: $MEGAOPTIM_API_KEY" 
    -F "file=@$IMAGE" 
    -F "compression=intelligent")

  REQUEST_ID=$(echo "$RESPONSE" | jq -r '.request_id')

  for i in $(seq 1 30); do
    RESULT=$(curl -s https://api.megaoptim.com/v1/optimize/$REQUEST_ID/result 
      -H "X-API-Key: $MEGAOPTIM_API_KEY")
    STATUS=$(echo "$RESULT" | jq -r '.status')
    if [ "$STATUS" = "completed" ]; then
      curl -s -o "$IMAGE" "$(echo "$RESULT" | jq -r '.result.url')"
      git add "$IMAGE"
      echo "  Optimized and re-staged."
      break
    fi
    sleep 2
  done
done

The hook optimizes each staged image and re-adds the compressed version to the staging area. The commit proceeds with the optimized files. Set the MEGAOPTIM_API_KEY environment variable in your shell profile or a local .env file that is excluded from version control.

Build Tool Integration

If your project uses a JavaScript bundler, you can optimize images as part of the build process using plugins that handle image assets.

For Vite, the vite-plugin-imagemin plugin compresses images during the build. For deeper integration with MegaOptim, you can write a custom plugin that sends images to the API during the build step:

// vite.config.js
import { defineConfig } from 'vite';

export default defineConfig({
  plugins: [
    {
      name: 'megaoptim-optimize',
      async generateBundle(_, bundle) {
        for (const [fileName, asset] of Object.entries(bundle)) {
          if (asset.type === 'asset' && /.(jpe?g|png|webp)$/i.test(fileName)) {
            // Send asset.source (Buffer) to MegaOptim API
            // Replace asset.source with the optimized result
          }
        }
      }
    }
  ]
});

For webpack, the image-minimizer-webpack-plugin supports custom optimization functions where the MegaOptim API can be called during asset processing.

Build tool integration is best suited for projects where images are bundled application assets rather than user-uploaded content.

Using the MegaOptim API in Custom Scripts

For workflows that do not fit neatly into CI or build tools, a standalone script gives you full control. This is useful for batch processing images in a content directory, optimizing uploads in a staging environment, or integrating with a custom CMS deployment process.

#!/bin/bash
# optimize-directory.sh - Optimize all images in a directory

DIRECTORY=${1:-.}
API_KEY="${MEGAOPTIM_API_KEY}"
COMPRESSION="${2:-intelligent}"

find "$DIRECTORY" -type f ( -name '*.jpg' -o -name '*.png' -o -name '*.webp' ) | while read IMAGE; do
  ORIGINAL_SIZE=$(stat -c%s "$IMAGE")

  RESPONSE=$(curl -s -X POST https://api.megaoptim.com/v1/optimize 
    -H "X-API-Key: $API_KEY" 
    -F "file=@$IMAGE" 
    -F "compression=$COMPRESSION")

  REQUEST_ID=$(echo "$RESPONSE" | jq -r '.request_id')

  for i in $(seq 1 30); do
    RESULT=$(curl -s https://api.megaoptim.com/v1/optimize/$REQUEST_ID/result 
      -H "X-API-Key: $API_KEY")
    STATUS=$(echo "$RESULT" | jq -r '.status')
    if [ "$STATUS" = "completed" ]; then
      curl -s -o "$IMAGE" "$(echo "$RESULT" | jq -r '.result.url')"
      NEW_SIZE=$(stat -c%s "$IMAGE")
      SAVED=$(( (ORIGINAL_SIZE - NEW_SIZE) * 100 / ORIGINAL_SIZE ))
      echo "$IMAGE: ${ORIGINAL_SIZE}B -> ${NEW_SIZE}B (${SAVED}% saved)"
      break
    fi
    sleep 2
  done
done

For a full walkthrough of the API endpoints used in these scripts, see the API getting started guide.

Monitoring Optimization Results

Automation without visibility is just hope. After setting up pipeline optimization, track the results to confirm it is working and to catch regressions.

Log output from CI jobs. The examples above print per-image savings. Aggregate these into a summary at the end of the job — total images processed, total bytes saved, average compression ratio.

Set size budgets. Add a CI step that fails the build if any single image exceeds a size threshold (for example, 500 KB for content images, 200 KB for thumbnails). This prevents oversized images from slipping through even if the optimization step is misconfigured.

Track API usage. The MegaOptim dashboard shows your API call history and remaining balance. Set up alerts if your usage approaches your plan limits so that optimization does not silently stop.

Storing Optimized Images: Cache and Artifacts

Repeated optimization of the same unchanged images wastes API calls and slows down builds. Use caching to avoid reprocessing:

CI cache by file hash. Before optimizing, compute a hash of each image. Store a mapping of hash-to-optimized-file in the CI cache. On subsequent runs, skip any image whose hash matches a cached entry.

Build artifacts. In GitHub Actions, use actions/cache to persist the optimized images directory between runs. In GitLab CI, define the images directory as an artifact or use the built-in cache mechanism.

Content-addressable storage. For large-scale pipelines, store optimized images in a content-addressable store (keyed by the original file’s hash). This works especially well for monorepos or multi-project setups where the same image might appear in multiple locations.

The goal is simple: each unique image should be optimized exactly once.

Next Steps

Integrating image optimization into your CI/CD pipeline turns a manual, error-prone task into a reliable, zero-effort process. Start with the approach that fits your existing tooling — GitHub Actions or GitLab CI for most teams, pre-commit hooks for stricter workflows, build plugins for JavaScript-heavy projects.

If you are new to the MegaOptim API, start with the getting started guide to understand the endpoints and authentication. For background on why image optimization matters and how compression techniques work, see What Is Image Optimization.