Image Weight on Portfolio Sites: The 2026 Compression Workflow That Holds Up
Compression is not a one-off task. Here is the 2026 workflow that keeps portfolio images sharp, small, and maintainable across projects.

Why Workflows Matter More Than Tools
Every few months, a new image optimization tool appears. It promises better compression, faster encoding, or simpler integration. Studios try it on one project, get decent results, then forget about it. The next project goes back to manual Photoshop exports or no optimization at all.
The problem is not tools. The problem is the lack of a repeatable process that runs the same way every time, regardless of who adds images, which project it is, or whether anyone remembers to optimize. A compression workflow needs to be automatic, consistent, and integrated into the build pipeline so that unoptimized images cannot reach production.
This article describes the workflow we use across projects. It handles multi-format output, responsive sizing, quality validation, and integration with static site generators. It is not tied to any specific tool, though we will name the ones we use.
The Pipeline
The workflow has four stages:
1. Source image preparation
Original images go into a source directory that is not served to visitors. These are the highest quality versions: full-resolution exports from Figma, Photoshop, or camera RAW processing. They should be stored at the maximum quality you will ever need, typically at 2x the largest display size.
For a portfolio hero image that displays at 1600 CSS pixels on desktop, the source should be at least 3200 pixels wide. For a supporting image at 1200 CSS pixels, the source should be 2400 pixels wide.
Store these as lossless PNG or high-quality JPEG (quality 95+). They are your master copies. Everything downstream is derived from these.
2. Resize and crop
Generate the responsive variants your site needs. Typical breakpoints for portfolio sites:
- 400px (mobile, 1x)
- 800px (mobile, 2x or tablet, 1x)
- 1200px (tablet, 2x or desktop, 1x)
- 1600px (desktop, standard)
- 2400px (desktop, high DPI, optional)
Use a tool that handles this automatically. Sharp (Node.js), Pillow (Python), or ImageMagick (CLI) all work. The key is that the resize step is scripted, not manual.
Cropping should be art-directed for key breakpoints. A desktop hero image with a subject on the right may need a different crop on mobile to keep the subject visible. This is the one step that benefits from human judgment, at least for hero images.
3. Encode to multiple formats
For each resized variant, generate:
- AVIF at quality 55 to 65
- WebP at quality 78 to 82
- JPEG at quality 80 to 85
These quality targets are starting points. Visually inspect a representative sample and adjust. Some images (high-detail product photography) need higher quality. Others (textured backgrounds, blurred bokeh shots) can tolerate lower quality.
We use Sharp for encoding because it handles all three formats, runs in Node.js (matching our build toolchain), and is fast enough for batch processing.
4. Validate and deploy
Before deploying, validate that:
- Every source image has all expected format variants
- No variant exceeds the size budget (we use 150KB per image as a soft limit for hero images)
- No variant has visible artifacts at normal viewing distances
This validation step catches missing images, botched crops, and accidental quality regressions. It runs as part of the build pipeline.
Integration With Static Sites
For a static site generator like Astro, Eleventy, or a static export from Next.js, the compression pipeline runs at build time.
The approach we use:
- Source images live in a
src/images/directory (or equivalent source folder) - A build script processes them into
public/images/with proper naming conventions - The site templates reference images by a naming convention that assumes all variants exist
- The build fails if any expected variant is missing
On this very site, images follow a path convention like /images/journal/. The download and compression step runs before the main build and produces the files that the templates expect. The image manifest tracks what exists and what is missing.
The Naming Convention
Consistent naming is surprisingly important. When images follow a predictable pattern, you can generate elements programmatically and validate image completeness with scripts.
Our convention:
```
```
Examples:
journal/core-web-vitals-1600x900.jpgjournal/core-web-vitals-1600x900.avifjournal/core-web-vitals-800x450.webpwork/project-name-hero-1600x900.jpg
This convention makes it trivial to check whether a file exists, generate responsive srcset values, and find orphaned images that are no longer referenced.
Tools We Actually Use
Sharp (Node.js): Primary encoding tool. Fast, supports AVIF/WebP/JPEG, integrates with Node.js build scripts. We use it for both resizing and encoding.
Squoosh CLI: Useful for one-off compression experiments and quality comparisons. Not ideal for pipeline automation, but excellent for finding the right quality settings before automating.
ImageMagick: For batch operations like cropping, color space conversion, and metadata stripping. Heavier than Sharp but more flexible for edge cases.
Pillow (Python): When the build pipeline is Python-based (like our image downloader), Pillow handles resize and JPEG encoding well. For AVIF, pillow-avif-plugin adds support.
Handling Client-Provided Images
Studio projects often include images provided by clients: product photography, team photos, event images. These arrive in unpredictable formats and sizes. The workflow needs to handle them gracefully.
Our approach:
- Drop client images into the source directory
- Run the same pipeline as internally produced images
- Manually review the output for quality, especially if client images are low-resolution to start with
If a client image is too small for the required display size, do not upscale it. Serve it at its native resolution and accept the quality limitation. Upscaled images look worse than slightly small images and waste bandwidth.
Budget Enforcement
Page weight budgets only work if they are enforced automatically. Our build pipeline includes a check that fails if any image variant exceeds its budget:
- Hero images (1600px wide): max 150KB JPEG, max 80KB AVIF
- Supporting images (1200px wide): max 100KB JPEG, max 60KB AVIF
- Thumbnails (400px wide): max 30KB JPEG, max 15KB AVIF
These are soft limits. If an image legitimately needs more (high-detail technical diagram, for example), we override the limit explicitly with a documented reason. But the default enforcement catches accidental over-sized images before they reach production.
Common Mistakes
Optimizing images manually. Any process that depends on a human remembering to compress images will eventually produce unoptimized images. Automate it.
Using one quality setting for all images. A hero photograph and a screenshot of a code editor have different compression characteristics. The screenshot might compress well at quality 60 while the photograph needs quality 80. Use different quality presets for different image types.
Ignoring image metadata. Camera images include EXIF metadata (GPS coordinates, camera settings, timestamps) that adds kilobytes to the file and may include privacy-sensitive data. Strip metadata during processing.
Serving full-width images to mobile. Without responsive sizing, a mobile visitor downloads the full desktop image and the browser downscales it in the viewport. This wastes bandwidth and slows LCP for no visual benefit.
FAQ
How much time does the compression pipeline add to the build?
For a portfolio site with 50 to 100 images and three format variants each, the pipeline runs in 30 to 90 seconds. That is a one-time cost per build, not per page.
Should I use a CDN image service instead of build-time processing?
Both approaches work. CDN image services (Cloudinary, Imgix, Cloudflare Images) handle resizing and format negotiation at request time, which is simpler to set up. Build-time processing gives you more control and avoids runtime dependencies. For static sites, build-time processing is usually the better fit.
What about AI-powered upscaling for low-resolution client images?
AI upscaling has improved significantly, but the results are inconsistent for portfolio-quality requirements. Use it cautiously and always compare the upscaled result against the original at the target display size.
The Core Principle
A compression workflow is only as good as its worst image. If 49 out of 50 images are optimized but one hero image ships at 2 MB because someone forgot the pipeline, the page weight budget is blown. Automation removes the possibility of forgetting. That is the whole point.