Back to blog
Guide·April 29, 2026·7 min

AI Image Editing: The CHANGE / PRESERVE / MATCH Framework

How to edit AI images without drift, identity loss, or the 'pasted-in' look.

Why editing is harder than generating

Generating an image from a prompt is a single-shot operation. The model has total freedom — it just needs to satisfy your constraints.

Editing is fundamentally different. You're asking the model to change one thing while keeping everything else identical. That's a much harder problem, because "everything else" is a massive set of visual properties: lighting angle, color temperature, skin tones, fabric texture, shadow direction, grain structure, focal plane, and thousands of other details you don't consciously notice until they change.

Without structure, edits drift. Skin tones shift. Backgrounds bleed into subjects. Hair changes texture. Shadows point the wrong way. The result looks "pasted in" — which is worse than just generating from scratch.

The framework: CHANGE / PRESERVE / MATCH

Every edit prompt should have exactly three blocks:

CHANGE: The one thing (or small set of things) you want different. Be specific about what replaces what.

PRESERVE: An explicit, exhaustive list of everything that must stay exactly the same. If you don't name it, the model may change it.

MATCH: The lighting, color, and grain logic that bridges the old and new elements. This is what prevents the "pasted in" look.

Why three blocks, not two

Most people write edit prompts with just CHANGE and PRESERVE. They skip MATCH — and that's where the uncanny results come from.

When you swap a studio background for a rainy street, the subject's lighting shouldn't stay studio-flat. The new environment should cast subtle color spill onto the subject — cool blue from neon, warm orange from streetlights. Without the MATCH block, the model either keeps the original lighting (subject looks pasted) or re-lights everything (subject's face changes).

MATCH tells the model: keep the original lighting direction and intensity on the subject, but let the new environment influence color temperature and reflections naturally.

Example 1: Background swap

Rough idea: "Put this portrait in a rainy Tokyo street at night"

CHANGE: Replace the background with a rainy Tokyo street at night —
neon signs reflecting on wet pavement, blurred passing pedestrians,
soft bokeh from illuminated signage, light rain visible against
dark sky.

PRESERVE: Subject's face, expression, pose, hair, clothing, body
framing, skin tones, and exact silhouette — locked as in source.

MATCH: Keep original lighting direction on subject (rim from
camera-right). Subject's color temperature stays neutral, but add
subtle cool blue and magenta reflections on clothing and skin from
neon environment. Match original film grain and depth of field.

Example 2: Object removal

Rough idea: "Remove the trash can and power lines"

CHANGE: Remove the green trash can in lower-left corner and the
power line crossing the upper third. Reconstruct the wall behind
the trash can using the existing brick pattern. Reconstruct the
sky as a clean continuous gradient matching the surrounding area.

PRESERVE: Every person, the storefront awning, parked car,
street signs, sidewalk texture, all cast shadows, window
reflections, and wall graffiti.

MATCH: Golden hour lighting from camera-right, warm color
temperature, existing film grain and vignette.

Example 3: Style transfer

Rough idea: "Make this photo look like a watercolor"

CHANGE: Convert the photographic style to loose watercolor
painting — visible paper texture, soft wet-on-wet color bleeds,
unpainted white areas where highlights fall, slightly imprecise
edges.

PRESERVE: Composition, subject placement, relative proportions,
facial features and expression, clothing silhouette, background
element positions.

MATCH: Maintain the same value structure (darks and lights in the
same places). Simplify the color palette to 5-6 watercolor-plausible
hues. Preserve the original light direction through shadow placement.

Multi-turn editing: the drift problem

Single edits are manageable. The real challenge is multi-turn editing — making 3, 4, 5 changes to the same image across multiple generations. Each turn introduces small drift, and drift compounds.

By turn 3 or 4, the subject's face has subtly changed, the lighting is inconsistent, and the image has lost coherence.

The fix: restate PRESERVE every turn. Don't rely on the model remembering what to keep from previous turns. Copy-paste your full PRESERVE block into every edit prompt, updated to include anything new you added in previous turns.

The second fix: work from the best previous output, not the original. Each edit should reference the most recent good result, not the original image. This keeps the edit chain grounded.

When to edit vs. when to regenerate

Edit when:

  • You like 80%+ of the image and want to fix one element
  • The composition, pose, and lighting are right but a detail is wrong
  • You need to swap a background or remove an object
  • You're doing style transfer on a specific composition

Regenerate when:

  • The composition is fundamentally wrong
  • Multiple major elements need changing
  • The pose or framing doesn't work
  • You're on edit turn 4+ and drift has accumulated

Regenerating with a refined prompt is often faster than fixing a broken edit chain.

Common editing failures and fixes

"Pasted in" look: You're missing the MATCH block. Add color spill, reflection, and grain matching from the new environment.

Face drift after 2-3 edits: Your PRESERVE block isn't specific enough. Add: "facial bone structure, eye color, skin texture, exact hairline."

Shadow inconsistency: Name the light direction in MATCH. "Shadows cast to lower-left, consistent with key light from upper-right."

Background bleeding into subject edges: Add to PRESERVE: "exact silhouette edges, no feathering or blending at subject boundary."

Grain mismatch: Add to MATCH: "Match ISO grain structure and intensity of the source image."

The shortcut

Writing CHANGE / PRESERVE / MATCH blocks manually is tedious, especially the exhaustive PRESERVE list. Depikt's critique tool can analyze your edit prompts and flag missing PRESERVE elements before you waste a generation on drift.

Generate yours

Generate polished prompts in seconds.

Paste a rough idea. Get back a structured prompt that ships.