Tutorials/Multi-Room + Multi-Angle Consistency
Sales Playbook10 min setupAPI-ready today

Multi-Room + Multi-Angle Consistency Pipeline

Use reference-anchored generation plus stable anchor prompts to keep style and identity aligned across room sets and product angle sets. This guide uses shipped endpoints only.

Why this closes active sales objections

Property-level room consistency

Keep geometry and staging style aligned per listing with one anchor strategy.

SKU-level angle consistency

Run front, 45-degree, and detail shots while preserving product identity.

One CSV for both segments

A single schema supports real estate and product-photo pipelines.

Retry-safe outputs

Persist shot-level status lines and replay only failed shots.

1

Prepare a mixed consistency job CSV

Group all room shots for one property and all angle shots for one SKU under the same anchor_id.

csv
job_id,segment,anchor_id,source_image_url,shot_name,prompt_append
RE-9001,living_room,prop-1107,"https://cdn.example.com/listings/1107/living-room-empty.jpg",daylight_stage,"Scandinavian staging, oak coffee table, neutral palette, no structural edits"
RE-9001,bedroom,prop-1107,"https://cdn.example.com/listings/1107/bedroom-empty.jpg",daylight_stage,"Matching Scandinavian style, linen bedding, wall art, no structural edits"
RE-9001,kitchen,prop-1107,"https://cdn.example.com/listings/1107/kitchen-empty.jpg",daylight_stage,"Matching Scandinavian style, clean counters, warm practical lighting, no structural edits"
SKU-2202,product,sku-2202,"https://cdn.example.com/products/2202/front.png",front_hero,"Front hero angle on white seamless, soft shadow, premium catalog look"
SKU-2202,product,sku-2202,"https://cdn.example.com/products/2202/front.png",left_45,"Left 45-degree angle, same lens feel, same lighting direction as front_hero"
SKU-2202,product,sku-2202,"https://cdn.example.com/products/2202/front.png",detail_zipper,"Close detail crop of zipper and texture, keep material and color exact"
2

Run the consistency batch script

The script calls POST /v1/images/generations with image_url and one output per shot for deterministic review.

python
import asyncio
import csv
import json
import os
from dataclasses import dataclass

import httpx

API_KEY = os.environ["CREATIVEAI_API_KEY"]
BASE_URL = "https://api.creativeai.run/v1/images/generations"
INPUT_CSV = "consistency-jobs.csv"
OUTPUT_JSONL = "consistency-outputs.jsonl"

MAX_CONCURRENCY = 6
MAX_RETRIES = 2


@dataclass
class JobRow:
    job_id: str
    segment: str
    anchor_id: str
    source_image_url: str
    shot_name: str
    prompt_append: str


ANCHOR_PROMPTS = {
    "real_estate": (
        "Photorealistic real estate listing image. Preserve original room geometry, wall and floor materials, "
        "window positions, and camera perspective from reference image. Keep listing-safe composition."
    ),
    "product": (
        "Commercial product photography. Keep exact product silhouette, proportions, label/logo placement, "
        "material texture, and brand colors from reference image."
    ),
}


def choose_anchor_prompt(segment: str) -> str:
    if segment == "product":
        return ANCHOR_PROMPTS["product"]
    return ANCHOR_PROMPTS["real_estate"]


def build_prompt(row: JobRow) -> str:
    return f"{choose_anchor_prompt(row.segment)} Shot: {row.shot_name}. {row.prompt_append}."


async def submit_row(client: httpx.AsyncClient, sem: asyncio.Semaphore, row: JobRow) -> dict:
    payload = {
        "model": "gpt-image-1",
        "prompt": build_prompt(row),
        "image_url": row.source_image_url,  # CreativeAI extension for reference-anchored generation
        "size": "1536x1024" if row.segment != "product" else "1024x1024",
        "quality": "high",
        "n": 1,  # deterministic single output per shot for consistency pipelines
    }

    async with sem:
        for attempt in range(MAX_RETRIES + 1):
            try:
                resp = await client.post(BASE_URL, json=payload)
                if resp.status_code >= 500 and attempt < MAX_RETRIES:
                    await asyncio.sleep(1.5 * (attempt + 1))
                    continue
                resp.raise_for_status()
                data = resp.json()
                first = (data.get("data") or [{}])[0]
                return {
                    "job_id": row.job_id,
                    "anchor_id": row.anchor_id,
                    "segment": row.segment,
                    "shot_name": row.shot_name,
                    "status": "completed",
                    "image_url": first.get("url"),
                    "request_id": data.get("id"),
                    "model_actual": data.get("model_actual"),
                }
            except Exception as exc:
                if attempt == MAX_RETRIES:
                    return {
                        "job_id": row.job_id,
                        "anchor_id": row.anchor_id,
                        "segment": row.segment,
                        "shot_name": row.shot_name,
                        "status": "failed",
                        "error": str(exc),
                    }
                await asyncio.sleep(1.5 * (attempt + 1))


def load_jobs(path: str) -> list[JobRow]:
    out: list[JobRow] = []
    with open(path, newline="", encoding="utf-8") as f:
        reader = csv.DictReader(f)
        for raw in reader:
            out.append(
                JobRow(
                    job_id=raw["job_id"].strip(),
                    segment=raw["segment"].strip(),
                    anchor_id=raw["anchor_id"].strip(),
                    source_image_url=raw["source_image_url"].strip(),
                    shot_name=raw["shot_name"].strip(),
                    prompt_append=raw["prompt_append"].strip(),
                )
            )
    return out


async def main() -> None:
    jobs = load_jobs(INPUT_CSV)
    sem = asyncio.Semaphore(MAX_CONCURRENCY)
    headers = {"Authorization": f"Bearer {API_KEY}"}

    async with httpx.AsyncClient(headers=headers, timeout=120.0) as client:
        results = await asyncio.gather(*[submit_row(client, sem, row) for row in jobs])

    with open(OUTPUT_JSONL, "w", encoding="utf-8") as f:
        for item in results:
            f.write(json.dumps(item) + "\n")

    grouped: dict[str, int] = {}
    for item in results:
        if item["status"] != "completed":
            continue
        grouped[item["anchor_id"]] = grouped.get(item["anchor_id"], 0) + 1

    print("Completed shot counts by anchor_id:")
    for anchor_id, count in grouped.items():
        print(f"  {anchor_id}: {count}")


if __name__ == "__main__":
    asyncio.run(main())
3

Write a manifest and route exceptions

Keep each line keyed by anchor_id + shot_name so QA can approve or retry with full traceability.

jsonl
{"job_id":"RE-9001","anchor_id":"prop-1107","segment":"living_room","shot_name":"daylight_stage","status":"completed","image_url":"https://.../living-room.png","request_id":"gen_re_01","model_actual":"openai/gpt-image-1"}
{"job_id":"RE-9001","anchor_id":"prop-1107","segment":"bedroom","shot_name":"daylight_stage","status":"completed","image_url":"https://.../bedroom.png","request_id":"gen_re_02","model_actual":"openai/gpt-image-1"}
{"job_id":"SKU-2202","anchor_id":"sku-2202","segment":"product","shot_name":"left_45","status":"completed","image_url":"https://.../left45.png","request_id":"gen_sku_02","model_actual":"openai/gpt-image-1"}
{"job_id":"SKU-2202","anchor_id":"sku-2202","segment":"product","shot_name":"detail_zipper","status":"failed","error":"HTTP 500: {\"error\":{\"code\":\"server_error\"}}"}

Consistency Guardrails

1. Keep a stable anchor prompt block per segment and only vary shot-specific suffix text.
2. Use one canonical reference image per listing or SKU whenever possible.
3. Persist request_id per shot for audit and replay.
4. Run human QA before publishing; no model guarantees pixel-perfect identity across all generations.

Production Checklist

Pilot with 3-5 properties and 20-30 SKUs before wider rollout.
Lock anchor prompts and shot naming conventions in your config repo.
Route failed records to a retry queue instead of rerunning the full batch.