Colorspace Field Guide

March 31, 2026

Luma’s HDR model is designed to integrate directly into professional VFX and film pipelines.

That means one thing above all:

clear, production-ready color specifications.

This article is split into two parts:

  1. Exact technical specs (what you need to plug into your pipeline)
  2. Context (why those specs matter and how to think about them)

Ray 3.14 HDR Tech Specs (TL;DR)

If you only need the implementation details, here they are:

Model: Ray3.14 HDR

Primary output (video):

  • Format: OpenEXR (.exr)
  • Bit depth: 16-bit
  • Compression: DWAB (level ~45, perceptually lossless)
  • Color space: ACES 2065-1 (AP0 primaries)
  • Encoding: scene-referred linear

Key properties:

  • Wide gamut (AP0 — larger than ACEScg)
  • Full dynamic range preserved (no tone mapping applied)
  • Suitable for ACES-based pipelines

Conversions:

  • Directly convertible to ACEScg, Rec.709, P3, BT.2020, Log formats (LogC, etc.)
  • ACEScg is a subset of AP0 → conversion is straightforward and stable

What this means for your pipeline

  • Treat output as scene-linear ACES 2065-1
  • Convert to ACEScg for rendering/compositing if needed
  • Apply ODT / tone mapping at display stage only

This behaves like high-end camera or rendered CG data, not display-referred media.

The idea

Most AI-generated images and videos are designed to look good on a screen.

Luma (via Ray3.14 HDR) is designed to hold up in a pipeline.

In VFX and filmmaking, you’re not just creating images, you’re creating assets that need to survive compositing, grading, relighting, and delivery across formats.

That comes down to one thing:

how much color and light information you actually have to work with.

The real limitation of most AI outputs

Most generative models today output:

  • 8-bit images or compressed video
  • Rec.709 / sRGB color space
  • Display-referred imagery

In practice, that means:

  • Highlights are clipped or baked in
  • Colors are already compressed
  • Very limited flexibility in post

They’re optimized for final pixels not intermediate assets.

What Ray3.14 HDR does differently

Ray3.14 HDR generates outputs that behave like camera data, not just images.

Instead of baking everything into a display format, it generates in a scene-referred, wide-gamut space used in professional cinema workflows.

You’re not getting a finished image. You’re getting a source asset with headroom.

Why ACES AP0 matters

ACES AP0 is one of the largest color spaces used in production.

  • Larger than ACEScg
  • Larger than Rec.2020
  • Extends beyond visible spectrum (as a mathematical container)

Why that matters:

  • No premature clipping
  • Stable downstream transforms
  • Maximum flexibility for grading and compositing

You can always map down to a smaller space.

You cannot recover information that was never captured.

Dynamic range

Because output is scene-referred linear:

  • Highlights are not baked
  • Shadows retain detail
  • Exposure adjustments remain physically consistent

This allows you to treat outputs like footage, not flattened renders.

Scene-referred vs display-referred

Ray3.14 HDR output:

  • Scene-referred
  • Linear
  • Not tone-mapped

Typical AI output:

  • Display-referred
  • Tone-mapped
  • Gamut-compressed

This is the difference between:

“final video”

vs

“production asset”

Color spaces

Two concepts matter:

Color gamut = range of colors
Dynamic range = range of brightness

Key takeaway:

  • Rec.709 / sRGB → small container
  • ACES AP0 → extremely large container

Ray3.14 HDR operates at the largest end of that range.

Delivery vs source

Like any modern pipeline:

  • EXR = source (full fidelity)
  • Derived formats = previews / delivery

Depending on pipeline configuration, previews may be:

  • HDR (e.g. Rec.2020 PQ/HLG)
  • or SDR tone-mapped

The important part:

The original EXR remains scene-referred and untouched.

Why this matters

This enables:

  • Direct ACES pipeline integration
  • Reliable compositing in Nuke / Houdini
  • High-end grading in Resolve / Baselight
  • Consistent relighting and look development
  • HDR delivery workflows

You are not adapting AI outputs into production.

You are generating assets that already conform to it.

A helpful reminder

If you’re integrating into a pipeline, the only questions that matter are:

  • What color space?
  • What bit depth?
  • Scene-referred or display-referred?

Ray3.14 HDR answers:

  • ACES 2065-1 (AP0)
  • 16-bit EXR
  • Scene-referred linear

Key takeaway

By outputting 16-bit EXR in ACES AP0 (scene-referred), Luma's Ray3.14 HDR model gives you the same flexibility as high-end camera pipelines so you can shape color and light after generation, not be limited by it.