When Should You Use ACES?

Introduction
Color space is as baffling as quantum physics—and honestly, I don’t fully understand it either. While I respect the efforts behind ACES (Academy Color Encoding System) to improve computer graphics, it can also adds confusion instead of clarity.
My experience with ACES in studios? Endless trial and error with Nuke nodes until something “just works.”. No one can really explain why. The best explanation is usually, “All the other settings didn’t work”
Why Does Color Space Exist?

Render engines work in linear space because light in physics behaves linearly:
A 1000K candle simply emits twice more light than a 500K one.
But human perception isn’t linear. A 1000K candle doesn’t look twice as bright as a 500K one—our eyes adjust based on brightness levels.
Our eyes are more sensitive in the dark and less responsive to brightness changes in well-lit conditions.
For example, lighting a 1000K candle in a pitch-black room illuminates everything, creating a dramatic difference. However, in daylight with the sun at 6000K, adding the candle’s brightness would go unnoticed. This sensitivity helps us avoid perceiving the sun’s constant fluctuations, preventing a life long epileptic flickering nightmare.
So how do you convert 3D linear render into an image that mimics how the human eye would perceive this environment in real life? The answer: colorspace.
The 2.2 gamma
There is something between the image and your eye: your screen.
It’s your screen applying this 2.2 gamma. 2.2 gamma ended up being adopted as a standard for all screens due to a mixture of hardware, software and arbitrary choices.
That was at a time where we had cathodic screens, and internet started to be a thing.
We now have much more capable displays (DCI-P3, Rec. 2020 or HDR,ect), and better tools to handle complex colors.

Digital Cameras and Color Space

All cameras capture light differently. Each one’s raw file has different specs, so footage must be properly converted using the camera’s proprietary LUTs. (LUT = Look up table. It is a basic math table for transforming images from one color space to another)
With now countless cameras on the market, tracking formats is an exponential nightmare.
Many unpopular cameras got their support discontinued, or badly documented.
Dig a random footage from a shoot a few years ago, and realise you can’t find which camera it got shot on?
This is why remastering old films is challenging and not an exact science: the original output intent has been lost.
And this, is exactly the problem ACES got invented to solve.
What Is ACES?
Its first purpose was archive and exchange—it’s first name was actually “Image Interchange Framework” (IIF).

Roundhay garden scene is the oldest movie known to us
No data will ever be lost, and even in a century, an editor can pick up a raw footage, and be sure he gets the right output from it, regardless of which camera it was shot with.
And if you are thinking about making it future proof, you need to account for the fact that our displays will keep improving. So they created an ACES gamut that encompasses all colors visible to the human eye.
That means ACES already supports the screens that are yet to be invented. Cool huh?
Color spectrum visible to the naked eye:

The Problem
As ACES grew, so did demand for tools to create and edit directly in its color space. However, ACES wasn’t designed at all for content creation and editing. This led to various versions:
ACES2065-1: is the backbone one, used for exchanging and archiving.
ACEScg: For 3D rendering (rendering in ACES2065-1 creates artifacts at render times, brings unsupported colors (like negative color values), and mess up energy conservation).
ACEScc/ACEScct: For grading, mimicking log workflows.
ACESproxy: For on-set previews (not designed to be saved/stored).
Having many aces variants is a direct contradiction of the idea of creating a universal non ambiguous colorspace. That’s the first backfire.
Bear in mind, you do NOT necessarily need all of those. Cherry pick what you need only. More on this in the conclusion. But if you want to be confused, here is what a hardcore full ACES workflow looks like:

Practical Challenges
Want to bang your head against the wall yet? Take a ticket, there’s a queue.

Every arrow is a room for mistake, and unless you know what you are doing (which you don’t), you will end up randomly toggling settings until you magically see a correct output.
What I have found the most annoying is that during production you often have to share frames/previews with producers, directors or clients, who have no access to your CG software.
So at any point of the workflow you may need to export a good old srgb frame of your work, which adds confusion. Now you need to support 2 output formats, ACES and srgb, and sometimes swap between the 2 many times a day. Easy to fuck up things here.
So many times have I seen playables on Ftrack/shotgrid published with the wrong colorspace.
Also, depending on the software, some colorpickers will simply sample the wrong values.
When to Use ACES
It’s not an all-in or nothing ACES situation. I have seen companies who were asked to deliver their final files in ACES, and from there got dragged into converting their whole pipeline into ACES. Before you know it, no one can remember why they did it in the first place, but now they have to support a legacy pipeline no one can understand nor fix.
Dealing with Multiple Cameras
Use ACES for ingesting footage and editing/grading (and potentially compositing), but you don’t need an end-to-end ACES pipeline.
Collaborating with Vendors
Convert incoming ACES files to linear Srgb EXRs for your workflow and export back to ACES right before delivery.
Or cherry pick which part of the work you want to keep for. For instance, if you received ACES plates, and you are commissioned to do the 3D on top, it can make sense to save your 3D renders in ACEScg, and compositing in ACES-2065 is when you merge everything together consistently.
Full-CG Projects with more than 1000 shots
At this scale and budget, it starts to make sense to adopt a full ACES workflow. Your pipeline team is strong enough to absorb the cost of deployment and support.
Never
If your final work will never be remastered (who worked on NFT?).
If you hardly ever share files with other vendors
If you hardly ever deal with camera footage
Working for feature film
Your film may get remastered in the future, save/deliver your final deliverable in ACES.
Is it worth remastering fast and furious 32 though?
Future-Proof Vision
You like the ACES vision, and want to contribute to making it happen
Conclusion
You still don’t understand colorspaces.
Need any 3D or 2D assets?