3
$\begingroup$

I hope this question is acceptable here. I'm impressed by the results of Filmic in Blender, and I would like to do something similar in my own code, which is for processing photographs and doesn't use Blender. (My code is likely to be released as an open source project eventually.)

Because of this, I'm wondering if it's possible to learn the details of what Filmic does, in a form that would make it possible to re-implement it. Presumably that's possible by reading the source code, but that's a huge amount of work, and if the technical specifications are written up anywhere it would be very helpful to know them.

Alternatively (this might be wishful thinking), is it possible somehow to access Filmic's colour transforms from within a Python script, without working inside Blender? Then I could just use it directly instead of reimplementing it. Essentially what I want to do is take photographs, do the inverse of what Filmic does (i.e. convert them from pixel values to intensities of light), then process them in the light intensity space, then do what Filmic does to render them back as photographs with a realistic film response.

$\endgroup$
6
  • $\begingroup$ What about starting here: github.com/sobotka/filmic-blender ? $\endgroup$
    – Bruno
    Commented Jan 26, 2020 at 11:37
  • $\begingroup$ @Bruno it has usage instructions, but not a clear description of the processing pipeline, which is what I'd need in order to reimplement it. I'm not familiar with a lot of the terminology used in that description, so I'd be looking for something with more of a step by step explanation of how it operates. $\endgroup$
    – N. Virgo
    Commented Jan 26, 2020 at 11:39
  • $\begingroup$ @Bruno I'm also particularly interested in the way Filmic reduces saturation at high intensities, which seems not to be covered on that page at all. $\endgroup$
    – N. Virgo
    Commented Jan 26, 2020 at 11:41
  • 1
    $\begingroup$ Hello :). Why not just ask Troy Sobotka directly? He's pretty responsive on twitter @troy_s $\endgroup$ Commented Jan 26, 2020 at 12:54
  • 3
    $\begingroup$ Related: Render with a wider dynamic range in cycles to produce photorealistic looking images $\endgroup$
    – p2or
    Commented Jan 26, 2020 at 14:15

1 Answer 1

5
$\begingroup$

Blender is a package of view transforms created to convert Scene-Referred image data into Display-Referred values, using a large dynamic range and de-saturating the highlights so that they resemble images created on a film camera.

Refer to: Render with a wider dynamic range in cycles to produce photorealistic looking images

Blender's Color management is implemented through OpenColorIO (https://opencolorio.org/). That might be the place to start a journey that will take you down the rabbit hole of color.

The file with the rules used for color managenment can be found in /blender/(blender version)/datafiles/colormanagement/config.ocio.

The definitions and stanzas on such config.ocio file are written to be interpreted by OCIO and determine how data is transformed, and if LUTs are used.

LUTs used for transformations are contained in the luts and views sub-folders within the color management folder.

How to make 3D LUTs and use them in Blender?

To use filmic, you would need an app that is OCIO aware. (Nuke, Mari, AffinityPhoto, Krita and others already use it, so there might be no need to re-invent the wheel).

Image data must first be converted to use Rec709/sRGB primaries, transformed to linear values, and then scaled to be real scene-referred data (or at least to fall within the range used by filmic).

If the image is generated on a camera, the de-saturation and conversion to display referred might have already taken place, so there might be nothing to be gained by using filmic, even when shooting raw. Filmic was intended for CG rendered imagery. It was meant to overcome the limitations of using sRGB transforms and to solve the non-existing cross-talk between color channels during rendering. Camera sensors and filters work in a completely different way to create images. The filters over the sensor, that define color, allow for some overlap of different frequencies, making the color saturation at the bright parts of the image more "film like".

Other kinds of transformations might be more useful than "filmic" for in-camera generated images .

$\endgroup$
2
  • 1
    $\begingroup$ Thank you, I will go through all of this in detail the next time I have time to work on this project. What I'm doing is sort of weird - I'm using photographs but it's also a kind of rendering. I take many photographs, tint them and then combine them to make these images deviantart.com/nathanielvirgo/gallery/61269036/time-is-colour - my aim is to make the effect as realistic as I can. $\endgroup$
    – N. Virgo
    Commented Jan 27, 2020 at 20:20
  • $\begingroup$ Cegaton? Come back! :D $\endgroup$
    – p2or
    Commented Jan 27, 2020 at 20:25

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .