7
$\begingroup$

A good example would be a giant screen in a football stadium. The last rendered frame would be used as texture on the screen for the current frame.

This texture would be emitting light (as it is a screen), so I really need it as a texture input for an emission shader in cycles.

It differs a bit from that question: Is it possible to use the output of a Renderlayer in the material nodes of another RenderLayer? as the given solution gives me a pink screen on the ~20 first frames.

Is it possible in Blender and if yes, how?

$\endgroup$
4
  • $\begingroup$ I get a pink texture on the first render by using the answer you pointed me to. Then this pink texture will also be rendered in the second as it is in the first one, but smaller, and so on until it is less than a pixel wide. I tried to start at frame 2, but it doesn't change anything. My problem is in some way similar, but not exactly the same as I need the render layer to be really used as a texture during rendering (the screen is emitting light so I can't do it only with compositing) $\endgroup$
    – matali
    Commented Jun 21, 2014 at 9:38
  • $\begingroup$ The other post uses it in material nodes too (you could easily connect it to an emission shader). $\endgroup$
    – gandalf3
    Commented Jun 21, 2014 at 18:27
  • $\begingroup$ This is waht I've done, but I get that pink texture problem on first frame which is reproduce every time (a bit smaller) on the following frames. (because the first frame is not rendered.) $\endgroup$
    – matali
    Commented Jun 22, 2014 at 6:40
  • $\begingroup$ The only solution I can think of is just rendering the first frame until the pink is no longer visible (though using an initial color that stands out less helps). $\endgroup$
    – gandalf3
    Commented Jun 22, 2014 at 7:11

1 Answer 1

8
$\begingroup$

The answer to this is basically the same as Is it possible to use the output of a Renderlayer in the material nodes of another RenderLayer?, except you must render the first frame repeatedly until the replicated images of the render are smaller than can be seen (this of course depends on the resolution)

I've thought about this, and as far as I can tell, there is really no true way around it using compositing.

The reason for this is that this is a feedback loop; The rendered result depends on the lighting, which is affected by the texture of the rendered result, which is affected by the lighting, which is affected by the rendered result, etc.

The only thing you can do is try and make the initial render a reasonably close approximation of how it will look after enough iterations have been done to make it appear infinite (i.e. emitting bright pink doesn't help). This way it will look "good enough" in fewer iterations.

Using the setup in my other answer, rendering a single frame will write a file via file output node which will then be read by an image sequence node in material nodes the next time the scene is rendered. So all you have to do is press F12 repeatedly..

To automate pressing F12, you could render an "animation" where nothing changes for as many frames as you want iterations.

$\endgroup$
2
  • 1
    $\begingroup$ Could you please reupload that example blend file? It no longer works. I tried to replicate your method and have not been able to get it to work without seeing the blend file. $\endgroup$ Commented Feb 1, 2018 at 1:44
  • $\begingroup$ @MarkFoxworthy Fixed, thanks for letting me know :) $\endgroup$
    – gandalf3
    Commented Feb 2, 2018 at 8:27

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .