You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Today I've tried RenderTexture in order to attempt composition of layers — I have a number of opaque nodes in a layer, some of top of each other, and I want to set an opacity for the entire layer in such a way that nodes behind don't become partly visible as would happen if you attempted this by just cascading opacity.
Unfortunately, I had a very bad time trying to understand how it was supposed to be used since its implementation has just too many bugs and strange behaviors... in fact, I needed to run lots of experiments just to distinguish every bug from each other since at first the behavior of RenderTexture seemed to be completely random.
I've classified them depending on the use case of RenderTexture, since there are several ways to use it and every one of them has its own gotchas. Feel free to link them to individual issues if you want.
Bugs that affect all use cases:
In WebGL RenderTexture does not get the viewport right with content scale factor > 1, so nodes appear much smaller and distorted. (I've created a PR for this one: Fix bug viewport bug in RenderTextureWebGLRender #3480)
Multi-sampling is different when rendering to the texture. This may be fine in some cases for performance reasons, but it's not documented and seemingly there is no option to change it. This would be a problem if some sprites in your scene have scale != 1 and you need to take a precise screenshot of them.
Elements are projected in the global viewport. Say for instance you have 640x480 viewport, create a 100x100 RenderTexture and visit a 100x100 sprite. The sprite will be deformed to something about 16x21. This did not happen in older revisions (2b3a2df).
Strange behaviors you may encounter if you use create a custom Sprite to show the rendered texture:
The texture is flipped vertically in WebGL, but not in canvas mode, so it's not clear what you should do to handle both cases when loading it in a new cc.Sprite.
Bugs and strange behaviors you may encounter adding the RenderTexture to the scene (and using the built-in Sprite that is its child):
Its boxes are very weird: It has no content size, so anchoring it does not work. The child sprite it contains (the one showing the rendered texture) is anchored at (0.5, 0.5) so it ends up working if you assume a center anchor, but appears in weird places in every other case.
Setting opacity to the RenderTexture node does not work in WebGL. As long as opacity is greater than zero, the texture is fully visible.
In WebGL mode, its built-in sprite does not use flippedY = true but rather scaleY = -1 and reversed anchor. That works too, but it's more confusing.
In WebGL mode, if you clear the texture with transparent red (255, 255, 255, 0) and put the sprite in a scene with a dark clear color (e.g. black, which is the default clear color), the texture is rendered as completely red! But if you do the same thing with the scene clear color set to white, the sprite is now invisible... so it's not even consistent. Using colors in between produces funny results.
The canvas renderer isn't flawless either: if you set an opacity on an empty texture, cleared with any transparent color, the lower the opacity, the clearer your screen gets. Setting opacity to 1 creates an interesting grey.
Strange behaviors you may find when using the auto draw mode:
There are no built-in constants for setClearFlags(), so you need to look for them in WebGLRenderingContext or hardcode them.
The text was updated successfully, but these errors were encountered:
Today I've tried RenderTexture in order to attempt composition of layers — I have a number of opaque nodes in a layer, some of top of each other, and I want to set an opacity for the entire layer in such a way that nodes behind don't become partly visible as would happen if you attempted this by just cascading opacity.
Unfortunately, I had a very bad time trying to understand how it was supposed to be used since its implementation has just too many bugs and strange behaviors... in fact, I needed to run lots of experiments just to distinguish every bug from each other since at first the behavior of RenderTexture seemed to be completely random.
I've classified them depending on the use case of RenderTexture, since there are several ways to use it and every one of them has its own gotchas. Feel free to link them to individual issues if you want.
Bugs that affect all use cases:
In WebGL RenderTexture does not get the viewport right with content scale factor > 1, so nodes appear much smaller and distorted. (I've created a PR for this one: Fix bug viewport bug in RenderTextureWebGLRender #3480)
Multi-sampling is different when rendering to the texture. This may be fine in some cases for performance reasons, but it's not documented and seemingly there is no option to change it. This would be a problem if some sprites in your scene have scale != 1 and you need to take a precise screenshot of them.
Elements are projected in the global viewport. Say for instance you have 640x480 viewport, create a 100x100 RenderTexture and visit a 100x100 sprite. The sprite will be deformed to something about 16x21. This did not happen in older revisions (2b3a2df).
Strange behaviors you may encounter if you use create a custom Sprite to show the rendered texture:
Bugs and strange behaviors you may encounter adding the RenderTexture to the scene (and using the built-in Sprite that is its child):
Its boxes are very weird: It has no content size, so anchoring it does not work. The child sprite it contains (the one showing the rendered texture) is anchored at (0.5, 0.5) so it ends up working if you assume a center anchor, but appears in weird places in every other case.
Setting opacity to the RenderTexture node does not work in WebGL. As long as opacity is greater than zero, the texture is fully visible.
In WebGL mode, its built-in sprite does not use
flippedY = true
but ratherscaleY = -1
and reversed anchor. That works too, but it's more confusing.In WebGL mode, if you clear the texture with transparent red
(255, 255, 255, 0)
and put the sprite in a scene with a dark clear color (e.g. black, which is the default clear color), the texture is rendered as completely red! But if you do the same thing with the scene clear color set to white, the sprite is now invisible... so it's not even consistent. Using colors in between produces funny results.The canvas renderer isn't flawless either: if you set an opacity on an empty texture, cleared with any transparent color, the lower the opacity, the clearer your screen gets. Setting opacity to 1 creates an interesting grey.
Strange behaviors you may find when using the auto draw mode:
setClearFlags()
, so you need to look for them inWebGLRenderingContext
or hardcode them.The text was updated successfully, but these errors were encountered: