I followed the tutorial on
to create my own mesh effect.
I've already created a FragmentFilter myself, and wanted to make this a MeshStyle to get the advantages of batching and thus reducing draw calls.
Issue is that in the vertex shader, the texture coordinates are set:
"mov v0, va1", // pass texture coordinates to fragment program
But i get the feeling this is just the texture coordinates for this one vertex.
Now my fragment shader relies on the texture coordinate per pixel and my FilterEffect implementation receives that. But the MeshEffect does only get the vertex texture coordinate.
When i do something like that in my fragment shader:
tex("ft0", "v0", 0, texture), //read base texture
"mul ft0.w, ft0.w, v0.y", //shouldn't that create something like an alpha gradient?
//y should be a value between -1 and 1 right?
"mov oc, ft0"
The resulting image gets the same alpha value for every pixel, but the image itself is
displayed correctly. So i guess v0 actually contains the right uv-coords for every pixel
to make the texture look up. But why am i not achieving an alpha gradient by multiplying
the v0.y with the alpha value of each pixel?
Am i doing something wrong?