Multitexturing and labels in Opengl ES 3.0 - java

I want to draw numbers on an object using multitexturing. But the final image is lighter, like:
Is it possible to exclude white color from multitexure and make the digit darker?
Here's the my fragment shader:
#version 300 es
precision mediump float;
in vec2 v_textureCoord;
out vec4 outColor;
uniform sampler2D base_texture;
uniform sampler2D number_texture;
void main() {
// wall texture
vec4 baseColor = texture(base_texture, v_textureCoord);
// texture with digit
vec4 numberColor = texture(number_texture, v_textureCoord);
// resulting pixel color based on two textures
outColor = baseColor * (numberColor + 0.5);
}
I tried to do this:
GLES30.glEnable(GLES30.GL_BLEND);
GLES30.glBlendFunc(GLES30.GL_SRC_ALPHA, GLES30.GL_ONE);
GLES30.glActiveTexture(GLES30.GL_TEXTURE1);
...
GLES30.glDisable(GLES30.GL_BLEND);
But this did not solve the problem.
Thank you for any answer/comment!
Solution:
On Rabbid76 advice, I applied this:
outColor = baseColor * mix(numberColor, vec4(1.0), 0.5);
Result:

mix the color of number_texture by a white color, rather than adding a constnant:
outColor = baseColor * mix(numberColor, vec4(1.0), 0.5);
Actually that is the same as:
outColor = baseColor * (numberColor * 0.5 + 0.5);

Related

OpenGL Alpha values having no effect in shader

I have a simple shader to make scanlines in my 2d game which works fine and is as follows:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
void main() {
vec2 p = vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
if (mod(p.y, 6.0)==0.0)
gl_FragColor = vec4(0.0,0.0,0.0, 0.1);
else
gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}
But if I change the alpha value in the vec4 for the fragment colour, it has no effect, the scanlines are equally black at 0.1 as they are at 1.0. I saw some other questions regarding this which advised to enable blending, but I tried that to no avail.
This is my render method with blending enabled.
#Override
public void render(float delta) {
Gdx.gl.glClearColor(0.0f, 0.0f, 0.0f, 0);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
// batch.enableBlending(); tried this too but no effect
batch.setProjectionMatrix(cam.combined);
batch.setShader(shaderProgram);
batch.begin();
rendup.renderAll(batch);//renders all sprites, tiles etc
batch.setShader(null);
batch.end();
}
This isn't a blending problem. The issue is your shader only draws either a translucent black pixel, or the color of whatever texture region is being drawn. These black pixels are just getting blended with what's already on the screen (in this case the black clear color).
I assume what you actually want here is for the scan lines not to be pure black. So you should be drawing the color of the texture region everywhere, and just darken it slightly where the scan lines are. You don't want to be modifying the alpha that the shader outputs, or when you have overlapping sprites, the scanline will appear darker in that area.
So change your shader like this:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
const float DARK_LINE_BRIGHTNESS = 0.9;
void main() {
vec4 color = v_color * texture2D(u_texture, v_texCoords);
vec2 p = vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
if (mod(p.y, 6.0)==0.0)
gl_FragColor = vec4(color.rgb * DARK_LINE_BRIGHTNESS, color.a);
else
gl_FragColor = color;
}
However, if/else should be avoided in fragment shaders because it performs significantly worse in most cases. (Exception is if the if statement evaluates to the same value for many pixels in a row, such as with if (u_someUniformBoolean).) You could rewrite it like this:
void main() {
vec4 color = v_color * texture2D(u_texture, v_texCoords);
vec2 p = vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
gl_FragColor = (step(0.01, mod(p.y, 6.0)) * (1.0 - DARK_LINE_BRIGHTNESS) + DARK_LINE_BRIGHTNESS) * color;
}

OpenGL ES 2.0 Android Clipping Color

i'm using my fragment shader to clip objects in OpenGL ES 2.0. Everything is working well, however the colour of the clipped surface is all black... I can not figure out how to change the colour (well ideally I'd want to make a similar texture to the rest of the object. I have included the code for my fragment shader below.
precision mediump float;
varying vec2 texCoord;
varying vec3 v_Normal;
varying vec3 v_Position;
varying vec4 originalPosition;
uniform sampler2D texSampler2D;
uniform vec3 lightPosition;
uniform vec4 lightColor;
void main()
{
vec3 L = normalize(lightPosition - v_Position);
vec3 N = normalize(v_Normal);
float NdotL = max(dot(N,L),0.0);
if(originalPosition.y >= 2.0){
discard;
}else{
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}
}
Using discard in a fragment shader doesn't render anything to the pixel, it just leaves it exactly as it was before, so the black color is probably your clear color or whatever you had rendered previously. If you want to render a particular color to the pixels you are currently discarding, add another uniform to your shader for the clip color, like this:
uniform vec4 clipColor;
Set it in the same way you set the lightColor, then instead of discarding the pixel when clipping, you can set the pixel to the clip color:
if(originalPosition.y >= 2.0) {
gl_FragColor = clipColor;
} else {
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}

2D glsl shader transformation

I would like to create a shader to simulate a pseudo 3D water surface on a 2D scene build with libgdx. The idea is to recreate the following effect:
http://forums.tigsource.com/index.php?topic=40539.msg1104986#msg1104986
But I am stuck at creating the trapezoid shape, I think I didn't understand how texture coordinate are calculated on opengl shaders. May I modify the vertex in the vertex shader or may I displace the texture on the fragment shader?
here is a test I have done but that don't work as expected.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoord0;
void main()
{
v_color = a_color;
float factor = (a_texCoord0.x - 0.5) * (a_texCoord0.y);
v_texCoord0 = a_texCoord0;
v_texCoord0.x += factor;
gl_Position = u_projTrans * a_position;
}
the fragment shader is just a passthrough
varying vec4 v_color;
varying vec2 v_texCoord0;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoord0);
}
and the result image
I am sure that my approach is too naive.
Finally, I decided to use only the fragment shader to achieve the effect.
Fake depth calculation is done by applying X displacement in regards of the Y axis.
Here is the snipet:
// simulate 3D surface
vec2 new_coord = v_texCoord0;
// calculate the stretching factor
float factor = ((0.5 - new_coord.x) * new_coord.y);
// apply the factor and stretch the image
new_coord.x += factor;
gl_FragColor = texture2D(u_texture, new_coord);

GLSL renders textures wrong

I am trying to make a lighting system, the program changes the texture(block) brightness depends on the light it gets, and the program does it for every block(texture) that is visible to the player.
However, the lighting system works perfectly, but when it comes to rendering with shaders everything gets destroyed.
This code is in the render loop -
float light = Lighting.checkLight(mapPosY, mapPosX, this); // Returns the light the current block gets
map[mapPos].light = light; // Applies the light to the block
Shaders.block.setUniformf("lightBlock", light); // Sets the light value to the shader's uniform, to change the brightness of the current block / texture.
batch.draw(map[mapPos].TEXTURE, (mapPosX * Block.WIDTH), (mapPosY * Block.HEIGHT), Block.WIDTH, Block.HEIGHT); // Renders the block / texture to the screen.
The result is pretty random..
As i said the first two lines work perfectly, the problem is probably is in the third line or in the shader itself.
The shader:
Vertex shader -
attribute vec4 a_color;
attribute vec3 a_position;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 vColor;
varying vec2 vTexCoord;
void main() {
vColor = a_color;
vTexCoord = a_texCoord0;
gl_Position = u_projTrans * vec4(a_position, 1.0f);
}
Fragment shader -
varying vec4 vColor;
varying vec2 vTexCoord;
uniform vec2 screenSize;
uniform sampler2D tex;
uniform float lightBlock;
const float outerRadius = .65, innerRadius = .4, intensity = .6;
const float SOFTNESS = 0.6;
void main() {
vec4 texColor = texture2D(tex, vTexCoord) * vColor;
vec2 relativePosition = gl_FragCoord.xy / screenSize - .5;
float len = length(relativePosition);
float vignette = smoothstep(outerRadius, innerRadius, len);
texColor.rgb = mix(texColor.rgb, texColor.rgb * vignette * lightBlock, intensity);
gl_FragColor = texColor;
}
I fixed the problem.. but i dont have any idea why it fixed it.
Thanks to Pedro, i focused more on the render loop instead of the shader itself.
Before the loop i added those 2 lines -
List<Integer> lBlocks = new ArrayList<Integer>();
Shaders.block.setUniformf("lightBlock", 0.3f);
Basically I created an array to store the bright blocks later on.
I set the uniform of the shader to be 0.3f which means pretty dark. the value should be 0-1f.
Now in the render loop, inside the for -
float light = Lighting.checkLight(mapPosY, mapPosX, this);
map[mapPos].light = light;
if( light == 1.0f) {
lBlocks.add(mapPos);
} else {
batch.draw(map[mapPos].TEXTURE, (mapPosX * Block.WIDTH), (mapPosY * Block.HEIGHT), Block.WIDTH, Block.HEIGHT);
}
As you can see, the bright blocks i add to the array and the dark ones i render, i set the uniform to 0.3f before the render loop as you can in the first code sample.
After the render loop i loop again through the bright blocks.. because we didn't render them.
Shaders.block.setUniformf("lightBlock", 1.0f);
for(int i = 0; i < lBlocks.size(); i++) {
batch.draw(map[lBlocks.get(i)].TEXTURE, ((lBlocks.get(i) % width) * Block.WIDTH), ((lBlocks.get(i) / width) * Block.HEIGHT), Block.WIDTH, Block.HEIGHT);
}
Now i rendered the bright blocks and it works.. the result was good.
But I don't have any idea why its like that, its like cutting the render loop to two, one for dark blocks and one for the bright ones.
Thanks :)

Normals transforming, but lighting still weird?

I have a basic model viewer which displays a parsed OBJ model. I have a light positioned where the camera is, so the model being viewed is fully lit on one side. For some reason, my transformations aren't doing what I expected in my renderer. When I rotate my model 90 degrees about the x- or y-axis, the mesh is completely dark. But rotate another 90 degrees, and its fully lit again.
Am I doing the transformations wrong? Or are my normals wrong to begin with?
I calculated and applied the proper transform to the normals in my app (transpose of the inverse of the ModelView matrix)
Matrix.invertM(mNormalMatrix, 0, mMVMatrix, 0);
Matrix.transposeM(mNormalMatrix, 0, mNormalMatrix, 0);
before passing it into my shaders:
/*Vertex shader*/
attribute vec3 vPosition;
attribute vec3 vNormal;
uniform mat4 modelViewMatrix;
uniform mat4 mMVPMatrix;
uniform mat4 mViewMatrix;
uniform mat4 normalMatrix;
uniform float lightingEnabled;
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Calculate normal matrix*/
vec4 normal = vec4(vNormal, 0.0);
normalEye = normalize(vec3(normalMatrix * normal));
lightsEnabled = lightingEnabled;
lightPosEye = vec3(mViewMatrix * vec4(0.0, 0.0, 3.0, 1.0));
vertEye = vec3(modelViewMatrix * vec4(vPosition, 1.0));
gl_Position = mMVPMatrix * vec4(vPosition, 1.0);
}
/*Fragment shader*/
precision mediump float;
/*uniform vec4 vColor; */
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Light output components*/
vec3 Ia;
vec3 Id;
/*light source components*/
vec3 La = vec3(0.5);
vec3 Ld = vec3(1.0);
/*vec3 Ls = vec3(1.0);*/
vec3 Ka = vec3(0.3); /*ambient reflectance term*/
vec3 Kd = vec3(1.0); /*diffuse term*/
/*ambient light term*/
Ia = La * Ka;
float dotProd;
vec3 lightToSurface;
if(lightsEnabled > 0.5){
/*diffuse light term*/
lightToSurface = normalize(lightPosEye - vertEye);
dotProd = dot(lightToSurface, normalEye);
dotProd = max(dotProd, 0.0);
}
else {
dotProd = 1.0;
}
Id = Ld * Kd * dotProd;
gl_FragColor = vec4(Ia + Id, 1.0);
}
I know its been a while, but I've found the solution. The problem was this line in my vertex shader:
normalEye = normalize(vec3(normalMatrix * normal));
I changed it to:
normalEye = normalize(vec3(modelViewMatrix * normal));
and everything works fine. Although I have no idea why the second line works when the first one should.

Categories