Normals transforming, but lighting still weird? - java

I have a basic model viewer which displays a parsed OBJ model. I have a light positioned where the camera is, so the model being viewed is fully lit on one side. For some reason, my transformations aren't doing what I expected in my renderer. When I rotate my model 90 degrees about the x- or y-axis, the mesh is completely dark. But rotate another 90 degrees, and its fully lit again.
Am I doing the transformations wrong? Or are my normals wrong to begin with?
I calculated and applied the proper transform to the normals in my app (transpose of the inverse of the ModelView matrix)
Matrix.invertM(mNormalMatrix, 0, mMVMatrix, 0);
Matrix.transposeM(mNormalMatrix, 0, mNormalMatrix, 0);
before passing it into my shaders:
/*Vertex shader*/
attribute vec3 vPosition;
attribute vec3 vNormal;
uniform mat4 modelViewMatrix;
uniform mat4 mMVPMatrix;
uniform mat4 mViewMatrix;
uniform mat4 normalMatrix;
uniform float lightingEnabled;
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Calculate normal matrix*/
vec4 normal = vec4(vNormal, 0.0);
normalEye = normalize(vec3(normalMatrix * normal));
lightsEnabled = lightingEnabled;
lightPosEye = vec3(mViewMatrix * vec4(0.0, 0.0, 3.0, 1.0));
vertEye = vec3(modelViewMatrix * vec4(vPosition, 1.0));
gl_Position = mMVPMatrix * vec4(vPosition, 1.0);
}
/*Fragment shader*/
precision mediump float;
/*uniform vec4 vColor; */
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Light output components*/
vec3 Ia;
vec3 Id;
/*light source components*/
vec3 La = vec3(0.5);
vec3 Ld = vec3(1.0);
/*vec3 Ls = vec3(1.0);*/
vec3 Ka = vec3(0.3); /*ambient reflectance term*/
vec3 Kd = vec3(1.0); /*diffuse term*/
/*ambient light term*/
Ia = La * Ka;
float dotProd;
vec3 lightToSurface;
if(lightsEnabled > 0.5){
/*diffuse light term*/
lightToSurface = normalize(lightPosEye - vertEye);
dotProd = dot(lightToSurface, normalEye);
dotProd = max(dotProd, 0.0);
}
else {
dotProd = 1.0;
}
Id = Ld * Kd * dotProd;
gl_FragColor = vec4(Ia + Id, 1.0);
}

I know its been a while, but I've found the solution. The problem was this line in my vertex shader:
normalEye = normalize(vec3(normalMatrix * normal));
I changed it to:
normalEye = normalize(vec3(modelViewMatrix * normal));
and everything works fine. Although I have no idea why the second line works when the first one should.

Related

Multitexturing and labels in Opengl ES 3.0

I want to draw numbers on an object using multitexturing. But the final image is lighter, like:
Is it possible to exclude white color from multitexure and make the digit darker?
Here's the my fragment shader:
#version 300 es
precision mediump float;
in vec2 v_textureCoord;
out vec4 outColor;
uniform sampler2D base_texture;
uniform sampler2D number_texture;
void main() {
// wall texture
vec4 baseColor = texture(base_texture, v_textureCoord);
// texture with digit
vec4 numberColor = texture(number_texture, v_textureCoord);
// resulting pixel color based on two textures
outColor = baseColor * (numberColor + 0.5);
}
I tried to do this:
GLES30.glEnable(GLES30.GL_BLEND);
GLES30.glBlendFunc(GLES30.GL_SRC_ALPHA, GLES30.GL_ONE);
GLES30.glActiveTexture(GLES30.GL_TEXTURE1);
...
GLES30.glDisable(GLES30.GL_BLEND);
But this did not solve the problem.
Thank you for any answer/comment!
Solution:
On Rabbid76 advice, I applied this:
outColor = baseColor * mix(numberColor, vec4(1.0), 0.5);
Result:
mix the color of number_texture by a white color, rather than adding a constnant:
outColor = baseColor * mix(numberColor, vec4(1.0), 0.5);
Actually that is the same as:
outColor = baseColor * (numberColor * 0.5 + 0.5);

GLSL Moving Point Light

I have been writing a point light shader for my LWJGL + Java application. I am writing it based off of this tutorial. My problem is when I "walk around" with the camera, the light moves as well. Also, when I rotate the sphere, the light rotates with it.
I believe that the problem is in the Vertex Shader, but I put the fragment shader in also just in case.
Example 1 (No movement)
Example 2 (Moved Left and rotated the camera)
Vertex Shader
#version 330
in vec4 in_Position;
in vec3 in_Normal;
in vec2 in_TextureCoord;
uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;
uniform mat3 normal;
uniform vec4 light_Pos; //Set to 0, 3, 0, 1
out Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} Out;
void main(void) {
vec4 vert = view * model * light_Pos;
vec4 pos = model * view * in_Position;
Out.normal = normalize(in_Normal);
Out.lDir = vec3(vert - pos);
Out.eye = vec3(-pos);
Out.st = in_TextureCoord;
gl_Position = projection * view * model * in_Position;
}
Fragment Shader
#version 330
uniform sampler2D texture_diffuse;
in Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} In;
out vec4 color;
void main(void) {
vec4 diffuse = texture(texture_diffuse, In.st);
vec4 spec = vec4(0.0);
vec3 n = normalize(In.normal);
vec3 l = normalize(In.lDir);
vec3 e = normalize(In.eye);
float i = max(dot(n,l), 0.0);
if (i > 0.0) {
vec3 h = normalize(l+e);
float intSpec = max(dot(h,n), 0.0);
spec = vec4(1) * pow(intSpec, 50); //50 is the shininess
}
color = max(i * diffuse + spec, vec4(0.2));
}
I already tried the solution presented in this question, it did not solve my problem.
Just from a quick glance, looks like you're multiplying the light's position by the view and model matrix:
vec4 vert = view * model * light_Pos;
This means that whenever you walk around/move the camera you're changing the view matrix which affects the light's position, and likewise your when you move the sphere you're changing the model matrix which is also affects the light's position.
In other words if you want the light to be stationary in relation to the world then don't transform it by any matrices.
The problem is that your normal, Out.lDir and Out.eye are not in the same coordinate system. Normal is in your model's coords while the other too are in the eye space. Try to pass eye position as a uniform in a similar way to light_Pos.
Light_Pos and Eye_Pos are now in world coord system. Now just calulate
vec4 pos = model * in_Position;
vec4 vert = light_Pos;
and
Out.eye = vec3(eye_Pos);
It should work. When performing spatial operations always make sure that all points / vectors are in the same coorinate system.

OpenGL ES 2.0 Android Clipping Color

i'm using my fragment shader to clip objects in OpenGL ES 2.0. Everything is working well, however the colour of the clipped surface is all black... I can not figure out how to change the colour (well ideally I'd want to make a similar texture to the rest of the object. I have included the code for my fragment shader below.
precision mediump float;
varying vec2 texCoord;
varying vec3 v_Normal;
varying vec3 v_Position;
varying vec4 originalPosition;
uniform sampler2D texSampler2D;
uniform vec3 lightPosition;
uniform vec4 lightColor;
void main()
{
vec3 L = normalize(lightPosition - v_Position);
vec3 N = normalize(v_Normal);
float NdotL = max(dot(N,L),0.0);
if(originalPosition.y >= 2.0){
discard;
}else{
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}
}
Using discard in a fragment shader doesn't render anything to the pixel, it just leaves it exactly as it was before, so the black color is probably your clear color or whatever you had rendered previously. If you want to render a particular color to the pixels you are currently discarding, add another uniform to your shader for the clip color, like this:
uniform vec4 clipColor;
Set it in the same way you set the lightColor, then instead of discarding the pixel when clipping, you can set the pixel to the clip color:
if(originalPosition.y >= 2.0) {
gl_FragColor = clipColor;
} else {
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}

2D glsl shader transformation

I would like to create a shader to simulate a pseudo 3D water surface on a 2D scene build with libgdx. The idea is to recreate the following effect:
http://forums.tigsource.com/index.php?topic=40539.msg1104986#msg1104986
But I am stuck at creating the trapezoid shape, I think I didn't understand how texture coordinate are calculated on opengl shaders. May I modify the vertex in the vertex shader or may I displace the texture on the fragment shader?
here is a test I have done but that don't work as expected.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoord0;
void main()
{
v_color = a_color;
float factor = (a_texCoord0.x - 0.5) * (a_texCoord0.y);
v_texCoord0 = a_texCoord0;
v_texCoord0.x += factor;
gl_Position = u_projTrans * a_position;
}
the fragment shader is just a passthrough
varying vec4 v_color;
varying vec2 v_texCoord0;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoord0);
}
and the result image
I am sure that my approach is too naive.
Finally, I decided to use only the fragment shader to achieve the effect.
Fake depth calculation is done by applying X displacement in regards of the Y axis.
Here is the snipet:
// simulate 3D surface
vec2 new_coord = v_texCoord0;
// calculate the stretching factor
float factor = ((0.5 - new_coord.x) * new_coord.y);
// apply the factor and stretch the image
new_coord.x += factor;
gl_FragColor = texture2D(u_texture, new_coord);

Flickering when rendering with lwjgl on windows 8.1

When i render a mesh with lwjgl on my windows 8 computer i get weird flickering.
video
With exactly the same code (except for lwjgl natives) the image renders properly on my old MacBook.
So why does this just happen in windows 8.
fragment shader
#version 400core
in vec2 texCords;
in vec3 surfaceNormal;
in vec3 toLight;
in vec3 toCamera;
in vec3 position_out;
in float distanceToCamera_out;
uniform vec3 diffuseLightColor;
uniform vec3 specularLightColor;
out vec4 out_Color;
void main(void){
vec3 unitNormal = normalize(surfaceNormal);
vec3 unitToLight = normalize(toLight);
//temporary
vec4 color = vec4(1,1,1,1);
//diffuse light
float diffuseFactor = max(dot(unitNormal, unitToLight),0);
vec3 diffuse = diffuseLightColor * diffuseFactor;
//specular light
vec3 reflected = reflect(-unitToLight,unitNormal);
float specularFactor = max(dot(reflected,normalize(toCamera)),0);
specularFactor = pow(specularFactor,50); //dampening
vec3 specular = specularLightColor * specularFactor;
//final light
vec3 finalLight = diffuse + specular;
//final color
out_Color = color * vec4(finalLight,1);
}
render function
public void render(Entity entity) {
TexturedModel texModel=entity.getModel();
RawModel model=texModel.getModel();
ModelTexture texture = texModel.getTexture();
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texture.getBaseId());
GL30.glBindVertexArray(model.getVaoId());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, model.getVboiId());
shader.loadTransformation(Maths.createTransformationMatrix(entity.getPosition(), entity.getRx(), entity.getRy(), entity.getRz(), entity.getScale()));
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);
GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
}
GPU: gtx 970
In your video I see no flickering. However you may be experiencing screen tear...? If this is the case, add Display.setVSyncEnabled(true). You could also have fragmentation, and that is hardware issue and nothing wrong with your coding.

Categories