GLSL Moving Point Light - java

I have been writing a point light shader for my LWJGL + Java application. I am writing it based off of this tutorial. My problem is when I "walk around" with the camera, the light moves as well. Also, when I rotate the sphere, the light rotates with it.
I believe that the problem is in the Vertex Shader, but I put the fragment shader in also just in case.
Example 1 (No movement)
Example 2 (Moved Left and rotated the camera)
Vertex Shader
#version 330
in vec4 in_Position;
in vec3 in_Normal;
in vec2 in_TextureCoord;
uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;
uniform mat3 normal;
uniform vec4 light_Pos; //Set to 0, 3, 0, 1
out Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} Out;
void main(void) {
vec4 vert = view * model * light_Pos;
vec4 pos = model * view * in_Position;
Out.normal = normalize(in_Normal);
Out.lDir = vec3(vert - pos);
Out.eye = vec3(-pos);
Out.st = in_TextureCoord;
gl_Position = projection * view * model * in_Position;
}
Fragment Shader
#version 330
uniform sampler2D texture_diffuse;
in Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} In;
out vec4 color;
void main(void) {
vec4 diffuse = texture(texture_diffuse, In.st);
vec4 spec = vec4(0.0);
vec3 n = normalize(In.normal);
vec3 l = normalize(In.lDir);
vec3 e = normalize(In.eye);
float i = max(dot(n,l), 0.0);
if (i > 0.0) {
vec3 h = normalize(l+e);
float intSpec = max(dot(h,n), 0.0);
spec = vec4(1) * pow(intSpec, 50); //50 is the shininess
}
color = max(i * diffuse + spec, vec4(0.2));
}
I already tried the solution presented in this question, it did not solve my problem.

Just from a quick glance, looks like you're multiplying the light's position by the view and model matrix:
vec4 vert = view * model * light_Pos;
This means that whenever you walk around/move the camera you're changing the view matrix which affects the light's position, and likewise your when you move the sphere you're changing the model matrix which is also affects the light's position.
In other words if you want the light to be stationary in relation to the world then don't transform it by any matrices.

The problem is that your normal, Out.lDir and Out.eye are not in the same coordinate system. Normal is in your model's coords while the other too are in the eye space. Try to pass eye position as a uniform in a similar way to light_Pos.
Light_Pos and Eye_Pos are now in world coord system. Now just calulate
vec4 pos = model * in_Position;
vec4 vert = light_Pos;
and
Out.eye = vec3(eye_Pos);
It should work. When performing spatial operations always make sure that all points / vectors are in the same coorinate system.

Related

libgdx - how to set sprite batch to blend/shade like this?

I currently have a default sprite batch:
batch = new SpriteBatch();
I have read about using shaders to modify how the batch draws each sprite. In my game, I am try to create a 'nighttime' effect - I want every pixel on screen to be black, except for pixels that are already white. For the white pixels, I want a shade of blue. Obviously I am new to libgdx and openGL - can anyone who is familiar with blending or shaders help me out with this? What should I do to my spritebatch to achieve the effect that I am describing?
Effect which you would like to achieve could be done with something like this.
Vertex Shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main()
{
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
Fragment shader
precision mediump float;
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
bool equals(float a, float b) {
return abs(a-b) < 0.0001;
}
bool isWhiteShade(vec4 color) {
return equals(color.r, color.g) && equals(color.r, color.b);
}
void main() {
vec4 color = texture2D (u_texture, v_texCoords) * v_color;
if(isWhiteShade(color)) {
color *= vec4(0, 0, 1, 1);
}
else {
color *= vec4(0, 0, 0, 1);
}
gl_FragColor = color;
}
Add them to assets folder and then pass as arguments while creating instance of ShaderProgram and of course don't forget to apply this shader program to your SpriteBatch.
To be honest it has (almost) nothing common with SpriteBatch - all you have to do with it is just apply created shader
batch.setShader(ShaderProgram shader)
The shaders topic is very very wide (and to be honest independent of Libgdx) and there is no simple answer to your question without informations of what it should do. Also - to be honest - try at first to read something about shaders and come back on SO when you will have some troubles with this.
You can start right here:
https://github.com/libgdx/libgdx/wiki/Shaders

OpenGL ES 2.0 Android Clipping Color

i'm using my fragment shader to clip objects in OpenGL ES 2.0. Everything is working well, however the colour of the clipped surface is all black... I can not figure out how to change the colour (well ideally I'd want to make a similar texture to the rest of the object. I have included the code for my fragment shader below.
precision mediump float;
varying vec2 texCoord;
varying vec3 v_Normal;
varying vec3 v_Position;
varying vec4 originalPosition;
uniform sampler2D texSampler2D;
uniform vec3 lightPosition;
uniform vec4 lightColor;
void main()
{
vec3 L = normalize(lightPosition - v_Position);
vec3 N = normalize(v_Normal);
float NdotL = max(dot(N,L),0.0);
if(originalPosition.y >= 2.0){
discard;
}else{
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}
}
Using discard in a fragment shader doesn't render anything to the pixel, it just leaves it exactly as it was before, so the black color is probably your clear color or whatever you had rendered previously. If you want to render a particular color to the pixels you are currently discarding, add another uniform to your shader for the clip color, like this:
uniform vec4 clipColor;
Set it in the same way you set the lightColor, then instead of discarding the pixel when clipping, you can set the pixel to the clip color:
if(originalPosition.y >= 2.0) {
gl_FragColor = clipColor;
} else {
gl_FragColor = NdotL * lightColor * texture2D(texSampler2D, texCoord);
}

2D glsl shader transformation

I would like to create a shader to simulate a pseudo 3D water surface on a 2D scene build with libgdx. The idea is to recreate the following effect:
http://forums.tigsource.com/index.php?topic=40539.msg1104986#msg1104986
But I am stuck at creating the trapezoid shape, I think I didn't understand how texture coordinate are calculated on opengl shaders. May I modify the vertex in the vertex shader or may I displace the texture on the fragment shader?
here is a test I have done but that don't work as expected.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoord0;
void main()
{
v_color = a_color;
float factor = (a_texCoord0.x - 0.5) * (a_texCoord0.y);
v_texCoord0 = a_texCoord0;
v_texCoord0.x += factor;
gl_Position = u_projTrans * a_position;
}
the fragment shader is just a passthrough
varying vec4 v_color;
varying vec2 v_texCoord0;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoord0);
}
and the result image
I am sure that my approach is too naive.
Finally, I decided to use only the fragment shader to achieve the effect.
Fake depth calculation is done by applying X displacement in regards of the Y axis.
Here is the snipet:
// simulate 3D surface
vec2 new_coord = v_texCoord0;
// calculate the stretching factor
float factor = ((0.5 - new_coord.x) * new_coord.y);
// apply the factor and stretch the image
new_coord.x += factor;
gl_FragColor = texture2D(u_texture, new_coord);

Uniform used in shader is inactive

I have the following shaders:
My fragment shader:
#version 110
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
void main() {
gl_FragColor = vec4(1, 0, 0, 1);
}
And my vertex shader:
#version 110
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
attribute vec3 vertex;
void main() {
vec4 world = modelMatrix * vec4(vertex, 1);
vec4 camera = world * viewMatrix;
gl_Position = projectionMatrix * world;
}
They both compile and link just fine. When I print out my active uniforms I get
projectionMatrix
modelMatrix
but no viewMatrix. When I try and get the Uniform with glGetUniformLocation, I can get projectionMatrix, modelMatrix, and my vertex attribute just fine, so why is viewMatrix inactive?
The problem lies in the last line of your vertex shader:
gl_Position = projectionMatrix * world;
You probably meant projectionMatrix * camera. Otherwise, the GLSL compiler sees that camera isn't being used and optimizes it away, which means that viewMatrix is no longer being used either. Unused uniforms are not considered active, which leads to your predicament.
Note: Your viewing transform is also backwards. You probably want vec4 camera = viewMatrix * world.

Normals transforming, but lighting still weird?

I have a basic model viewer which displays a parsed OBJ model. I have a light positioned where the camera is, so the model being viewed is fully lit on one side. For some reason, my transformations aren't doing what I expected in my renderer. When I rotate my model 90 degrees about the x- or y-axis, the mesh is completely dark. But rotate another 90 degrees, and its fully lit again.
Am I doing the transformations wrong? Or are my normals wrong to begin with?
I calculated and applied the proper transform to the normals in my app (transpose of the inverse of the ModelView matrix)
Matrix.invertM(mNormalMatrix, 0, mMVMatrix, 0);
Matrix.transposeM(mNormalMatrix, 0, mNormalMatrix, 0);
before passing it into my shaders:
/*Vertex shader*/
attribute vec3 vPosition;
attribute vec3 vNormal;
uniform mat4 modelViewMatrix;
uniform mat4 mMVPMatrix;
uniform mat4 mViewMatrix;
uniform mat4 normalMatrix;
uniform float lightingEnabled;
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Calculate normal matrix*/
vec4 normal = vec4(vNormal, 0.0);
normalEye = normalize(vec3(normalMatrix * normal));
lightsEnabled = lightingEnabled;
lightPosEye = vec3(mViewMatrix * vec4(0.0, 0.0, 3.0, 1.0));
vertEye = vec3(modelViewMatrix * vec4(vPosition, 1.0));
gl_Position = mMVPMatrix * vec4(vPosition, 1.0);
}
/*Fragment shader*/
precision mediump float;
/*uniform vec4 vColor; */
varying float lightsEnabled;
varying vec3 lightPosEye;
varying vec3 normalEye;
varying vec3 vertEye;
void main() {
/*Light output components*/
vec3 Ia;
vec3 Id;
/*light source components*/
vec3 La = vec3(0.5);
vec3 Ld = vec3(1.0);
/*vec3 Ls = vec3(1.0);*/
vec3 Ka = vec3(0.3); /*ambient reflectance term*/
vec3 Kd = vec3(1.0); /*diffuse term*/
/*ambient light term*/
Ia = La * Ka;
float dotProd;
vec3 lightToSurface;
if(lightsEnabled > 0.5){
/*diffuse light term*/
lightToSurface = normalize(lightPosEye - vertEye);
dotProd = dot(lightToSurface, normalEye);
dotProd = max(dotProd, 0.0);
}
else {
dotProd = 1.0;
}
Id = Ld * Kd * dotProd;
gl_FragColor = vec4(Ia + Id, 1.0);
}
I know its been a while, but I've found the solution. The problem was this line in my vertex shader:
normalEye = normalize(vec3(normalMatrix * normal));
I changed it to:
normalEye = normalize(vec3(modelViewMatrix * normal));
and everything works fine. Although I have no idea why the second line works when the first one should.

Categories