I have the following shaders:
My fragment shader:
#version 110
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
void main() {
gl_FragColor = vec4(1, 0, 0, 1);
}
And my vertex shader:
#version 110
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
attribute vec3 vertex;
void main() {
vec4 world = modelMatrix * vec4(vertex, 1);
vec4 camera = world * viewMatrix;
gl_Position = projectionMatrix * world;
}
They both compile and link just fine. When I print out my active uniforms I get
projectionMatrix
modelMatrix
but no viewMatrix. When I try and get the Uniform with glGetUniformLocation, I can get projectionMatrix, modelMatrix, and my vertex attribute just fine, so why is viewMatrix inactive?
The problem lies in the last line of your vertex shader:
gl_Position = projectionMatrix * world;
You probably meant projectionMatrix * camera. Otherwise, the GLSL compiler sees that camera isn't being used and optimizes it away, which means that viewMatrix is no longer being used either. Unused uniforms are not considered active, which leads to your predicament.
Note: Your viewing transform is also backwards. You probably want vec4 camera = viewMatrix * world.
Related
I currently have a default sprite batch:
batch = new SpriteBatch();
I have read about using shaders to modify how the batch draws each sprite. In my game, I am try to create a 'nighttime' effect - I want every pixel on screen to be black, except for pixels that are already white. For the white pixels, I want a shade of blue. Obviously I am new to libgdx and openGL - can anyone who is familiar with blending or shaders help me out with this? What should I do to my spritebatch to achieve the effect that I am describing?
Effect which you would like to achieve could be done with something like this.
Vertex Shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main()
{
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
Fragment shader
precision mediump float;
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
bool equals(float a, float b) {
return abs(a-b) < 0.0001;
}
bool isWhiteShade(vec4 color) {
return equals(color.r, color.g) && equals(color.r, color.b);
}
void main() {
vec4 color = texture2D (u_texture, v_texCoords) * v_color;
if(isWhiteShade(color)) {
color *= vec4(0, 0, 1, 1);
}
else {
color *= vec4(0, 0, 0, 1);
}
gl_FragColor = color;
}
Add them to assets folder and then pass as arguments while creating instance of ShaderProgram and of course don't forget to apply this shader program to your SpriteBatch.
To be honest it has (almost) nothing common with SpriteBatch - all you have to do with it is just apply created shader
batch.setShader(ShaderProgram shader)
The shaders topic is very very wide (and to be honest independent of Libgdx) and there is no simple answer to your question without informations of what it should do. Also - to be honest - try at first to read something about shaders and come back on SO when you will have some troubles with this.
You can start right here:
https://github.com/libgdx/libgdx/wiki/Shaders
I am following a tutorial by dermetfan on shaders and although my code seems to be pretty much identical, it does not perform what is intended. No compiling error happens and no logs are written.
ShaderProgram.pedantic = false;
shader = new ShaderProgram(Gdx.files.internal("shaders/red.vsh"), Gdx.files.internal("shaders/red.fsh"));
System.out.println(shader.isCompiled() ? "Compiled shader!" : shader.getLog());
batch.setShader(shader);
and this code is the vsh
attribute vec4 a_color;
attribute vec3 a_position;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoord0;
void main(void) {
v_color = a_color;
v_texCoord0 = a_texCoord0;
gl_Position = u_projTrans * vec4(a_position, 1.);
}
and heres the .fsh
varying vec4 v_color;
varying vec2 v_texCoord0;
uniform sampler2D u_sampler2D;
void main() {
gl_FragColor = texture2D(u_sampler2D, v_texCoord0) * v_color;
gl_FragColor = vec4(1.,0.,0.,1.);
}
If i get rid of the shader my texture loads just fine, however when I run with the shader my textures seem to vanish.
EDIT
Please try to stray from looking for errors in the code, ShaderProgram.pedantic = false and I know for a fact my code is identical to the tutorial I am watching and yet his works and mine doesn't. Thank you for any help!
I have been writing a point light shader for my LWJGL + Java application. I am writing it based off of this tutorial. My problem is when I "walk around" with the camera, the light moves as well. Also, when I rotate the sphere, the light rotates with it.
I believe that the problem is in the Vertex Shader, but I put the fragment shader in also just in case.
Example 1 (No movement)
Example 2 (Moved Left and rotated the camera)
Vertex Shader
#version 330
in vec4 in_Position;
in vec3 in_Normal;
in vec2 in_TextureCoord;
uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;
uniform mat3 normal;
uniform vec4 light_Pos; //Set to 0, 3, 0, 1
out Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} Out;
void main(void) {
vec4 vert = view * model * light_Pos;
vec4 pos = model * view * in_Position;
Out.normal = normalize(in_Normal);
Out.lDir = vec3(vert - pos);
Out.eye = vec3(-pos);
Out.st = in_TextureCoord;
gl_Position = projection * view * model * in_Position;
}
Fragment Shader
#version 330
uniform sampler2D texture_diffuse;
in Data {
vec3 normal;
vec3 eye;
vec3 lDir;
vec2 st;
} In;
out vec4 color;
void main(void) {
vec4 diffuse = texture(texture_diffuse, In.st);
vec4 spec = vec4(0.0);
vec3 n = normalize(In.normal);
vec3 l = normalize(In.lDir);
vec3 e = normalize(In.eye);
float i = max(dot(n,l), 0.0);
if (i > 0.0) {
vec3 h = normalize(l+e);
float intSpec = max(dot(h,n), 0.0);
spec = vec4(1) * pow(intSpec, 50); //50 is the shininess
}
color = max(i * diffuse + spec, vec4(0.2));
}
I already tried the solution presented in this question, it did not solve my problem.
Just from a quick glance, looks like you're multiplying the light's position by the view and model matrix:
vec4 vert = view * model * light_Pos;
This means that whenever you walk around/move the camera you're changing the view matrix which affects the light's position, and likewise your when you move the sphere you're changing the model matrix which is also affects the light's position.
In other words if you want the light to be stationary in relation to the world then don't transform it by any matrices.
The problem is that your normal, Out.lDir and Out.eye are not in the same coordinate system. Normal is in your model's coords while the other too are in the eye space. Try to pass eye position as a uniform in a similar way to light_Pos.
Light_Pos and Eye_Pos are now in world coord system. Now just calulate
vec4 pos = model * in_Position;
vec4 vert = light_Pos;
and
Out.eye = vec3(eye_Pos);
It should work. When performing spatial operations always make sure that all points / vectors are in the same coorinate system.
I would like to create a shader to simulate a pseudo 3D water surface on a 2D scene build with libgdx. The idea is to recreate the following effect:
http://forums.tigsource.com/index.php?topic=40539.msg1104986#msg1104986
But I am stuck at creating the trapezoid shape, I think I didn't understand how texture coordinate are calculated on opengl shaders. May I modify the vertex in the vertex shader or may I displace the texture on the fragment shader?
here is a test I have done but that don't work as expected.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoord0;
void main()
{
v_color = a_color;
float factor = (a_texCoord0.x - 0.5) * (a_texCoord0.y);
v_texCoord0 = a_texCoord0;
v_texCoord0.x += factor;
gl_Position = u_projTrans * a_position;
}
the fragment shader is just a passthrough
varying vec4 v_color;
varying vec2 v_texCoord0;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoord0);
}
and the result image
I am sure that my approach is too naive.
Finally, I decided to use only the fragment shader to achieve the effect.
Fake depth calculation is done by applying X displacement in regards of the Y axis.
Here is the snipet:
// simulate 3D surface
vec2 new_coord = v_texCoord0;
// calculate the stretching factor
float factor = ((0.5 - new_coord.x) * new_coord.y);
// apply the factor and stretch the image
new_coord.x += factor;
gl_FragColor = texture2D(u_texture, new_coord);
I'm making a program with OpenGL ES 2.0. I need to render a texture on top of another, like a clock hand. The textures are both 1024 x 1024 and are transparent. The transparency is always rendering black, and this is preventing me from overlaying the clock hand texture over the clock.
simple_fragment_shader.glsl
precision mediump float;
varying vec4 v_Color;
void main()
{
gl_FragColor = v_Color;
}
simple_vertex_shader.glsl
uniform mat4 u_Matrix;
attribute vec4 a_Position;
attribute vec4 a_Color;
varying vec4 v_Color;
void main()
{
v_Color = a_Color;
gl_Position = u_Matrix * a_Position;
gl_PointSize = 10.0;
}
texture_fragment_shader.glsl
precision mediump float;
uniform sampler2D u_TextureUnit;
varying vec2 v_TextureCoordinates;
void main()
{
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);
}
texture_vertex_shader.glsl
uniform mat4 u_Matrix;
attribute vec4 a_Position;
attribute vec2 a_TextureCoordinates;
varying vec2 v_TextureCoordinates;
void main()
{
v_TextureCoordinates = a_TextureCoordinates;
gl_Position = u_Matrix * a_Position;
}
I'm kind of new at OpenGL, and I have used textures, but I don't know how to get the transparency.
Also, if this helps, I am sort of following the methods in OpenGL ES 2 for Android by Kevin Brothaler from The Pragmatic Programmers.
Alpha value might be ignored without following settings.
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
I hope this help you:)