I am interested in creating a shader effect similar to that of the game (don't shoot me for using this example) Animal Crossing. As you move forward and backward along the terrain, the world "curves" giving the sense of being on a round surface. The thing is, I want to apply this kind of effect to a 2D side-scroller.
Imagine a game like Terraria where both ends of the screen (left and right sides) are slightly bent downward to give the illusion of curvature.
I have tried to find an article explaining how to achieve such an effect, but I haven't much in the way of helpful direction pointing. I know this isn't the most organized or well-put question, but any help would be appreciated.
Although I am not a fan of answering my own questions, I think I have found a way to achieve this effect and would like to share. By setting up a basic vertex shader, I was able to manipulate the location of the vertex along the y-axis depending on how far away it was from the center of the screen (the origin in my case). I originally used a linear absolute value equation to see how it worked, and I got something like this:
This is obviously a strange effect, but it is getting very close to what I want to achieve. I figured I would also try leveling the effect out by dividing the absolute value of the vertices' distance from the origin by some scalar. I started with 32 and the result was much more reasonable:
As you can see, there is only a slight bend in the terrain.
This is all nice and all, but it isn't a "curve" yet. It is just an upside down 'V' with a bit of squashing done. So from here, it was easy to apply a nice curve by using a parabola and just flattening it out in the same fashion. The result was this:
The result was a very nice curve that I could modify to be any intensity I wanted. I also tried applying the graph of a 3rd degree equation, but it gave more of a try-hard 3D feel. I suppose I could apply the graph of a circle so I can accurately get the proper curve when I am on a planet with a specified radius, but I am satisfied for now.
The code turned out to be only a few lines long. Here is the GLSL code for the vertex shader:
#version 150
varying vec4 vertColor; //Just to send the color data to the fragment shader
uniform float tx; //Passed by the Java program to ensure the terrain curvature
//is sinked by with player's movement, this value is usually
//in the "lookThroughCamera" method where you handle the game
//world's translation when the player moves
void main(void) {
float v0 = gl_Vertex[1] - pow((gl_Vertex[0] + tx) / 64, 2); //this will be the new y-value for the vertex.
//It takes the original y-value and subtracts
//the absolute value of the x-value plus the offset
//on the x-axis of the player divided by 64 all to
//the power of 2. The division by 64 levels the
//curve out enough so it looks acceptable
vec4 pos = vec4(gl_Vertex[0], v0, gl_Vertex[2], gl_Vertex[3]); //create the new position with the changed y-value
gl_Position = gl_ModelViewProjectionMatrix * pos; //multiply it by your projection and modelview matrices
vertColor = gl_Color.argb; //more color stuff that has nothing to do with the rest
}
EDIT: This approach does have a serious issue though. All vertical lines will remain vertical due to the fact they are not shifted along the x-axis properly. This is shown by the following image:
This gives an extremely unnatural look, and I have yet to come up with a proper solution to this.
Related
The following problem im working on is for one of my favorite past-times: game development.
Problem:
We're in 3D space. I'm trying to determine if a line between two vectors in said space is passing through a circle; the latter of which consists of: center vector, radius, yaw & pitch.
In order to determine that, my aim is to convert the circle to a plane which can either be infinite or just have the diameter of the circle for all it's sides.
Should the line between the two vectors in fact pass through that plane, i am left with the simple task of determining wether that intersection point is within the radius of the circle, in which case i can return either true or false.
What's already working:
I have my circles set up and the general framework is there. The circles are appearing/rendered in the 3D space exactly as specified, great!
What was already tried:
Copied some github gist codes and tried to make them work for my purposes. I kinda worked, sometimes at least. Unfortunately due to the nature of how the code was written, i had no idea what it was doing and just scrapped all of that.
Researched the topic a lot, too. But due to me not really understanding the language people speak when talking about line/plane intersections, i could have read the answer without recognizing it as such.
Question:
I'm stuck at line intersections. No idea where to go and how it works logically! So, where do i go from here and how can one comprehend all of this?
Note:
I did tag this issue as "java", but i'm not looking for spoon-fed code. It's a logical issue i'm trying to get past. If explained well enough, i will make the code work with trial and error!
Say if your circle is a circle in the XY plane with its centre on (0,0,0) and radius 1. How would you solve that?
You would check the values of X and Y when Z is equal to zero. And X squared plus Y squared would be less than 1 (radius squared) if the line passes through the circle.
In other words, you could transform the 3D coordinates to a simpler reference frame. So I think you need to learn transformation of 3D coordinates, which is really not too hard to do. You need to rotate the 3D space around until the centre vector only has a Z component, and yaw and pitch are zero. And then offset the coordinates so the circle centre is in (0, 0, 0). Then apply the same transformation to the line. You could lastly scale by radius, but to be honest that is not so important since the circle math is easy.
I am using a lookAt matrix calculated in an open source math library I found for LWJGL called JOML for free cam in my game. It works well when rotating left and right, but looking up and down seems to cause major distortions issues similar to heavily increasing the FOV.
Looking straight forward:
But when looking up:
And when looking down:
I haven't been able to find someone with a similar error, and no one using JOML has reported this. I'm not the best at matrix math, so all my tries at calculating my own lookAt matrix were fails.
If someone could make a lookAt matrix using JOML, or say any one of my (most likely) possible errors, that would be much appreciated, thank you.
Well, the lookAt code provided by that library is just this (I'm leaving the actual source code out and only keep the comments, as they nicely explain the steps which are done):
public final static void lookAt(Vector3f position, Vector3f centre, Vector3f up, Matrix4f dest) {
// Compute direction from position to lookAt
// Normalize direction
// Normalize up
// right = direction x up
// up = right x direction
// Set matrix elements
}
And this code is just wrong. Interestingly, I've seen this mistake before. It is actually the same error that that the "official" gluLookAt() manpage still contains (the actual glu implementations do not have the error, just that documentation is wrong).
What this code does is building an orthonormal basis. And the problem is that the up vector is normalized before the cross product for calculating right. The assumption seems to be that when building the cross product of two unit length vectors, the result will also be a unit lenght vector. But that is a common misconception. What's actually holding true is just:
length( cross( a, b) ) == lenght(a) * length(b) * sin(alpha)
where alpha is the angle between a and b. So the unit lenght assumption only holds if the vectors are already orthogonal. As the vectors are never re-normalized after the cross product, the resulting basis is not orthonormal, but will introduce some non-uniform scaling. The lookAt assumes that the inverse rotation can be calculated by the transposed matrix, which will completely fail in this case.
The distortion you see will get more severe when the angle between the viewing direction and your up vector will move away from 90 degrees.
The correct way to deal with this is just doing the normalization at a different point. Don't normalize the up-vector before the cross-product, but normalize it's result instead. Then, you have two unit-lenght vectors orthogonal to each other, and the second cross-product will also work as expected. So the actual lookAt function should be:
// Compute direction from position to lookAt
// Normalize direction
// right = direction x up
// Normalize right
// up = right x direction
// Set matrix elements
I have this camera that is set up with vecmath.lookatMatrix(eye, center, up).
The movement works fine, forwards, backwards, right, left, these work fine.
What does not seem to work fine is the rotation.
I am not really good at math, so I assume I may be missing some logic here, but I thought the rotation would work like this:
On rotation around the Y-axis I add/sub a value to the X value of the center vector.
On rotation around the X-axis I add/sub a value to the Y value of the center vector.
For example here is rotation to the right: center = center.add(vecmath.vector(turnSpeed, 0, 0))
This actually works, but with some strange behaviour. It looks like the higher the x/y of the center vector value gets, the slower the rotation. I guess it's because through the addition/substraction to the center vector it moves too far away or something similar, I would really like to know what is actually happening.
Actually while writing this, I just realized this can't work like this, because once I have moved around and rotated a bit, and for example I'm in "mid air", the rotation would be wrong....
I really hope someone can help me here.
Rotating a vector for OpenGL should be done using matrices. Linear movement can be executed by simply adding vectors together, but for rotation it is not enough just to change one of the coordinates... if that was the case, how'd you get from (X,0,0) direction to (0,X,0)?
Here is another tutorial, which is C++, but there are Java samples too.
There is a bit of math behind all this - you seem to be familiar with vectors, and probably have a 'feel' of them, which helps.
EDIT - if you are to use matrices in OpenGL properly, you'll need to familiarize yourself with the MVP concepts. You have something to display (the model) which is placed somewhere in your world (view) at which you are looking through a camera (projection).
A working solution to working with a free-flight camera with eye, center, up vectors was posted here:
Free Flight Camera - strange rotation around X-axis
I've been wanting to write it for some time now... as a project for the university, I've written (with a friend) a game that needed good explosions & particles effects. we've encountered some problems, which we solved quite elegantly (I think), and I'd like to share the knowledge.
OK then, so we found this tutorial: Make a Particle Explosion Effect which seemed easy enough to implement using Java with JOGL. before I'll answer as to how exactly we implemented this tutorial, I'll explain how rendering is done:
Camera: is just an orthonormal basis which basically means it contains 3 normalized orthogonal vectors, and a 4th vector representing the camera position. rendering is done using gluLookAt:
glu.gluLookAt(cam.getPosition().getX(), cam.getPosition().getY(), cam.getPosition().getZ(),
cam.getZ_Vector().getX(), cam.getZ_Vector().getY(), cam.getZ_Vector().getZ(),
cam.getY_Vector().getX(), cam.getY_Vector().getY(), cam.getY_Vector().getZ());
such that the camera's z vector is actually the target, the y vector is the "up" vector, and position is, well... the position.
so (if to put it in a question style), how to implement a good particles effect?
P.S: all the code samples and in-game screenshots (both in answer & question) are taken from the game, which is hosted here: Astroid Shooter
OK then, lets look at how we first approach the implementation of the particles: we had an abstract class Sprite which represented a single particle:
protected void draw(GLAutoDrawable gLDrawable) {
// each sprite has a different blending function.
changeBlendingFunc(gLDrawable);
// getting the quad as an array of length 4, containing vectors
Vector[] bb = getQuadBillboard();
GL gl = gLDrawable.getGL();
// getting the texture
getTexture().bind();
// getting the colors
float[] rgba = getRGBA();
gl.glColor4f(rgba[0],rgba[1],rgba[2],rgba[3]);
//draw the sprite on the computed quad
gl.glBegin(GL.GL_QUADS);
gl.glTexCoord2f(0.0f, 0.0f); gl.glVertex3d(bb[0].x, bb[0].y, bb[0].z);
gl.glTexCoord2f(1.0f, 0.0f); gl.glVertex3d(bb[1].x, bb[1].y, bb[1].z);
gl.glTexCoord2f(1.0f, 1.0f); gl.glVertex3d(bb[2].x, bb[2].y, bb[2].z);
gl.glTexCoord2f(0.0f, 1.0f); gl.glVertex3d(bb[3].x, bb[3].y, bb[3].z);
gl.glEnd();
}
we've most of the method calls are pretty much understandable here, no surprises. the rendering is quite simple. on the display method, we first draw all the opaque objects, then, we take all the Sprites and, sort them (square distance from camera), then draw the particles, such that further away from the camera is drawn first. but the real thing we have to look deeper into here is the method getQuadBillboard. we can understand that each particle has to "sit" on a plane that is perpendicular to the camera position, like here:
the way to compute a perpendicular plane like that is not hard:
substruct particle position from camera position to get a vector that is perpendicular to the plane, and normalize it, so it can be used as a normal for the plane. now a plane is defined tightly by a normal and position, which we now have (particle position is a point that the plane goes through)
compute the "height" of the quad, by normalizing the projection of the camera's Y vector on the plane. you can get the projected vector by computing: H = cam.Y - normal * (cam.Y dot normal)
create the "width" of the quad, by computing W = H cross normal
return the 4 points / vectors: {position+H+W,position+H-W,position-H-W,position-H+W}
but not all sprites acts like that, some are not perpendicular. for instance, the shockwave ring sprite, or the flying sparks/smoke trails:
so each sprite had to give it's own unique "billboard".BTW, the computation of the smoke trails & flying sprites sparks was a bit of a challenge as well. we've created another abstract class, we called it: LineSprite. i'll skip the explanations here, you can see the code in here: LineSprite.
well, this first try was nice, but there was an unexpected problem. here's a screenshot that illustrates the problem:
as you can see, the sprites intersects with each other, so if we look at 2 sprites that intersects, part of the 1st sprite is behind the 2nd sprite, and another part of it, is infront the 2nd sprite, which resulted in some weird rendering, where the lines of the intersection are visible. note, that even if we disabled glDepthMask, when rendering the particles, the result would still have the lines of intersection visible, because of the different blending that takes place in each sprite. so we had to somehow make the sprites to not intersect. the idea we had was really cool.
you know all these really cool 3D street art?
here's an image that emphasizes the idea:
we thought the idea could be implemented in our game, so the sprites won't intersect each other. here's an image to illustrate the idea:
basically, we made all the sprites to be on parallel planes, so no intersection could take place. and it did not effected the visible data, since it stayed the same. from every other angle, it would look streched, but from the camera point of view, it still looked great. so for the implementation:
when getting 4 vectors representing a quad billboard, and the position of the particle, we need to output a new set of 4 vectors that represents the original quad billboard. the idea of how to do this, is explained great in here: Intersection of a plane and a line. we have the "line" which is defined by the camera position, and each of the 4 vectors. we have the plane, since we could use our camera Z vector as the normal, and the position of the particle. also, a small change would be in the comparison function for sorting the sprites. it should now use the homogeneous matrix, which is defined by our camera orthonormal basis, and actually, the computation is as easy as computing: cam.getZ_Vector().getX()*pos.getX() + cam.getZ_Vector().getY()*pos.getY() + cam.getZ_Vector().getZ()*pos.getZ();. one more thing we should notice, is that if a particle is out of the viewing angle of the camera, i.e. behind the camera, we don't want to see it, and especially, we don't want to compute it's projection (could result in some very weird and psychedelic effects...).
and all is left is to show the final Sprite class
the result is quite nice:
hope it helps, would love to get your comments on this "article" (or on the game :}, which you can explore, fork, and use however you want...)
I'm working on an Android game and would like to implement a 2D grid to visualize the effects of gravity on the playing field. I'd like to distort the grid based on various objects on my playing field. The effect I'm looking for is similar to the following from the Processing library:
Except that my grid will be simpler- 2D, and viewed strictly from the top, as if looking down at the playfield.
Can someone point me to an algorithm for drawing such a grid?
The one idea that I came up with was to draw the lines as if they were "particles"- start at one end of the screen and draw the line in multiple segments, treating each segment as a particle, calculating the effect of gravity at each segment's location.
The application is intended to run on Android.
Thanks
I would draw each line as a separate segment, as you mentioned. If the grid is sparse, it might be fastest.
If you are viewing the grid from above, you would need to calculate x and y coordinate displacements. The easiest way would be to actually do displacement along the z axis and then fake perspective with x_result = x/z and y_result = y/z . You set z=1 and make sure to vary it only relatively slightly (+- 0.1 for instance).
Your z should be proportional to the sum of 1/(distance to the sphere)^2. This simulates how gravity works - it tapers off with square of the distance. Great news - square of the distance means to calculate delta_x^2 + delta_y^2 - so you save yourself that square root calculation == faster.