Changing existing GL texture size and pixel data - java

To make things short, a GL texture is allocated and created prior to my code running (I cannot prevent the first texture from being created, but I know the GL id that it is attached to). I need to redefine that existing texture using a buffered image that is twice the size of the old image. Does anybody know how to approach doing something like this? Based on some Google searches, it would look like I need to use glTexSubImage2D, but I'm not sure how.
Any help on the matter would be useful, Thanks

In my understanding you cannot replace in-place the texture, you have to allocate a new texture and on the next rendering, you will have to remap the texture index when you do the rendering.
The OpenGL is to high level to have access to previous textures memory and edit it.
There is a tutorial to change on the same shape a specific texture:
http://nehe.gamedev.net/tutorial/playing_avi_files_in_opengl/23001/

Related

JOGL Loading textures individually and using multiple at once

Alright, I couldn't find a good name for this, so I will explain in a bit further detail.
I am making a game using LWJGL and I have gotten some basic rendering done, but now I want to do something a bit more advanced.
Here is the situation:
I have a mesh (positions, normals, texture coords, indices) I generate which can currently support 1 texture, this would be great if I had a single image containing all of the textures, but sadly that isn't the case. I have a individual image for each texture which needs to be loaded individually.
Now, I see a way how I could do this, but it doesn't seem practical or like a good usage of memory.
-Load all the textures into one image and save where each one is in that image for usage with the texture coords.
The textures should NOT blend together, hard coding anything is not an option as I wish to allow modding to be easy to implement, and anywhere from 1 (best case scenario) to 65,536+ textures (worst case scenario) are able to be used in the same "mesh".
I am simply going to use a Texture Atlas as doing anything else seems impractical. Thanks #httpdigest for the suggestion.

Java2D best way to pick various image with Alpha background overlapped?

I am developing an isometric game in Java2D. I.e, note that I do not have direct access to hardware pixel shaders (real-time software pixel shaders aren't practical. I can do a single pass on every entity texture without a noticeable hit on performance)
I know the typical method would be to somehow encode the depth of the individual pixels into a depth buffer and look that up. However, I don't know how I can do that efficiently in Java2D. How would I store the depth buffer? How would I filter out the alpha in an image? Etc.
Up until now I have just been reversing the projection matrix I use to calculate the tile-coordinates. However, that doesn't work well when you have entities that render outside of those tile's bounds.
Another method I considered was using a color-map, however I have the same problems with this as I do with the depth buffer (and if I can get the depth buffer working I'd much rather use that.)
Here is a picture of what I am working with:
I've resolved this quite nicely. The solution is actually very simple, just unconventional.
The graphics are depth sorted via a TreeMap, and then rendered to the screen. One can simply traverse this TreeMap in reverse (and keep it until the next render cycle) to translate the cursor location to the proper image it falls over (by testing the pixels [in reverse render order] and checking if they are transparent.)
The solution is in the open-source project, under the io.github.jevaengine.world.World class, pick method. https://github.com/JeremyWildsmith/JevaEngine/blob/master/jevaengine/src/main/java/io/github/jevaengine/world/World.java

LibGDX FrameBuffer scaling

I'm working on a painting application using the LibGDX framework, and I am using their FrameBuffer class to merge what the user draws onto a solid texture, which is what they see as their drawing. That aspect is working just fine, however, the area the user can draw on isn't always going to be the same size, and I am having trouble getting it to display properly on resolutions other than that of the entire window.
I have tested this very extensively, and what seems to be happening is the FrameBuffer is creating the texture at the same resolution as the window itself, and then simply stretching or shrinking it to fit the actual area it is meant to be in, which is a very unpleasant effect for any drawing larger or smaller than the window.
I have verified, at every single step of my process, that I am never doing any of this stretching myself, and that everything is being drawn how and where it should, with the right dimensions and locations. I've also looked into the FrameBuffer class itself to try and find the answer, but strangely found nothing in there either, but, given all of the testing I've done, it seems to be the only possible place for this issue to be created somehow.
I am simply completely out of ideas, having spent a considerable amount of time trying to troubleshoot this problem.
Thank you so much Synthetik for finding the core issue. Here is the proper way to fix this situation that you elude to. (I think!)
The way to make frame buffer produce a correct ratio and scale texture regardless of actual device window size is to set the projection matrix to the size required like so :
SpriteBatch batch = new SpriteBatch();
Matrix4 matrix = new Matrix4();
matrix.setToOrtho2D(0, 0, 480,800); // here is the actual size you want
batch.setProjectionMatrix(matrix);
I believe I've solved my problem, and I will give a very brief overview of what the problem is.
Basically, the cause of this issue lies within the SpriteBatch class. Specifically, assuming I am not using an outdated version of the class, the problem lies on line 181, where the projection matrix is set. The line :
projectionMatrix.setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
This is causing everything that is drawn to, essentially, be drawn at the scale of the window/screen and then stretched to fit where it needs to afterwards. I am not sure if there is a more "proper" way to handle this, but I simply created another method within the SpriteBatch class that allows me to call this method again with my own dimensions, and call that when necessary. Note that it isn't required on every draw or anything like that, only once, or any time the dimensions may change.

How can I do off screen rendering using LWJGL?

I'm trying to find a way to do rendering off screen with LWJGL. What I want to do is render something and keep it in memory as a texture, then at a later point use that to texture a shape I'm drawing in the main window. I'm pretty sure this should be done using a Frame Buffer Object, but I haven't been able to find any useful documentation online. I'm fairly new to Open GL and LWJGL so I'm sure there is some fundamental concept I'm missing.
Could someone possibly provide a simple example that renders something(I don't really care what) off screen to a texture? Ideally I would like to end up with a slick-util Texture object.
Create a frame buffer object and bind it as the primary render target. Here is a tutorial:
http://www.gamedev.net/page/resources/_/technical/opengl/opengl-frame-buffer-object-101-r2331

Render string to texture in Android and OpenGL ES

I've googled around everywhere, but cannot find much for rendering strings to textures and then displaying that texture on a quad on the screen. Can someone provide a run-down on the process or provide good resources that describe how? Is rendering strings to textures even the best method for displaying text in an Android OpenGL ES app?
EDIT:
Okay, so LabelMaker interferes with alpha blending, the texture (created from a PNG with a transparent background) now has a solid black background, rather than a transparent background. If I comment out all the LabelMaker-related code, it works fine.
UPDATE:
Nevermind. I took a look at the code to find that LabelMaker was disabling blending after drawing the labels.
I think this is what you are looking for.
If you don't want to use GL extensions you need to create the font as a bitmap and then create a class to convert that string into quads that you can draw.
I use this method with the 2 fonts in my game. I have a class that takes a wide texture with all the letters evenly spaced, and a string that matches the image, then uses lookups on the letters to find out how far in the bitmap it should go.
Your other option is to render your text to a offscreen bitmap using android, and then bind the text as a texture. This will let you use androids built-in font processing and rendering to create texture-based fonts.
The second method I have not used yet, but I have rendered google maps to a offscreen canvas and then bound the bitmap as a GL texture, so doing it for text should be much simpler.
If you are planning to have modifying string data in a gl loop you need to really worry about StringBuilder too, because it causes GC and performance issues. I hardcode all my strings so it doesn't allocate, and all my rapidly numbers are done through a second draw function dedicated to drawing changing numbers without using string-builder.

Categories