I'm looking into running libGDX based app on a Ubuntu Server. The server has no GPU. The purpose is to render animations defined as libGDX "games". The perfect setup would pipe frames out to standard output. I'm OK with some kind of software-simulated GPU.
I'm aware of libGDX headless mode, but as far as I understand it will not render any graphics.
LibGDX on Linux depends upon the LWJGL library for setting up rendering support. So, what you want to find out is if/how LWJGL can run without a GPU. Both LibGDX and LWJGL expose fairly low-level OpenGL commands, so I think what you need is the Mesa library (it exposes OpenGL support, but does everything in software).
From this question:
Does LWJGL (bare) use software or hardware rendering? it looks like you can ask LWJGL to not require hardware support. But, I think you still need some software-support enabled (and I believe that would be Mesa).
If you get this stack working, I think the next step is to have your "game" render to a FrameBuffer and then use Libgdx to render the FrameBuffer to file inside libgdx or OpenGL. (There may be ways to get lower layers of the stack to render to a file, too.)
Related
Is it possible to have an antialiasing shader (if so how) , or is it only possible with fbos based on information below.
.
I am trying to add anti aliasing to the gui in Minecraft which runs off lwjgl.
I first came upon this https://www.youtube.com/watch?v=Pdn13TRWEM0
In which he talks about how to make antialiasing in lwjgl, But since minecraft uses another framebuffer,
Display.create((new PixelFormat()).withDepthBits(24).withSamples(8));
GL11.glEnable(GL13.GL_MULTISAMPLE);
Dosen't work
From there in another video he describes doing it with fbos
https://www.youtube.com/watch?v=HCBoIvVmYgk
Would this be the only way for me to implement anti aliasing, or is it possible an easier way.
Optifine allows for antialiasing, but it only happens within minecraft games
With no antialiasing a circle looks like this
Which game version and modding framework are you using? There are a few ways of doing this, I happen to know one for fabricmc on everything up from 1.17.
(disclaimer: this library is made and maintained by me)
https://github.com/0x3C50/Renderer is a rendering library for modern fabric, that includes a MSAAFramebuffer you can use to render things anti aliased. Just wrap your rendering code in
MSAAFramebuffer.use(samples, () -> {
// render
});
TL;DR: Is there any way of forcing Java2D graphics to only go through the OpenGL pipeline, and entirely and completely ignore Direct3D and DirectDraw (desired effect: opengl32.dll should be used before ddraw.dll)?
I have created a game using Java and Swing with the intent of publishing on steam. A problem has arisen: the steam overlay doesn't appear. After asking in this discussion (not view-able for most people, keep reading for gist) the possible source of the problem was identified. The game is initializing Direct3D and DirectDraw before OpenGL, and the overlay is hooking on to the first of these, however OpenGL is the acceleration type supported by the steam overlay.
The problem would hopefully be fixed if Direct3D and DirectDraw are not used. That way the steam overlay will not try to hook on to them first, and just hook on to OpenGL.
I have tried the following flags to disable Direct3D and DirectDraw:
-Dsun.java2d.d3d=false -Dsun.java2d.ddoffscreen=false -Dsun.java2d.noddraw=true This has made no noticeable effect (ddraw.dll is still be used).
OpenGL is programmatically enabled because this is the only way the steam API can be initialize prior to the init of an OpenGL device:
//init steam api here. api init is required before OpenGL device init
System.out.println("The api was initialized successfully")
System.setProperty("sun.java2d.opengl", "True");
//continue with the game initialization, creates a frame, game loop, callbacks, etc.
Console output:
The api was initialized successfully
OpenGL pipeline enabled for default config on screen 0
It is worth noting that JavaFX is also used in the game but purely for audio. Nothing graphical is initialized. I am not sure if this would cause a problem or conflict.
In summary:
I'm trying to use only the OpenGL pipeline for Java2D so that the steam overlay will hook on to the OpenGL device. ddraw.dll (direct draw) is being used just before opengl32.dll is used (making it so that the overlay tries and fails to hook on to ddraw.dll because it's being used first). I'm trying to get it not to use ddraw.dll, so that it will only try to hook on to opengl32.dll.
Side note: I'm not noticing any difference when using OpenGL versus standard Java2D. I'm getting the OpenGL pipeline enabled message (above), but is it possible that it's still not using it somehow?
As far as I know the options you have used only apply to AWT and not JavaFX. So JavaFX could indeed be the problem here. As you are not using any JavaFX graphics anyway it might help to switch off the hardware acceleration of JavaFX via -Dprism.order=sw. To verify that you are actually not using it you could switch on logging too via -Dprism.verbose=true. I am not sure this helps but at least it is woth a try.
We have a lot of days of research but can't find a solution for the following project. We need to convert a flux project to IOS and android native app. But as flux supports flash scripting it has easily implementing some 3d effects like shadow, emboss gradient etc. Please check the link here for seeing the swf file we have. We need to convert all this features into a native IOS and android app. We have research some area and found that most of the item we can implement except one icon here. The fourth icon have some 3d effects, shadow effects, border, emboss, contour and gradient etc. Can anybody check on this and guide us whether this can be implemented in IOS and android. I am pasting the entire url here again http://projects.zoondia.org/signfabcreator/signCreator.swf. Please check and let me know if this is possible. Let me if this is possible or not. If yes it will be helpful for me if anybody can give me a clue about implementing those in both android and ios
Very interesting! But I'm afraid you have to reimplement all this functionality by yourself. Don't be upset. There are good news for you - OpenGL ES and GLSL are extremely portable. So you can reuse 100% of your shaders. What is even better now you can share the other code too and stay native. Not long ago Intel announced the Multi-OS Engine. It enables you to develop native mobile applications for iOS and Android with Java. There are a bunch of tutorials inside installation package. One of them is especially dedicated to cross-platform OpenGL capabilities. Please check out my OpenGLBox sample.
My Android OpenGL-ES 2.0 project don't even run on Android emulator (GPU host) or virtual machines (Androidx86.iso + VirtualBox), but no problem with physical android devices.
so, can i use libGdx desktop port for emulation of Android GLES20 without integrating other libGdx stuff in my android project?
if yes, then how to load shaders from android's RAW folder without libGdx framework?!
beside libGdx, i find a working wrapper
but if i have to maintain a wrapper by myself, then NDK+GLUT isn't faster option?
You can do something like that. It is what libgdx wiki says:
Clearly there's no OpenGL ES implementation on the desktop useable via
Java. Well, there is now. Libgdx emulates OpenGL ES via the standard
OpenGL API on Windows and Linux. Nearly all features are supported
such as fixed point math and so on. The only feature missing at the
moment is support for fixed point vertex buffer objects and some minor
getter methods.
Here is tutorial how you can use libgdx with lwjgl backend.
Here is how shader programs are initialized.
You can see from the link above how shaders are loaded from resources.
Here is what ShaderProgram does under the hood.
You can see what it calls some functions of gl object which contains methods emulating opengl es 2.0.
I am currently working on a notification app, using Java.
I need for the window to be shortly (about 10 sec.) on top of any apps. including the ones running OpenGL and DirectX (e.g. computer games). I know that the JFrame.setAlwaysOnTop(true); only works in window based environments, which OpenGL and DirectX is not.
Thought about the OpenGL bindings, but can't seem to find anything about DirectX?
Maybe it is possible using JNI?
Any ideas are welcome :-)
Edit: Thanks for the answers guys, but I actually gave up on it and decided to force the game, of which I am making the overlay for, in windowed fullscreen mode. I´ll just have to wait and see what the beta-testers have to say about the reduced fps :-) thanks again
Don't bother with directx and java. That has an impedance mismatch. Java is cross platform, OpenGL is cross platform. If you insist on using directx you can try SWT since it uses native operating system resources you might have a chance.
For opengl what you are looking for is canvas integration with panel/jframe, you can use JOGL for that. I think lwjgl has a implementation too, but the jogl one is better. Then you can set the alwaysOnTop on that jframe.
kenai.com/projects/jogl
Your problem is not related to your code but the graphic device instead.
The device can have only one accelerated surface in full screen mode. This is Why you cannot launch two DirectX games in fullscreen if you have a dueal head GPU.
Your best bet is to try to find the DX/OpenGL surface pointer and attach to it using the required APIs. I don't know if this can be done in Java however the approach woul be:
get a pointer to the accelerated surface of the game or app that is running
create an accelerated graphic device
create a non-mimpap texture(screenshot) from your UI
get a pointer to the texture surface
use the stretch rectangle from this surface to the one you got in step 1
present
This may introduce flickering if not using Vsync and refresh rate synched repetition for each frame. This may introduce a substantial framerate drop too.
One example of this is FRAPS, that draws the current framerate on the accelerated app.