TL;DR: Is there any way of forcing Java2D graphics to only go through the OpenGL pipeline, and entirely and completely ignore Direct3D and DirectDraw (desired effect: opengl32.dll should be used before ddraw.dll)?
I have created a game using Java and Swing with the intent of publishing on steam. A problem has arisen: the steam overlay doesn't appear. After asking in this discussion (not view-able for most people, keep reading for gist) the possible source of the problem was identified. The game is initializing Direct3D and DirectDraw before OpenGL, and the overlay is hooking on to the first of these, however OpenGL is the acceleration type supported by the steam overlay.
The problem would hopefully be fixed if Direct3D and DirectDraw are not used. That way the steam overlay will not try to hook on to them first, and just hook on to OpenGL.
I have tried the following flags to disable Direct3D and DirectDraw:
-Dsun.java2d.d3d=false -Dsun.java2d.ddoffscreen=false -Dsun.java2d.noddraw=true This has made no noticeable effect (ddraw.dll is still be used).
OpenGL is programmatically enabled because this is the only way the steam API can be initialize prior to the init of an OpenGL device:
//init steam api here. api init is required before OpenGL device init
System.out.println("The api was initialized successfully")
System.setProperty("sun.java2d.opengl", "True");
//continue with the game initialization, creates a frame, game loop, callbacks, etc.
Console output:
The api was initialized successfully
OpenGL pipeline enabled for default config on screen 0
It is worth noting that JavaFX is also used in the game but purely for audio. Nothing graphical is initialized. I am not sure if this would cause a problem or conflict.
In summary:
I'm trying to use only the OpenGL pipeline for Java2D so that the steam overlay will hook on to the OpenGL device. ddraw.dll (direct draw) is being used just before opengl32.dll is used (making it so that the overlay tries and fails to hook on to ddraw.dll because it's being used first). I'm trying to get it not to use ddraw.dll, so that it will only try to hook on to opengl32.dll.
Side note: I'm not noticing any difference when using OpenGL versus standard Java2D. I'm getting the OpenGL pipeline enabled message (above), but is it possible that it's still not using it somehow?
As far as I know the options you have used only apply to AWT and not JavaFX. So JavaFX could indeed be the problem here. As you are not using any JavaFX graphics anyway it might help to switch off the hardware acceleration of JavaFX via -Dprism.order=sw. To verify that you are actually not using it you could switch on logging too via -Dprism.verbose=true. I am not sure this helps but at least it is woth a try.
Related
Is it possible to have an antialiasing shader (if so how) , or is it only possible with fbos based on information below.
.
I am trying to add anti aliasing to the gui in Minecraft which runs off lwjgl.
I first came upon this https://www.youtube.com/watch?v=Pdn13TRWEM0
In which he talks about how to make antialiasing in lwjgl, But since minecraft uses another framebuffer,
Display.create((new PixelFormat()).withDepthBits(24).withSamples(8));
GL11.glEnable(GL13.GL_MULTISAMPLE);
Dosen't work
From there in another video he describes doing it with fbos
https://www.youtube.com/watch?v=HCBoIvVmYgk
Would this be the only way for me to implement anti aliasing, or is it possible an easier way.
Optifine allows for antialiasing, but it only happens within minecraft games
With no antialiasing a circle looks like this
Which game version and modding framework are you using? There are a few ways of doing this, I happen to know one for fabricmc on everything up from 1.17.
(disclaimer: this library is made and maintained by me)
https://github.com/0x3C50/Renderer is a rendering library for modern fabric, that includes a MSAAFramebuffer you can use to render things anti aliased. Just wrap your rendering code in
MSAAFramebuffer.use(samples, () -> {
// render
});
I am Using Java OpenGL(JOGL).
I knew GL context depends on each thread.
But I have to convert Local coordinate to Global one when mouse clicked.
Can I refer the CACHED GL CONTEXT in AWT EVENT THREAD?
If I can not, then is there any alternative logic?
Please read the JOGL user's guide. You mustn't use a GL instance when its OpenGL context isn't current, you mustn't store the GL instance, you should use it in a GLEventListener and you should avoid passing it. You can use GLAutoDrawable.invoke() to execute a task when the drawable is displayed. I don't advise you to make the OpenGL context current even though it's possible.
Are you sure that you need something OpenGL related to convert local coordinates to global ones?
Finally, you should rather ask questions specific to JOGL on our official forum as only a very few JogAmp contributors come here. We can't be everywhere.
It seems that in android with opengl when you rotate your screen, activity gets recreated. Does thus cause all the opengl programs to be unloaded from the memory? When I use GLES20.glUseProgram(savedProgramId); It says that there is no such program. What do I do wrong? (By the way, I keep my program id in a static field)
You can make changes to your manifest to indicate that you will handle changes in screen orientation yourself.
See 'configchanges'+'orientation' here: http://developer.android.com/guide/topics/manifest/activity-element.html
However, you'll still have the problem that your OpenGL context will be lost when the user switches between apps.
The most correct thing to do is to fully handle loss and recreation of the OpenGL context and all associated resources. In a large and complex project this can be very difficult.
A reasonable alternative is to use setPreserveEGLContextOnPause (http://developer.android.com/reference/android/opengl/GLSurfaceView.html#setPreserveEGLContextOnPause%28boolean%29) which is available on Android 4.0 and above.
The documentation states that the OpenGL context might not always be preserved, but my opinion is that it works well enough to ship with and avoids a lot of complicated code. When your app is in the background, it might get terminated due to memory pressure anyway, so if it's terminated occasionally due to a device's limit on EGL contexts then that seems acceptable to me.
I'm looking into running libGDX based app on a Ubuntu Server. The server has no GPU. The purpose is to render animations defined as libGDX "games". The perfect setup would pipe frames out to standard output. I'm OK with some kind of software-simulated GPU.
I'm aware of libGDX headless mode, but as far as I understand it will not render any graphics.
LibGDX on Linux depends upon the LWJGL library for setting up rendering support. So, what you want to find out is if/how LWJGL can run without a GPU. Both LibGDX and LWJGL expose fairly low-level OpenGL commands, so I think what you need is the Mesa library (it exposes OpenGL support, but does everything in software).
From this question:
Does LWJGL (bare) use software or hardware rendering? it looks like you can ask LWJGL to not require hardware support. But, I think you still need some software-support enabled (and I believe that would be Mesa).
If you get this stack working, I think the next step is to have your "game" render to a FrameBuffer and then use Libgdx to render the FrameBuffer to file inside libgdx or OpenGL. (There may be ways to get lower layers of the stack to render to a file, too.)
I am currently working on a notification app, using Java.
I need for the window to be shortly (about 10 sec.) on top of any apps. including the ones running OpenGL and DirectX (e.g. computer games). I know that the JFrame.setAlwaysOnTop(true); only works in window based environments, which OpenGL and DirectX is not.
Thought about the OpenGL bindings, but can't seem to find anything about DirectX?
Maybe it is possible using JNI?
Any ideas are welcome :-)
Edit: Thanks for the answers guys, but I actually gave up on it and decided to force the game, of which I am making the overlay for, in windowed fullscreen mode. I´ll just have to wait and see what the beta-testers have to say about the reduced fps :-) thanks again
Don't bother with directx and java. That has an impedance mismatch. Java is cross platform, OpenGL is cross platform. If you insist on using directx you can try SWT since it uses native operating system resources you might have a chance.
For opengl what you are looking for is canvas integration with panel/jframe, you can use JOGL for that. I think lwjgl has a implementation too, but the jogl one is better. Then you can set the alwaysOnTop on that jframe.
kenai.com/projects/jogl
Your problem is not related to your code but the graphic device instead.
The device can have only one accelerated surface in full screen mode. This is Why you cannot launch two DirectX games in fullscreen if you have a dueal head GPU.
Your best bet is to try to find the DX/OpenGL surface pointer and attach to it using the required APIs. I don't know if this can be done in Java however the approach woul be:
get a pointer to the accelerated surface of the game or app that is running
create an accelerated graphic device
create a non-mimpap texture(screenshot) from your UI
get a pointer to the texture surface
use the stretch rectangle from this surface to the one you got in step 1
present
This may introduce flickering if not using Vsync and refresh rate synched repetition for each frame. This may introduce a substantial framerate drop too.
One example of this is FRAPS, that draws the current framerate on the accelerated app.