I'm having a barely noticeable but annoying and random glitch tweening an actor with tween engine. The actor is just a group with an image. The tween is just a linear from the right of the screen to the left and repeating.
FPS is always showing 60.
Any idea?
This is the code:
public class BackgroundScreen extends AbstractScreen {
public BackgroundScreen() {
stage = new Stage();
stage.setViewport(Properties.VIRTUAL_WIDTH, Properties.VIRTUAL_HEIGHT, false);
createRock();
}
private void createRock() {
rock = new GameElement(atlas.createSprite("obj-stone"));
rock.setX(Properties.VIRTUAL_WIDTH);
rock.setY(100);
float duration=5f;
Tween.to(rock, ActorAccessor.POSITION_XY, duration/2).ease(Linear.INOUT).target(-rock.getWidth(), rock.getY()).repeat(Tween.INFINITY, 0).start(Resources.tweenManager);
stage.addActor(rock);
}
#Override
public void render(float delta) {
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
Resources.tweenManager.update(delta);
stage.act(delta);
stage.draw();
}
}
UPDATE
This is the link to a youtube video with the effect. Two considerations:
http://www.youtube.com/watch?v=0pVJbGFciyo
a) When screenrecording the video, the glich is more pronounced as you can watch.
b) I can't see any glitch on my Galaxy Nexus.
c) I did the same tween with just libgdx actions and the glitch is the same
d) Somehow it is related to my PC.
You should play with the tween equation.
Off the bet i'd say that Linear.INOUT is what causing the sprite to glitch. maybye simply try running it with Linear.IN will do the trick.
Related
I've been struggling with how to use and set up Viewports in LibGDX for quite some time. I want to be able to render everything like its on a display that is 1920x1080 and have it scale to fit the display its on, and I need some help getting to work like that.
This is what I want it to look like (taken from a computer with a 1920x1080 monitor), but when I run the same code on my laptop which is 1440x800, it looks like this. I apologize for the poor photo of a screen, I couldn't get it to take a screenshot of the game running for whatever reason, but it shows that the top of the display remains unused, and that not everything is being fit to the display. This is the main code running the show:
public class Main extends Game {
...
public void create() {
...
Gdx.graphics.setFullscreenMode(Gdx.graphics.getDisplayMode());
//last
this.setScreen(new MainMenu(this));
}
public void render() {
super.render(); //important!
}
...
}
And then the MainMenu class
public class MainMenu implements Screen{
...
public MainMenu(final Main game) {
this.game = game;
cam = new OrthographicCamera();
cam.setToOrtho(false, 1920, 1080);
...
}
#Override
public void render(float delta) {
Gdx.gl.glClearColor(0.025f, .025f, 0.025f, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
cam.update();
game.batch.setProjectionMatrix(cam.combined);
...
if(Gdx.input.isKeyPressed(Keys.ESCAPE)) {
Gdx.app.exit();
}
}
...
}
How would I implement a Viewport or something of the like to get it to look the same on the smaller screen as it does on the larger? Any help is really appreciated! If you want to see the code that I left out for brevity, its all on my GitHub. Thanks again!
That didn't take me long, hopefully someone will learn from me though, ha ha.
Turns out when you use the camera with fixed height and width like that, it does fill up the whole monitor, but the cameras width DOES NOT equal the value returned by Gdx.graphics.getWidth(). Because of this all my code was rendering like it was being compressed because it was referencing the width returned by Gdx.graphics, and not the camera.viewportWidth.
Lesson learned: Gdx.graphics.getWidth() can and will change depending on device and cam.veiwportWidth wont. Oops!
I am trying out Libgdx, and I have an actor which performs some action whenever we click on it. So far it is working fine. Now I want to add light to the actor. After doing some research I came across Box2DLights. When I tried adding it to my project onClick Actor which was working fine does not seem to work. I am pretty sure this is due to rayhandler/Box2DLights because that is the only change I am making. here is the minimal change that I made to include Box2DLights.
public class GameScreen implements Screen {
private RayHandler rayHandler;
private World world;
public GameScreen(Game game) {
this.game = game;
world = new World(new Vector2(0, 0), true);
rayHandler = new RayHandler(world);
rayHandler.setAmbientLight(0.1f, 0.1f, 0.1f, 1f);
rayHandler.setBlurNum(3);
}
#Override
public void show() {
viewport = new FitViewport(1080, 720);
stage = new Stage(viewport);
rayHandler.setCombinedMatrix(stage.getCamera().combined);
Gdx.input.setInputProcessor(stage);
}
#Override
public void render(float delta) {
//some custom rendering logic, but nothing related to rayHandler, excluding this for brevity.
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
stage.act(Gdx.graphics.getDeltaTime());
stage.draw();
rayHandler.updateAndRender();
}
Now When I debugged, I realised the the onClick is
working little below the actual actor
, that means somehow the coordinates sifted(I know weird).
Can you please help?
Thanks #Mikhail Churbanov for your response here.
If somebody else stumbles on this again here is the solution which worked.
viewport = new FitViewport(1080, 720);
rayHandler.useCustomViewport(viewport.getScreenX(),
viewport.getScreenY(),
viewport.getScreenWidth(),
viewport.getScreenHeight());
The explaination is box2lights doesn't auto-acquire custom viewports, and restores the 'default one' after the updateAndRender called - your need to set your custom 'fitted' viewport to rayHandler so that it would restore it correctly- using the rayHandler.useCustomViewport(...) method.
All credits to #mikahi churbanov
I'm currently working on a libgdx game and before I give it final touches I wanted to actually hear something from experienced users, that has been bothering me for a few days already.
If I want to support as many as possible devices, essentially I will be designing graphics for the biggest possible res ,which is then going to be scaled if needed, for smaller screens, right? How do I go about developing for a resolution that is even bigger than my laptop's(the 2015/16 gen phones). My laptop has a resolution of 1920x1080px and the S7 Samsung has 2k+ width.
Thank you!
I think what you are looking for is Viewports. You have to decide which strategy fits best your needs. For example a FitViewport always keeps the aspect ratio you define, which might lead to black bars on some devices.
When I personally develop with libgdx I place and size all objects relative to the screen width and height. This includes images, fonts, buttons, etc. This gives me a pretty consistent result across all devices because most devices today have a ratio 16:9 or something close to it. For developing an image larger than your screen size what's wrong with just using photoshop to create the image of the specified size?
Better you choose the screen with as 1280 and screen height as 800 and also use the fill viewPort . So you will be able to render your game in almost all the screens without the issue of stretching.
Viewport is the method which provided by the libgdx to solve this multi screen compatible issue . here i will post some sample code which you can use for the reference.
public class myGame extends ApplicationAdapter {
public OrthographicCamera camera;
public Viewport viewPort;
private SpriteBatch batch;
private BitmapFont myScoreFont;
private Texture texture;
public myGAme() {
}
#Override
public void create() {
myScoreFont = new BitmapFont(Gdx.files.internal(Constants.PATH_TO_MY_SCORE_FONT), true);
batch = new SpriteBatch();
float w = Gdx.graphics.getWidth();
float h = Gdx.graphics.getHeight();
texture = new Texture(Gdx.files.internal(Constants.PATH_TO_LEFT_BAR));
camera = new OrthographicCamera();
camera.position.set(0, 0, 0);
camera.update();
camera.setToOrtho(false, Constants.APP_WIDTH, Constants.APP_HEIGHT);
// Here is the viewport is setting up with the camera and the screen size
viewPort = new FillViewport(1280, 800, camera);
}
#Override
public void dispose() {
batch.dispose();
}
#Override
public void render() {
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL30.GL_COLOR_BUFFER_BIT);
float deltaTime = Gdx.graphics.getDeltaTime();
batch.setProjectionMatrix(camera.combined);
batch.begin();
batch.draw(myScorefont,"Score",0,0);
batch.end();
}
#Override
public void resize(int width, int height) {
// the game area will be resized as per the screen size of the device
viewPort.update(width, height);
}
#Override
public void pause() {
}
#Override
public void resume() {
}
}
it's my first time posting and I'm self taught so be please gentle!
I've been building a bomberman replica game in libGDX using Game and Screen classes:
public class Main extends Game {
...
#Override
public void create() {
levelScreen = new LevelScreen(playerCount, new int[playerCount]);
levelScreen.level.addAction(Actions.sequence(Actions.alpha(0), Actions.fadeIn(2f)));
this.setScreen(levelScreen);
}
However when the game launches there is no fade effect.
public class LevelScreen implements Screen {
...
#Override
public void render(float delta) {
Gdx.gl.glClearColor(1, 0.1f, 0.5f, 0);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
level.act();
level.draw();
batch.end();
}
I want this levelScreen to fade in from black but it just doesn't!
When the round is over I want to fadeOut of this levelScreen to black, then fadeIn to a trophyScreen from black:
(From Main Class)
#Override
public void render() {
super.render();
if (endRoundTimer <= 0) {
trophyScreen = new TrophyScreen(playerCount, levelScreen.getScore());
levelScreen.level.addAction(Actions.sequence(Actions.fadeOut(1), Actions.run(new Runnable() {
#Override
public void run() {
setScreen(trophyScreen);
}
})));
}
}
And I've tried using the show() method in the TrophyScreen:
public class TrophyScreen implements Screen {
...
#Override
public void show() {
stage.addAction(Actions.sequence(Actions.alpha(0), Actions.fadeIn(1)));
}
I've done loads of searching and tried various things but no joy. I'm sure I'm missing something somewhere in a draw() or render() method that is preventing the fade Action from taking place.
UPDATE1
#Override public void draw() {
super.draw();
if (roundOver) {
this.getBatch().begin(); String s = String.format("%s", message);
font_text.draw(this.getBatch(), s, (90 + (2 * 30)), (this.getHeight() / 2));
this.getBatch().end();
}
For fading to work on actors, they must properly apply their own color's alpha in the draw method. And for an entire hierarchy of objects to fade at once, they must all also apply the parentAlpha parameter from the draw method signature.
So your draw method in any custom Actor subclass should look like this:
public void draw (Batch batch, float parentAlpha) {
Color color = getColor();
batch.setColor(color.r, color.g, color.b, color.a * parentAlpha);
//..do drawing
}
If you are using a Sprite in your Actor instead of a TextureRegion (which I don't recommend due to redundancies) you must apply the color to the Sprite instead of Batch.
Note that this method of fading the whole game is not a "clean" fade. Any actors that are overlapping other actors will show through each other when the parent alpha is less than 1 during the fade. An alternative that would provide a clean-looking fade would be to draw a copy of your background (or black) over your entire scene and fade that instead.
I assume that level is an object of class that extends Stage and you are creating a control inside the stage, which is weird. You are not appling color to your font_text which I assume it is a BitmapFont
Solution, the weird way
If you want to do it in this way you will need something like that:
#Override public void draw() {
super.draw();
if (roundOver) {
getBatch().begin();
String s = String.format("%s", message);
font_text.setColor(getRoot().getColor())
font_text.draw(this.getBatch(), s, (90 + (2 * 30)), (this.getHeight() / 2));
getBatch().end();
}
}
getRoot() gets Group from Stage, we do it, because every action applied to Stage is actually applied to this Group root element. We get color (which has alpha channel) and we copy the color to the bitmapFont.
This solution is weird, because you are actually creating an Label inside Stage. It is pointless, actors plays on stage, not inside.
Solution, the good way
You want to draw text, right? So just use Label which is an actor, who shows a text. Actors do jobs for you:
stage = new Stage();
Label.LabelStyle labelStyle = new Label.LabelStyle(bitmapFont, Color.WHITE);
Label label = new Label("Hi, I am a label!", labelStyle);
stage.addActor(label);
Then you can apply actions and they will work fine (and every actor can have own actions applied).
stage.addAction(Actions.sequence(Actions.alpha(0), Actions.fadeIn(5)));
label.addAction(Actions.moveBy(0, 300, 15));
There is a lot of different actors like TextButton, Image, ScrollPane. They are customizable, easy to manage and they can be integrated in groups and tables.
Output:
A better way would be to just start by drawing a black image over everything, so you don't have to mess with every scene object's alpha. Use layering to do that. This post may be helpful.
Then you can control it's alpha channel, change it's rendering to 0 right before unpausing the game action to get it's drawing cycles back. Reactivate it on stage ending for your fade out effect.
Thank you cray, it's way better like this.
I am trying to use opencv background subtraction for detecting moving objects. It works good for some videos. But for one particular video (captured by still camera), it do not detect moving pedestrians. Also, there is very light snow showers in the video which are hard to see by naked eye. Could this be the reason it do not detect the moving objects. Or there could be other reasons like similar pixel values for background and foreground objects.
This is program code:
import processing.core.*;
import processing.video.*;
import gab.opencv.*;
public class BackgroundSubtraction extends PApplet {
Movie video;
OpenCV opencv;
public void setup(){
size(720,680);
video = new Movie(this, "/home/gurinderbeer/Downloads/IMG_1570.MOV");
opencv = new OpenCV(this, width, height);
opencv.startBackgroundSubtraction(0, 3, .5); // 5,3, .5
video.play();
}
public void draw() {
image(video, 0, 0);
opencv.loadImage(video);
opencv.updateBackground();
opencv.dilate();
opencv.erode();
noFill();
stroke(255, 0, 0);
strokeWeight(3);
for (Contour contour : opencv.findContours()) {
contour.draw();
}
}
public void movieEvent(Movie m) {
m.read();
}
public static void main(String _args[]){
PApplet.main(new String[] { BackgroundSubtraction.class.getName()});
}
}
These are couple of snapshots from video. We can hardly see any snow showers (although there are actually very light snow showers), and there are two pedestrians walking. but they are not captured in contour detection.
Could this very light snow showers be the reason for not detecting the walking pedestrians.
It could be due to the background noises from the environment, you could use Smoothing Effect on your video before processing.
The smoothing will definitely blur out the noises which will improve the overall detection rate.
There is an example here.