How to detect the zoom factor of an image - java

I read the related question on this topic here: How to detect subjective image quality
I would like to detect whether the uploaded image by the user could be zoomed in at high % (say, 500%) and still be or good image quality. I understand that "good image quality" is subjective, but for my purpose I would say it means "whether the image is pixelated?". I'm trying to find ways of how to do that?
Should I calculate the size of the uploaded image? Higher the size, more zoomed in it can be?
Should I calculate the total pixel count of the image?
Should I read the metatags of the uploaded image?
Combination of multiple things?
Are there solutions/libraries out there that determine the bad pixelation of an image?

Use the horizontal and vertical pixel count.
As long as the image has more pixels than you're displaying, you can zoom. Once you have more display pixels than image pixels, you have to interpolate the pixels. The higher the interpolation, the blurrier the picture.
Say you have a 4000 x 3000 pixel picture, and you're displaying 640 x 480 pixels.
You can zoom up to 625% horizontally (4000 / 640) and 625% vertically (3000 / 480). The smaller zoom number would be 625%. Rounding to an even zoom number, this picture could be zoomed up to 600% without pixelation.
Now, you could zoom even higher than 600%, if you're willing to interpolate the pixels. How high can you zoom? That would depend on what you consider acceptable interpolation.
My guess would be 25% higher. For this example picture, you could go to 800% before the picture got too blurry.
If you want to be on the safe side, keep it below the calculated zoom size. If not, go as high as you wish and let the user determine how high is too high.

Related

How to use DIP/DP in LibGDX

I have been trying for the past hours to find a solution to this problem, but I can't seem to find anything.
I am developing a game for Android using LibGDX. In the emulator, the game looks fine, but when I play it on my phone, everything is different and misplaced. The solution I found for this is using Density Individual Pixels instead of regular pixels, so everything is placed corectly, no matter what device I use. However, I can't seem to find a proper way to do that. The only relevant solution I have found was to use this:
public static float pixelToDP(float dp) {
return dp * Gdx.graphics.getDensity();
}
I tried resizing some of the objects using the formula above, but they are still different from the emulator.
Please, if anyone has a solution that doesn't involve changing the Ortographic Camera(already tried those), help me!
This answer is just to add to the what TomGrill said in the comments.
The reason your game looks fine in the emulator is because you have used values that fits the resolution of the emulator.
If you position a sprite at 100,100 on a 1920x1080 resolution, the sprite will be in the upper (or lower, depending on how you orient your y axis) left corner.
On a 200,200 resolution, the sprite will be placed in the middle of the screen.
The size of the sprite is also dependent on the resolution / pixel density. If you have 1 pixel per sq inch, a 32x32 pixel sprite will be 32 inches wide and high. But on a screen with a high pixel density, lets say 100px pr. sq. inch. the 32x32 sprite will look pretty small.
This is where viewports come in. You choose a resolution, lets say 900x540 and you just code for this resolution. The viewport will make sure your game scales up or down to fit any resolution and pixel density. If you place a sprite in the middle of you 900x540 screen, the viewport will make sure that it is placed in the middle of a 1920x1080 resolution.
Even if you wanted to do these calculations yourself, Gdx.graphics.getDensity(); is not of any use on its own. You need the width and the height of the physical screen to find the resolution. And what you would be doing next is reinventing the wheel.

How meters to pixel and scaling works in Box2d?

This is my first time using Box2d with Libgdx and I'm confused because now I'm dealing with meters and not pixels, for example if I wan't to set the size of my shape to 80x80 pixels, how do I do that in meters? Obviously I would have to zoom the camera which I have no idea how to do it. I wan't to know how can you set the amount of pixels needed to have one meter.

Android positioning in pixels works but shouldn't

I've been positioning things (bitmaps, text) on a surface view using pixels. For example, to centre something I would take half of the width and height of the rectangle representing the screen. Screen width and height are being returned as expected. My phone (480*800) is reporting the available screen as 442*800, with density 1.5, and running on a 240*320 emulator the available screen is reported as 221*320, density 0.75. All as expected.
But these values, surely, are pixels, and so surely to centre something I should first multiply by the density factor before halving. The strange thing is everything is centred perfectly, on both the low res emulator and my high res phone. This doesn't make sense to me.
On both screens, in order to get text to appear the same, I set the size using a density scaled value, as I'd expect, and this works.
In the manifest I've declared support for all screen sizes (although I was seeing the same 'correct' behaviour before I did this as well)
Why are the pixel values 'working' without me adjusting them for density? I'm now very confused.
A pixel is a pixel is a pixel!
It is the lowest common denominator of screens and sizes.
Density is how many pixels are packed into a unit of measure, typically an inch. on a physical display.
For example, taking a physical width of 5 inches, with a resolution of 500x1000 yields a density of 200dpi (1000 pixels in 5 inches). A physical width of 4 inches with the same resolution yields a density of 250dpi (1000 pixels in 4 inches).
x=500 (half the width) is the centre of the screen on both and is independent of density. Indeed, half the width in pixels is the centre of the screen for any size screen at any density.

Java drawImage interpolation

My question involves the drawImage method in Java Graphics2D (this is for a desktop app, not Android).
My BufferedImage that I'd like to draw contains high resolution binary data, most pixels are black but I have some sparse green pixels (the green pixels represent data points from an incoming raw data stream). The bitmap is quite large, larger than my typical panel size. I made it large so I could zoom in and out. The problem is when I zoom out I lose some of my green pixels .. as an example if my image is 1000 pixels and by panel is 250 pixels, I'd lose 1 out of 4 pixels in each direction (X and Y). If I use nearest neighbor interpolation when I scale the pixels can just disappear to black. If I use something like bilinear interpolation my green pixel will get recolored to somewhere between black and green.
I understand all this behavior, but my question is that is there any way to get the behavior I want, which is to make sure if any pixels is non-black I want it to be drawn at it's full intensity. Perhaps something like a "max-hold" interpolation.
I realize I could probably do what I want by drawing shape primitive over a black background, and maybe this is what I'll have to do. But there is a reason I'm using bitmaps (has to do with the fact that I'm showing the data in a falling spectrogram-type display - and it does have a mode where all the pixels could be colored and not just black and green).
Thanks,
You could look at the implementation of drawImage and override it to get your desired behaviour, however probably the core of the scaling uses hardware acceleration, so re implementing it in Java would be really slow.
You could look into JOGL, but my impression is that, if your pixels are really sparse, just drawing the green pixels on a black background (or over an image) would be both easy to code and very fast.
You could even have an heuristic switching between painting the dots to scaling the image if the number of dots starts being too high.

Multiple drawables background images to fit the screen resolutions

Alright, so I have an image that it's resolution is: 1054X1054.
I want to set that image so it fits exactly the screen size of the android devices. (I'll cut the image in some editor as nessacary).
So my question is: How can I know what resolution my image should be so it will cover the device background without needing to resize the image.(For the mdpi,hdpi,ldpi)
I'm asking that question because I keep missunderstanding how the "multiple screen resolutions" really work..
you can design your image to re-size with different screen sizes, you'll discover that each design requires a minimum amount of space. So, each generalized screen size above has an associated minimum resolution that's defined by the system.
android os count sizes in "dp" units—the same units you should use when defining your layouts—which allows the system to avoid worrying about changes in screen density.
xlarge screens image sizes are at least 960dp x 720dp
large screens image sizes are at least 640dp x 480dp
normal screens image sizes are at least 470dp x 320dp
small screens image sizes are at least 426dp x 320dp

Categories