I was wondering if there is a way to find out the size of a persons finger when they touch an android device. I want to know this so i can change the sensitivity of certain objects in my game.
A solution for devices that don't offer the touch.getSize() function, you can have on the start screen a high resolution plot of 2d points. On touch, detect the ones touched and there you can have a pretty accurate area of how big the finger is.
Related
I am writing a program that is supposed to display 3D point clouds. For this purpose, I am using the jMonkeyEngine. Unfortunately, I do not like the default camera behavior of jMonkey. Especially the mouse dragging and mouse wheel do not really do what I want. What I want is them to behave like in the pcd viewer of the PointCloudLibrary.
Mouse wheel: Should be faster, and the the effect of the turning directions should be switched.
Mouse dragging: In jMonkey it seems like mouse dragging changes the camera viewing direction in the world. I am not sure what exactly happens in the pcd viewer, but I believe the camera is moved through the world while fixating the centroid of the displayed point cloud.
How can I change the behavior of the camera to fullfil my wishes? :)
1.
In the simpleInit() method (where 100 is an abritrary number):
getFlyByCamera().setZoomSpeed(100);
getFlyByCamera().setDragToRotate(true);
Note, that zooming doesn't actually change the position of the camera, just the FOV.
2.
The normal behavior of the camera is to rotate around its own axis. By offseting the location of the camera as well, the effect you want can be achieved. In simpleUpdate():
cam.setLocation(cam.getDirection().negate().multLocal(cam.getLocation().length()));
I consider the answer to the second question a bit of a quick hack. But it does the trick.
I'm working on an Android project. it's goal is to detect predefined movement gesture of the device. if the device rotates 45 degrees over an axis(X,Y or Z) and then rotates back to its first position( the first and second positions are not going to be accurate, I mean if it was rotated 50 degrees instead of 45, not important !!!)then the gesture has happened and the app should detect it.
I tried to do that using Accelerometer and Magnetic sensors of device to continually monitor the orientation of the device and detect gesture but the results weren't acceptable(explained here). any start point or idea please ?
It doesn't seem like anybody is going to provide a concrete and valuable answer. So let me try to do that.
First of all, even a bit primitive and straightforward approach allows to spot the fact you do not need to process all the data coming from sensors. Moreover humans are not that fast, so there is no need to proceed 10000 values per second in order to identify any specific move as well.
What you actually need is just to identify key points and make your decision. Does it sound like a tanget to you?
What I'm actually suggesting is to test your solution using an ordinary mouse and available gesture recognition framework. Because the actual idea is pretty much the same. So please check:
iGesture - Gesture Recognition Framework
Mouse Gestures
It such a way it might be easier to develop a proper solution.
Update
Let's imagine I'm holding my phone and I need to rotate it 90 degrees counterclockwise and then 180 degrees clockwise. I hope you will not expect me to do some complex 3D shapes in the air (it will break usability and frankly I do not want to loose my phone), so it is possible to say there might be a point we can track or we can easily simulate it.
Please see my other answer in order to see simple, but working solution of a similar problem:
I just noticed that when I turn off my screen on my mobile device and turn it on again some graphics resize or disappear. My game is coded towards 800x480 resolution and my HTC Desire HD doesn't have this problem (it has a 800x480 resolution). However, when tested on my HTC One or a Samsung Galaxy S3 the graphics scale or behave weird.
The only thing those objects who behave weird have in common are that they rotate every frame. Stationary objects don't seem to be affected at all.
I have stars who rotates and scales up/down every frame and I have a moving block who goes left/right or up/down. The moving block seems to ignore collision when the screen is restarted and disappears to places he should not be able to reach.
Any ideas?
Thanks in advance.
When you turn off the screen, rendering is paused (i.e. no calls to render() are made). When you resume, Gdx.graphics.getDeltaTime() will be very large as the last frame was rendered at least some seconds ago. So the delta time values that are normally in the order of 0.0166 (60 FPS) will now be in the order of a 100 times greater.
If you are using this delta to take a physics simulation / collision check step, that will go beserk because it's way too large. Rotation shouldn't be a real problem, but scaling will also go out the roof.
A simple way to avoid this is to put in something like
if (delta > 0.1f)
delta = 0.0166f
to avoid taking really large steps.
So I want to know how you would measure different light intensities when a finger is pressed on the android device's camera with flash on. I have read throughout the internet about exposure, light sensors, etc., but I don't know where to start off :(. So far I have made a program that opens up the camera using surfaceholders and surfaceviews with flash on as well. I put my thumb against the camera and I can see that my thumb turned to a pinkish color with small color changes throughout the area of my thumb. How can I take this information from the camera and use them for other stuff, like measuring heart rate? Thank you so much.
You might want to investigate looking at ratios between red and blue light, instead of absolute brightness. You may find that this measurement helps get rid of some of the common mode noise likely to exist in an absolute brightness measurement.
Your blood doesn't actually turn blue when it isn't oxygenated, but it does change to a different shade of red. You might be able to make a primitive O2 saturation measurement with that camera. You can pick up an actual home meter for O2 saturation / pulse meter at a local pharmacy for less than $50, if you want some real data to correlate with. I believe that the "real" sensors correlate an IR measurement with red light.
You also might want to see if there is some kind of auto white-balance going on with the image sensor that needs to be disabled (this would be model specific to whatever device you are using).
What are you trying to do? I'll assume that you're trying to measure your heart rate by the amount of blood in your finger. so basically you have 2 states, one with more blood and one with less.
I would start by measuring the average brightness of the picture like Totoo mentioned. After you know how to do this, make a program that will identify what state the finger is in, from the picture - Say, if the average brightness is less than 50, your heart just pumped, making it state 2. Otherwise, it hasn't, and it will be in state 1.
After you know how to do this, you can know when it switches from state 1 to state 2 and the other way around. And by dividing the amount of state switches by (time passed * 2), you'd get the heart rate.
hope I helped :)
Dear programmers/scripters/engineers/other people,
The problem:
I'm currently developing an augmented reality application for an Android 3.2 tablet and having some issues with getting an accurate compass reading. I need to know exactly the (z) direction the tablet is facing, measured from the north. It doesn't matter if it's in degrees or radians.
What I currently have tried:
I used the magnetometer and accelerometer to calculate the angle. It has one big disadvantage. If you rotate 90 degrees, the sensors will measure a larger or a smaller angle. Even when I'm in an open field far away from metal or any magnetic objects. Even the declination doesn't solve it.
Using the gyroscope would be an option. I have tried to measure the rotating speed and store the measured units into a variable to know the exact view direction. There seems to be an factor that causes distortion though. I found out that fast rotations distort the accurate direction measurement. The gyro's drift wasn't that much troublesome. The application checks the other sensors for any movement. If none is detected, the gyro's rotation change won't be taken into account.
The rotation vector works okay. It has some issues like the gyroscope. If you move slowly and stop at a suddenly moment, it will drift away for a few seconds. Another problem is that it will be inaccurate with quick rotations depending on the speed and how many turns you've made. (You don't want to know how my co-workers are looking at me when I'm swinging the tablet in all directions...)
Sensor.Orientation, not much to say about. It is deprecated for some reason so I won't use it. A lot of examples on the internet are using this sensor and it's probably the same thing as the magnetic/accelerometer combination.
So I'm currently out of idea's. Could you help me with brain storming / solution solving?
Thanks in advance, yours sincerely, Roland
EDIT 1:
I am willing to provide the code I have tried.
I'm summing up our comments:
its clear from this video that the sensors on phones are not very accurate to begin with. Also interesting to read is this
it its important that the user calibrates the sensors by doing a figure 8 motion holding the phone flat. An App can programmatically check if such a calibration is necessary and notify the user. See this question for details
To eliminate jitter the values obtained from the sensors need to be filtered by a low pass filter of some kind. This has also been discussed on StackOverflow.
The orientation obtained is not the true north. To obtain the true north one must use GeomagneticField