Trilateration of 3 Calculated Distances from WiFI Strength Signals - java

I am using android to scan WIFI AP's every frame of time, I am getting from each AP the Strength of Signal (RSSI in dbm) and I am calculating the distance with this formula:
public double calculateDistance(double levelInDb, double freqInMHz) {
double exp = (32.44 - (20 * Math.log10(freqInMHz)) + Math.abs(levelInDb)) / 20.0;
return Math.pow(10.0, exp);
}
That is working fine, So I have three or more distances, now I need to draw on a map all AP's with its fixed locations, I made some reading on internet and I found the Trilateration (is the process of determining absolute or relative locations of points by measurement of distances) but it looks like I need at least one Point (x,y), at this moment I just have the calculated distances from the Signal Strength that can be taken as the radius of the different circumferences.
I am confused because I don't have any concrete point (x,y) to start to calculate the location of the Mobile Phone.
I just need to know if there is a way to calculate that point or I can assume that initial point or I am missing something.
Thank you, I really appreciate any.

As Paulw11 mentioned is his comment, you have to know the exact position of all of the APs or at least the one of them and the relative position of the other twos to this one. Then the Trilateration procedure will produce one circle for each AP and the interception of them will be the device.
Keep in mind that the Trilateration with Wi-Fi will produce an area instead of a point which will result in an area of uncertainty with an accurancy of at least 2-3m. And from what I can see you are calculating the distance based on the free space loss model which is a generic type and it is not the truth for each environment, so, this assumption will make even worse your estimation.
A good approach is to make a radio mapping of your area first with Wi-Fi measurements from the device and then be based on this Wi-Fi fingerprints. Or prepare a training period first. There are many tutorials on that.

Related

How do I get the distance from an object to another object with a camera?

My FRC (robotics) team is having issues with image processing, and tomorrow is our last testing day before competition.
The camera is facing downward and tilted in the x direction. We are trying to calculate the distance that an object is to a fixed point on the same surface. We only need to calculate the x distance (in inches).
Here's a diagram.
The object could be anywhere on the line with the fixed point.
Here is the view from the camera
The tape measure represents the line in the diagram.
I know it's low res and not the best picture, I took it just before I left today. The tape measure is where the object could be. And we only care about it's x position.
Other info if needed:
Camera: Pixy
Focal length: 28mm (1.1024")
Sensor size: 0.25"
Height of camera from surface (the ground in our case): 8"
We always know the x position (in pixels) of the object, we just need to calculate the distance (in inches) that the object is from the fixed point.
If you have any other questions please ask. Thanks.
You are on the right track with your image of the tape measure. All you need to do is manually (from that image), determine the inches (from zero) for each x-position (pixel). Create a lookup table that you can use in the code.
When you determine the x-position of the object and the x-position of the fixed point, look up the inches for each of these x-positions and subtract to get the distance between the object and the fixed point.
This approach is super simple, but also depends on proper calibration of the system. In particular, the operational setup (height, angle, camera optics, etc.) has to exactly match the setup when the test image was taken that was used to create the lookup table.
A standard technique is to calibrate the system by taking and processing a calibration image whenever the operational setup might have changed. For example, you could place a grid patter (e.g., with one inch squares) in the field of view. The idea is that you code a calibration analysis that will determine the proper lookup table values based on the standard image.

Convert Latitude and Longitude values to a custom sized grid

I am making a java program that classifies a set of lat/lng coordinates to a specific rectangle of a custom size, so in effect, map the surface of the earth into a custom grid and be able to identify what rectangle/ polygon a point lies in.
The way to do this I am looking into is by using a map projection (possibly Mercator).
For example, assuming I want to classify a long/lat into 'squares' of 100m x 100m,
44.727549, 10.419704 and 44.727572, 10.420460 would classify to area X
and
44.732496, 10.528092 and 44.732999, 10.529465 would classify to area Y as they are within 100m apart.
(this assumes they lie within the same boundary of course)
Im not too worried about distortion as I will not need to display the map, but I do need to be able to tell what polygon a set of coordinates belong to.
Is this possible? Any suggestions welcome. Thanks.
Edit
Omitting projection of the poles is also an acceptable loss
Here is my final solution (in PHP), creates a bin for every square 100m :
function get_static_pointer_table_id($lat, $lng)
{
$earth_circumference = 40000; // km
$lat_bin = round($lat / 0.0009);
$lng_length = $earth_circumference * cos(deg2rad($lat));
$number_of_bins_on_lng = $lng_length * 10;
$lng_bin = round($number_of_bins_on_lng * $lng / 360);
//the 'bin' unique identifier
return $lat_bin . "_" . $lng_bin;
}
If I understand correctly, you are looking for
a way to divide the surface of the earth into approximately 100m x 100m squares
a way to find the square in which a point lies
Question 1 is mission impossible with squares but much less so with polygons. A very simple way to create the polygons would to use the coordinates themselves. If each polygon is 0.0009° in latitude and longitude, you will have approximately square 100m x 100m grid on the equator, put the slices will become very thin close to the poles.
Question 2 depends on the approximation used to solve the challenge outlined above. If you use the very simple method above, then placing each coordinate into a bin is just a division by 0.0009 (and rounding down to the closest integer).
So, first you will have to decide what you can compromise. Is it important to have equal area in the polygons, equal longitudinal distance, equal latitude distance, etc.? Is it important to have four corners in the polygon? Is it important to have similar or almost similar polygons close to the poles and close to the equator? Once you know the limitations set by your application, choosing the projection becomes easier.
What you are trying to do here is a projection onto a flat surface of an ellipsoid. So as long as your points are close together, and, well, you don't mind getting the answer slightly wrong you can assume that your projection plane intersects in the centre of your collection of points, and, each degree of lat and lon are a constant number of metres. Then the problem is a simple planar calculation.
This is wrong, of course. I would actually recommend that you look into map projections, pick one that makes sense, and go for that. Remember that you can move the centre of the projection to the centre to your set of points which will reduce distortion.
I suspect that PROJ.4 might help you in terms of libraries. There also must be a good Java one but that is not my speciality.
Finally you can could assume that the earth is a sphere and do your calculations on the sphere. Or, if you really want to get it right you can pick a standard earth ellipsoid and do the calculations on that.

Algorithm to make balloon fly to specified altitude/height

I'm looking for a way/algorithm to make a robot balloon fly to a certain altitude. The robot is controlled by a Raspberry Pi and has a propeller. Propeller speed can be set to several values (it uses PWM so technically 1024 different power outputs).
The balloon has a distance sensor pointing down, so it's possible to get the current height several times per second.
Only idea I had so far was to measure the height constantly and set to max speed based on the height left to travel. This doesn't seem like the best option though, but can't figure out how to fit all power outputs in.
Any ideas would be welcome. I'm using Java to code the project but any high-level algorithm/description would be great!
Thx,
Magic
There is a great "game" available that lets you try and play around with exactly that problem: Colobot (seems to be open source now). Create a Winged-Grabber (or shooter if you are more the FPS type of person) and try to get it to fly to a specific destination using only the altitude and motor controls.
in general the Pseudo-Code by MadConan is the way to go, however the main task lies in writing a smart setPower function. In the end you need some smoothing function that reduces the power in relation to how close you are to your desired altitude, however the exact values of that function completely depend on your hardware and the weight of your final system.
Depending on how valuable and/or fragile your setup will be in the end, you might want to develop a learning system, that takes the under-/overshot as a basis to adjust the smoothing function while it runs. Make sure to take factors like up-/down-wind into your calculation.
Pseudo code.
while(true){
val height = getHeight(); // from sensor
// Get the difference between the current height and
// the TARGET height. Positive values mean too low
// while negative values mean too high
val offset = TARGET_VALUE - height;
// Set the power to some direct ratio of the offset
// When the baloon is at 0 height, the offset should
// be relatively high, so the power will be set
// high. If the offset is negative, the power will be
// set negative from the current power.
setPower(offset);// I'll leave it up to you to figure out the ratio
}

Determining if an object is moving away from a point versus towards it

I am trying to practice my skills with using latitude and longitude and I'm attempting to determine the following: given a center point X on a map and a point around it called Y, how do I tell whether or not the points around the center are moving away from the center object or towards it using latitude and longitude?
Right now I have the center latitude and longitude and am focusing on one of the points around it. I have used the Haversine method to calculate distance in miles between two lats and longs. Using this I measured the initial distance the from X to Y and assigned it to a variable. Upon Y's first move I recalculated the overall distance from X to Y and compared it with the initial. If the new measurement is greater than the old then your distance from the point X is increasing, if not it's decreasing. Also, I have check to make sure what I'm working with the point Y is ACTUALLY moving some distance with each move, not just going around the radius of point X in some weird fashion.
Is the way I'm doing things sound alright? I keep feeling like I need to fine tune something but I just can't put my finger on it.
Hopefully everything I'm saying makes sense and is not falling on deaf ears and this doesn't get flagged as an non-constructive question. It definitely is.
Yes, this is the correct way, I have done this some years ago:
In praxis you get the coordinates from a GPS device. Therfore you may consider additional filtering, e.g ignore situtions where the device stands still. Because this may introduce positional jumps.
In your question I saw that you already use a filtering by distance moved: this is suitable!
You can use the haversine formula, like you propose. For high load situations, there are faster distance formulas, for your task (small distances), which do not need so much trigonometric calls, but this is a minor topic.

Using accelerometer, gyroscope and compass to calculate device's movement in 3D world

I'm working on an android application which can calculate device's movement in 6 direction. I think I can use acceleration as;
"x=a.t^2" but a is not a constant. And this is the problem. How can I calculate the total movement??
The accelerometer gives you three directions (x, y, z). They are acceleration measurements which is harder to know what the position of the device is. But, remember acceleration is related to position through integration:
a(t) = a[x]
v(t) = a[x]t + c
x(t) = a[x]t ^ 2 + ct + d
Problem is you can't know c or d because as you take the derivative the constants drop out. So there is some amount you can't get right with c and d missing. You can attempt to compensate by remembering the values you used last for those. So after grabbing 3 samples you can start to calculate position from that.
There is a significant amount of information about how to interpret the data from the sensors. Like figuring out where gravity is for orientation, and subtracting out gravity to get linear acceleration.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Here is a way to come up with position using an accelerometer along with an algorithm for finding position in detail:
https://www.nxp.com/docs/en/application-note/AN3397.pdf
It is true, you get position by integrating the linear acceleration twice. But the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
It is not the accelerometer noise that causes the problem but the gyro white noise, see subsection 6.2.3 Propagation of Errors. (By the way, you will need the gyroscopes too.)
A similar question is Distance moved by Accelerometer.

Categories